Sooooooo last night my roommate and I started talking about ways to make entry/exit of the apartment easier, and I brought up Smart Door Locks. This naturally led us to a discussion of how we could get someone into the actual building so we could then let them use the Smart Door Lock to get into our specific apartment.

He mentioned it would be cool to have some sort of remote control for the buzzer/intercom system and I said it’s so funny you say that because I’ve been thinking about that too.  I said, it’s probably a simple wire jumper that triggers it.

Well, I was right.  And now I have a new project to work on. 🙂  This will be an on-going series, because I think this is fscking awesome.

Continue reading

Pic not relevant to the discussion, admittedly. >_>

So, I’m a bit behind on posts this month because of dealing with landlord and contract negotiations, as well as studying for my boat safety certificate exam. Sorry about that!

Today we bring the question about apache and auth: why is it peculiar?

Continue reading

But in an odd way.

Reporting from TechCrunch we see that Apple has gone ahead and made an update (9.2.1) for their iPhone devices with 3rd-party-replaced Home Buttons and Screens.

In case you missed the gaffe, any person who had a 3rd-party shop replace their screen or home button and then attempted to do the latest iOS update (primarily via iTunes) they received an error 53 during the update, and an extra Apple-branded extra: a bricked phone.  Literally bricked.  You couldn’t go back.  You couldn’t go forward.  You couldn’t even start your phone.  Their devices showed “Connect to iTunes” permanently.  Connecting their device to a PC with iTunes reported “Error 53” for infinity.  There was simply NOTHING they could do.

Apple has gone back and released a software update which will get you back to a working state, with one caveat: you will LOSE TOUCH ID.

Continue reading

The FBI demand for Apple to unlock Those-Who-Shall-Not-Be-Named’s iPhones (the San Bernardino shooters) has officially been ruled on by a federal judge. The verdict is in: Apple must give investigators access to the encrypted data on the iPhone used by the shooters. (Link)

But wait, there’s more!  The FBI demand (apparently) includes a demand that Apple help decrypt the data.  This is something that Apple steadfastly (rightly so) claims to not have the ability to do.  The FBI claims that Apple simply MUST have that ability, and lacking that ability currently, must develop such a method to do so (if you believe Tim Cook, which I do in this case).

Continue reading

Original Articles: MSNBCCNBCBusiness InsiderWashington Free Beacon

Well, this is a long time coming in my book.  Just wow.  We first learned that Hillary Clinton ran a private email server where she was corresponding with people about government matters while she was the Secretary of State in March (or so) of 2015.  It is now February 2016.  That is almost a full year of “What the hell were you doing FBI?!?”  I really don’t understand why this took so long to get any sort of meaningful announcement from the FBI.  A former inspector general was reported to have indicated in this New York Post article claims that Hillary “never set up an agency email address for her in the first place” – this would mean that any and all communication she was doing as the Secretary of State via email had to be on these insecure channels (namely her private email server).

Continue reading

Recently I’ve heard of a few states (NY and CA, I’m looking in your direction) thinking about outright banning the sale of phones that are capable of encrypting phone contents.  Specifically they claim that the state (namely the police) should have the ability to decrypt and access all the contents of your personal mobile devices (because reasons).  Interestingly enough though, these two states have taken the stance of punishing the seller, not the user (this is a common theme in law).  That means that Apple, Google, and Microsoft (and all other cell phone manufacturers like LG, HTC, OnePlus; and all cell phone providers like Verizon, T-Mobile, Sprint) would be unable to sell their equipment in NY and CA (or face stiff penalties of up to $2,500 in the case of CA).  These penalties would be retroactive (how is that even legal???) back to January 1st, 2016.  I don’t see how any of this makes any sense.

Look, I get it.  Law enforcement agencies exists to arrest and subsequently convict people of crimes (under the pretense of the greater public welfare and trust).  Law enforcement needs information to make their cases as air-tight as possible.  Law enforcement also understands that people have their lives on their phones.  Law enforcement therein made the (what I can only assume they thought to be) logical jump to say: we need complete and unrestricted access to cell phone contents.  I do not see how a cell phone (and it’s contents) are not protected by the 4th Amendment of the Constitution of the United States.  The police cannot just barge into your home (without a warrant) and root around for anything that might be suspected of being used in a crime.  Even more than that, even if they do get a warrant it has to be very specific (at least in theory; in practice lately this does not seem to be the case, but that isn’t really in the purview of this discussion) or else the results of the search can be entered as inadmissible.

Encryption is a natural backlash to a string of perceived slights from the public by law enforcement.  Encryption simply denies access to the information by anyone without the access code.  The Supreme Court has recently decided that the 5th Amendment applies to your access codes to your devices.  This means that if your phone is encrypted you cannot be legally coerced into providing your password.  Therefore, an encrypted device would be largely inaccessible to law enforcement.  I can see why they’d be bothered by this.  What I can’t see is how they have any legal basis to declare that encryption is inherently bad (unless in their hands).

Encryption is your final line of defense against people who would use your mobile phone and the data therein to build a case against you (even for something you may not have initially been suspected of).  You should be using it (and there are instructions later in this post about getting it done) for your own (and for your contacts) well being and protection.  It also has the handy ability to make your phone a paperweight upon being stolen (most devices encrypt the bootable partitions of the device, meaning you must enter the decryption code before the device will even start up, meaning you cannot even format it or recover the device without the code.

The funny part is these laws supposedly would apply to goods purchased outside of the state and shipped in as well, but not to goods you physically purchase in another state and then transport into the other state by hand.  This means you would not be able to buy an Apple phone in NY, or via Amazon shipped into NY, but you would be able to drive into NJ, buy the phone, then drive back.  Are they seriously trying to kill their own tax revenues by limiting technology sales?  That seems like a recipe for disaster.

All things considered: I am not surprised by NY claiming that encryption is evil and that police should have access to your data at all times.

I am, however, completely surprised (and taken aback) by CA making the same claim.  I wonder how Apple and Google feel about their headquarters states now?  It astounds me that a state so rife with technology can be so utterly left in the dark ages via their politics.

And if these states honestly expect Apple and Google to stop full device encryption then I think those states are definitely in for a rude awakening (assuming the bill even passes, which I doubt will happen).  Apple’s CEO Tim Cook challenged this anti-encryption mentality in early 2015 with his statements: ““history has shown us that sacrificing our right to privacy can have dire consequences.”  I am honestly surprised that the heads of Google and Microsoft haven’t come out with a similar statement or sentiment.  Regardless I have no doubt that any company would be willing to forgo the sales in a particular state (knowing full well that someone who wants their device would just go a state over to get it).

Whatever the case may end up being one thing is clear: 2016 is going to be an interesting year for encryption technology and end user rights.

For your information:

  1. iPhone
    1. Encrypt your iPhone device
    2. Encrypt your iOS backups
  2. Android
    1. Encrypt your Android device
  3. Windows Phone
    1. Encrypt your Windows Phone device

Source: WPScan by the WPScan Team

If you’re using a WordPress site then you really should be using the WordPress Scanner WPScan.  It’s SUPER simple to install and very user friendly.

I heard about it from ma.ttias.be’s website which I’ve been following for a while now since he’s pretty spot on when it comes to IT Security and does a good deal of work with Zabbix (Mobile Zabbix UI, if you haven’t checked it out, is pretty sweet).

Returning to the original subject,  WPScan.

For me, installation was a simple series of commands (I’m running Ubuntu 14.04.2, LTS):

  1. sudo apt-get install libcurl4-openssl-dev libxml2 libxml2-dev libxslt1-dev ruby-dev build-essential
  2. git clone https://github.com/wpscanteam/wpscan.git
  3. cd wpscan
  4. sudo gem install bundler && bundle install --without test
  5. ./wpscan.rb --update
  6. ./wpscan.rb --url http(s)://yourwebsite.whoa

Running the scan on my website revealed an HTML file that tells the WordPress version (not in and of itself a vulnerability, but still why give an attacker any information right off the bat), open Registration being enabled (I don’t mind, this isn’t a vulnerability it just results in me getting a LOT of spam), and directory listing being enabled (pretty significant in my book).

All in all, the process took about 15 minutes from install to secured.

This is highly recommended in my book.

Cheers,

-M

So it’s been a little over 24 hours since the Heartbleed Bug and associated fixes were announced.  If you haven’t checked your SSL enabled site yet, I highly recommend that you do so.  The test is available at SSL Lab’s site: Qualys SSL Labs SSL Tester.  I highly recommend you give it a shot.  If you don’t pass, the site will give you recommendations on how to fix it.  I’ve been testing our web-facing equipment at work all morning, and the results are largely decent, with a few minor exceptions.

That being said, the question of the hour becomes: how much damage was done?

The answer to this question is largely unknown.  If you haven’t been following the Heartbleed Bug I will try and explain it as much as I understand it.

Thanks to Nick, I understand that the bug allowed a remote attacker to remotely read data from server memory. This attack can be repeated many times, allowing an attacker to basically dump the webserver memory completely.  Things like passwords, usernames, and security keys could be seen.  Usernames and passwords are one thing: the user can change them almost at will (and a lot of people, including myself, will be changing ALL their passwords over the next few days) and is largely not the problem.

The real problems lay with the security keys for SSL certificates.  If the security key for a SSL certificate was compromised before the bug patch was deployed to that server, then the server must still be considered compromised until they regenerate their SSL certificates (which I will also be doing this week, once I get Apache upgraded from 2.2.22 to 2.4.x).  If the attacker has the security keys for the SSL certificates, than the encryption that the SSL certificate services provides are basically null and void: the attacker can decrypt data fairly easily.

So at the end of the day, the question becomes: how bad is this?

The answer is: REALLY, REALLY, REALLY (potentially) BAD

Recommendations:

  1. For the love of god, if you haven’t updated your SSL provider yet, please do so.  The attack information has been published for over 24 hours.  Attacks will start becoming prevalent VERY soon.
  2. If you do any sort of e-commerce now, or with the potential to do it any time soon (or if you even have users who login to your pages to post content, etc) then REGENERATE YOUR SSL CERTIFICATES WITH NEW KEYS.  Otherwise, your site integrity is basically useless.
  3. Change your passwords for critical sites.  Things like Google accounts, Bank accounts, Shopping accounts are all big targets.  Do you want unexpected purchases and charges on your cards?  I don’t think so.

I do not wish to seem alarmist or even crazy, but cyber security is a BIG DEAL and we need to pay attention to it.

Relevant sites for extra reading:

Heartbleed Bug
Matthew D Fuller’s Blog
Business Insider’s Article
Storify’s Article
Relevant XKCD