Study both AWS and Azure for High-level Cloud Understanding

Recently, I’ve found it’s not enough to simply say, “I’m an AWS person” or “I’m an Azure person” when it comes to learning and understanding the cloud. The greatest benefits in learning surface from seeking a high-level understanding of cloud platforms, in general.

These two leading cloud providers, AWS and Azure, are distinct enough from each other, but fundamentally there are features that neither of them can avoid if they are going maintain market share, or simply be able to provide viable solutions to their customers.

When folks spin up a VM instance at AWS or in the Azure portal, they are likely to overlook a few critical things. 1) These environments are networks unto themselves, 2) Where you might’ve been operating a private cloud in the past, which is essentially a traditional data center, you’re now using a public cloud, and 3) Where you might envision you’re life getting simpler with cloud, it’s actually getting more complicated.

When folks spin up a VM instance at AWS or in the Azure portal, they are likely to overlook a few critical things. 1) These environments are networks unto themselves, 2) Where you might’ve been operating a private cloud in the past, which is essentially a traditional data center, you’re now using a public cloud, and 3) Where you might envision you’re life getting simpler with cloud, it’s actually getting more complicated.

With point #1, AWS and Azure both provide ways for their customers to create networks. There may be slight differences in how these networks are set up, but the concepts are the same. Azure calls theirs virtual networks and AWS describes them as virtual private cloud or VPC instances. (Oracle calls theirs virtual cloud network or VCNs.)

Call it what you want, it is a network. And if you don’t know that, you’re likely in for a world of hurt later when you need to scale or secure your cloud environment. And if you can’t do the same thing in AWS as you do Azure on this front, then you don’t really understand the fundamentals. And if you try another cloud platform and find you simply can’t fine tune networks the way you like, then you’ll know why AWS and Azure are leaving them behind.

With point #2, you are now participating in the use of a public data center. This means it is exceedingly easy to expose your services in ways you maybe didn’t intend. Take a careful look at how AWS and Azure overlap or don’t overlap in their approach to exposing services. If you only ever had you head in AWS you might assume that certain settings are always set to be public and the same might be true of Azure.

They’re both evolving from month to month on their approach to how they handle default public exposure, so I won’t go into too much detail. But if you work with them simultaneously when you’re learning, you’ll make fewer assumptions and as more critical questions than you would otherwise.

With point #3, generally speaking, your life as an IT professional who uses the cloud to build IT infrastructure, services and to take advantage of pre-built applications, is about to get a bit more complicated. Why? Because now in addition to all the systems you need to support locally, which help you move data in an out of the cloud, you need to manage the cloud as well. And if you’re using a service that “manages itself” you need to manage perpetual change just to keep up with it.

Understanding integration in both AWS and Azure will help you make fewer assumptions about the way things should be and, again, think critically about the fact that much of the design is actually up to you. You can’t afford to turn of your critical thinking skills. Having the courage to ask tough questions is even more important than it has been in the past. So if someone asks you, should I learn AWS or Azure, or (name your favorite alternative), the answer should be “yes”. There is no “or” if you’re going to be the master of your own cloud destiny. 🙂

‘The Cloud’ is Still New

It feels like folks have been talking about ‘the cloud’ forever. But levels of cloud utilization in the form of IaaS, PaaS, etc. have really only ramped up significantly in the last couple years. The tendency is to think that there are ‘cloud’ people who were just born knowing ‘cloud’ and that the chasm between ‘cloud’ and ‘on-prem’ is so great that the ‘on-prem’ folks simply won’t understand this new realm.

Fact is, ‘the cloud’ is still new. And no one is born knowing anything, especially not best-practices around cloud utilization, security, and architecture. Herein lies both risk and opportunity. If we can all just put down our pretensions around cloud know-how and get busy learning, we might actually be able to build, configure and secure our cloud environments in a way that delivers consistent, beautiful results.

But the first step is remind ourselves about how new all of this is, and how revolutionary it is. Organizational leaders, instead of saying, “Hey what do you know about cloud? Oh, you don’t know anything? Okay, bye.” Need to say, “Hey let’s get learning! See what you can find out about the cloud that will help us meet our goals.” Because the reality is, most of us don’t know everything there is to know about the cloud. It is still new! And it is going to still be new for a long time!

If leaders don’t charge their teams with learning, these same leaders will have their business strategies singularly handled by vendors — well meaning as they may be. And the best solutions and the most remarkable features of ‘the cloud’ will never arrive. Innovation happens with a sense of ownership and dedication. This is less likely to happen when innovative work is attempted by 3rd parties who have ample room to over promise and under deliver.

The cloud is still new! Let’s respect that fact and don’t presume that the best solutions live elsewhere. Bring your teams into this new world and get ready to be blown away. Give them a chance to learn and innovate; don’t write them off. Sometimes the best innovations are right under our noses, but we can see them because we’re blinded by the glare of shinny, well-marketed solutions that can be low on substance.

Security Hygiene is Boring and Critical

This has been said many times before by people many times more credentialed than me. There are sexy vulnerabilities out there that take considerable expertise to understand. Then there are vulnerabilities or configurations that are the equivalent of leaving your car door unlocked.

The calculation so often made goes like this: “it hasn’t happened before”, or “I’ll only be gone for a few minutes”.

Oddly, many who have an incredibly honed financial sense about them and who understand that ‘past performance does not equal equal future results’, have great difficulty extending this concept elsewhere. But nowhere is it more applicable than in security. Past performance does not equal future results! (Or you may have been hacked in the past and you don’t know it.)

The oversight that causes an organization to get hacked in the first place is likely something simple. Are you missing two-factor authentication? Are you still using a default login? Is your password “Spring2019” and do you use it everywhere? These are security concerns that don’t take heaps of expertise to understand; they are boring and critical.

Attackers don’t want to work hard to steal data or install ransomware, so they’re likely to look for simple vulnerabilities or poorly configured networks in order to get the job done. Don’t sweat the small stuff, sweat the simple stuff.

“The Cuckoo’s Egg:” An Old Story – New to Me

Two weekends ago I finished reading “Tribe of Hackers: Cybersecurity Advice from the Best Hackers in the World”. (Please read previous blog entry to learn more.) I was amazed at how many of “Tribe of Hackers” contributors recommended an old book, “The Cuckoo’s Egg: Tracking a Spy Through the Maze of Computer Espionage,” which was written by Clifford Stoll in 1989.

The story actually begins at Lawrence Berkeley National Laboratory in 1986. I won’t go into too many details about the setting or the time. In computer years, it was ages ago. So my question: “How could such an old book about tracking down a hacker be so routinely recommended by a slew of highly knowledgeable and well-respected info sec professionals?”

Turns out cybersecurity hasn’t changed much. In “The Cuckoo’s Egg,” the hacker who is being tracked by Stoll, an astronomer, is aided by of the following: 1) default credentials, 2) processes that run as root, but shouldn’t, 3) well-known vulnerabilities, 4) the fact that folks can be fooled into entering their credentials into fake sites, 5) the desire of organizations to not share information, 6) the fact that various US agencies described this sort of attack as not their ‘bailiwick’, 7) the fact that various agencies don’t have the expertise to fully comprehend the risk to their data and network infrastructures, and 8) that organizations could not possibly imagine someone actually penetrating their ‘high security’ environments. I’m sure I’m missing a few, but you get the idea.

Besides being a great old book, published when I was a curious, modem tapping, BBS surfing adolescent, it’s an excellent primer on the foundations of modern cybersecurity. Sure, the technology has changed, but fundamentals haven’t moved an inch. Maybe all cybersecurity professionals have heard of this book except for me, but if you haven’t, consider reading it. Even if you’re not after the education, it’s wonderfully entertaining.

Postman API Learning, Testing, and Development

I’m pretty late into to the API game. Recently I was on a call with a handful of security engineers and they explained that they couldn’t afford to have their people staring at console screens any more. Instead, they rely almost entirely on API’s to automate and streamline their work. I’ve been hearing about API development forever but I’d not gotten past the first hurdle: how to start. My answer to this is Postman.

Once you have an API you want to consume, you can start doing ‘POST’ and ‘GET’ requests pronto and see results immediately. Also, one critical tipping point for me was when I watched a number of the introductory videos that Postman provides. For example, I didn’t understand what the ‘Test’ section was for. The videos demonstrated that this is where you can write JavaScript to traverse the JSON files which are the results of your requests.

Currently, I’m only using a free account. I’m in learning mode, but as I move toward doing more work with API’s in the future, I’ll absolutely be using Postman to test and verify my efforts. It’s also a great introduction in the security advantages and disadvantages of using API’s.

Anyone else who has a desire to dig into API’s and consider what they can do to add value to your work, try Postman. And don’t forget to check out a few of their tutorial videos.

Discovering “2600 Magazine: The Hacker Quarterly”

Not long ago I did one of those “Strengths Finder” assessments put out by the folks at gallupstrengthscenter.com. At the top of my “strengths” list was the designation “Learner”. It essentially confirmed what I already almost knew — that I enjoy learning or getting to a point of understanding on a variety of topics.

Recently a colleague at work recommended that I consider taking at look at the 2600 Magazine. So I did. I read the Kindle version of the most recent edition. What I really enjoy about reading the Hacker Quarterly is that it is filled with articles written by people who love to learn and understand things, specifically related to computers and technology.

Also, as someone who works in cyber security, it is exceedingly helpful for me to understand the types of vulnerabilities that are written about in Hacker Quarterly articles. For example, I read an article by an individual who was able to ‘investigate’ a very larger number of routers in Malaysia. Initially, he had resource constraints, but discovered that by using a Spot Instance at AWS he could considerably broaden his reach at a very low cost: ten dollars. I’ll be seeking to understand these AWS Spot Instances and the impact they may have on the security of organizations in the future.

By and large the spirit of the “Hacker Quarterly” is centered around learning and understanding. And the culture of the group is such that criminal activity is frowned upon, though they do skirt the edges of legality from time to time. To have a window into this world is marvelous. I’m now reading through a whole ‘digest’ of issues from the past year. And if you’re a “Learner” like me, I suggest you do the same. Here’s their website: https://www.2600.com/

Holiday Hacking with SANS

Perpetual learning is paramount for folks in any profession, but I’ve found that for individuals who work in cyber security it is absolutely critical. A significant part of the work I do involves knowing what risks lurk both in the wild (and internally) that can stand in the way of an organization’s future success. Staying up with these risks, mitigation techniques, and controls is vital.

There are all types of learning that help new concepts find a home in my brain. One comprehensive learning experience that I recommend for anyone in cyber security is an event put out each year by SANS, which is an organization that trains cyber security professionals. The event is called the SANS Holiday Hack Challenge.

This year 9-year-old son helped me in ways that blew my mind. His little mind went after small details that I thought were insignificant that turned out to be a pretty big deal. He was very excited by what he was able to uncover…and so was I.

The SANS Holiday Hack challenge introduces cyber security professionals and pen-testers to new technologies and opens their minds to risks and mitigation techniques that they had not previously considered. I greatly enjoy their ‘terminal challenges’ which provide hints toward solving objectives. Never before had I decrypted http2 traffic using Wireshark and SSL keys. So awesome! Here’s the link for this years’ challenge which has been a wild ride for me, to say the least: https://www.holidayhackchallenge.com/2018/.

Stop in and poke around. Solve a terminal challenge or two then put it on your holiday to-do list for next year. You won’t regret it!

Investigating the Arduino

For Christmas we got our son an Arduino Uno starter kit. It’s not officially and Arduino, though. The hardware specifications are the same, but it is made by a company called Elegoo. What we purchased was the “Complete Starter Kit”. I highly recommend it. So far we’ve made prototypes for the following: 1) blinking LED lights, 2) joystick controlling a servo motor, and 3) an ultrasonic sensor that tells us how far objects are from it. There have been a few other things, but those are what come to mind as I write.

Besides being extremely fun an interesting, these prototypes foster a new understanding about all the electronic things we use and how they may be wired. We could have gotten a kit for a robot or a remote controlled car, but testing out a range of sensors seems to broaden our view of what’s possible. If we decide on a full project, we’ll have a much better idea of what we’ll need and whether it will work.

Also, as a side note, since I’m using my Chromebook for these project I’m not using a locally installed IDE. Instead, I’m paying $1 a month to use the cloud service provided by Arduino for building sketches. So far it has worked flawlessly. Though ChromeOS does have a linux sandbox now. I’m going to see if I can install it that way, too.

Amazon Athena – What?

If you’re like many IT professionals who’ve had anything to do with large amounts of data, you’ve become immune to the phrase ‘big data’. Mostly because the meaning behind that phrase can vary so wildly.

Processing ‘big data’ can seem out of reach for many organizations. Either because of the costs in infrastructure required to establish a foothold on this front or because of a lack organizational expertise. And since the meaning of ‘big data’ can vary so much, you may find that you’re doing ‘big data’ work and then ask yourself, “Is this big data?” Or an observer can suggest that something is ‘big data’ when you know full well that it isn’t.

With my own background in data, I’m ever curious about what’s out there that can make the threshold into ‘big data’ seem less insurmountable.  Also, I’m interested in the security considerations around these solutions.

In the last week or so, I’ve gotten more familiar with AWS s3 buckets and a querying service called Amazon Athena. Here’s the truly amazing thing. You can simply drop files in an s3 bucket and query them straight from Amazon Athena. (There are just a couple steps to go through, but they are mostly trivial.) And for the most part, there’s not much of a limit for how much data you can query and analyze. You can scan 1tb of data for $5. What? That’s right. And you didn’t have to set up servers, database platforms, or any of that. I’ll be exploring Amazon Athena more and more over the coming weeks. If you have an interest in this sort of thing, I suggest you do the same.

One note: Google has something similar called BigQuery, so that might be worth a look as well. I’ve explored BigQuery briefly but I keep coming back to various AWS services since they seem to be holding strong as a dominant leader in emerging cloud technologies. But as well all know, the emerging technology landscape can change very quickly!

Retropie Old School Gaming Goodness

For some time, I’ve been interested in learning about the Raspberry Pi. It’s little a bare bones computer that packs a big punch. And to top it off, it’s quite affordable. Through work I heard about a way to use a Raspberry Pi for an OS called Retropie. Retropie is an emulation platform that let’s you play scores of old games…if you have the digital files for them, of which many can be found with the help of Google.

I’m not much into modern video games, (as in games from the last 20 years or so), but I did play NES games back when I was in jr. high and high school. And I do still have my original NES, but it has a number of issues that make it less than reliable for playing. My kids are interested in the older games because I’ll actually join them when they play. And, quite frankly, because the older games are super fun to play and easy to learn.

Anyway, Retropie is a great way to learn how to use and get familiar with the Raspberry Pi. You simply, burn the Retropie image on a micro SD card, pop it in the micro SD card slot and boot it up! There are a few other things you need to know, but that’s the gist of it. Get a few games, a controller or two, have a monitor with an HDMI plug-in handy and you’re good to go. That’s a bit of an over-simplification, but please do explore Retropie and Raspberry Pi if you’re at all interested in this sort of thing and are looking for a good way to get familiar with the Raspberry Pi world.

Here are a couple key links:

https://www.raspberrypi.org/

https://retropie.org.uk/ 

Jeshua