I have been an IT professional for more than four decades, starting in the 1980s. So much has changed, yet so much has stayed the same or, more appropriately, returned to the same. That is one of the reasons I love embracing innovation. Innovation is built on the foundation of what we already know. However, if you do not break down a new technology, you may not see the familiar, foundational elements. When we rely on the hype, the evangelist view of why this new technology is the greatest thing since
Looked at from that perspective, new technologies can be frightening. New terms such “Internet of Things,” “blockchain,” “quantum computing” and “machine learning” do not help matters. However, when we take these different technologies apart, we will often find that while there is an element of something new, there is also a lot we already know how to audit, protect and on which to define risk.
Much of Computing Is Cyclical
Speaking of foundations, a lot of innovation is simply the reuse of older ideas. It is not just Hollywood that recycles previous award winners. In many ways, the familiar saying, “There is nothing new under the sun,” applies. Here are several examples.
Virtualization and Cloud Computing
A great example of a computing idea that has come back around is virtualization. Mainframe platforms have had means of virtually carving up compute resources dating back to before the advent of client-server platforms. After all, a mainframe is a large computing platform with significant I/O capability. Very few processes or systems could fully consume a mainframe’s complete compute power. It is also a relatively expensive investment for which any organization would want to maximize the return on investment (ROI). Therefore, it made sense to “carve up” that power and spread it out. This is the basic concept of virtualization.
Cloud computing is another nod in that direction. It used to be that you had a relatively low-powered client that interfaced with a mainframe. Perhaps you even rented time on the mainframe. It was not “your” computer, but you were able to run workloads on it. That is the cloud computing model. You run workload, even if it is a Software as a Service (SaaS) solution, on someone else’s system. Therefore, cloud computing as a concept is not new, although how things are implemented and secured is new.
Edge Computing
We like to give new names to old ideas. Another example is edge computing. About a year ago, I was having a conversation with scientists and consultants from one of the larger IT companies in the world. The scientist was speaking excitedly about how many cloud early adopter organizations started moving all of their data straight to the cloud, even if they were going with a hybrid approach. What they found was that constantly moving all the data to the cloud first was not efficient. They were encountering issues transmitting as much data as they were producing. However, because they were moving data into the cloud first, they had to wait until all the data needed for a given process arrived. This delayed processing and caused enterprises to start compiling data at the various locations where they were collected, doing the processing at those sites, and only then pushing what was absolutely needed into the cloud. The scientist was calling it “edge computing.”
My first through was, “That is how we used to handle branch offices with slow Frame Relay connections.” When that remote office only had a 64-kilobyte uplink back to the main office, data transfers had to be efficient. In many cases, we could not push everything and then let processing occur. The bandwidth available was too small. As a result, systems were developed to collect and process data locally and then push what was needed to the appropriate destination. Taking this a step forward, what we were doing was something we were also calling “distributed computing.”
THE KEY IS TO TAKE WHAT YOU ALREADY KNOW AND SEE HOW THE REUSE OF THOSE TECHNOLOGIES HAS MODIFIED AND/OR EXTENDED THE TECHNOLOGIES.
Understanding How the Old Is Applied in a New Way
As an auditor, you have likely been exposed to any new technology in an older form, as the examples given suggest. Knowing that, the key is to take what you already know and see how the reuse of those technologies has modified and/or extended the technologies. Just as the technologies leverage what we already know, we can use our own knowledge to understand those changes and innovations we encounter. There are several good examples.
Listening Devices and Our Digital Assistants
Listening devices have been around for a long time in various forms. We have built structural impediments and other countermeasures to those listening devices. For instance, there were continuous efforts in the US military and federal government around something code named TEMPEST.1 Many of the practices and countermeasures used and developed under TEMPEST are applicable today.
After all, many of us carry around potential listening devices without thinking about it: our smartphones and tablets. Recently, security researchers were able to turn Amazon and Google-approved applications (apps) leveraging Alexa and Google Home into voice-capturing systems.2 In other words, those devices that we find so convenient can be equated to the same scenario and risk as if someone planted a “bug” on us. How will workplaces adapt? Likely, they will use the same technologies and techniques used to dampen and eliminate signal transmission under TEMPEST.
Edge Computing Is a Form of Distributed Computing
What about edge computing? We are not necessarily talking about actual remote offices with a small bandwidth network connection. However, data are still being stored at the site and processed there. How can we determine the risk and put the controls in place to protect our organization?
While the amount of bandwidth is different, the methods for auditing and controlling branch offices just have to be adapted. In years past, we were worried about a security breach to those offices, but it was not a computing-related attack that we were protecting against. We were protecting against a physical one. When you look at the two scenarios, old vs. new, the types of issues to overcome are similar:
- Remote offices do not tend to have as much traffic/visibility as main offices. It is harder to notice an intrusion.
- When data are being stored at a branch office, and not all of the data are being moved to a central location, there is a greater risk of data loss. Previously, we worried about a building fire destroying physical. Now we are more worried about an adversary wiping or stealing a system with said data on it.
- Processes at remote sites receive less visibility and scrutiny than they do at the main office. Therefore, there is more risk to any processes that involve human manual effort. Previously, we were worried about documents and their processing, especially in physical form. Now we are worried about processes running on computing platforms at the location.
While edge computing may be the new phrase, and while we are now dealing with a more digital situation, the challenges to an organization are almost identical to what they were in the past. We may have to use different controls, ones appropriate for the digital nature of data now, but the same types of scenarios exist. Therefore, we know what to look for and what to evaluate to ensure that we can protect our organizations as auditors.
BECAUSE IT SOUNDS LIKE EVERYTHING IS NEW ALL THE TIME, AUDITING AND DEVELOPING CONTROLS MAY SEEM LIKE A DAUNTING AND NEAR IMPOSSIBLE TASK.
Deep Fakes
Leighton Johnson, CISSP, CISM, has an excellent talk on "Auditing Big Data Systems."3 In that presentation he talks about what to do with live streams, since there is so much of that now. For instance, if your organization is analyzing social sentiment, it probably has something constantly looking at Twitter and Facebook streams. Those systems do not give you discrete sets of data. There are other systems that deliver streams of data, such as anything feeding a security information and event management (SIEM) system. In the case of streams, there is analysis that has to be done based on the reputation and trustworthiness of the stream. Why does this matter? Let us close with looking at “Deep Fakes.”
Deep Fakes are AI-manipulated pictures and video to make it look as though something happened that really did not. This is not new. Fooling the enemy in warfare by pretending to be smaller than you are when you are large in number or appearing larger than reality is something Sun Tzu wrote about in The Art of War,4 which dates from antiquity. However, the sophistication and capabilities of modern technology may bring us to a point where we simply cannot tell the difference between what is real and what is not simply by analyzing the image or video. We will need to rely on other methods of validation such as what we are already having to apply to big data.
Conclusion
We say that in information technology, change is happening at breakneck speed. After all, innovation is pushing us forward. Because it sounds like everything is new all the time, auditing and developing controls may seem like a daunting and near impossible task. However, in almost everything new, there is plenty of old. The key is to take the time to look at the technology in detail, see how it works, what it is built upon, and where we can apply what we already know and understand. In many cases, we already have the tools we need to protect our organizations. We might have to modify them a bit, but we are not operating in completely new territory. This is why we should not be afraid of what is to come. We do have a way forward, even with the rapid pace of change in today’s world.
Endnotes
1 National Security Agency, “TEMPEST: A Signal Problem,” PL86-36, released 29 September 29 2007, USA, http://www.nsa.gov/Portals/70/documents/news-features/declassified-documents/cryptologic-spectrum/tempest.pdf
2 Goodin, D.; “Alexa and Google Home Abused to Eavesdrop and Phish Password,” ArsTechnica, 20 October 2019, http://arstechnica.com/information-technology/2019/10/alexa-and-google-home-abused-to-eavesdrop-and-phish-passwords/
3 Johnson, L.; “Auditing Big Data Systems,” slides from presentation to ISACA Charlotte (North Carolina, USA) Chapter, 5 June 2018, http://m.aksarayyeralticarsisi.com/chapters3/Charlotte/Events/Documents/Event%20Presentations/06052018/Auditing%20Big%20Data%20Systems%20(Johnson).pptx
4 Sun Tzu; The Art of War, translated by Lionel Giles, Internet Classics Archive at Massachusetts Institute of Technology (MIT), Cambridge, Massachusetts, USA, http://classics.mit.edu/Tzu/artwar.html
K. Brian Kelley, CISA, CSPO, MCSE, Security+
Is an author and columnist focusing primarily on Microsoft SQL Server and Windows security. He currently serves as a data architect and an independent infrastructure/security architect concentrating on Active Directory, SQL Server and Windows Server. He has served in a myriad of other positions including senior database administrator, data warehouse architect, web developer, incident response team lead and project manager. Kelley has spoken at 24 Hours of PASS, IT/Dev Connections, SQLConnections, the TechnoSecurity and Forensics Investigation Conference, the IT GRC Forum, SyntaxCon, and at various SQL Saturdays, Code Camps, and user groups.