Panel #3: Securing Mobile Devices (Panel Summary)

Panel #3: Securing Mobile Devices (Panel Summary)


Panel #3: Securing Mobile Devices (Panel Summary)

Posted: 13 Apr 2012 03:53 PM PDT

Tuesday, April 3, 2012 Panel Members: Saurabh Bagchi, Purdue David Keppler, MITRE Jeremy Rasmussen, CACI Panel summary by Robert Winkworth. The panel was moderated by Keith Watson, CERIAS, Purdue University. In light of its unprecedented growth, wireless mobile communications remains a major focus of security research. The stated purpose of this panel was to address the challenges in securing data and processing, limiting communication to designated parties, protecting sensitive data from loss of device, and handling new classes of malware. Professor Bagchi opens the discussion with these key points and predictions: 3G routing often circumvents institutional barriers and filters. Information is leaking from one application to another within the device. More anti-malware software packages are sold now. This will increase. Virulent code will spread by near-field technologies, such as Bluetooth. It is becoming more lucrative to commit unauthorized remote monitoring. Encryption for mobile services will improve in the future. Behavior-based detection will become more popular. New features are often rushed to market before being functionally secure. MITRE's David Keppler joins the discussion with these thoughts: Mobile devices are single-user devices, and are highly personalized. On the device, we are separating apps rather than users. Contacts, social network data, banking info, etc. are stored in mobiles. Locking down devices can reduce productivity. Users like to have one device for many different actions. A single compromised device can enable a threat against many network users. Mobiles are "always connected", and that brings security implications. CACI's Jeremy Rasmussen contributes: DoD facilities are still trying to prevent mobile activity on premises. New proposals would extend popular connectedness to government workers. Policy is lagging behind what technology provides. Everything needed, even for NSA standards, is available as free software. Vouching for a unit is vouching for every combination of apps it can run. The US government struggles greatly to keep pace with technology. The audience submits questions: Attendant: "What will it take to make mobiles as secure as desktops?" David: "I would argue that the vulnerabilities of a handheld are actually no worse than those of a laptop. A proper risk assessment should be done for each. Expect that exploits will always be possible, but invest for them accordingly." Saurabh: "Protocols and architecture need to be standardized. This will be helpful to developers. And we need openness in standards." Attendant: "Does it seem inevitable that Android will allow lower-level access to the hardware in the future?" Jeremy: "Yes, and that can benefit the user, who really should unlock the device and install a personalized solution. We must have root access to the phone to get better security. An app cannot protect the user from system abuses that occur at a lower level than app." David: "I agree. What we must do is break the current security in order to rebuild it in a more robust way. There are also some underling market issues at work here. Commercial products are unfortunately vendor-specific, but need to be standardized. How can this happen where there is DRM?" Attendant: "What are the key differences in user experience between desktop and mobile?" Saurabh: "Energy consumption, bandwidth, and limitations in the user interface." David: "Users trust mobiles MORE rather than less than their desktops. They have not grasped the magnitude of the mobile threat." Keith: "What advice would you have for CSO/CIO as they face these threats?" Saurabh: "CSOs and CIOs don't ask me for advice! [laughter] What I would recommend, though is strong isolation between applications, and a means to certify them before loading." David: "There are some utilities available that employers can have users run if they're going to be on a private network. Some risk is inevitable, though. There is no perfect solution." Jeremy: "Yes—NAC (Network Access Control) used to be required for user devices if they'd be allowed on a corporate network. We need that for mobiles, but I don't see how it's possible; we can be circumvented so easily."

Panel #2: Big Data Analytics (Panel Summary)

Posted: 13 Apr 2012 03:44 PM PDT

Tuesday April 3, 2012 Panel Members: William S. Cleveland, Purdue University Marc Brooks, MITRE Corporation Jamie Van Randwyk, Sandia National Laboratories Alok R. Chaturvedi, Professor, Purdue University Panel Summary by Matt Levendoski The panel was moderated by Joel Rasmus, CERIAS, Purdue University. A quick review on Big Data: Big Data represents a new era in data analysis where the volume of the data to analyze is so big that it does not work with current traditional database technologies and algorithms. The size of the data set needs to be collected, stored, shared, analyzed and/or visualized continue to grow as the information has been produced at an unprecedented rate from ubiquitous mobile devices, RFID technologies, sensor networks, web logs, surveillance records, search queries, social networks and so on. Increasing volume of the data is only one challenge of big data, and there are other challenges. In fact, Gartner analyst, Doug Laney, defined big data challenges/ opportunities as 3V's: Volume - it refers to the increasing volume of data as mentioned above. Velocity - it refers to the time constraints in collecting, processing and using the data. A traditional algorithm which can process a small set of data quickly may take days to process a large set and give the results. However, if there is a real-time need such as national security, surveillance, and health care, taking days is not good enough any more. Variety - it refers to the increasing array of data types that need to be handled. It includes all kinds of structured and unstructured data including audio, video, image data, transaction logs, web logs, web pages, emails, text messages and so on. Panel discussion: First, each of the panelists gave their perspective and their experience with big data analytics. William S. Cleveland, Shanti S. Gupta Professor of Statistics, Purdue University, mentioning the challenges and experience in handling large volume of data in their research group, described their divide and recombine (D&R) approach to parallelize the processing by dividing the data into small subsets and applying traditional numeric and visualization algorithms on such subsets. They exploit the parallelization exhibits by the data itself. Cleveland described their tool called RHIPE built based on this concept. It is available to the public at www.rhipe.org. RHIPE is a merger of R, a free statistical analysis software and Apache Hadoop, an open source MapReduce framework. Marc Brooks, Lead Information Security Researcher, MITRE Corporation, mainly focused on anomaly detection in large data sets. He raised the question of how one can detect an anomaly without sufficient test data sets. Further, in his opinion, it is expensive to create such data sets. Brooks sees the trend of moving from supervised learning to unsupervised learning such as clustering due to the above reason. Most of the big data sources provide large amount of unstructured data. We know well to handle structured data as we already have a schema of it. He raised the question of what are the effective ways of handling unstructured data and thinks that there should be a fundamental change in the way we model such data. He also touched on the subject of what it takes to be a data scientist which is becoming an attractive career path these days. He thinks that the skill set is a mixture of software engineering, statistics and distributed systems. Jamie Van Randwyk, Technical R&D Manager, Sandia National Laboratories, started off with the idea of relativity behind the term "big data". In his opinion, for different organizations big data means different sizes and complexities. Specially the volume of the data which can be called as big data. Randwyk mentioned that while most commercial entities such as Amazon, Microsoft, Rackspace and so on, handle the big data needs of the industry, Sandia mainly focus on US government agencies. He raised the question that we use Hadoop and other technologies to perform analytics and visualizations on large volume of data, however, we still don't know how to secure such data in these big data environments. Randwyk and his team deal mainly with cyber data which is mostly unstructured. He pointed out the challenge of analyzing large volumes of unstructured data due to the lack of schema. Alok R. Chaturvedi, Professor, Krannert Graduate School of Management, Purdue University, started his perspective with the idea that one has to collect as much information possible from multiple sources and make actual information stand out. Chaturvedi briefly explained their big data analytics work involving real time monitoring of multiple markets and multiple assets. A challenge in doing so in the real world is that data is often inconsistent and fragmented. They build behavioral models based on the data feeds from sensors, news feeds, surveys, political, economical and social channels. Based on such models they perform macro market assessment by regions in order to spot opportunities to invest. Chaturvedi thinks that big data analytics is continue to going to play a key role in doing such analysis. After the initial perspective short talks by each panelist, the floor was open to the questions from the audience. Q: Is behavioral modelling effective? What are the challenges involved? A: Panelists identified two ways in which the behavior would change: adversarial activities and perturbation of data or the business itself. It is important to understand these two aspects and build behavioral model accordingly. Also, if the behavioral model does not keep up with the changes, it is going to be less effective in identifying behaviors that one wants to look for. Some of the challenges involved are deciding what matrices to use, defining such matrices, understanding the context (data perturbation vs. malicious activities) and keeping updating the model. It is also important to put the correct causality to the event. For example, 9/11 is due to a security failure not anything else. Q: Do you need to have some expertise in the field in order to better utilize big data technologies to identify anomalies? A: Yes, big data analytics will point to some red flags, you need be knowledgeable in the subject matter in order to dig deep and get more information. Q: Is it practical to do host based modeling using big data technologies? A: Yes, you have to restrict your domain of monitoring. For example, it may not be practical to do host based monitoring for the whole Purdue network. Q: How do you do packet level monitoring if the data is encrypted? A: Cleveland is of the view that one cannot do effective packet level monitoring if the data is encrypted. In their work, they assume that the packets are transmitted in cleartext. Q: To what extent intelligence response being worked out? Can you do it without the intervention of humans? A: Even with big data analytics, there will be false positives. Therefore, we still need human in the loop in order to pinpoint the incident accurately. These people should have background in computer security, algorithms, analysis, etc. A challenge in current big data technologies like Hadoop is that it is difficult to do near real time analysis yet. Q: (panel to audience) What are your big data problems? A: (An audience) Our problem is scalability. There is nothing off the shelf that we can buy to meet our need. We have to put a lot of effort to build these system by putting various component together. Instead of spending time on defending attacks, we have to spend a lot of time on operational tasks. Q: Is it better to have a new framework for big data for scientific data? A: It is not the science per se that you have to look at; you have to look at the complexity and size of the data in order to decide. From an operational perspective, a definition/framework may not be important, but from a marketing perspective, it may be important. For example, defining the size of the data set could be potentially useful. Q: We want to manage EHRs (electronic health records) for 60m people. Can these people be re-identified using big data technologies? A: Even EHR data confirming to safe harbor rules where 18-19 elements are not there may be re-identified. Safe harbor rules are not sufficient, neither they are necessary. They will protect most people, but not all. You can protect even without safe harbor. This is a very challenging problem and CERIAS has an ongoing research project. Q: Have you seen adversaries intentionally trying to manipulate big data so that they go undetected? Specifically have you seen adversaries that damage the system slowly to stay below the threshold level of detection and that damage very fast to overwhelm the system? A: We have seen that adversaries understand your protocols, whether your packets are encrypted or not, etc., so that they can behave like legitimate users. I have heard anecdotal stories of manipulating data in bank and other financial institutions, but can't point to any specific incident. Q: Often times, we have to reduce the scope, when many parameters are to be analyzed due to the sheer volume of data. How do you ensure that you still detect an anomaly (no false negatives)? A: You have to analyze all the data otherwise it may result in false negatives.

Panel #1: Securing SCADA Systems (Panel Summary)

Posted: 13 Apr 2012 01:52 PM PDT

Tuesday April 3, 2012 Panel Members: Hal Aldridge - Sypris Electronics William Atkins – Sandia National Laboratories Jason Holcomb – Lockheed Martin - Energy and Cyber Services Steven Parker – Energy Sector Security Consortium Lefteri Tsoukalas – Purdue University Panel Summary by Matt Levendoski The panel was moderated by Charles Killien, Computer Science, Purdue University. Dr. Hal Aldridge, the Director of Engineering at Sypris Electronics, opened today's first panel on the currently popular topic of SCADA security. Dr. Aldridge initially presented his current research interests, which involves the defining of who takes true ownership and responsibility for the security of our nation's backbone infrastructure, our SCADA and control systems. An interesting opposition he presented was, what if the responsible party doesn't have a well-defined background in the security realm? Dr. Aldridge further delved into the aspects of smart grids and the fact that they are everywhere. Hal discussed how it is a scary thought of how much code is being utilized to run the control system of an automobile. In some aspects cars have more code then a variety of our current fighter jets. He further teased about the concept of an Internet based coffee maker. All concepts aside, these systems have their cons, which are present in the form of security problems. Dr. Aldridge closed with the statement that he greatly appreciates the interdisciplinary stance of CERIAS and how this allows for great innovation in the industry and current academic research. William Atkins, a Senior member of Technical Staff in Sandia National Laboratories, followed up with his stance and the difference between SCADA and control systems. He specifically focuses on general computing systems security. More precisely, he introduced the term 'cyber physical systems'. He presented the recent trend that calls for these systems to have inter-compatibility because customers don't want to be locked into a single vendor for their solutions. He further stressed that this topic is vague and largely unknown which has created a lot of media attention, more specifically topics like the stuxnet worm. William further addressed the current trends of security as they relate to control systems. These systems are changing from a less manual or analog approach to a more automated and digital methodology. We want our systems to do more yet require less. This trend tends to bring about unforeseen consequences, especially when these systems hit an unknown state of inoperability. Additionally, all the hypothetical attacks being posed to the public are actually becoming a reality. Attackers now have the capability to purchase or acquire the hardware online via surplus sales, eBay, or the like. William closed with his perspective on SCADA security and how the odds are asymmetrically stacked in favor of the offense verses the defense. Essentially, security tends to get in the way of security. The stuxnet worm is a great example in that it utilized vulnerabilities within the access level of anti-virus software that allowed for a lower level approach to the attack. Jason Holcomb is a Senior Security Consultant at Lockheed Martin in Energy and Cyber Services. He opened his panel discussion with an interesting spin on how he got involved with SCADA security. Jason indirectly introduced a denial of service conflict within the SCADA system he was working on in which he had to, in turn, remediate. Jason presented Lockheed's current approach to the security threads within SCADA systems. Their current research and solutions look to bring some of the advantage back to the defense. This was a great contrast to the perceptions that William Atkins previously presented. Jason then further introduced the following Cyber Kill Chain: Reconnaissance – Gather information. Names, emails, employee info, etc Weaponization – Create malware, malicious document, webpage etc Delivery – Deliver the malware. Email hyperlink *Exploitation – Exploit vulnerability to gain access to assets *Installation – Install on assets Command and Control – Create channel of communication back to attacker Actions on Objectives – Adversary performing their objectives Steven Parker is the Vice President of Technology Research and Projects with the Energy Sector Security Consortium. Steven stated that when it comes to control systems and SCADA, we don't need to necessarily solve the hard problems but focus more on easy solutions. Steven then continued to compare the security industry with that of the diet industry. A few of his comparisons included how the diet industry has Dietitians and we have CISSPs, they have nutritional labeling and we have software assurance, everyone wants a no effort weight loss program while security wants an easy solution for everything, and lastly the diet industry has a surgical procedure called gastric bypass where the security industry has something called regulations and compliance. He then closed with the notion that a lot of challenges aren't all necessarily technical. These challenges include economic strategies, human interactions, public policy, and legal issues. Lefteri Tsoukalas is a Professor of Nuclear Engineering at Purdue University. Prof. Tsoukalas jumped right into making the statement that the energy markets are currently undergoing a phase transition. Demand isn't affected by high prices as the resources have changed state from abundance to resource scarcity. This is why energy allocation is key. We need to utilize our resources when energy prices are lower rather then during peak cost timeframes. Prof. Tsoukalas also suggested that we take the same perspective as Europe and look into alternative resources. At this point in time we aren't sitting as comfortably on our current supply of energy resources as we were, say, 100 years ago. Q&A Session Question 1: There is a lot of research in SCADA/Control Systems. How do we adapt our research to be more applicable to Control Systems? Answers/Discussion: Turn problem away from keeping attackers out and focus on other aspects. Looking at domain specific research. Don't limit research to a very specific area but rather apply it across all platforms. It's not an issue that systems are attached to Internet but the fact that we need better control of these systems in both physical and cyber worlds. Looking from the console perspective things may be fine, but sometimes they aren't. We can't always rely on the digital tools. Understanding the business is critical for research. Developing methods for evolved systems. Resilience is key, protect privacy and confidentiality. Question 2: How do we get a handle on global regulations? Answers/Discussion: A lot can be shared that doesn't involve personal or corporate data. Here is where the offense has the advantage over defense. The Offense doesn't care about regulations where defense has to. Discussion was diverted to a more local level and the differences and difficulties with sharing data across large and small companies and how smaller companies tend to be more agile from this perspective. Question 3: What skills do students and staff need, to be affective in this area? Answers/Discussion: Good communication, understand business requirements, wide range of experience skills. The industry needs more security experts then there are job openings. Technical experience, also good social engineer. Core fundamental concepts, you will be able to be trained to flourish in this domain. May want to visit and acquire physics skillsets to operating in Control/SCADA systems Question 4: What type of attacks have you actually experienced? Answers/Discussion: This question was diverted for confidential and security reasons. Further discussion was taken from the following perspective: Be careful with internal use of thumb drives etc. Attackers don't always know what they are looking for but rather just collect data until they find something of interest.

Opening Keynote: Arthur W. Coviello, Jr. (Keynote Summary)

Posted: 13 Apr 2012 01:40 PM PDT

Tuesday, April 3, 2012 Keynote summary by Gaspar Modelo-Howard The State of Security Arthur W. Coviello, Jr., Chairman, RSA, The Security Division of EMC Mr. Coviello opened his keynote with a quote from Nicholas Negroponte: "Internet is the most overhyped, yet underestimated phenomenon in history". This statement, Mr. Coviello argues, it is still true today. And to determine the state of security, one does not have to look beyond the state of the Internet. The growth of the Internet has driven the evolution of computing in the last few decades. Computing has gone through radical transformations: from its early days with mainframes, to computers, moving later to networks in the 80s and then to the rise of the Internet and the World Wide Web in the mid 90s. We are currently experiencing a confluence of technologies and trends (cloud computing, big data, social media, mobile, consumerization of IT) that make clear that the next transformation of computing is well underway and creating new challenges to security. Coviello contended the past evolution of IT infrastructure gives clear signals to the fast and deep changes security should continue to experience in the future. As an example and in just a couple of years, the IT industry has moved from 1 exabyte of data to 1.8 zettabytes, from the iPod to the iPad, from 513M to over 2B Internet users, from speeds of 100kbps to 100Mbps, and from AOL to Facebook (which would be the 3rd largest country in the world, by considering its number of users as population). Coviello then used an interesting analogy to explain the impact in security of the continuous growth of the Internet, and therefore the need to better empower security. Imagine that the Internet is a highway system that is experiencing an exponential growth in the number of cars that use it. The highway system then needs to increase the number of lanes of existing roads, add new roads, and provide better ways for cars to access the system. But all this growth also increases the number and complexity of accidents on the roads. Then, security needs to grow accordingly to better manage (prevent, detect, and respond) the new scenario of potential accidents. Looking at the security world, things have also changed dramatically over the years. Not long ago there were tens of thousands of viruses and their corresponding signatures, where as now there are tens of millions. Organized crime and spying online is a very real threat today that was not really happening in 2001. The scenario is then more difficult today for security practitioners to protect their networks. Stuxnet opened a new threat era for security. We have long moved away from the times of script kiddies. The new breed of attackers include: (1) non-state actors, like terrorists and anti-establishment vigilantes; (2) criminals, that act like a technology company by expanding their market around the world to distribute their products and services, and have sophisticated supply chains; and (3) nation-state actors, which are stealth and sophisticated, difficult to detect, well-resourced, and efficient. Coviello briefly explained the high profiled breach experienced by RSA in 2011. They were attacked by two advanced persistent threat (APT) groups. From the steps taken, it is very clear that a lot of research on the company was made before the attacks. Phishing email was used to get inside their networks, sending the messages to a carefully selected group of RSA employees. The messages included an Excel attachment that contained a zero-day exploit (Adobe Flash vulnerability), which installed a backdoor when triggered. The attackers knew what they wanted, and went low and slow. The attack went on for 2 weeks, with RSA staying two to three hours behind the attackers' moves. The attackers were able to ex-filtrate information from the networks, but RSA ultimately determined that no loss was produced to the company from the attack. As for the experience, Coviello acknowledged that is still not a good idea for a security company to get breached. We are past the tipping point, were physical and virtual worlds could be separated. Additionally, the confluence of technologies and trends is creating more 'open' systems. The security industry is challenged as the open systems are more difficult to secure (than close systems, each under a single domain). We need to secure what in a way can't be controlled. It is then not difficult to explain what has happened recently, in terms of breaches. In 2011, many high-profiled attacks occurred (in what others have labeled as the 'Year of the Security Breach') to big organizations like Google, Sony, RSA, PBS, BAH (Booz, Allen, Hamilton), Diginotar, and governmental entities such as the Japanese Parliament and the Australian Prime Minister. Coviello argued that vendors and manufacturers must stop the linear approach used in the security industry to keep adding layer after layer of security control mechanisms. Security products should not be silos. We need to educate computer users, but keeping in mind that people make mistakes. After all we are humans. Our mindset must change from playing defense, as protection from perimeter does not work alone. Also, security practitioners and technologists must show an ability for big picture thinking and having people skills. We need to get leverage from all security products, therefore the need to move away from the security silos architecture. Fortunately, the age of big data is arriving to the security world. Coviello provided a definition to big data: collecting datasets from numerous sources, at a large scale, and to produce actionable information from analyzing the datasets. The security objective is then to reduce the window of vulnerabilities for all attacks. The age of big data should also promote the sharing of information, which unfortunately is currently a synonym for failure. Organizations do not work together to defend against attacks. Mr. Coviello calls for the creation of multi-source intelligence products. They must be risk-based, as there are different types of risks and should consider the different vulnerabilities, threats, and impacts affecting each organization. The intelligence products should be agile, having deep visibility of the protected system. They should detect anomalies in real time and the corresponding responses should be automated in order to scale and be deployed pervasively. Unfortunately today, systems are a patchwork of security products, focusing only on compliance. Finally, the intelligence products should have contextual capabilities. The ability to succeed against attacks depends on having the best available information, not only security logs. Such information should come from numerous sources, not only internals. The Q&A session included several interesting questions, after the stimulating talk. The first one asked about the possible impediments to achieve the goals outlined in the talk. Coviello pointed out three potential roadblocks. First, the lack of awareness regarding the impact of a security situation by the top board of the organization. Top management should understand that security problems are the responsibility of the whole company, not just the IT department. Second, ignoring the requirement to follow a risk based approach when making security decisions and developing strategies. Third, is important that security programs grow as organizations increasingly rely on their IT systems. A question was made regarding the asymmetric threat that security practitioners face and what can be done about it. Coviello pointed out the need to work around risk analysis in order to reduce the potential risks faced by organizations. It should be understood that the digital risk cannot be reduced any more than the physical risk. So organizations should get more sophisticated on the analytics, following a risk-based approach. A member of the audience pointed out that several federal cybersecurity policies are based on the concept of defense in depth. Such concept is not driven by risks, which ultimately might raise costs to organizations required to comply with policies and regulations. Coviello agreed that if a risk-based approach is not followed, security programs might not achieve cost effectiveness. He also mentioned that defense in depth is sometimes misunderstood as it is not a layering mechanism to implement cybersecurity. It should encompass information sharing among organizations and even countries. He offered an example, calling for ISPs to play a more aggressive role and work with organizations to stop the threat from botnets. A final question was made regarding the push by elected officials to use electronic voting, especially in small counties that might lack the resources to protect those systems. How to make elected officials understand the risk faced when using electronic voting, when such authorities usually do not have the capability to secure the voting system? Coviello sounded less than enthusiastic about electronic voting. But more importantly, he said there is a need to aggregate the security expertise and services so it can be outsourced to small and medium-sized organizations. The security industry should follow on the steps of the software and hardware industries, offering outsourcing services and products.

0 comments:

Post a Comment