Opticks and a Treatise on the PRISM Surveillance Program (Guest Blog)

Opticks and a Treatise on the PRISM Surveillance Program (Guest Blog)


Opticks and a Treatise on the PRISM Surveillance Program (Guest Blog)

Posted: 16 Jun 2013 08:16 PM PDT

By Mark Rasch and Sophia Hannah Last post, we wrote about the NSA‟s secret program to obtain and then analyze the telephone metadata relating to foreign espionage and terrorism by obtaining the telephone metadata relating to everyone. In this post, we will discuss a darker, but somewhat less troubling program called PRISM. As described in public media as leaked PowerPoint slides, PRISM and its progeny is a program to permit the NSA, with approval of the super-secret Foreign Intelligence Surveillance Court (FISC) to obtain "direct access" to the servers of internet companies (e.g., AOL, Google, Microsoft, Skype, and Dropbox) to search for information related to foreign terrorism – or more accurately, terrorism and espionage by "non US persons." Whether you believe that PRISM is a wonderful program narrowly designed to protect Americans from terrorist attacks or a massive government conspiracy to gather intimate information to thwart Americans political views, or even a conspiracy to run a false-flag operation to start a space war against alien invaders, what the program actually is, and how it is regulated, depends on how the program operates. When Sir Isaac Newton published his work Opticks in 1704, he described how a PRISM could be used to – well, shed some light on the nature of electromagnetic radiation. Whether you believe that the Booz Allen leaker was a hero, or whether you believe that he should be given the full Theon Greyjoy for treason, there is little doubt that he has sparked a necessary conversation about the nature of privacy and data mining. President Obama is right when he says that, to achieve the proper balance we need to have a conversation. To have a conversation, we have to have some knowledge of the programs we are discussing. Different Data Unlike the telephony metadata, the PRISM programs involve a different character of information, obtained in a potentially different manner. As reported, the PRISM programs involve not only metadata (header, source, location, destination, etc.) but also content information (e-mails, chats, messages, stored files, photographs, videos, audio recordings, and even interception of voice and video Skype calls.) Courts (including the FISA Court) treat content information differently from "header"information. For example, when the government investigated the ricin-laced letters sent to President Obama and NYC Mayor Michael Bloomberg, they reportedly used the U.S. Postal Service‟s Mail Isolation Control and Tracking (MICT) system which photographs the outside of every letter or parcel sent through the mails – metadata. When Congress passed the Communications Assistance to Law Enforcement Act (CALEA), which among other things established procedures for law enforcement agencies to get access to both "traffic" (non-content) and content information, the FBI took the posistion that it could, without a wiretap order, engage in what it called "Post-cut-through dialed digit extraction" -- that is, when you call your bank and it prompts you to enter your bank account number and password, the FBI wanted to "extract" that information (Office of Information Retrival) as "traffic" not "content." So the lines between "content" and "non-content"may be blurry. Moreover, with enough context, we can infer content. As Justice Sotomeyor observed in the 2012 GPS privacy case: … it may be necessary to reconsider the premise that an individual has no reasonable expectation of privacy in information voluntarily disclosed to third parties. E.g., Smith, 442 U.S., at 742, 99 S.Ct. 2577; United States v. Miller, 425 U.S. 435, 443, 96 S.Ct. 1619, 48 L.Ed.2d 71 (1976). This approach is ill suited to the digital age, in which people reveal a great deal of information about themselves to third parties in the course of carrying out mundane tasks. People disclose the phone numbers that they dial or text to their cellular providers; the URLs that they visit and the e-mail addresses with which they correspond to their Internet service providers; and the books, groceries, and medications they purchase to online retailers. But the PRISM program is clearly designed to focus on content. Thus, parts of the Supreme Court‟s holding in Smith v. Maryland that people have no expectation of privacy in the numbers called, etc. therefore does not apply to the PRISM-type information. Right? Again, not so fast. Expecting Privacy Simple question. Do you have a reasonable expectation of privacy in the contents of your e-mail? Short answer: Yes. Longer answer: No. Better answer: Vis a vis whom, and for what purposes. You see, privacy is not black and white. It is multispectral – you know, like light through a triangular piece of glass. When the government was conducting a criminal investigation of the manufacturer of Enzyte (smiling Bob and his gigantic – um – putter) they subpoenaed his e-mails from, among others, Yahoo! The key word here is subpoenanot search warrant. Now that‟s the thing about data and databases -- if information exists it can be subpoenaed. In fact, a Florida man has now demanded production of cell location data from – you guessed it – the NSA. But content information is different from other information. And cloud information is different. The telephone records are the records of the phone company about how you used their service. The contents of emails and documents stored in the cloud are your records of which the provider has incidental custody. It would be like the government subpoenaing your landlord for the contents of your apartment (they could, of course subpoena you for this, but then you would know), or subpoenaing the U-stor-it for the contents of your storage locker (sparking a real storage war). They could, with probable cause and a warrant, seach the locker (if you have a warrant, I guess you‟re cooing to come in), but a subpoena to a third party is dicey. So the Enzyte guy had his records subpoenaed. This was done pursuant to the stored communications act which permits it. The government argued that they didn‟t need a search warrant to read Enzyte guy‟s email, because – you guessed it – he had no expectation of privacy in the contents of his mail. Hell, he stored it unencrypted with a thjird party. Remember Smith v. Maryland? The phone company case? You trust a third party with your records, you risk exposure. Or as Senator Blutarsky (I. NH?) might opine, "you ()*^#)( up, you trusted us…"(actually Otter said that, with apologies to Animal House fans.) Besides, cloud provider contracts, and email and internet provider privacy policies frequently limit privacy rights of users. In the Enzyte case, the government argued that terms of service that permitted scanning of the contents of email for viruses or spam (or in the case of Gmail or others, embedding context based ads) meant that the user of the email service "consented" to have his or her mail read, and therefore had no privacy rights in the content. ("Yahoo! reserves the right in their sole discretion to pre-screen, refuse, or move any Content that is available via the Service.") Terms of service which provided that the ISP would respond to lawful subpoenas made them a "joint custodian" of your email and other records (like your roommate) who could consent to the production of your communications or files. Those policies that your employer has that says, "employees have no expectation of privacy in their emails or files"? While you thought that meant that your boss (and the IT guy) can read your emails, the FBI or NSA may take the position that "no expectation of privacy" means exactly that. Fortunately, most courts don't go so far. In general, courts have held that the contents of communications and information stored privately online (not on publicly accessible Facebook or Twitter feeds) are entitled to legal protection even if they are in the hands of potentially untrustworthy third parties. But this is by no means assured. But clearly the data in the PRISM case is more sensitive and entitled to a greater level of legal protection than that in the telephony metadata case. That doesn‟t mean that the government, with a court order, can't search or obtain it. It means that companies like Google and Facebook probably can't just "give it" to the government. I''s not their data. The PRISM Problem So the NSA wants to have access to information in a massive database. They may want to read the contents of an email, a file stored on Dropbox, whatever. They may want to track a credit card through the credit card clearing process, or a banking transaction through the interbank funds transfer network. They may want to track travel records – planes, trains or automobiles. All of this information is contained in massive databases or storage facilities held by third parties – usually commercial entities. Banks. VISA/MasterCard. Airlines. Google. The information can be tremendously useful. The NSA may have lawful authority (a Court order) to obtain it. But there is a practical problem. How does the NSA quickly and efficiently seek and obtain this information from a variety of sources without tipping those sources off about the individual searches it is conducting – information which itself is classified? That appears to be the problem attempted to be solved by PRISM programs. In the telephony program, the NSA "solved" the problem by simply taking custody of the database. In PRISM, they apparently did not. And that is a good thing. The databases remain the custody of those who created them. Here‟s where it gets dicey – factually. The reports about PRISM indicate that the NSA had "direct access" to the servers of all of these Internet companies. Reports have been circulating that the NSA had similar "direct access" to financial and credit card databases as well. The Internet companies have all issued emphatic denials. So what gives? Speculation time. The NSA and Internet companies could be outright lying. David Drummond, Google‟s Chief Legal Officer aint going to jail for this. Second, they could be reinterpreting the term "direct" access. When General Alexander testified under oath that the NSA did not "collect any type of data on millions of Americans" he took the term "collect" to mean "read" rather than "obtain." Most likely, however, is that the NSA PRISM program is a protocol for the NSA, with FISC approval, to task the computers at these Internet companies to perform a search. This tasking is most likely indirect. How it works is, at this point, rank speculation. What is likely is that an NSA analyst, say in Honolulu, wants to get the communications (postings, YouTube videos, stored communications, whatever) of Abu Nazir, a non-US person, which are stored on a server in the U.S., or stored on a server in the Cloud operated by a US company. The analyst gets "approval" for the "search," by which I mean that a flock of lawyers from the NSA, FBI and DOJ descend (what is the plural of lawyers? [ a "plague"? --spaf] ) and review the request to ensure that it asks for info about a non US person, that it meets the other FISA requirements, that there is minimization, etc. Then the request is transmitted to the FISC for a warrant. Maybe. Or maybe the FISC has approved the searches in bulk (raising the Writ of Assistance issue we described in the previous post.) We don‟t know. But assuming that the FISC approves the "search," the request has to be transmitted to, say Google, for their lawyers to review, and then the data transmitted back to the NSA. To the analyst in Honolulu, it may look like "direct access." I type in a search, and voilia! Results show up on the screen. It is this process that appears to be within the purview of PRISM. It may be a protocol for effectuating court-approved access to information in a database, not direct access to the database. Or maybe not. Maybe it is a direct pipe into the servers, which the NSA can task, and for which the NSA can simply suck out the entire database and perform their own data analytics. Doubtful, but who knows? That‟s the problem with rank speculation. Aliens, anyone? But are basing this analysis on what we believe is reasonable to assume. So, is it legal? Situation murky. Ask again later. If the FISC approves the search, with a warrant, within the scope of the NSA‟s authority, on a non-US person, with minimization, then it is legal in the U.S., while probably violating the hell out of most EU and other data privacy laws. But that is the nature of the FISA law and the USA PATRIOT Act which amended it. Like the PowerPoint slides said, most internet traffic travels through the U.S., which means we have the ability (and under USA PATRIOT, the authority) to search it. While the PRISM programs are targeted at much more sensitive content information, if conducted as described above, they actually present fewer domestic legal issues than the telephony metadata case. If they are a dragnet, or if the NSA is actually conducting data mining on these databases to identify potential targets, then there is a bigger issue. The government has indicated that they may release an unclassified version of at least one FISC opinion related to this subject. That‟s a good thing. Other redacted legal opinions should also be released so we can have the debate President Obama has called for. And let some light pass through this PRISM. Mark Rasch, is the former head of the United States Department of Justice Computer Crime Unit, where he helped develop the department's guidelines for computer crimes related to investigations, forensics and evidence gathering. Mr. Rasch is currently a principal with Rasch Technology and Cyberlaw and specializes in computer security and privacy. Sophia Hannah has a BS degree in Physics with a minor in Computer Science and has worked in scientific research, information technology, and as a computer programmer. She currently manages projects with Rasch Technology and Cyberlaw and researches a variety of topics in cyberlaw. Rasch Cyberlaw (301) 547-6925 www.raschcyber.com

Schrodinger’s Catnip: A Review of the NSA Phone Surveillance Program (Guest Blog)

Schrodinger’s Catnip: A Review of the NSA Phone Surveillance Program (Guest Blog)


Schrodinger’s Catnip: A Review of the NSA Phone Surveillance Program (Guest Blog)

Posted: 15 Jun 2013 02:28 PM PDT

By Mark Rasch and Sophia Hannah The NSA programs to retrieve and analyze telephone metadata and internet communications and files (the former we will call the telephony program, the latter codenamed PRISM) are at one and the same time narrow and potentially reasonably designed programs aimed at obtaining potentially useful information within the scope of the authority granted by Congress. They are, at one and the same time perfectly legal and grossly unconstitutional. It's not that we are of two opinions about these programs. It is that the character of these programs are such that they have both characteristics at the same time. Like Schrödinger's cat, they are both alive and dead at the same time – and a further examination destroys the experiment. Let's look at the telephony program first. Telephone companies, in addition to providing services, collect a host of information about the customer including their name, address, billing and payment information (including payment method, payment history, etc.). When the telephone service is used, the phone company collects records of when, where and how it was used – calls made (or attempted), received, telephone numbers, duration of calls, time of day of calls, location of the phones from which the calls were made, and other information you might find on your telephone bill. In addition, the phone company may collect certain technical information – for example, if you use a cell phone, the location of the cell from which the call was made, and the signal strength to that cell tower or others. From this signal strength, the phone company can tell reasonably precisely where the caller is physically located (whether they are using the phone or not) even if the phone does not have GPS. In fact, that is one of the ways that the Enhanced 911 service can locate callers. The phone company creates these records for its own business purposes. It used to collect this primarily for billing, but with unlimited landline calling, that need has diminished. However, the phone companies still collect this data to do network engineering, load balancing and other purposes. They have data retention and destruction policies which may keep the data for as short as a few days, or as long as several years, depending on the data. Similar "metadata" or non-content information is collected about other uses of the telephone networks, including SMS message headers and routing information. Continuing with the Schrödinger analogy, the law says that this is private and personal information, which the consumer does not own and for which the consumer has no expectation of privacy. Is that clear? Federal law calls this telephone metadata "Consumer Proprietary Network Information" or CPNI. 47 U.S.C. 222 (c)(1) provides that: Except as required by law or with the approval of the customer, a telecommunications carrier that receives or obtains customer proprietary network information by virtue of its provision of a telecommunications service shall only use, disclose, or permit access to individually identifiable customer proprietary network information in its provision of (A) the telecommunications service from which such information is derived, or (B) services necessary to, or used in, the provision of such telecommunications service, including the publishing of directories. Surprisingly, the exceptions to this prohibition do not include a specific "law enforcement" or "authorized intelligence activity" exception. Thus, if the disclosure of consumer CPNI to the NSA under the telephony program is "required by law" then the phone company can do it. If not, it can't. But wait, there's more. At the same time that the law says that consumer's telephone metadata is private, it also says that consumers have no expectation of privacy in that data. In a landmark 1979 decision, the United States Supreme Court held that the government could use a simple subpoena (rather than a search warrant) to obtain the telephone billing records of a consumer. See, these aren't the consumer's records. They are the phone company's records. The Court noted, "we doubt that people in general entertain any actual expectation of privacy in the numbers they dial. All telephone users realize that they must "convey" phone numbers to the telephone company, since it is through telephone company switching equipment that their calls are completed. All subscribers realize, moreover, that the phone company has facilities for making permanent records of the numbers they dial, for they see a list of their long-distance (toll) calls on their monthly bills." The court went on, "even if petitioner did harbor some subjective expectation that the phone numbers he dialed would remain private, this expectation is not "one that society is prepared to recognize as `reasonable.'" By trusting the phone company with the records of the call, consumers "assume the risk" that the third party will disclose it. The Court explained, "petitioner voluntarily conveyed to it information that it had facilities for recording and that it was free to record. In these circumstances, petitioner assumed the risk that the information would be divulged to police." This dichotomy is not surprising. The Supreme Court held that, as a matter of Constitutional law, any time you trust a third party, you run the risk that the information will be divulged. Prosecutors and litigants subpoena third party information all the time –your phone bills, your medical records, credit card receipts, bank records, surveillance camera data, and records from your mechanic – just about anything. These are not your records, so you can't complain. At the same time, Congress was concerned with phone company's use of CPNI for marketing purposes without consumer consent, so they imposed statutory restrictions on the disclosure or use of CPNI unless "required by law." Enter the NSA There is little doubt that telephony metadata can be useful in foreign intelligence and terrorism cases. Hell, it can be useful in any criminal investigation, or for that matter, a civil or administrative case. But if the CIA obtains the phone records of, say Abu Nazir (for Homeland fans), and spots a phone number he has called, they, through the NSA, want to be able to find out information about that phone call, and who that person called. The NSA wants this data for precisely the same reason that it is legally protected – phone metadata reveals patterns which can show relationships between people, and help determine who is associated with whom and for what purpose. Metadata and link analysis can help distinguish between a call to mom, a call to a colleague, and a call to a terrorist cell. Context can reveal content – or at least create a strong inference of content. So, in appropriate cases involving terrorism, national security or intelligence involving non-US persons, the NSA should have this data. And indeed, they always have. None of that is new. If the NSA captured a phone number, say 867-5309, they could demand the records relating to that call from the phone company through an order issued by a special super-secret court called FISC. The order could say "give the NSA all the records of phone usage of 867-5309 as well as the records of the numbers that they called." Problem is, that is unwieldy, time consuming, requires a new court order with each query, and in many ways overproduces records. Remember, not only are these terrorism and national security investigations, but the target is a non-US person, usually (but not always) located outside the United States. The Fourth Amendment The right of the people to be secure in their persons, houses, papers, and effects, against unreasonable searches and seizures, shall not be violated, and no warrants shall issue, but upon probable cause, supported by oath or affirmation, and particularly describing the place to be searched, and the persons or things to be seized. Read that carefully. You would think that it requires a warrant to search, right? Wrong. Actually, Courts interpret the comma after the word "violated" as a semi-colon (who says grammar doesn't matter?) "The people" which includes but is not limited to U.S. citizens, have a right to be secure against unreasonable searches and seizures (more on the "and" in a minute). Also, warrants have to be issued by neutral magistrates and must specify what is to be seized. So no warrant is needed if the search is "reasonable." In fact, the vast majority of "searches and seizures" in America are conducted without a warrant. People are searched at airports and borders. No warrant. They are patted down on the streets and in their cars. No warrant. Cops look into their car windows, follow them around, and capture video of them without a warrant. Police airplanes, helicopters (and soon drones) capture images of people in their back yards or porches. No warrant. Dogs can sniff for drugs, bombs or contraband. No warrant. And people give consent to search without a warrant all the time. When the police searched the boat for the fugitive Boston bomber, they needed no warrant because of exigent circumstances (and perhaps because the boat's owner consented). Warrantless searches can be "reasonable" and can pass constitutional muster. That's one reason Congress created the FISC. For law enforcement purposes (to catch criminals) the government can get a grand jury subpoena, a search warrant, a "trap and trace" order, a "pen register" order, a Title III wiretap order, or other orders if they can show (depending on the information sought) probable cause or some relevance to the criminal investigation. But for intelligence gathering purposes, the NSA can't really show "probable cause" to believe that there's a crime, because often there is not. It's intelligence gathering. So the Foreign Intelligence Surveillance Act (FISA) created a special secret court to allow the intelligence community to do what the law enforcement community could already do – get information under a court order, but instead of showing that a crime was committed, they had to show that the information related to foreign intelligence. After September 11, 2001, Congress added terrorism as well. When Congress amended FISA, it allowed the FISA court (FISC) to authorize orders for the production of "books records or other documents." Section 215 of the USA PATRIOT Act allowed the FBI to apply for an order to produce materials that assist in an investigation undertaken to protect against international terrorism or clandestine intelligence activities. The act specifically gives an example to clarify what it means by "tangible things": it includes "books, records, papers, documents, and other items." Telephone metadata fits within this description, including the NSA Telephony Program (As we know it) So the NSA has the authority to seek and obtain (through the FBI and FISC) telephone metadata. It also has a legitimate need to do so. But that's not exactly what they did here. Instead of getting the records they needed, the NSA decided that it would get all the records of all calls made or received (non-content information) about everyone, at least from Verizon, and most likely from all providers. The demand was updated daily, so every call record was dumped by the phone companies onto a massive database operated by the NSA. Now this is bad. And good. The good part is that, by collecting metadata from all of the phone companies, the NSA could "normalize" and cross-reference the data. A single authorized search of the database could find records from Verizon, AT&T, Sprint, T-Mobile, and possibly Orange, British Telecom, who knows? Rather than having to have the FISC issue an order to Verizon for a phone record, and then after that is examined, another order to AT&T, by having the data all in one place, "pingable" by the NSA, a singly query can find all of the records related to that query. So if the FISC authorizes a search for Abu Nazir's phone records, this process allows the NSA to actually get them. Also, the NSA doesn't have to provide a court order (which itself would reveal classified information about who they were looking at) to some functionary at Verizon or AT&T (even if that functionary had a security clearance). And Verizon's database would not have a record of what FISC authorized searches the NSA conducted – information which itself is highly classified. Just because the NSA had all of the records does not mean that it looked at them all. In fact, the NSA and FBI established a protocol, which was apparently approved by the FISC that restricted how and when they could ping this massive database. So the mere physical transfer of the metadata database from the phone companies to the NSA doesn't impinge privacy unless and until the NSA makes a query, and these queries are all authorized by the FISC and are lawful. So what's the big deal? It's all good, man. General Warrant Not so fast Mr. Schrödinger. There are two huge legal problems with this program. Undoubtedly, the USA PATRIOT Act authorizes the FISC to order production of "tangible things" and these records are "tangible things." But the law does not authorize what are called "general warrants." A general warrant is a warrant that either fails to specify the items to be searched for or seized, fails to do so with particularity, or is so broad or vague as to permit the person seizing the items almost unfettered discretion in what to take. A warrant which permitted seizure of "all evidence of crimes" or "all evidence of gang activity" would be an unconstitutional general warrant. It's important to note that the warrant is "legal" in the sense that it was for information relevant to a crime (or, say terrorism), that the obtaining of the warrant was authorized by law, that a court issued the warrant, and that the proper procedures were followed. But the warrant is unconstitutional and so is the search and seizure. This is particularly true where the warrant seeks information that relates to First Amendment protected activities like what books we are reading, and with whom we are associating. So when Texas authorized the search and seizure of records relating to "communist activities"(the ism before terrorism) and cops got a warrant to take such books and records, the Supreme Court had no problem finding that the warrant was an unconstitutional "general warrant." Even though the FISC warrant to Verizon specified exactly what was to be seized ("everything") it was undoubtedly a general warrant. Remember, the Fourth Amendment prohibits unreasonable "searches" and "seizures." A warrant authorizing seizure of all records of millions of people who did nothing wrong, particularly when it is designed to figure out their associations is about as general as you can get. And that is assuming that the searches, or pinging to the database, which happen later, are reasonable. What's more, by taking custody of all of these records, the NSA abrogates the document retention and destruction policies of all of the phone companies. We can assume that the NSA keeps these records indefinitely. So long after Verizon decides it doesn't need to know what cell tower you pinged on July 4, 2005 at 6:15.22 PM EST, the NSA will retain this record. That's a problem for the NSA because now, instead of subpoenaing Verizon for these records (especially in a criminal case where the defendant has a constitutional right to the records if relevant to a defense), the NSA (or FBI who obtained the records for the NSA) can expect to get a subpoena for the records. While the NSA and FBI would undoubtedly claim that the program is classified, clearly my own phone records are not classified. A federal law called the Classified Information Procedures Act provides a mechanism to obtain unclassified versions of classified data. So if you were charged with a crime by the FBI, and the same FBI had records (in this database) that indicated that you did not commit the crime, they would have to search the database and produce the records. And when Verizon tells you that the records are gone, well… it aint true anymore. But wait, there's more. Even if the "seizure" is a general warrant, the government would argue that it is "reasonable" because it is necessary to effectuate the NSA's function of protecting national security, and its impact on privacy is minimal because the database isn't "pinged" without court approval. The "collection" of data about tens of millions of Americans doesn't affect their privacy especially when the Supreme Court said that they have no privacy rights in this data, and it doesn't even belong to them. (Even though the Director of National Intelligence testified in March that the NSA did not "collect" any data on millions of Americans). Besides, the NSA would argue, there is no other way for the government to do this. What does the NSA do with the records? Here's where there is an unknown. At present, we do not know what the NSA does with the telephone metadata database. Do they simply query it – e.g., give me all the records of calls made by Abu Nazir; or do they preform data mining, link analysis, and pattern analysis on the database in order to identify potential Abu Nazir's? If the latter, then the NSA is clearly searching records of millions of Americans. If the former, it is still troubling for a few reasons. Six Degrees of Separation First, the NSA's authority revolves around non-US persons. While there may be "inadvertent" collection on U.S. persons, the target of the surveillance must be a non-US person for the program to be legal. According to the leaked documents, the NSA took a very liberal interpretation of what this means. First, they determined that as long as there was a 51% chance that the target was a non-US person, the NSA was entitled to obtain records. Second, they may – and we stress "may" – have interpreted their authority as providing that, if the target of the investigation was foreign (again 51% chance) then they could obtain records related to calls between two US persons wholly in the US. Finally, they apparently deployed a "two degrees of separation" test. If Abu Nazir (51% foreign) called John Smith's telephone number, the NSA could look at who Smith (100% US) called within the US (first degree of separation). If Smith called Jones, the NSA could then look at Jones' call records (second degree of separation.) At this point, even if the pinging of the database is authorized by the FISC, we are a long way from Abu Nazir. Toto, I'm afraid we are in Kansas. Writs of Assistance OK, but what's the big deal? The seizure of the database is authorized by FISC, under a statute approved by Congress, with Congressional knowledge and oversight (maybe), and under strict control by the NSA, the FBI and DOJ. Every search of the database is approved by the super-secret court, right? Not so fast, Kemo Sabe. It is highly unlikely that the FISC approves every database search. More likely is that the FBI and NSA have established protocols and procedures designed to ensure that the searches are within their jurisdiction, are designed to find information about terrorism and foreign intelligence, that the targets are (51%) foreign, and that there is a minimization procedure. These protocols –rather than the individual searches themselves – are what are approved by the FISC. The NSA then most likely reports back to the FISC (through the DOJ) about whether there was an "inadvertent disclosure" of information not related to these objectives. So the court most likely does not approve every search. And that's another problem. You see, each "search" of the database is – well – a search. That search must be supported by probable cause (in a criminal case to believe that there's a crime, in a FISA case, espionage, foreign intelligence or terrorism) and must be approved by a court. Each search. Not the process. We have been down this road before. In fact, this is precisely what lead to the American Revolution in general and the Fourth Amendment in particular. When the British Parliament issued the Navigation Acts imposing tariffs on goods imported into America, many colonists refused to pay them (as Boston lawyer James Otis noted, "taxation without representation is tyranny") So Parliament authorized King George II to issue what are called "writs of assistance." This writ, issued by a Court, authorized the executive branch (a customhouse officer with the assistance of the sheriff) to search colonists houses for unlawfully smuggled items. These writs did not specify what the sheriff could search for or seize, or where he could look. Like the NSA program, the court approved what could be done, the executive had discretion in how to do it. When George II was succeeded by George III (the writs expiring with the death of the King) Parliament reauthorized them under the hated Townsend Acts. James Otis urged resistance, and it was the use of these unspecific writs authorizing searches that galvanized public opinion (and that of John Adams in particular) to urge revolution. It is why the Fourth Amendment demanded that a search warrant specify based on probable cause, the specific place to be searched and item to be seized. It's also why writs of assistance are prohibited in the constitution. The NSA FISC approved searches would be like a judge in Los Angeles issuing a search warrant to the LAPD which said, "you may search any house as long as you smell marijuana in that house." While the search may be reasonable, and indeed, if the LAPD had applied for a warrant to search a house after they smelled marijuana a court probably would have issued the warrant, the broad blanket approval of these searches would be more akin to a writ of assistance.    So the NSA digital telephony program, while legal in the sense that it was approved by both Congress and the Foreign Intelligence Surveillance Court, has some serious Constitutional problems. Telephone Company Liability? The phone companies could be on the hook for participating in the program, even though they have both immunity and had no choice but to participate. In fact, they could not legally have even disclosed the program. In the FISA amendments, Congress expressly gave the phone companies immunity for making "good faith" disclosures of information pursuant to Section 215. So why would the phone company be in trouble? The problem is the "good faith" part. In 2012 the Supreme Court looked at the question of when someone (cops in that case) should have immunity for a good faith search pursuant to an unconstitutional warrant. The cops got a warrant for all records of "gang related activity" and all guns in a particular house. The court agreed that the warrant was overbroad, unconstitutional, and should not have been issued. The question was whether the cops, who executed the warrant, should have immunity from civil liability because they acted in "good faith." The Supreme Court noted that the fact that they got a warrant at all was one indication that they acted in good faith, but that, "the fact that a neutral magistrate has issued a warrant authorizing the allegedly unconstitutional search or seizure does not end the inquiry into objective reasonableness. Rather, we have recognized an exception allowing suit when "it is obvious that no reasonably competent officer would have concluded that a warrant should issue." In other words, the cops are generally permitted to rely on the fact that a court issued a search warrant, unless the warrant itself (or the means by which it is procured) is so obviously unconstitutional, overbroad, general or otherwise prohibited that you cannot, in good faith rely on it. While the court found that the cops had immunity because the warrant was not so overbroad to lead to the inevitable conclusion that it was unconstitutional, it is hard to make that same argument where the FISA warrant essentially asked for every record of the phone company. Hard to imagine a broader warrant. Justice Kagan pointed out that it's not illegal to be a member of a gang, and that a warrant that authorized seizure of evidence of gang membership per se called for associational records which were protected. Much like the phone logs here. Justices Sotomayor and Ginsburg went further noting, The fundamental purpose of the Fourth Amendment's warrant clause is "to protect against all general searches." Go-Bart Importing Co. v. United States, 282 U. S. 344, 357 (1931) The Fourth Amendment was adopted specifically in response to the Crown's practice of using general warrants and writs of assistance to search "suspected places" for evidence of smuggling, libel, or other crimes. Boyd v. United States, 116 U. S. 616–626 (1886). Early patriots railed against these practices as "the worst instrument of arbitrary power" and John Adams later claimed that "the child Independence was born" from colonists'opposition to their use. Id., at 625 (internal quotation marks omitted). To prevent the issue of general warrants on "loose, vague or doubtful bases of fact," Go-Bart Importing Co., 282 U. S., at 357, the Framers established the inviolable principle that should resolve this case: "no Warrants shall issue, but upon probable cause . . . and particularly describing the . . . things to be seized." U. S. Const., Amdt. 4. That is, the police must articulate an adequate reason to search for specific items related to specific crimes. They found that the search by the police without probable cause was unreasonable even though there was both judicial and executive oversight, and that therefore there should be no immunity because the actions were not in "good faith." The phone companies run that risk here. Mark Rasch, is the former head of the United States Department of Justice Computer Crime Unit, where he helped develop the department's guidelines for computer crimes related to investigations, forensics and evidence gathering. Mr. Rasch is currently a principal with Rasch Technology and Cyberlaw and specializes in computer security and privacy. Sophia Hannah has a BS degree in Physics with a minor in Computer Science and has worked in scientific research, information technology, and as a computer programmer. She currently manages projects with Rasch Technology and Cyberlaw and researches a variety of topics in cyberlaw. Rasch Cyberlaw (301) 547-6925 www.raschcyber.com

Spafford Answers Cyber Security Questions on CNN.com

Spafford Answers Cyber Security Questions on CNN.com


Spafford Answers Cyber Security Questions on CNN.com

Posted: 23 May 2013 07:19 AM PDT

More information »

Cloud Computing: A Way to Reduce Risk?

Posted: 22 May 2013 06:45 AM PDT

Spafford, a computer science professor at Purdue, sees issues that often aren't discussed in cloud computing conversations. "Too often, organizations [are] told that moving things to the cloud will be safer and cheaper, and cheaper as we know is always what tends to dominate these conversations and lead to new vulnerabilities," Spafford says. More information »

Spafford Taking Cyber Security Questions on CNN.com

Posted: 08 May 2013 06:33 AM PDT

(CNN) The Pentagon's claims in a new report that China is trying to extract sensitive information from U.S. government computers has put cyber security issues back in the media spotlight. But how serious is the threat to U.S. interests? How can America respond? And what other issues should be attracting policymakers' attention? Cyber security expert Eugene Spafford, a professor of computer sciences at Purdue University and former member of the President's Information Technology Advisory Committee, will be taking questions from GPS readers. More information »

Spafford Joins EPIC Advisory Board

Posted: 26 Apr 2013 08:31 AM PDT

April 23, 2013 - EPIC has announced the 2013 members of the EPIC Advisory Board. They are Michael Froomkin, Distinguished Professor of Law at the University of Miami School of Law; Sheila Kaplan, student privacy advocate and founder of Education New York; Eugene Spafford, a/k/a/ "Spaf," professor of Computer Science at Purdue University; and Tim Wu, professor at Columbia Law School and author of "The Master Switch." The EPIC Advisory Board is a distinguished group of experts in law, technology, and public policy. Joining the EPIC Board of Directors in 2013 are current Advisory Board members David Farber, Joi Ito, and Jeff Jonas. For more information. see EPIC: EPIC Advisory Board. More information »

Will New Hires Impede Future Security?

Posted: 16 Apr 2013 06:48 AM PDT

The rush to find qualified IT security professionals to meet current cyber-threats could jeopardize IT systems' security in the not-too-distant future, say two leading IT security experts, Eugene Spafford and Ron Ross. More information »

Opening Keynote: Todd Gebhart, Co-President McAfee Inc. (Summary)

Posted: 16 Apr 2013 06:41 AM PDT

Wednesday, April 3, 2013 Summary by Gaspar Modelo-Howard The Changing Security Landscape Why do we, as cybersecurity professionals, go to work each day? Mr. Gebhart reflected on this question to start his presentation, suggesting a very clear and concise answer. It is to protect the many things and people that are so important to our lives. Security professionals need to protect the families from threats like cyber bullies or identity thieves, risks associated to financial information, attacks to the new business ideas and our critical infrastructure, and to help protect those that protect us, such as law enforcement and first responders. This is why a multidisciplinary approach, such as what CERIAS follows and to which Mr. Gebhart pointed out, is required to come up with the ideas and solutions to achieve our goal as cybersecurity professionals. In the early days of malware, it could have been considered a nuisance. After all, there were about 17,000 pieces of malware in 1997 and for some people antivirus software could be updated every few months. But malware has been growing at a rapid pace. McAfee stores more than 120M samples of malware software in its database, up from 80M in 2011. The growth is also fast in the mobile landscape. There were 2K unique pieces of mobile malware in 2011, while last year it grew to 36K. And as the mobile market becomes more popular and we move from multiple operating systems to just two today, Google's Android and Apple's iOS, there will still be room for growth for malware. McAfee's stats show that (1) Android is the most targeted operating system for malware, (2) many application stores for phones host malware, and (3) half of all iOS phones are jail broken. Other trends explain the always changing landscape of information technology and therefore security. For example, the growth in the number of devices connected to the Internet and their changing profiles. There are approximately 1B devices today, and that number should reach 50B by 2020. People think about computers and phones when asked about which electronic devices are connected to the Internet. But there are many others such as automobiles, televisions, dishwashers, and refrigerators that are being connected every day, helping to put the control of our lives at our fingertips: how much energy we consume, what do we eat or how we communicate and with whom. So today's risks are more about the devices and data stored, rather than just malware, and everybody is at risk. At the personal level, there are always reports of attacks aimed at individuals. Mr. Gebhart recounted Operation High Roller that targeted corporate bank accounts and wealthy people by using a variant of the Zeus Trojan horse.  At the business level, he talked about the incident known as Operation Aurora, discovered by McAfee Labs, where attackers were after intellectual property from 150 companies. It is also common nowadays to hear about state sponsored cyberattacks on businesses. For example, McAfee believes is one of the most attacked companies in the world (given their condition as both a security services provider and a consumer) as they see many, frequent attacks around the world, ran by well-funded, professional organizations. One of the most concerning areas at risk is critical infrastructure and governments around the world show growing concern about malware. The Stuxnet malware seemed to come from a spies' movie as it was created as a stealthy, offensive tool to cause harm. The Citadel trojan is another example of how incisive and targeted malware can be, attacking individual organizations, while also harvesting credentials and passwords from users. So the malware found nowadays in the wild is more targeted and automated, which explains the growing concern on highly important systems such as critical infrastructure. Additionally, the commercialization of malware keeps increasing. Hackers as a Service (HAAS) and off-the-shelf malware are too common now, so malicious code and people' services are openly being sold. Mr. Gebhart suggested that new partnerships are required to deal with malware; it is no longer only a technical issue. This pointed back to his early comment of dealing with cybersecurity in a multidisciplinary approach. An organization's board should be involved and new strategies need to be created. Whereas malware used years ago to be a topic that would only include a mid-level business manager, now is a high-level management discussion topic everywhere you go. It is in everybody's mind, with people not limiting the conversation to the technical aspects of an attack, but also talking about the impact to the business. Today, it is required to include those that make the decisions for the business in order to opportunely defend against malware and to plan for security. Innovation is also paramount in order to successfully protect the systems and Mr. Gebhart mentioned several current initiatives. For example, companies are increasingly using cloud-based threat intelligence systems to deal with real-time and historical data, and at increasing quantities. McAfee monitoring systems receive about 56B events a month from 120M devices. Many of the events are hashed and sent to their systems on the cloud to determine if they are malicious or not, allowing McAfee to block (if necessary) similar traffic. The response capabilities have also improved, as now there exists the algorithms to classify the events, determining which ones to handle, and to respond fast. The DeepSAFE Technology is another innovation example, coming from the partnership between McAfee and Intel. The jointly-developed technology serves as a foundation for new hardware-assisted security products. Today's malware detection software sits above the operating system, whereas DeepSAVE will operate without such restriction and closer to the hardware, offering a different vantage point to detect, block, and remediate hidden attacks such as Stuxnet and SpyEye. To close his presentation, Mr. Gebhart mentioned to not forget who we are working for and to protect the global access to information and the identities of our users. It is an exciting time to be involved in cybersecurity with the changing landscapes of information technology and security. Computing has come a long way in the last few decades but we still have to build the trust around it so people can confidently rely on computing.  

Keynote: Christopher Painter, Coordinator for Cyber Issues, U.S. Department of State (Summary)

Posted: 13 Apr 2013 02:20 AM PDT

Thursday, April 4th, 2013 Summary by Kelley Misata As Christopher Painter, Coordinator for Cyber Issues within the US Department of State, began his keynote address to the CERIAS Symposium audience he humorously admitted, "Today I'm flying without a net", a PowerPoint presentation net that is. This set the tone for an informal and informative discussion about the changing threat landscape in cyberspace. In the early 1990s Christopher Painter began his federal career as an Assistant U.S. Attorney in Los Angeles; a time when most people were not that interested in cyber crime and the issues we are facing today where unimaginable. These issues weren't on the forefront of most people's minds which provided Mr. Painter an opportunity to dive in and get involved at all levels of cyber investigations happening at the time. Mr. Painter led some of the early and most infamous cyber crime cases including the prosecution of Kevin Mitnick; one of the most wanted cyber criminals in the United States. Through his work leading case and policy discussions of the Computer Crime and Intellectual Property Section of the US Department of Justice, Mr. Painter has become a leading expert in international cyber issues. However, through this impressive journey he shared with the CERIAS audience, one of the most marked times during his career was with President Obama in 2009. Reminding the audience of the campaign hacking incident that raised the awareness of cyber threats to the office of the President, Mr. Painter discussed how the shift in focus on cyber issues was starting to occur. Now charged with identifying the gaps in national cyber policies, Mr. Painter led a research initiative which resulted in over 60 interviews engaging individuals from government, private industry, academia and civic society the results of this study became the premise for President Obama's landmark speech on cyber security in May 2009. Over the past 5 years the conversations in cyber security have evolved dramatically. Initially these conversations were so highly technical in nature that government officials handed them to the technical community to find the solutions. Today, with cyber issues expanding beyond domestic boundaries it was quickly realized that in order for solutions to be sustainable they needed the "push" of the senior policy makers and CEOs from the private sectors. As Mr. Painter stated, "We have come a long way even though the challenges continue to mount, we need to remember we still have a long way to go." Today, the cyber security threat landscape has changed from the days of the "lone gunman hackers" to the now organized, transnational groups. Cyber security professionals are facing mounting challenges in international laws, forensic processes and the introduction of new actors in the arena of bad guys. However, reflecting back again to President Obama's 2009 speech on cyber security, Mr. Painter recall's the President reference to the "economic threat of cyber crime"; an important distinction from merely addressing cyber crime as a security threat to identifying cyber crime as an economic threat to the country. Public awareness is changing and so are the conversations within the U.S. government. Remembering President Obama's 2013 State of the Union address, Mr. Painter remarked, "this was to a national audience who are not cyber folks - it is another great example of how the cyber issues have transitioned to be government issues." This landmark speech resulted in a new sergeant of collaboration and coordinating among government agencies; "This is a big shift in how these groups are running interagency meetings as there is a new commonality and purpose to these issues." Looking toward the future, world will continue to grabble with the constantly changing cyber threat landscape and the equitably of these issues in the physical world. These are global challenges globally. As result, in partnership with the Department of Homeland Security, Mr. Painter and his team are bringing technical information and training to over 100 countries; working to help technologically advancing countries to mitigate the increasing and complex cyber threats around the world. Concurrently, they are evaluating key policies issues including 1. international security - the US has taken the lead in establishing an international law through systems that build confidence in transparency; 2. cyber security due-diligence-challenging the international community to continue to develop national policies, build institutions and foster the due diligence process; 3. identification cyber crimes; 4. internet governance - through existing technical organizations and a multi-stakeholder approach; and 5. internet freedom - principles around openness and transparency online. As the audiences starts to process this incredible professional journey along with the changing landscape in cyber space, Mr. Painter closed his keynote address illustrating the efforts him and his team in working closely with inter-agencies within the US government, private sectors and academia around the world. Also, actively conducting important dialogues and advancing the key cyber issues with governments in Brazil, South Africa, Korea, Japan and Germany to name a few; bringing the issues of cyber security strategies, the changing landscape and key policy issues to these emerging countries.

Tech Talk #3: Stephen Elliott (Summary)

Posted: 12 Apr 2013 02:01 PM PDT

Thursday, April 4th, 2013 Associate Professor Stephen Elliott, Industrial Technology, Purdue University Director, Biometric Standards, Performance and Assurance Laboratory Summary by Kelley Misata Title: Advances in Biometric Testing Starting the conversation Stephen reminded the audience that what makes biometrics such an interesting field is the unpredictability of the humans in the testing and evaluations processes. In traditional biometric testing environments researchers work with algorithms and established metrics and methodologies. However, as biometrics testing moves to operational environments there are more uncertainties to content with and therefore making it hard to do. Considering these two important testing environments, what biometric researchers are now trying to do is to understand further how a biometric system performs in any environment and identify what (or who) could the possible cause of errors. As Stephen pointed out, there have been several papers addressing how individual error impacts biometric performance and the potential causes of these errors. Some of these errors are now being traced to gaps in biometrics testing including training (e.g. "How do you train someone who is difficult to train or doesn't want to be trained?"), accessibility (e.g. "Are the performance results different in a operation environment than collected in a lab?"), usability (e.g. "Can the system be used efficiently, effectively and consistently by a large population?") and the complexities of the human factors on biometric testing performance. Raising the question, is the error always subject centric?</> In order to fill in some these gaps, Stephen and his graduate students are looking at the traditional biometric modes and metrics to determine if they are suitable in today's testing and evaluation environments. During the CERIAS tech talk Stephen spotlighted the research of three of his graduate students: 1. The Concept of Stability Thesis by Kevin O'Connor - the examination of finger print stability across force levels; 2. The Case of Habituation by Jacob Hasselgren - quantitatively measuring habituation in biometrics testing environments; and 3. Human Biometric Sensor Interaction highlighting Michael Brokly's research on test administrator errors in biometrics, including the effects of operator train, workloads of both test administrators and test operators, fatigue and stress. The biometrics community continues to investigate these questions in order to understand how the vast array of players in a operational data collection environment impact performance. In his closing statements, Stephen reiterated the complexities and challenges in biometrics testing and how researchers are looking deeper into the factors affecting performance beyond a simple ROC/DET curve.

Featured Commentary: The Honorable Mark Weatherford, DHS Deputy Under Secretary for Cybersecurity

Posted: 12 Apr 2013 01:39 PM PDT

Thursday April 4, 2013 Summary by Marquita A. Moreland During the introduction, Professor Spafford discussed Mark Weatherford's experience prior to becoming Deputy Under Secretary for cybersecurity at DHS. He mentioned that Mr.Weatherford was CIO of the state of Colorado and California and director of security for the electric power industry. He made it known that Mr.Weatherford has won a number awards and spent a lot of time in cybersecurity in the navy. He also mentioned that under sequestration rules Mr.Weatherford was not allowed to travel. Mr.Weatherford desired to be present, but he could not attend, so he decided to create a video. Mark Weatherford began his commentary with the For Want of a Nail rhyme because he believes it is a good way on how to approach the business of security. Mr.Weatherford expressed his appreciation for Professor Spafford, thanking him for how much he has helped advance the topic of cybersecurity and the development of some of the national security leaders. Mr. Weatherford proceeded to state that "we're in business where ninety nine percent secure, means you're still one hundred percent vulnerable." An example he used was from 2008, when a large mortgage company that is no longer in business, was concerned with the loss of their client's information. They decided to disable the USB ports from thousands of machines to prevent employees from copying data. They missed one machine, which was used by an analyst to load and sell customer's data over a two year period. Cybersecurity threat, DHS's role in cybersecurity, the President's Executive Order on cybersecurity, and the lack of cyber talent across the nation are the four topics that Mr.Weatherford briefly explained. Cybersecurity Threat: The danger of a cyber attack is the number one threat facing the United State, bigger than the threat of Al Qaeda. There is a lack of security practices, and water, electricity and gas are dangerously vulnerable for cyber attacks. The banking and finance industry has been under a series of DDOS attacks since last summer. Almost every week there are a new set of banks under siege, such as the Shamoon attack on Saudi Aramco and the attack on Qatari RasGas. In February of this year the emergency broadcast system in four states were attacked, with a message that said the nation was being attacked by zombies. The fact that someone can get into these systems raises safety and security concerns. The office of cybersecurity and communication (CS&C) has the largest cybersecurity role in DHS. They help secure the federal civilian agency networks in the executive branch primarily the .gov domain. They also provide help with the privacy sector in the .com domain, with a focus on critical infrastructure. They lead and coordinate the response of cyber events. They work on national and international cybersecurity policies. There are five divisions; Network Security Deployment, Federal Network Resilience, Stakeholder Engagement and Cyber Infrastructure Resilience, the Office of Emergency Communications, and the National Cybersecurity and Communications Integration Center. Last year U.S. CERT resolved over 200,000 incidents involving different sectors, and ICS-CERT responded onsite to 177 incidents. President's Cybersecurity Executive Order (EO): The EO was announced during the State of Union speech. There were two paragraphs regarding cybersecurity in the President's State of Union Speech. Mr. Weatherford mentioned when he was CIO, he worked every year to try and get at least a single sentence in the Governor State of State speech but was unsuccessful. The EO significance will help achieve: Establishment of an up to date cybersecurity framework. Enhancement of information sharing amongst stakeholders by: Expanding the voluntary DHS Enhanced Cybersecurity Services program (ECS). Expediting the classified and unclassified threat reporting information for private sectors. Expediting the issuance of security clearances of critical infrastructure members in the private sector. Cyber Challenges: Mr.Weatherford stated that "the common denominator to all the work we do is the requirement for well trained and experienced cyber professionals." DHS sponsors Scholarship for Service (SFS) with the National Science Foundation. DHS co-sponsored the National Centers of Academic Excellence (CAE). Purdue was one of the first seven universities in the nation designated as a CAE in 1999. The lack of qualified people is one of the biggest problem and Mr.Weatherford's suggestions are: Make people want to choose cyber security. Government, academia and industry need to work together to change the public perception and to figure out how to make cybersecurity "cool". Mr.Weatherford closed this commentary by stating "DHS wants to be your partner in cybersecurity whether you're in the government, academia or the private sector. No one can go it alone in this business and be successful, so think of us as partners and colleagues, we really can help."

Panel 3: Security Education and Training (Panel Summary)

Posted: 12 Apr 2013 01:17 PM PDT

Thursday, April 4th, 2013 Panel Members: Diana Burley, Associate Professor of Human and Organizational Learning, George Washington University Melissa Dark, Professor, Computer and Information Technology, CERIAS Fellow, Purdue University Allan Gray, Professor and Director, Center for Food and Agricultural Business and Land O'Lakes Chair in Food and Agribusiness, Purdue University Marcus K. Rogers, Professor, Computer and Information Technology, CERIAS Fellow, Purdue University Ray Davidson, Professor of Practice and Dean of Academic Affairs, SANS Technology Institute Moderator: Professor Eugene Spafford, Executive Director, CERIAS Summary by Rohit Ranchal Current technological advances and shortage of cyber security professionals require us to focus on cyber security education. The main challenge is that how to fit the identified needs in a practical education or training program. Going by the modern trends and popularity of MOOCs (Massive Open Online Courses), it is very important to consider online and distance education for cyber security. One important requirement is to have a business model in place to structure the MOOCs because right now they are just doing information dissemination. We need a structured curriculum, which can take advantage of the freely available MOOCs. The current trend of security problems suggests that we are moving away from the traditional problems like protocol vulnerabilities and reviewing RFCs to fix them. Most problems such as policy based vulnerabilities, social engineering etc occur at application level and end user level. So it is important to have exposure to the changing problems and understanding the associated legal and regulatory environment. Professionals need to be trained for organizational dynamics such as budgeting and investments, which are important to the business. Having awareness of bigger issues is also important along with the technical expertise. One important thing to consider in Information Security Education is the target population. When we consider about educating everyone in the security awareness space, we focus on campaigns, reaching into k-12, educating elderly people, talking about cyber security war etc. But the instruction language is not particularly persuasive. It is very important to think about the instruction language when the target audience is masses of people. Our current education system focuses on Professionalization. Professionalization is a social phenomenon. A cyber security professional is someone who has to deal with high levels of uncertainty and high levels of complexity. A professional can have a specific technical background or expertise or can have skills in the interdisciplinary space. The framework proposed by National Initiative for Cyber Security Education lists seven high- level job roles including some non-technical job roles as well. Cyber security professionals are not only in cyber security profession only but they are in hybrid roles in the interdisciplinary space. Thus the professionals should be educated and trained in such a way that they can carry out multiple tasks in their hybrid roles. Professionalization could also mean credentialing, education/degree, codes of ethics, certification, training, apprenticeship, etc. Professionalization can be debated in terms of various aspects such as applied vs theoretical knowledge, concepts vs technologies, vocational training vs degree education, immediate needs vs future needs, generalists vs specialists etc. We need to consider all these aspects. The underlying point is that Professionalization induces a change in behavior. An important way to achieve that is through apprenticeship and mentoring. Apprenticeship and mentoring is strictly followed in some other professions on the completion of degree to acquire the practical training and on successful completion, the person is considered a professional. We need to bring back apprenticeship and mentoring in the security education curriculum. But things in security space are changing so rapidly that no matter how much education is given, the professionals will have to deal with high level of uncertainty and complexity. One way to ensure this is to have people who are excited about the profession and are willing to constantly learn and enjoy it. Professionalization should not be considered as something where one can arrive like an end-point. The obvious question is that how to find such people. Some institutes like SANS Technology Institute and (ISC)2 focus to address this problem through certification. But how can we measure if the certifications have any real value? It depends upon the training, knowledge and experience that goes into the certification. There are many different types of certifications from weekend certifications to highly specialized certifications. Another thing to consider is that certification implies that a professional has some valuable knowledge today but doesn't say anything about tomorrow when the threats, situations and environment change. There is a shortfall of individuals at present but how can we ensure that our education system can balance that need for today with the need for professionals who are able to learn, analyze and synthesize challenges of tomorrow that are not yet known. If we look at other professions, many of them require licensing. Professionals in such professions have to renew their licensing to stay active with the current technologies and skills. Another difference is that the cyber security professionals don't have the same liability if something goes wrong e.g. a system gets hacked, as compared to some other professions for e.g. if a bridge falls down, then you can talk to the civil engineer. Consider if we have all the security jobs require a certification and an organization hires a professional without certification for building a system that gets broken then there can be terrible consequences such as lawsuits. Also you have to consider that building a system requires system designers, developers and users. Its not easy to declare someone liable. The liability model is not appropriate at present but we should move in that direction. An important concern while education and training security professionals is that how to prevent them from turning bad such as ethical hackers becoming unethical hackers. The argument is that there is a high risk in case of information dissemination only but with education that risk is lowered. The goal of education is not just to give knowledge but to provide the context, the morality, the ethics and to teach that there are consequences to actions. Education is a socialization and culturization process that induces the change in behavior. The education curriculums should be designed in such a way that the mentor can effectively measure that change in the behavior. While addressing the education problem, it is important to understand that the governments tend to be reactionary and focus on present problems rather than being visionary so it is very important for the universities and industries to be visionary and drive the education and training that focuses on the future and not past.

Panel 2: NSTIC, Trusted Identities and the Internet (Panel Summary)

Posted: 12 Apr 2013 11:25 AM PDT

Wednesday, April 3rd, 2013 Panel Members: Cathy Tilton, VP Standards and Technology, Daon Solutions Elisa Bertino, Professor, Computer Science and CERIAS Fellow, Purdue University Stephen Elliot, Associate Professor, Technology Leadership & Innovation and CERIAS Fellow, Purdue University Stuart S. Shapiro, Principal Information Privacy and Security Engineer, The MITRE Corporation Moderator: Keith Watson, Information Assurance Research Engineer, CERIAS Summary by Ruchith Fernando Cathy Tilton was the first to present her views and she opened with an introduction to NSTIC. She mentioned that NSTIC strategy document came out in April 2011 is an outcome the President's cyber security review. Daon's objective is "Enhancing commercial participation cross sector in the identity ecosystem" in collaboration with AARP, PayPal, Purdue University IT Department, American Association of Airport Executives and a major bank. Daon pilot study consists of 4 components: Technology component: This is based on a risk based multi factor authentication capability solution that leverages mobile devices called IdentityX. Based on the risk level of the transaction, the relying party would dynamically invoke some combination of authentication methods. Research component: Deon teamed with the Purdue Biometric Lab in analyzing data coming from operational pilots to evaluate usability, accessibility, privacy, security, user acceptance and performance of the solution in various environments. Trust frameworks: A research effort attempting to identify what gaps exists in trying to fit IdentityX solution to existing trust frameworks. Operational Pilot: There are 5 relying parties from different sectors. Some with large and smaller subscriber bases implementing different use cases. Professor Stephen Elliot presented his work at the Purdue Biometrics Laboratory where they have been working with a focus on testing and evaluation various biometrics since 2001. There is a multi faceted approach to the testing philosophy in this project which involves "in lab" testing, surveys and "in the wild" testing. In lab testing is where there is a controlled environment where users carry out controlled transactions. These tests are carried out on three different operating systems, and evaluate interoperability by assessing whether users remember how to use the device and whether they can transfer that knowledge into another operating system. These sessions are recorded and are conducted over 4 to 6 weeks. In the wild testing attempts to mimic various real life scenarios where the test subjects are given a mobile device for a month. The focus groups involved in testing include elderly, disabled and able-bodied individuals where there are about 10 to 15 participants in each group. Professor Elisa Bertino was the next to present her views. She defined digital identity and introduced the concepts of strong identifiers and weak identifiers. Strong identifiers identify an individual uniquely. Weak identifiers are those that do not identify a person uniquely. Depending on the context an identifier may be a strong or a weak identifier. Security and interoperability are concerns: In most identity management systems, the user is redirected to an identity provider when authenticating with a relying party. But this leads to privacy issues where the identity provider learns information about the user's transactions. Protocols developed in VeryIDX project uses an identity token given to the user by the identity provider, which can be used without further interactions with the identity provider. Those protocols are very different with different information and different interaction models. Therefore achieving interoperability with other protocols is a challenge. Linkability: When a user carries out two transactions with two different relying parties, the two relying parties may be able to use information they collect to identify that they are interacting with the same user. This further applies to the user carrying out two transactions with the same relying party. Stuart S. Shapiro expressed his views on two main issues. NSTIC This promotes selective attribute disclosure. In the case where individual subscribes to an online newspaper, from the subscriber's privacy perspective, as long as the service provider can verify that he/she is a valid subscriber there is no need for any other identity information. But based on the business model, the service provider may need to know certain demographic data about the individual to be able to target advertisements and to be able to charge for advertisements. This is the issue of "Functional Minimums vs. Business Model Minimums". Business models may require much more information than what is covered in functional requirements. NSTIC does not clearly address this issue. Service providers are "not interested in individuals but are interested in categories": This categorization may be either benign or harmful to individuals. Therefore even with privacy preservation techniques, if an individual can be categorized, this might lead to the leak of critical identity information. This was the conclusion of the presentations by the panelists and the audience raised several questions: Q: FIPS 2001 seems to be very applicable to your study. Cathy Tilton: With regard to FIPS 2001, we are not doing anything with regard to smart card credentials. We are however looking at the ICAM (Identity, Credential, and Access Management) certification related to trust framework providers and identity providers. Q: I assume you are trying to certify the protection of data bearing processes. Please explain how you are doing it on inherently insecure mobile device without a trust anchor. Cathy Tilton: We include a private key in the keychain of the phone. We consider the phone an untrusted device. We use the key to set up a mutually authenticated TLS session with the phone. When this secure channel is established one can use this channel to collect information on the phone and send it back to the server where verification is performed. Q: What about jailbroken devices? Cathy Tilton: The device is considered untrusted. When secure elements are commercially available on mobile devices we will make use of those. Our approach has been BYOD (bring your own device) model for usability and familiarity. Therefore we have to manage with the capabilities and security features of those devices. Q: How does liability fit into NSTIC? Cathy Tilton: In our situation we are really more of a credential provider. Therefore it is a shared responsibility. Most of our relying parties already have all their identity information about their subscriber base. They do not share that information with us and they do the identity proofing. What they are looking from us is a strong credential. What becomes critical is the binding of the credential to the identity. Prof. Bertino: The problem of liability is a very debated question. In software engineering who is liable for software mistakes? Identity management is very much the same. It is software, which needs to be secured to be working properly. Q: How does NSTIC accommodate potential issues such as forcing users under duress to authenticate sensitive transactions? Cathy Tilton: In our solution we have provisions to a "duress pin" where the relying party handles it according to their policy. Stuart S. Shapiro: In a certain context duress is the status quo right now. In some cases, users lie about the requested information to obtain the service and avoid providing real identity information. If the certified attributes are required then there will be no option to lie. In such cases duress can increase rather than decrease. Q: How do you see we achieve tradeoff between the fact that we have to reveal certain information about ourselves while certain generalized categories already reveal so much information about us? Prof. Bertino: Services should provide customers the choice of revealing information and provide alternatives such as paying for those services. Systems should be flexible to support such options. Sometimes categorization can be benign, but for example, if a user is in the wrong passenger profile then he/she might have trouble getting through airport security. Even if a user is a very private user but he/she simply possess a certain feature in a population he/she might be automatically classified. I think this is separate problem and I don't think NSTIC has to solve this. Cathy Tilton: NSTIC from the supports both pseudonymous and anonymous credentials since there are many transactions that do not require any more information. Q: How is the biometric data stored on the server? Is there anything equivalent to secure password hashes for biometrics data? Cathy Tilton: All biometrics protected using all mechanisms that are normally used to protect data at rest such as encryption and audited access behind a firewall. If the cryptographic mechanisms fail, certain biometrics can leak information. But biometrics are not stored alongside identity information. When acting as a credential provider we have no identifying information associated with biometrics. Q: What are the methods available to verify aliveness of a subject. Cathy Tilton: Our aliveness support includes using photographs of different angles of a face and a challenge response mechanism with random and longer phrases for voice authentication. We are working on using video as well. Q: What are interesting open research problems? Prof. Elliot: There are many research problems with a lot of challenges in areas such as mobile usability testing. Prof. Bertino: There's a lot of work to be done in the anonymity techniques for digital identity and linkability analysis. Stuart S. Shapiro: More sophisticated privacy risk modeling techniques are required. Need techniques for integrating privacy in an engineering sense.

Panel 1: Security Analytics, Analysis, and Measurement (Panel Summary)

Posted: 12 Apr 2013 11:04 AM PDT

Wednesday, April 3rd, 2013 Panel Members: Alok Chaturvedi, Professor, Management, Purdue University Samuel Liles, Associate Professor, Computer and Information Technology, Purdue University Andrew Hunt, Information Security Researcher, the MITRE Corporation Mamani Older, Senior Vice President, Information Security, Citigroup Vincent Urias,Principle Member of Technical Staff, Sandia National Laboratories Moderator: Joel Rasmus, Director of Strategic Relations, CERIAS Summary by Ben Cotton With "Big Data" being a hot topic in the information technology industry at large, it should come as no surprise that it is being employed as a security tool. To discuss the collection and analysis of data, a panel was assembled from industry and academia. Alok Chaturvedi, Professor of Management, and Samuel Liles Associate Professor of Computer and Information Technology, both of Purdue University, represented academia. Industry representatives were Andrew Hunt, Information Security Research at the MITRE Corporation, Mamani Older, Citigroup's Senior Vice President for Information Security, and Vincent Urias, a Principle Member of Technical Staff at Sandia National Laboratories. Joel Rasmus, the Director of Strategic Relations at CERIAS, moderated the panel. Professor Chaturvedi made the first opening remarks. His research focus is on reputation risk: the potential damage to an organization's reputation – particularly in the financial sector. Reputation damage arises from the failure to meet the reasonable expectations of stakeholders and has six major components: customer perception, cyber security, ethical practices, human capital, financial performance, and regulatory compliance. In order to model risk, "lots and lots of data" must be collected; reputation drivers are checked daily. An analysis of the data showed that malware incidents can be an early warning sign of increased reputation risk, allowing organizations an opportunity to mitigate reputation damage. Mister Hunt gave brief introductory comments. The MITRE Corporation learned early that good data design is necessary from the very beginning in order to properly handle a large amount of often-unstructured data. They take what they learn from data analysis and re-incorporate it into their automated processes in order to reduce the effort required by security analysts. Mister Urias presented a less optimistic picture. He opened his remarks with the assertion that Big Data has not fulfilled its promise. Many ingestion engines exist to collect data, but the analysis of the data remains difficult. This is due in part to the increasing importance of meta characteristics of data. The rate of data production is challenging as well. Making real-time assertions from data flow at line rates is a daunting problem. Professor Liles focused on the wealth of metrics available and how most of them are not useful. "For every meaningless metric," he said, "I've lost a hair follicle. My beard may be in trouble." It is important to focus on the meaningful metrics. The first question posed to the panel was "if you're running an organization, do you focus on measuring and analyzing, or mitigating?" Older said that historically, Citigroup has focused on defending perimeters, not analysis. With the rise of mobile devices, they have recognized that mere mitigation is no longer sufficient. The issue was put rather succinctly by Chaturvedi: "you have to decide if you want to invest in security or invest in recovery." How do organizations know if they're collecting the right data? Hunt suggested collecting everything, but that's not always an option, especially in resource-starved organizations. Understanding the difference between trend data and incident data is important, according to Liles, and you have to understand how you want to use the data. Organizations with an international presence face unique challenges since legal restrictions and requirements can vary from jurisdiction-to-jurisdiction. Along the same lines, the audience wondered how long data should be kept. Legal requirements sometimes dictate how long data should be kept (either at a minimum or maximum) and what kind of data may be stored. The MITRE Corporation uses an algorithmic system for the retention and storage medium for data. Liles noted that some organizations are under long-term attack and sometimes the hardware refresh cycle is shorter than the duration of the attack. Awareness of what local log data is lost when a machine is discarded is important. Because much of the discussion had focused on ways that Big Data has failed, the audience wanted to know of successes in data analytics. Hunt pointed to the automation of certain analysis tasks, freeing analysts to pursue more things faster. Sandia National Labs has been able to correlate events across systems and quantify sensitivity effects. One audience member noted that as much as companies profess a love for Big Data, they often make minimal use of it. Older replied that it is industry-dependent. Where analysis drives revenue (e.g. in retail), it has seen heavier use. An increasing awareness of analysis in security will help drive future use.

A Different Approach To Foiling Hackers? Let Them In, Then Lie To Them

Posted: 08 Apr 2013 05:28 AM PDT

(Forbes) Last month Heckman, a researcher for the non-profit IT research corporation MITRE, gave a talk with fellow MITRE researcher Frank Stech at Purdue's Center for Education and Research in Information Assurance and Security and described a cyber war game scenario MITRE played out internally in which she and Stech tried an unorthodox defensive strategy: Instead of trying to purge a Red Team of hackers from a Blue Team's network they were defending, Heckman and Stech let the attackers linger inside, watched them, and fed them confusing misinformation. The result: despite the Blue Team's network being deeply compromised by the Red Team's hackers, Blue managed to trick Red into making the wrong moves and losing the game. Forbes Related: CERIAS Information Security Seminar Mar 20, 2013 More information »

On Competitions and Competence

Posted: 07 Apr 2013 01:11 PM PDT

This is a follow-up to my last post here, about the "cybersecurity profession" and education. I was moderating one of the panels at the most recent CERIAS Symposium, and a related topic came up. Let's start with some short mental exercises. Limber up your cerebellum. Stretch out and touch your cognitive centers a few times. Ready? There's another barn on fire! Quick, get a bucket brigade going -- we need to put the fire out before everything burns. Again. It is getting so tiring watching all our stuff burn while we're trying to run a farm here. Too bad we can only afford the barns constructed of fatwood. But no time to think of that -- a barn's burning again! 3rd time this week! Hey, you people over there tinkering with designs for sprinkler systems and concrete barns -- cut it out! We can't spare you to do that -- too many barns are burning! And you, stop babbling about investigating and arresting arsonists -- we don't have time or money for that: didn't you hear me? Another barn is burning! Now, hurry up. We're going to have a contest to find who can pass this pail of water the quickest. Yes, it is a small, leaky pail, but we have a lot of them, so that is what we're going to use in the contest. The winners get to be closest to the flames and have a name tag that says "fire prevention specialist." No, we can't afford larger buckets. And no, you can't go get a hose -- we need you in the line. Damnit! The barn's burning! Sounds really stupid, doesn't it? Whoever is in charge isn't doing anything to address the underlying problem of poor barn construction. It doesn't really match the notion of what a fire prevention specialist might really do. And it certainly doesn't provide deep career preparation for any of those contestants... it may even condemn them to a future of menial bucket passing because we're putting them on the line with no training or qualification beyond being able to pass a bucket. Let's try another one. Imagine that every car and automobile in the country has been poorly designed. They almost all leak coolant and burn oil. They're trivial to steal. They are mostly cheap junkers, all built on the same frame with the same engines, accessories, and tires -- even the ones sold to the police and military (actually, they're the same cars, but with different paint). The big automakers are rolling out new models every year that they advertise as being more efficient and reliable, but that is simply hype to get you to buy a new car because the new features also regularly break down. There are a few good models available, but they are quite a bit more expensive; those more expensive ones often (but not always) break down less, are more difficult to steal, and get far better mileage. Their vendors also don't have a yearly model update, and many consumers aren't interested in them because those cars don't take the common size of tire or fuzzy dice for the mirror. The auto companies have been building this way for decades. They sell their products around the world, and they're a major economic force. Everyone needs a car, and they shell out money for new ones on a regular basis. People grumble about the poor quality and the breakdowns, but other than periodic service bulletins, there are few changes from year to year. Many older, more decrepit cars are on the road because too many people (and companies) cannot afford to buy new ones that they know aren't much better than the old ones. Many people argue -- vociferously -- against any attempt to put safety regulations on the car companies because it might hurt such an important market segment. A huge commercial enterprise has sprung up around fixing cars and adding on replacement parts that are supposedly more reliable. People pour huge amounts of money into this market because they depend on the cars for work, play, safety, shopping, and many other things. However, there are so many cars, and so many update bulletins and add-ons, there simply aren't enough trained mechanics to keep up -- especially because many of the add-ons don't work, or require continual adjustment. What to do? Aha! We'll encourage young people in high school and maybe college to become "automotive specialists." We'll publish all sorts of articles with doom and gloom as a result of the shortage of people going into auto repair. We especially need lots more military mechanics. So...we'll have competitions! We'll offer prizes to the individuals (or teams) that are able to change the oil of last year's model the most quickly, or who can most efficiently hotwire a pickup truck, take it to the garage, change the tires, and return it. The government will support these competitions. They'll get lots of press. Some major professional organizations and even universities will promote these. Of course we'll hire lots of mechanics that way! (Women aren't interested in these kinds of competition? We won't worry about that now. People who are poor with wrenches won't compete? No problem -- we'll fill in with the rest.) Meanwhile, the government and major companies aren't really doing anything to fix the actual engineering of the automobiles. There are a few comprehensive engineering programs at universities around the country, but minimal focus and resources are applied there, and little is said about applying their knowledge to really fixing transportation. The government, especially the military, simply wants more mechanics and cheaper cars -- overall safety and reliability aren't a major concern. Pretty stupid, huh? But there does seem to be a trend to these exercises. Let's try one more. We have a large population that needs to be fed. They've grown accustomed to cheap, fast-food. Everyone eats at the drive-thru, where they get a burger or compressed chicken by-product or mystery-meat taco. It's filling, and it keeps them going for the day. It also leads to obesity, hypertension, cardiac problems, diabetes, and more. However, no one really blames the fast-food chains, because they are simply providing what people want. It isn't exactly what people should have, and is it really what everyone wants? No, there are better restaurants with healthy food, but that food is more expensive and many people would go hungry if they had to eat at those places given the current economic model. Of course, if they didn't need to spend so much on medicine and hospital stays, a healthier diet is actually cheaper. Also, those better places aren't easy to find -- small (or no) advertising budgets, for instance. The government has contracted with the chains for food, and even serves it at every government office and on every military base. The chains thus have a fair amount of political clout so that every time someone raises the issue about how unhealthy the food is, they get muffled by the arguments "But it would be too expensive to eat healthy" and "Most people don't like that other food and can't even find it!" We have a crisis because the demand for the fast-food is so great that there aren't enough fry cooks. So, the heads of major military organizations and government agencies observe we are facing a crisis because, without enough fry cooks, our troops will be overwhelmed by better fed people from China. Government officials and industry people agree because they can't imagine any better diet (or are so enamored of fried potatoes that they don't want anything else). How do they address the crisis? By mounting advertising campaigns to encourage young people to enter the exciting world of "cuisine awareness." We make it seem glamorous. Private organizations offer certifications in "soda making" and "ketchup bottle maintenance" that are awarded after 3-day seminars. DOD requires anyone working in food service to have one of these certificates -- and that's basically all. We see educational institutes and small colleges offering special programs in "salad bar maintenance." The generals and admirals keep showing up at meetings proclaiming how important it is that we get more burger-flippers in place before we have a "patty melt Pearl Harbor." The government launches a program to certify schools as centers of "Cuisine Awareness Exellence" if they can prove they have at least 5 cookbooks in the library, a crockpot, and two faculty who have boiled water. Soon, there are hundreds of places designated with this CAE, from taco trucks and hot dog stands to cordon bleu centers -- but lots are only hot dog stands. None of them are given any recipes, cooks, or financial support, of course -- simply designating them is enough, right? When all of that isn't seen to be enough, the powers-that-be offer up contests that encourage kids to show up and cook. Those who are able to most quickly defrost a compressed cake of Soylent Red, cook it, stick it in a bun, and serve it up in a bag with fries is declared the winner and given a job behind someone's grill. Actually, each registered contestant gets a jaunty paper cap and offer of an immediate job cooking for the military (assuming they are U.S. citizens; after all, we know what those furriners eat sure isn't food!) And gosh, how could they aspire to be anything BUT a fry cook for the next 40 years -- no need to worry about any real education before they take the jobs. Meanwhile, those studying dietetics, preventative health care, sustainable agriculture, haute cuisine, or other related topics are largely ignored -- not to mention the practicing experts in these fields. The people and places of study for those domains are ignored by the officials, and many of the potential employers in those areas are actually going out of business because of lack of public interest and support. The advice of the experts on how to improve diet is ignored. Find that disconcerting? Here -- have a deep-fried cherry pie and a chocolate ersatz-dairy item drink to make you feel better. Did you sense a set of common threads (assuming you didn't blow out your cortex in the exercise)? First, in every case, a mix of short-sighted and ultimately stupid solutions are being undertaken. In each, there are large-scale efforts to address pressing problems that largely ignore fundamental, systemic weaknesses. Second, there are a set of efforts putatively being made to increase the population of experts, but only with those who know how to address a current, limited problem set. Fancy titles, certificates, and seminars are used to promote these technicians. Meanwhile, longer-term expertise and solutions are being ignored because of the perceived urgency of the immediate problems and a lack of understanding of cost and risk. Third, longer-term disaster is clearly coming in each case because of secondary problems and growth of the current threats. Why did this come up with my post and panel on cybersecurity? I would hope that would be obvious, but if not, let me suggest you go back to read my prior post, then read the above examples, again. Then, consider: Nationally, we are investing heavily in training and recruiting "cyber warriors" but pitifully little towards security engineers, forensic responders, and more. It is an investment in technicians, not in educated expertise. We have a marketplace where we continue to buy poorly-constructed products then pay huge amounts for add-on security and managing response; meanwhile, we have knowledgeable users complaining that they can't afford the up-front cost required to replace shoddy infrastructure with more robust items Rather than listen to experts, we let business and military interests drive the dialog We have well-meaning people who somehow think that "contests" are useful in resolving part of the problem One of the most egregious aspects is this last item -- the increasing use of competitions as a way of drawing people to the field. Competitions, by their very nature, stress learned behavior to react to current problems that are likely small deviations from past issues. They do not require extensive grounding in multiple fields. Competitions require rapid response instead of careful design and deep thought -- if anything, they discourage people who exhibit slow, considerate thinking -- discourage them from the contests, and possibly from considering the field itself. If what is being promoted are competitions for the fastest hack on a WIntel platform, how is that going to encourage deep thinkers interested in architecture, algorithms, operating systems, cryptology, or more? Competitions encourage the mindset of hacking and patching, not of strong design. Competitions encourage the mindset of quick recovery over the gestalt of design-operate-observe-investigate-redesign. Because of the high-profile, high-pressure nature of competitions, they are likely to discourage the philosophical and the careful thinkers. Speed is emphasized over comprehensive and robust approaches. Competitions are also likely to disproportionately discourage women, the shy, and those with expertise in non-mainstream systems. In short, competitions select for a narrow set of skills and proclivities -- and may discourage many of the people we most need in the field to address the underlying problems. So, the next time you hear some official talk about the need for "cyber warriors" or promoting some new "capture the flag" competition, ask yourself if you want to live in a world where the barns are always catching fire, the cars are always breaking down, nearly everyone eats fast food, and the major focus of "authorities" is attracting more young people to minimally skilled positions that perpetuate that situation...until everything falls apart. The next time you hear about some large government grant that happens to be within 100 miles of the granting agency's headquarters or corporate support for a program of which the CEO is an alumnus but there is no history of excellence in the field, ask yourself why their support is skewed towards building more hot dog stands. Those of us here at CERIAS, and some of our colleagues with strategic views elsewhere, remind you that expertise is a pursuit and a process, not a competition or a 3-day class, and some of us take it seriously. We wish you would, too. Your brain may now return to being a couch potato.

Seven CERIAS Faculty Promoted

Posted: 05 Apr 2013 11:12 AM PDT

The Purdue board of trustees approved faculty promotions for seven CERIAS related faculty effective July 1: To Professor Saurabh Baghci, professor of electrical and computer engineering David Love, professor of electrical and computer engineering T.N. Vijaykumar, professor of electrical and computer engineering Dongyan Xu, professor of computer science Chris Clifton, professor of computer science To Associate Professor Ramana Kompella, associate professor of computer science Jennifer Neville, associate professor of computer science and statistics Congratulations for their promotion and continuing success!