NASIG 2013: Knowledge and Dignity in the Era of Big Data

CC BY 2.0 2013-06-10
“Big Data” by JD Hancock

Speaker: Siva Vaidhyanathan

Don’t try to write a book about fast moving subjects.

He was trying to capture the nature of our relationship to Google. It provides us with a services that are easy to use, fairly dependable, and well designed. However, that level of success can breed hubris. He was interested in how this drives the company to its audacious goals.

It strikes him that what Google claims to be doing is what librarians have been doing for hundreds of years already. He found himself turning to the core practices of librarians as a guideline for assessing Google.

Why is Google interested in so much stuff? What is the payoff to organizing the world’s information and making it accessible?

Big data is not a phrase that they use much, but the notion is there. More and faster equals better. Google is in the prediction/advertising business. The Google books project is their attempt to reverse engineer the sentence. Knowing how sentences work, they can simulate how to interpret and create sentences, which would be a simulation of artificial intelligence.

The NSA’s deals that give them a backdoor to our data services creates data insecurity, because if they can get in, so can the bad guys. Google keeps data about us (and has to turn it over when asked) because it benefits their business model, unlike libraries who don’t keep patron records in order to protect their privacy.

Big data means more than a lot of data. It means that we have so many instruments to gather data, cheap/ubiquitous cameras and microphones, GPS devices that we carry with us, credit card records, and more. All of these ways of creating feed into huge servers that can store the data with powerful algorithms that can analyze it. Despite all of this, there is no policy surrounding this, nor conversations about best ways to manage this in light of the impact on personal privacy. There is no incentive to curb big data activities.

Scientists are generally trained to understand that correlation is not causation. We seem to be happy enough to draw pictures with correlation and move on to the next one. With big data, it is far too easy to stop at correlation. This is a potentially dangerous way of understanding human phenomenon. We are autonomous people.

The panopticon was supposed to keep prisoners from misbehaving because they assumed they were always being watched. Foucault described the modern state in the 1970s as the panopticon. However, at this point, it doesn’t quite match. We have a cryptopticon, because we aren’t allowed to know when we are being watched. It wants us to be on our worst behavior. How can we inject transparency and objectivism into this cryptopticon?

Those who can manipulate the system will, but those who don’t know how or that it is happening will be negatively impacted. If bad credit can get you on the no-fly list, what else may be happening to people who make poor choices in one aspect of their lives that they don’t know will impact other aspects? There is no longer anonymity in our stupidity. Everything we do, or nearly so, is online. Mistakes of teenagers will have an impact on their adult lives in ways we’ve never experienced before. Our inability to forget renders us incapable of looking at things in context.

Mo Data, Mo Problems

ER&L 2010: Opening Keynote – Librarians in the Wild: Thinking About Security, Privacy, and Digital Information

Speaker: Lance Hayden, Assistant Instructor, School of Information – University of Texas

He spent six years with the CIA, after that he attended the UT iSchool, which was followed by working with Cisco Systems on computer security issues. The team he works with does “ethical hacking” – companies hire them to break into their systems to find the holes that need to be filled so that the real bad guys can’t get in.

Many of us are not scared enough. We do things online that we wouldn’t do in the real world. We should be more aware of our digital surroundings and security.

In computer security, “the wild” refers to things that happen in the real world (as opposed to the lab). In cyberspace, the wild and civilization are not separate – the are co-located. Civilization is confidentiality, integrity, and availability. We think that our online communities are entirely civilized, but we are too trusting.

The point is, if you’re not careful about keeping your virtual houses secure, then you’re leaving yourself open to anyone coming in through the windows or the basement door you never lock.

Large herds attract big predators. As more people are connected to a network or virtual house, the motivation to hit it goes up. Part of why Macs seem more secure than Windows machines is because there is a higher ROI for attacking Windows due to the higher number of users. Hacking has gone from kids leaving graffiti to organized crime exploiting users.

Structures decay quickly. The online houses we have built out of software that lives on real-world machines. There are people every day finding vulnerabilities they can exploit. Sometimes they tell the manufacturers/vendors, sometimes they don’t. We keep adding more things to the infrastructure that increases the possibility of exposing more. The software or systems that we use are not monolithic entities – they are constructed with millions of lines of code. Trying to find the mistake in the line of code is like trying to find a misplaced semicolon in War and Peace. It’s more complex than “XYZ program has a problem.”

Protective spells can backfire. Your protective programs and security systems need to be kept up to date or they can backfire. Make sure that your magic is tight. Online shopping isn’t any less safe, because the vulnerabilities are more about what the vendor has in their system (which can be hacked) than about the connection. Your physical vendor has the same information, often on computer systems that can be hacked.

Knowledge is the best survival trait (or, ignorance can get you eaten). Passwords have been the bane of security professionals since the invention of the computer. When every single person in an institution has a password that is a variation on a template, it’s easy to hack. [side note: The Help Desk manager at MPOW recommends using a personalized template and just increasing the number at the end every time they have the required password change. D’oh!] The nature of passwords is that you can’t pick one that is completely secure. What you’re trying to do is to have secure enough of a password to dissuade most people except the most persistent. Hayden suggests phrases and then replace characters with numbers, and make it longer because it increases the number of possible characters required to hack it.

Zuckerberg says that people don’t care about privacy anymore, so don’t blame Facebook, but to a certain extent, Facebook is responsible for changing those norms. Do companies like Google have any responsibility to protect your information? Hayden’s students think that because Google gives them things for free, they don’t care about the privacy of their information and in fact expect that Google will use it for whatever they want.

css.php