All-woman tech panel warns of inadequate online legal protections

OURSA, set up as alternative to RSA conference, says emerging technologies can be used for surveillance of users

Poor security and inadequate legal protections enable emerging technologies and online platforms to be used for surveillance and other attacks on users, according to a speaker panel at OURSA, a women-run, one day alternative to the RSA security conference in San Francisco.

OURSA was set up to run in a venue adjacent to the RSA conference after RSA controversially listed only a single women, Monica Lewinsky, in its initial lineup of keynote speakers.

The all-woman panel on the security and ethics of emerging technologies was moderated by journalist and founder Kara Swisher, who said she believed “the lack of diversity and ethics in a lot of [technology] companies” had led to recent scandals involving user data breaches.

“Tech ethics involves balancing moral, technical and practical factors,” University of Washington professor Anna Lauren Hoffman told the audience.


“These help inform and shape our decisions. Within each of these corners of the triangle there are also little decisions we have to make and assumptions we make about the world and who occupies it,” she said.


Seemingly small choices, such as whether to offer a transgender identity option on dating software or airport body scanning security software, could prevent people being socially and physically harassed and assaulted, she said.

Jennifer Granick, surveillance and cybersecurity counsel at the American Civil Liberties Union (ACLU), noted that “increasingly, modern surveillance is mass surveillance” which can be facilitated by new technologies and the internet.

Secretive large scale surveillance differs from warrant-directed searches by the volume and depth of data and could be abetted by the ease of converting in-home appliances with microphones and cameras into “surveillance machines”, she said.

In addition, she noted the problems of the possible falsification of data, which a victim cannot counter because they do not have to be notified if they have been surveilled; of the government acting as an attacker rather than defender; of the government “losing exploits” such as the Wannacry virus, taken by hackers from the National Security Agency (NSA); the state “importing (software) vulnerabilities and not informing the product vendors and manufacturers; and the overall problem of lack of public trust.

“As lawyers, our approach is to try to legislate out of [these problems]” but Ms Granick said she believed lawmakers and lawyers didn’t have adequate tools to deal with the scale and secrecy of the problem.

Nicole Ozer, the technology and civil liberties director of ACLU California, said companies like Facebook and other social media networks have created platforms for government surveillance.


Some of it isn’t even particularly hidden, she said, noting that police departments have used software surveillance tools such as Geofeedia and Media Sonar on activists of colour, by utilising hashtags like blacklivesmatter.

After this was revealed, Facebook, Twitter and Instagram strengthened surveillance policies – Facebook banned Media Sonar over a year ago – but did not address which types of data were available to third party applications or whether data access could be controlled, she said.

This failure had enabled the Department of Homeland Security (DHS) to bring in a policy to surveil immigrants and anyone who interacts with them on social media. The DHS has explicitly encouraged vendors of security products to exploit publicly available information, she said.

Although Mark Zuckerberg told Congress last week that users "had complete control" over their user data, "the fact is, that is simply not true," she said. The public availability of data such as Facebook profile pictures, cover photos, gender, and lists of networks "has really dire consequences in respect to state of our social and political life".

Better and safer code in applications would help address some problems, but according to a 2017 security report by Veracode, “We’re seeing the same coding errors over and over,” said Ashley Tolbert, a cybersecurity engineer and researcher at Stanford University’s Linear Accelerator Centre (SLAC).

The DHS said in 2015 that 90 per cent of security exploits involved exploits against defects in software, she noted.

“What’s stopping us from thinking about security in our code and giving developers proper security training?”

Many coding schools don’t list security on their syllabus at all, she said, and when it is there, it’s often an option, not a requirement.

‘There are also misconceptions, such as that security costs more and slows down development. But I’d say fixing your reputation costs a lot more.”

Thanks to a new technical development known as “precision farming”, farmers are emerging as a perhaps unexpected group at the heart of many of these ethical and security issues, said Rian Wanstreet, a PhD student at the University of Washington.

Increasingly, farming machinery is carefully monitored and controlled on farms, sometimes remotely, she said.

Agriculture leads in some of the areas only now coming into public awareness, such as automated, driverless vehicles, she said, noting that John Deere not only has a driverless tractor, but is “the largest purveyor of automated vehicles on the planet”.

Precision agriculture lets farmers take data from historical records, satellite images, and sensors all over the farm to create specialised plans for variables such as how much fertiliser to use on a field, where to plant, or where to deploy robots, including vegetable and fruit picker bots and “hortibots” that incorporate lasers and flamethrowers for weeding.


With precision agriculture market growth predicted to expand in value from $3 billion in 2016 to $19 billion by 2021, technology companies like Microsoft and IBM are joining traditional agricultural vendors like Deere, she said.

But this rapid growth and expansion is leading to two big issues: data production and cybersecurity.

“Farmers really want to know, how could this data possibly be used against me? Could the government, a landlord, or neighbours get access?”

She said there’s also a lack of transparency about how algorithms used for precision agriculture are structured. And, she said, “farmers are asking, why should others make money off my data?”

While these are all issues, Wanstreet says she is also worried about the “entire infrastructure. Could actors bring down part of the US annual harvest?” The FBI is concerned about the growth of this sector, she said, and had recently sent out a warning that farmers could be vulnerable to ransomware hacks.

“The open source community thinks the correct solution is to move away from proprietary software to open source tools,” Wanstreet said, but noted that one of biggest problems with such a plan is the sheer scale.

One proposed solution for giving farmers better control of their data is to create neutral bodies to manage and let farmers sell and control their data, separate from the proprietary system manufacturers.

“The issues we talk about today – surveillance capitalism, surveillance platforms - they are becoming as prevalent on farms as they are on phones,” she concluded.

Karlin Lillington

Karlin Lillington

Karlin Lillington, a contributor to The Irish Times, writes about technology