An Ethical Horizon for Artificial Intelligence (Part 1/4)

 

Self-governance leadership can improve the future of AI, if companies are brave enough to adopt ethical tools and new business model leadership now.

PART 1

Artificial Intelligence is mature enough for professional ethics, but legal and academic haggling could roll on for many years; as it has with privacy policy governance.  We are in a world quick to fund and produce weaponized artificial intelligence. Commercial AI is leading a quietly unchallenged data reign, relatively unfettered by ethical disciplines.  When examples of poor ethical behaviour involving AI abound, the consumer public can’t necessarily afford to wait for policy wonks to emerge with a brand of consensus.

A significant percentage of the US academic community enamored with AI will continue to enable power differentials actively harming human rights interests. If you leave subjective ethical preferences to academic AI developers exclusively, you may wait behind the political will of public grant funders.  If you leave ethics to the companies who use and market AI, you might invoke consumer or market preferences, take your business elsewhere and still feel the effects of encroachment.

The future would be bright if a business gets a hold of conscious capital principles. For example. the health food market started out rough. It improved every 2-3 years with better quality food sources, increasingly diverse options and adaptation to culinary trends. 30 years later they have managed to pose significant competition to conventional market offerings. Conventional grocers now adopt more health food due to consumer demand. Competition stemming from privacy limitations sharpens the understanding of what is and can be. If you want more privacy in the market, you will have to create it and the environment for it.

Privacy and security positioning shouldn’t take 30 years due to current levels of risk involved.  You also don’t have to wait long because social and technical innovations are already present in the marketplace now.  Smaller companies are in a great position to adopt a flexible level of UI, security and ethics principles from the ground floor.  Larger companies take much longer to retrain their offerings. Loyal consumers should continue to speak up for what they want and affirm the right direction for privacy and security options.

The good news is AI has reached a level of business and adoptive maturity to qualify demand for ethical balances and corporate restraint. Corporate self-governance frameworks can expedite ethics as a deliverable competitive offering to consumers now.  There are de-identification tools and ethics proposals on the table all over the modernizing world from thoughtful social innovators who want computing futures to succeed without harming consumers.

The span of concerns over harm are proportional to AI’s ubiquitous presence in the marketplace. Big Data (machine learning), the Internet of Things(IoT), and drone robotics are examples of AI innovation bearing conflict to human interest.  Social innovation can help manage need in key areas flagged for ethical safeguards like: bias, fair information practice and proprietary rights with accountability. 

I will examine each of these areas for social innovation in the coming days.

COMING NEXT..  An Ethical Horizon for Artificial Intelligence, Bias (Part2/4)

4 Damaging Illusions to Consumer Self-Protection Online

 

The Internet is a creative, user-endorsed environment supporting information exchanges for every purpose known to man.  So what is it about Internet use that could be so self-defeating when it comes to consumer privacy?

There are a few best-laid deceptions in the marketplace keeping the Internet hostile to personal privacy. 

“The Internet is free."

Have you ever stopped to ask yourself how the Internet can be a multi-billion dollar business and be free to use by so many?   The truth is that the Internet is not ‘free’. Nothing in life is free. There are costs. 

How the Internet pays for free-to-you services, starts with online beacons; which track, trace and evaluate your traffic and identity. This is usually your home or work IP address via your Internet Service Provider.  After awhile you leave a distinct ‘footprint’ online. Then many marketing algorithms compare among each other.  This all takes place hundreds or thousands of miles away from most online consumers at server farms and data brokerage firms.  The data firms keep tabs on any information you volunteer to the “free” service: age, sexual preferences, when you have free time, if you’re working at work or unemployed, what kind of car you drive, so forth and so on.

Then the firms sell it to whomever is buying.  That is how the Internet pays each other millions to stay in business while you use a “free” account.  You are the product they are selling.

“My personal information is protected by US law.”

Test this unfortunate half-truth.  If the government can hack you and never suffer consequence or a corporation can help themselves to your contacts, with no consequences over a period of years, are you being protected by the law?

There are a wide variety of laws, but no holistic federal law to protect all consumer data and personal information. Protected areas of consumer privacy are scattered through a variety of policy areas: health, employment, driver protections, data breach notification.  Protections also vary from State-to-State in the US, but again, no holistic area of coverage. There’s just a sense of policy running scared from your serial outrage in the Capitols of our country.

Some countries and continents have a national consumer data protection policy or law.  In the US, it varies from agency to agency.  So privacy protection according to US nation State and member States remains as spotty as a Jackson Pollack painting.

The best fix for this problem is a fearless examination of State & Federal privacy laws to cover the areas you are most concerned about.  You can do a casual search online or visit your local law library.  The more informed you are, the better decisions you will make when it comes to who you trust with your information.

“I am owed whatever I can get from the Internet.”

Nothing sets you up for failed privacy results more than presuming that someone else’s server farm, computing code and the staff hired to market and manage your transactional information are beholden to you.  If the Internet architects can fool you into believing the space you rented on the currency of your data is actually yours, you are deceived. 

This illusion is typically dispelled by being booted off or banned by an online moderator. Some have attempted campaigns to collect on online company space because they are avid users. They are likely presented with a document created by a very well paid army of lawyers telling them how the information they fed into their system actually belongs to the company because of an End License User Agreement.  That would be the biggest deception of all.

The only thing you own in the cloud is your information and your data. That never changes. If you want to change the balance of user power, you have to stop feeding the beast the data it needs to thrive.

“I erased my data.”

There’s a saying in the privacy field that ‘data never dies’. It is somewhat true. Forensics teams use the same tactics corporate data recovery pros use, say, after a storm surge knocks out computer networks. That’s great news if involuntary data loss would ruin your business or create financial havoc. However, if you wanted to scrub personal information from the online universe you will need to visit a different kind of reputation specialist, like Privacy Duck

These service specialists address unique data brokerage and reputation conferencing strata called, People Finders, who license personally identifiable information.  People Finders sell your address, location, work, age and contact information to anyone for any price.  An even less legitimate version of this takes place on the dark web to online criminals. 

THE BOTTOM LINE

If you want to better protect your personal information, adopt a consumer privacy regimen for your household. You are always the best gate-keeper of what goes in and what comes out of every information portal of your life.  Digital privacy is a new consumer discipline.  However, it is having increasingly great & powerful results coaching the market to regard your privacy.

You can be the next person in line to demand anonymized data ecosystems like PDDP, HTTPS, increasingly secured communication, encryption, and ad blockers.  If you already use services like Ghostery, Mozilla private browsing services and anonymizing search engines like, Duck Duck Go, you are on the path to reorganizing an online currency system. 

Online businesses continue to put your privacy on the sacrifice altar when they don't have to.  Your part of the business end of your agreement needs to require privacy by design, warrants for your data, and to anonymize data they use in marketing exchanges. 

Demand better protections. They are technically within reach.

 

 

 

 

 

 

 

 

When IMSI became Mass Surveillance

This is an infographic I sent to Harper's Magazine in query that was thrown into "that hole" and never heard from again.  This took quite a bit of time because Tim Cushing & I spend a couple of hours pulling this information up from the dusty archives of our memory.  We had to scrap plans to co-author an article.

Unfortunately, this is all I have to show for our never-happened article, "Sunset at Midnight: How law enforcement use of cell site simulators numbered the days of the Patriot Act" .

Read more

When privacy apologetics are like 'vegan leather'

What is vegan leather? 'Vegan leather' is a term of pretence representing a leather-like product made from vegan materials. The label presumes no animals were harmed in the making of the product. In some stores, you can purchase 'vegan leather' as a dead animal hide dyed in all natural vegetable dyes made from plants.  The leather is not vegan, but the plant dyes are 100% vegan. 

Real vegans typically won't buy 'vegan leather'. They'll buy belts and shoes made from felt, rubber, canvas and vinyl. Fake leather products are not usually labelled 'vegan leather'.  They have labels or tags detailing the nylon or other synthetic materials.  However, you'll never know what kind of 'vegan leather' you might be dealing with unless you investigate further. 

'Vegan leather' can be a misleading marketing term for the ignorant and/or superficial crowds who will buy things to appear more 'conscious', rather than actually being more conscious. What would motivate someone to buy a product to openly exhibit their misappropriate ethics? Whomever they are, they feel compelled to camouflage themselves among those with high ethical standards. This is so they can witness something they'll never be committed to doing unless the standard hits critical mass. If someone is buying vegan leather, the ethical numbers have these actors & actresses on the defense.

So how are privacy apologetics like vegan leather?

Before I say anything, I respect the efforts of all privacy proponents when they actually are being proactive, regarding data ownership and using ethical privacy UX development practices.

However, there's a wide berth between professional practice of "user privacy principles" and realtime market practice of privacy.  That's why you see all the news drama and color between the license and spreadsheet firesales of PII and an employee-caused-breach leading to civil liability.  The truth is somewhere between Privacy by Design and Hasn't-gotten-caught-by-the-FTC. 

For instance, it may feel counterintuitive to ask an institution like the NSA to adopt basic privacy principles, but it isn't.  If the NSA, or any other mass surveillance aperture, is collecting PII and diverse sensitive personal information, they are responsible for protecting that information.  Every other business and institution on the planet has to regard personal data rights or face civil liability.  They must comply with the laws that protect data owners just like the Big Data 4: Google, Microsoft, Facebook and Palantir.

"BEWARE THE API"

The Big Data 4 are also the face of corporate, or privatized, mass surveillance (SEE: PRISM & Snowden Leaks). They still hunt and gather for global intelligence authorities depending on the purchase (or legal) order from mass surveillance authorities on any given day of the week.   

Do they regard privacy?  The answer is, more soberly, "When their lawyers say so." They face federal regulatory conventions that place fetters on their ability to completely disregard user privacy. The difference between them and a hacker who breaks into steal your information is a 15 pg Terms of Service agreement. This rationalizes your consent to trade use of your datasets in exchange for an account or use oftheir "free" service. 

It has turned out to be more of a faustian bargain with the devil. 

So when Facebook and Palantir, both data intelligence gatherers & InQtel startups who own large parcels of Palo Alto Real Estate, put on a Privacy Conference in Sweden it does not seem like authentic privacy standardization at work. By another label, I would call it the privatized Hearts-&-Minds Swedish massage package, as a complimentary consolation prize for sunken US Safe Harbor conventions. Safe Harbor was a years long triumph in privacy apologetics. It is being mourned by people who really don't care about authentic global privacy conventions.  I would call this occurrance a case study in gross privacy apologetics, rather than professional privacy pragmatism.   

I did think, "Oh this is just 'vegan leather' for Euros who 'lost' something in Safe Harbor."

I can assure you Palantir's rendition of 'vegan leather' won't hold a candle to Privacy By Design. Not even close.

 

 

 

 

 

 

 

 

 

 

 

 

EU's Safe Harbor Invalidation means 'You are free to do better.'

Europe’s privacy offices are now empowered to do more than look the other way at compromised data transfers.

Data transfer practice at US companies have reputedly poor standards facing the rest of the globe, particularly developed countries in the EU. The plain sense of the Safe Harbor agreement was to create a protected data pipeline to and from countries across the Atlantic. Unfortunately, actions conducted by US and UK intelligence authorities really victimized Europe’s data partnerships in the vagaries, compromising the intent and integrity of Safe Harbor agreement.  This is has led to an Irish based uprising to successfully invalidate a law that provided no useful protections for data transfers.   The ruling will impact the way e-commerce is conducted nearly immediately.

Privacy officers are scrambling to gird themselves under Article 25, EU privacy law. The are throwing out ‘model contracts’  as life savers, now they have been dumped overboard.  They are consulting each other on the would-be Safe Harbor 2.0.  Some coming in the form of binding contract resolutions, deferring to the standards of third party countries (Switzerland), auditing the existing data transfer priorities in order to produce a legal, viable alternative to continue commerce and trading.  Some are even still standing on the sinking ship saying, “You may still honor the Safe Harbor stand… BLUB, BLUB…*!”

There is enormous potential for good to come from an upset; that privacy counsel, Daniel Solove, attributed to “cavalier attitudes” of US governance toward EU data protections.  The legal privacy vacuum opened up by the Safe Harbor invalidation can now be filled with far better standards for the human rights of computer users in Europe. 

The EU has initiated an atmosphere of conditional embargo with some potential for US-EU commerce based on practice that has failed to protect consumers. Unlawful smash & grabs of non-criminal data based on US laws conducted by the Five Eyes/ECHELON group violates computer users everywhere. The EU now has an opportunity to impose standardized consumer data protections with some real teeth on countries in violation of UNHRC privacy rights. They have cause to cease business relations with any county that doesn’t honor its agreements and violates the human rights of its citizens.  While no country wants to pause commercial relations for long, the standards erected now could influence the way US companies collect and distribute data in a global economy respecting privacy.

There are terrific, diverse solutions for a higher global privacy standards. Almost instantly, the Snowden Treaty became a relevant goto for trade reform standard discussions. This suggests trade relations standards would not be harmed or frozen indefinitely over government spying, if companies assume socially responsible protections that do far better than existing law and governance policy. Privacy officials can now bring their most ethical and use friendly solutions for data management to the table to reform conventional business practices that even the most lazy and apathetic corporate counsels would be forced to conform to. US businesses may see the the data protections as a legal relief to require company wide adoption of encryption for all of their consumer and company products. Socially responsible privacy practice has legal means to flourish now the Safe Harbor falsehood has fallen apart.  Finally, they are free to do better.


Reports have indicated that at least 4,500 US companies will be impacted by the Safe Harbor ruling. US businesses will sustain some suffering under drafting of Safe Harbor 2.0 scaffolds, but it’s really for the best.

Businesses should assume more risk for privacy

Company privacy policy can do more for the future of businesses if they forge ahead on the curve of socially responsible consumer offerings.

So much of privacy practice centers on the case for threats: threat evaluation, legal risk mitigation and management, civil liability insurance in case of a breach and information security to ward off a breach. Everything orbiting company privacy policy seems to be on some sort of fire plan for businesses.  There are reasons for that.

Most businesses aren’t on the cultural advance towards privacy. They approach digital privacy primitively. They throw a stick at it. They try to make sure they don’t lose any business due to government rules.  The matter gets so complicated so quickly, many of them are inspired to delegate the whole matter to a lawyer. The lawyers hired do what is lawfully required of a business so they don’t get sued by consumers and sanctioned by the government. That doesn’t always pair with the interests of the consumer, left in the cold from data fire sales and 3rd party information marketing.

Contemporary governments serve double standards towards corporate privacy standards.  They are an opaque partner who wants exclusive views into business operations and their customers, often without a warrant.  Governments give themselves legal means to coerce a business to give clandestine access to consumers who trust them with their information. The lawfulness of government orders are subject to an ongoing debate.  Most businesses do not want to rock the boat, denying access requests from the government. So businesses have to balance what may be legal with whether government requests are socially responsible. 

Governments treating everyone like persons of interest, make little distinction between criminal & non-criminal for targeted surveillance. They approach the best and most productive companies in the US and the world to make demands that they insert surveillance capability to watch their customers. That’s not in the business plan for most businesses.  It’s certainly not in the interests of continuing business led by consumer trust.

Consumers have different needs now. What consumers are left with in privacy means are a totally unsatisfying and treacherous experience. Consumers face diverse privacy perils if they choose to adopt a new online service or work with businesses who really don’t have a socially responsible privacy practice.  They don’t owe anyone any business who will trade up their privacy for a more cushioning of their business.

Consumers are going to do what’s best for them.  They will look for companies with good intentions and great information security practices.  The relief experienced by a consumer who has the option of adopting privacy ware is immense. For instance, a mobile device company who offers a great User Interface is a good candidate for strategic partnerships with privacy ware developers and applications store offerings.  They have a department that invests in a great user experience. It’s not illegal to produce privacy ware and produce encryption. To sell it or offer it to privacy concerned consumers opens a new market.  So why not offer more privacy provisions to customers and pass along the costs to try something new?

On the cutting edge of privacy

Privacy led development can give businesses a new edge in markets who left consumer privacy behind.  You can find ways to make privacy applications more diverse, invest in practical research (like active penetration testing & hackathons), and produce an offering that so many companies need.  However, so many data driven businesses will never see that as an opportunity.  They only see a rival.

It may be that a business just has a generally nasty attitude towards privacy compliance due to internal conflicts with branches of their business model. Companies have competed for data sales since the 80’s as a buffer against harder times. Many have also settled in comfortably with close relations with governments, developing contracts for products and services to serve their needs.  Providing non-government customers with products that curtail government access is seen as a conflict of interest. These companies have in fact become combative with privacy advocates seeking better privacy service offerings to consumers.  After taking the hard road, it would be difficult for them to conceive of the pro-sumer privacy offering.  It may also be their saving grace, especially if businesses thrive on meaningful innovations.

Business edge is about giving the customer a competitive option.  They can go to a privacy shirking business to try to get their needs met or they can go to someone else who has a real offering for them.  

Businesses need an intelligent approach to privacy that works within the marketplace.  It begins with a brave internal audit for privacy practices to meet the standards of the pro-privacy consumer.  That means recruiting people who have the right networks to help organize research and business partnerships with vendors who tailor technology suites for your customer base.  Businesses who take bold, intrepid steps to develop an internal consumer practice, that if adopted in stages, will validate consumer trust and brand loyalty. 

It may be time to allow a new privacy policy to create your loyalty leaders for the future.

-Sheila Dean

For more information about privacy logistics and social responsibility planning, please contact me for an introductory consultancy meeting.