Mar 19, 2024
205 Views
Comments Off on This is a story* of how people lost ownership of their data to corporations
0 0

This is a story* of how people lost ownership of their data to corporations

Written by

*That makes sense from an information architecture and cognitive behavior point of view, and triangulated against what I saw first hand

Stories are what ground us in our human condition. They frame our world and help us develop a little more understanding. We fear strangers based on Little Red Riding Hood, and Brene Brown reminds us that “it’s a story we tell ourselves” as we’re misapprehending our social interactions. Effective marketing uses story to provide more information than the word count in the ad.

Stories are a relatable framework to provide information.

Your mind, your decisions, and even your actions are predicated on the information you have available to leverage. All sorts of things go into the parsing, including (but not limited to) physical attributes of memory forming, our perception, and what information is in our path. The cognition happens in instants, but it’s based on known and understood formulations. Complex, but swayable.

Part of all of our stories right now is a misuse of personal information.

What if we got here mostly by accident?

“Here” being where the-data-of-me is:

sold on the open market for others to combine into a full and nuanced picture of behavior;
Search for “consumer data aggregators”; it’s not just Cambridge Analyticaused to lead by misinformation, targeted and served based on behavioral nuances;
Cambridge Analytica and Facebook: The Scandal and the Fallout So Far, The New York Times, by Nicholas Confessore, April 4, 2018
The Great Hack (documentary film), Directed by Karim Amer and Jehane Noujaim. Netflix, 2019.tacitly part of every sale and search and an extra layer of profit, which “can’t” be avoided if ethics begin and end at shareholder value;
Sale of personal data, www.privacyaffairs.com
Data is the new gold — how and why it is collected and sold, Usercentrics.com
Consumer Data: Increasing Use Poses Risks to Privacy, GAO-22–106096. Published: Sep 13, 2022. Publicly Released: Sep 13, 2022.
Target sued by investor over backlash to LGBTQ merchandise, Reuters, by Jody Godoy, August 9, 2023 (I know, not direct — but if people will sue based on store displays they don’t agree with, using the argument of ‘shareholder value’, the argument for a known profit base is hard to ignore).collapsed into dehumanized “training data” for an AI to digest and spurt out as a net-new “creation”, specifically in terms of our sweated-over outputs;
Sarah Silverman sues Meta, OpenAI for copyright infringement, Reuters, by Jack Queen, July 10, 2023
John Grisham, other top US authors sue OpenAI over copyrights, Reuters, by Blake Brittain, September 21, 2023
The Times Sues OpenAI and Microsoft Over A.I. Use of Copyrighted Work, The New York Times, by Michael M. Grynbaum and Ryan Mac, December 27, 2023
Generative AI Has a Visual Plagiarism Problem > Experiments with Midjourney and DALL-E 3 show a copyright minefield, IEEE Spectrum, Gary Marcus and Reid Southen, January 6, 2024
The Intercept, Raw Story and AlterNet sue OpenAI for copyright infringement, The Guardian, by Nick Robins-Early, February 28, 2024
LinkedIn post with synopsis, opinion, and litigation paperwork, by Barry Scannell
Techno-Optimist Manifesto by Marc Andreesen (you have to read between the lines and trace his core precepts)while that same AI can function in an abstracted layer, easily sequestered/moated for profitability;
OpenAI is set to hit $2 billion in revenue — and fast, Quartz.com, by Michelle Cheng, Updated February 12, 2024
How OpenAI Transitioned from a Nonprofit to a $29B For-Profit Company, hackernoon.com, by Chinecherem Nduka, March 27, 2023seen as marketing, sales, HR, etc., data points rather than people;
Code-Dependent: Pros and Cons of the Algorithm Age, specifically Theme 3: Humanity and human judgment are lost when data and predictive modeling become paramount, Pew Research Center, by Lee Rainie and Janna Anderson, February 8, 2017
Weapons of Math Destruction by Cathy O’Neil, 2017; Penguin Books.and that data impossibly optimized until HR, etc., search for a perfection that doesn’t exist.
Persona (documentary film), Directed by Tim Travers Hawkins, Mark Monroe. HBO Max, 2021.

This isn’t every way in which our data-of-me is being misused and abused, but it’s some of the more problematic. Is our humanity being…lost? Set aside? Forgotten?

If we look at the outcomes, we could very easily decide that our lowest denominator must become our common denominator just to stay competitive. We could decide that competition is our species’ modus operandi, since what we do in the name of competition is what becomes the norm. Meta/Facebook became the financial powerhouse it is because they baked the use of divulged data (in the name of being social!!) deep and early into their business model and algorithms (Facebook privacy hearings, Facebook data collection); it’s easy/simple to decide that’s why everyone started capturing and using personal data. This continues, so more people are complicit in it every day. Right?

But how does that reconcile with the fact that, well, yeah, I’ve dealt with dark triad personally (psychopath, Machiavellian, narcissist; read anything by Robert D. Hare to build understanding). But…that’s not everyone. They are a vanishingly small percentage of our whole. They exist, really and truly; but they aren’t most of the people you or I will meet in our lives. Instead, what if their behaviors — the disassociation between other-harm-done and personal benefit, gaslighting, goal-post moving, just to name a few — have been embedded in systems that formed as an ongoing effort to solve the next problem.

In other words, what if it wasn’t a matter of mass dismissal of humanity-as-empathic-community-beings, but a system that grew until it was co-opted by a harmful few, who then set a new standard? Then it just becomes a problem to solve against. It becomes of point of acknowledgement, just like when we acknowledged that people murder, so we have to find a way to deal with people who murder and the nuances of the concept.

So here’s the story I built based on that what-if scenario, informed by my understanding of information architecture and human cognitive behavior, and triangulated against what I saw first hand.

It starts with people.

You know people. We have padlocks, and passwords, and we don’t share all our information with everyone, everywhere, all of the time — because of people. Just that little bit of friction is enough to remind most that, hey, your curiosity notwithstanding, this is my space. We build rapport before we share. We know where to go hunt down Bob before we give them a key to our tools. We understand Pat won’t kill us for our cow before we let them know where the cow (and ourselves) are located.

That little bit of friction is also a signal: if someone is taking too long at a locked space, the community is more likely to take notice. Accountability comes into play.

It’s a matter of boundaries and safety.

We don’t tell everyone everything. We share what is needed for the interaction at hand, and we keep some information sequestered.

Entire cultures form around the exclusion/inclusion of variable ‘standard’ information exchange. Names, representative images, glyphs, household addresses, religions and more have been earmarked as expected, sacrosanct, or worthy of lying about while still being considered a good person in an effort to maintain safety. Our fascination with this is evident in shelves of ethnographic studies going back to the formation of Anthropology as an area of study.

We walk our bodies and personality out in the world at large, and keep some information private to have safe spaces.

Think of it like holding data behind your back.

Two overlapping circles, one symbolizing the whole person and one a subset of personal data.

Then we started exchanging specializations.

AKA, in today’s parlance, “doing business”.

But to do it without being undercut by someone who asked pertinent questions (it always takes less time to learn than to figure out), we learned to keep certain information…unavailable. And we’ve been assuming this as the standard for maintaining our specialization since at least the time of guild formation in Medieval Europe. (1)

Business, after all, starts with finding an object to productionalize or a service to perform that is above par.

In order to keep it performing better-than-average, the ‘smart’ business sequesters the relevant bits and pieces under Intellectual Property (IP) protection. It’s a form of forced scarcity: only we know how, so to get value you need to hire us.

Think of it like holding data behind their backs.

Two overlapping circles, one symbolizing the whole entity (corporation) and one a subset of Intellectual Propery data. The entity is connected to a box representing object/service of interest — what it intends to sell.

For “business” to commence, a person needs to see a thing-for-sale

Somehow, some way, that special thing (controlled by IP!) needs to be put on display to spark interest.

It’s as old as specialization. It happens as regularly as a cup of coffee, buying a ready-made pie, or consulting an expert.

Someone has to have their desire for a thing engaged, and the object/service needs to be findable. It would be fascinating to go down a rabbit hole of the history of marketing, but someone has done it really well (The Century of the Self, Adam Curtis, 2009), and it’s not a key bit for this story.

Building on the last: the person objects (whole and subet) with a dotted line to the object/service presented by a corporate entity.

Communication ensues

The entity trying to sell something posits what it needs from the entity who wants to assume ownership of that something.

And here’s where the first big mental shift happens: money is just information.

Beyond the symbol (marker-filled clay spheres, courie shells, gold, paper, bytes, etc.), money is a time-spanning, resource-allocating, fungible agreement tool.(2)

That’s all. That’s it. It’s information put in symbolic form, attributing a fungible set of others, and shifting with the vagaries of interpersonal behavior. Interpersonal behavior is, itself, another information set (3) until it results in action: sharing, violence, movement, etc. Exchange.

Start adding in the complexity of the transfer of ownership, and the information can increase. Transferring money without cash needs a slew of additional information exchange: banks and account access, verification that you are the true owner of that repository, etc. Delivery needs an address. Warranties need a communication method.

Building on the last: direct/bilateral communication between whole-entity and whole-person, with the entity saying they need some of the data “help behind back” of the person.

For generations — millenia? — information was shared and expected to be relatively transitory

The exception was the transfer of the money, and the transfer of the object/service. These were transferred in whole, with the new holder the decider of what happens to the money, object, or embedded knowledge of the object/service going forward.

All the other information? The transferred data was subject to the fungibility of memory, the effort of documentation, and the space required to store documentation.

Repeat business increased memory potential. Everything else was somewhat haphazard, prone to the vagaries of individual memory and environment.

Once the accountability was managed (delivery, taxes) the business could do what it wanted with the documentation — including destruction.(4)

Who wanted to keep reams of unnecessary paper, continually expanding?

Building on the last: communication shifts to transaction, interest becomes attainment, and data requested is provided. Data request is assumed to be tempory. Accountability and forming reputation is added to the entity side, with human memory making the data capture temporary.

Then IT happened.

Information technology automated documentation and reduced physical space while requiring zero effort of functional personal memory

The transitory nature of the additional information — for delivery, for future communication on warrantied products, etc. — was made moot.

I can easily see a behavioral mental model of, “well, the reason to destroy the records (office/storage space) is mostly moot, so…keep them?”

Building on the last: Data request becomes permanent. Accountability and forming reputation is supplanted by entity-controlled personal data capture.

But the buyer’s assumptions didn’t change

Built over the course of generations/millenia, seeing no overt reason to think otherwise…why would a business continue to know where I lived, unless there was something that sparked their formation of memory of me?

I was working UX problems when this was still a part of the deep thinking: we help them feel known, we help build trust, by making their information something they don’t have to enter a thousand times. That was the focus of every conversation I was exposed to around personal information, until I started getting into CRM use cases.

The core message was: we remember you, you special darling.

Building on the last: Data request still experienced as temporary, but is functionally permanent.

BUT the information requests surfaced earlier, and expanded

What MIGHT have started out as a way to de-anonymize transactions that were happening without social interactions became…greedy.

Focus on the personal information asked for a specific function, and how the inclusion has expanded beyond the immediate need to increase future conversion potential.

Someone figured out they had all this contact information. Nomadic salesmen and traders had been a part of society (shifting aspect based on culture and tools) for millenia.(5) Now we had contact information in a readily parsable format, and a sense of acquaintanceship through direct, personalized information being included (your name, at least). It was no longer a stranger knocking on a door, but someone met-and-forgotten. Oops, let’s listen and smooth the social fumble.

This is the second major mental shift: our assumptions were no longer valid. We people have a tendency to think that what seems to work the same based on the parts we see, works the same in the parts in which we don’t have transparency. It’s easier to charge faulty memory than a paradigm shift. Believing the unseen portions have changed, especially without clear paper trails, is a short step away from conspiracy theory (critical thinking and skepticism are the bulwarks). All the evidence that most people see is inconclusive and easily refuted by our understanding-of and using standard memory: not remembering when you had contact with someone, not noticing how many ads are following your online pathways for how long. (6)

Until someone did the research (see the introduction for a starting point), or were involved in the formation of the databases and pathways, it was safest to stick to what was understood. Don’t want to be called a nut? Don’t think the world has changed when there’s no outward appearance of change.

And then someone else figured out that this was an information treasure trove. People would pay for this smoother introduction. Sales were made, profits logged, and businesses expanded. A few (ahem, Facebook, see introduction) even jumped the gap to, “more behavioral information helps us target ads, target contacts, target messaging.” Which, yes, it would be a boon to cognitive load if the person receiving those ads controlled what they saw, which was part of the early spin that I remember. But that wasn’t the defining pathway. Now a complete stranger — with one-way insight and control— had all the information to present as a best buddy, trusted with and/or “happenstance replicating” your inner cognitive workings. Trust and recognition are powerful; they pave the way to accepting shared learning outside of someone’s core understanding — like trusting your parent with tax advice even though they work as a chef. It might be perfect, or it might be filled with holes; but the trust is the starting point, not critical thinking.

And because it was NOT with social interactions, there was no way for people to talk their way out of it— no way for the frustration to become obvious and expansive, tied conclusively to sales. There was no way to understand or develop care that people were frustrated and cynical with this increasing number of complete strangers saying, “Hi, [you]! Buy our stuff!” It became a numbers game instead of a process to change, because there was no negative feedback.

It worked enough, and the failures were never seen to contextualize it.

AND the data capture became untied to accountability and forming reputation

Think of it as a variation in Conway’s Law.

Conway’s Law states: [O]rganizations which design systems (in the broad sense used here) are constrained to produce designs which are copies of the communication structures of these organizations.

Or, restated: how you are grouping information to have a finite, right-sized-skillset group of people work through it’s complexities in finite, dealable chunks will affect how the information technology supporting it is constructed.

Because the whole captured dataset became so huge, with special workflow and skillset requirements for parts of it, and the accountability factors didn’t require all of the information, it (mostly hypothetically) got sectioned into multiple databases. Because only certain bits of personal information was regulated (e.g., SSN, credit card details), the bits that weren’t covered could easily be put in another database, with no-or-limited oversight, privacy, or confidentiality controls.

Theory: once it’s set aside into another location to orient to, with a different level of access and controls, the mental orientation of the ethical relevance of the information also became fungible and disassociated from the ethics of accountability.

Rephrased: we didn’t have to care, so we didn’t.

Building on the last full diagram (two ago): Accountability and forming reputation re-emerges as separate data function, tied to the transaction.

EVEN IF mentally the business still considered personal data tied to forming reputation, accountability expanded into its own workflow for transaction

Every entity involved in the transaction process, whether as interested parties (banks and credit cards) or as overseers (IRS, consumer protection) needed a little more data, compounded by every transaction.

Why? Because a select few individuals did really shitty things. Which is the same reason people didn’t share all their everything with everyone on first contact, way back when people started deciding to not share everything with everyone. It’s also the same need to manage egregious behavior, like murder, because that it happened was irrefutable; and, we wanted to mitigate its increase and continuance before it happened to me-and-mine. Talk about your beautiful cow and where it can be found, and the cow goes missing; sometimes even with life lost.

Except now all it took was one person (or one group of people working together) to impact on a mass scale. In the time it took to steal one cow, an overwhelming amount of data about a ridiculously large number of people could be captured.

Building on the last diagram: Accountability and forming reputation get pulled apart as separate functional databases.

AND NOW the data privacy / confidentiality pieces are getting so egregiously out of whack with ethics that they’ve spawned their own need for accountability

HIPAA — 2003GDPR — 2018Califoria Privacy Right Act (CPRA) — 2023

Just to name a few of the increasing and expanding regulations. There are 22 countries and 15 US states currently enforcing, shepherding entities to deadline, or likely to pass laws. The space is a moving target, with very opinionated lensing happening; one entity that tries to remain out of the opinion and capture how it’s moving is [https://iapp.org/]. To be fair, though: they are a privacy organization, and that’s an opinion as a starting point.

When entities select their short-sighted goals as more important than anything or anyone else, regardless of reason, we have to form group agreements and provide a framework for adherence and accountability. It’s that friction provided by a padlock, and a community taking notice that, hey, someone is overstepping, and accountability is in order. That’s what these regulations are doing, in reaction to unabated, “nope, we want this — look, it’s their fault the door is open, they are implicitly asking us to take it” when people tried to reason with them.

Building on the last diagram and the focus-state from three diagrams ago: Personal data hardens to permanent in experience, and an extra line of data movement is added between the personal data capture and standalone accountability function.

In a way, it started with that sense of “protect IP” — of a sense that information is owned by the business, for it to use for profit making

We built out the capture-and-retention to humanize those technology-driven faceless interactions;Sundered from the empathy-building face of personal interaction by technology;Combined with the knowledge that there are usurious others;Tacitly separated from ethics in the formation of the storage solution;The data itself become both the means to gather more sales, and the selling of which can produce profit — two (profit) birds with one stone;Creating an overall dehumanization of the data from the people it represents.Same as last diagram, with Intellectual Property (IP) data highlighted.

A story of the behavior is ultimately just a window into balancing the effort; the behavior itself is harming many to benefit a few

The behavior is implicitly and explicitly dehumanizing as part of the construct of the information

The information has been visually and mentally separated from any potential empathy to the individualThe information has been visually and mentally separated from transference of ethics by way of accountability oversightThe use of the information has become a first-person solution (we profit, I produce more ROI/get raises/land “better” jobs) with no links to the consequences of the leveraged individual that the data represents — the actual first-person owner/life of the data

To “do it right” means that the individual who the data represents has to have a say in the use of that data. That’s what the laws and regulations are aiming for, some more explicitly and to deeper degrees than others.

We want to be stewards of our own cognitions, decisions, behaviors, and actions. We want to have agency. Not only that, but we are expected to be held accountable for our every choice, set against the social fabric. You and I have potential social repercussions for who we love, what we do, and how we live regardless of the pathway to that choice and point of action.

The more our data is used to put other’s agendas in the path of individuals as a higher priority than individual agency, stewardship of our lives becomes harder. Add the urgency of late-stage capitalism, the backwards meaning of work-life balance, and how many points of impact our everyday cognitive load has to navigate, and that stewardship cannot live at the same level of priority held by those trying to derail our agency for their benefit.

When taken as a whole, the outcome of all this personal data capture, retention, and use is to short-circuit thinking to sway others’ decisions towards a solipsistically preferred outcome — even if in the process the collateral damage is in sundering any realistic hope that free will exists, dismantling democracy, or making the planet uninhabitable for humans.

I do not believe that this was the intent of most of the captures; in quantity, by tool, at point of contact. It started — for most — in de-anonymizing an anonymous exchange, in trying to maintain a sense of connection and community. It took one person/company, seeing a broader opportunity without considering (or potentially caring about) the dehumanization and societal destruction it could support; and another to see their minority goals attainable; and another to try to keep up; and so on. Who/what that entity was, whether or not they are still functioning in society, is moot. The effects are present, and that is what we need to fix. We need to hold society, as built from individuals and their lived experience, as a higher priority than the continued profitability of usurious business models.

Technology is powerful. It is literally a point of mass impact; one process understood and encodified to be leveraged to potential infinity (Web of Make Believe, Ron Howard and Brian Grazer, 2022 for various narrative instances of individuals using it as a tool for expanded impact; Weapons of Math Destruction, Cathy O’Neil. 2017 for algorithmic instances). It captures cognitive biases and mental models at the same time, applied without reflection as automation bias kicks in — the belief that the code knows better than you do, without taking that next step to understand that people wrote it, or remember “garbage in, garbage out”.

This is about people, in all our quirks and inconsistencies; in our brilliance and shifting priorities and continuous learning; in our malfeasance and subtle misunderstandings that cascade and escalate into wars. People, functioning in our skewed priorities that imaginary value (money) is important to the point of exclusion of all else; more important than the environment that supports our continued existence, or in the people that form our community in the past/present/future.

People are not perfect. Despite the gaslighting/goal-post-moving that is so prevalently used in our culture (as mimicked behavioral leverage points, not as evidence of dark triad — in my opinion), especially by those in power who wish to maintain it, perfection is not the starting point of our valid contribution to society. Breathing might be; I don’t know, I’m not perfect. But if we continue to allow dysfunctional processes to survive because we have not found something we can all agree is ‘more perfect’, we’re stuck in a cycle of escalation instead of change.

Keep working information unavailable. I turned to fiction and some rough history for this. Ken Follet’s Pillars of the Earth and his description of the guilds intrigued me enough (15, 20 years ago?) that I went spelunking. It wasn’t a deep or robust spelunking, but enough to believe the description was on track “enough”.Money is a time-spanning, resource-allocating, fungible agreement tool. This is my summation based on scads of financial reading; I was designing in investment management for years, read all the pieces and a good chunk of the sources referenced. Money, The True Story of a Made-Up Thing by Jacob Goldstein is a fabulous book that explores many of the underpinnings, with much more deep knowledge and evocative writing than I could pull together.Interpersonal behavior as another information set. My core competencies are information architecture and cognitive behavior. They became my primary lenses to understand how to parse together UX design, which shifted to anything technology-and-digesting-information, and personally expanded to include our shifting realities. I am of the opinion that cognitive behavior is a complex information problem; interpersonal behaviors are simply (acknowledging that “simple” is only a surface) the butting-up of multiple cognitive behaviors. So, complexity, compounded by interaction, but ultimately predicated on information.Destroying documentation once accountability was managed. My summation based on pre-computer working in retail and seeing their built-up receipts, hearing everyday frustration, and witnessing the joy when they ran across boxes old enough to destroy; navigating paper files and when we could send old files to be shredded; and the continuing boiler-plate in LLC formation documents with 3-year retention as a starting point.“Nomadic salesmen and traders had been a part of society (shifting aspect based on culture and tools) for millenia.” This is an odd parsing, but one made for elegance and getting to the point. One of my fascinations as a child was far-reaching spice trade, Silk Road, gypsies, traveling/door-to-door salesmen, and even pirates. There is no way for me to back into what I consumed — stories, history — that can provide a short-version starting-point of what went into this summation.My theory of the second major mental shift, influencing our digestion of what information was being tracked/used. Cognitive biases are a fascinating subject, and include and are compounded by theories of memory, epistemology, and the development of trust. Look towards Steven Pinker, neurology textbooks, Plato, and Brene Brown as introductions.

This is a story* of how people lost ownership of their data to corporations was originally published in UX Collective on Medium, where people are continuing the conversation by highlighting and responding to this story.

Article Categories:
Technology

Comments are closed.