Why we care about Tech for Good and Consumer Empowerment
A few weeks ago I had the honor to speak on a Tech for Good panel co-hosted by Enterprise Ireland and Deloitte Digital. It was entitled “Tech, Media, and Ethical Innovations” and also featured Facebook VR Product Designer Michelle Cortese and Newship founder and CEO Paul Quigley.
On the panel, we spoke about the intersection of tech, humanity, and design; shared mistakes we think the tech industry has made; and discussed where we are today and where we hope to go in the future as builders, founders, and investors.
In the age of surveillance capitalism and privacy/data breaches, we believe doing the right thing as a business will be a competitive advantage in the next formative decade, a view both Michelle and Paul echoed on the panel.
Paul’s startup Newship tracks the spread of information online, including fake news. The technology behind Newswhip can predict what’s going to go viral in the next 24 hours. He spoke about the need for companies to do the right thing and think about the information and media diet consumers are ingesting every day.
Paul believes big tech companies can no longer be ingestion engines or agnostic distributors of content. They bear the responsibility for the types of information that spreads on their platforms and its effect on society.
Michelle spoke about her research in the VR space and the negative impacts VR can have on victims of sexual assault. She has been designing for virtual respect and consent in VR spaces. Many victims of sexual assault may experience triggers in the VR space, mimicking their assaut. About 50 percent of women in VR have experienced some form of sexual harassment, and many victims of sexual assault experience triggers in the VR space.
Michelle has been designing for virtual respect and consent in VR spaces, and published a paper in Facebook Research about how to design for safe spaces and respect with her VR coworker Andrea Zeller. She was hopeful about the future of design ethics and responsibility as companies realize that safe spaces are important for women online and offline.
During the session, I shared the user empowerment thesis Samsung NEXT uses to inform how we think about investing in and building products, while learning from other panelists how they seek to course-correct the industry’s ethical problems.
How we think about consumer empowerment
One reason I joined Samsung NEXT is the values-driven work we do. We partner with, acquire, invest in, and are helping to build the next wave of companies enabling people to take back control of their time, attention, and data.
Our mission statement is centered around “championing builders and founders inventing a more purposeful future.” That means we not only squint into the future, but also are committed to leaving a positive legacy to the world.
Our consumer empowerment thesis, in particular, is founded on the core principles of humanizing technology, resisting the attention economy, and empowering users. Unlike other tech companies that are based on an advertising business model, Samsung primarily sells hardware. As a result, we aren’t in the business of harvesting your data, time, and attention.
Online advertising has brought tremendous growth and revenue for many big tech companies, but that also means for the consumer their attention and data are bundled and monetized. With that business model also comes the negative externality of optimizing for maximum time spent on each company’s platform.
I would like to believe those companies couldn’t have anticipated the current state of technology 20 years ago, or the negative ramifications their business models would have on the human psyche, media, and politics.
But with numerous data breaches and increased government scrutiny, consumers are becoming more skeptical and sensitive about how their data is being used and manipulated for tech company gains.
Studies show that 78 percent of global citizens are concerned about their online privacy. Gen Zers are growing more and more concerned and outspoken about privacy and data issues. With the rising consciousness of surveillance capitalism and persuasive technology, tech companies now have a fiduciary duty and social responsibility to build more humane technology for their users.
We see this shift in consumer sentiment and consciousness as a great competitive opportunity to mindfully innovate and do the right thing as a business.
For example, we recently invested in a startup called Scroll that offers subscription-based, ad-free browsing of multiple publishers across the web. The service offers a better experience for users while also generating more revenues for publishers than they would receive in an ad-based business model. We believe backing companies like Scroll creates a win-win situation by growing online publisher revenues while also helping consumers and society at large.
I’m hopeful with the rise of VC funds like Attention Capital and nonprofits like the Center for Humane Technology that builders, founders, and investors are innovating on new business models that put the consumer first while making returns.
I also hope that as we move into the new year, more businesses see that being mission-driven and humane in their data privacy practices, harassment policies, and information dissemination will create better business models and returns.