Please update your browser.

Our site no longer supports this browser. Using another one will help provide a better experience.

Menu

Lumen Lede

How data privacy builds trust in the age of AI
January 23, 2024

In 2023, artificial intelligence or “AI” became a household term, sparking curiosity and anxiety amongst people trying to understand its implications. While public fears of deepfakes, inherent biases, job losses and blatant misuse of AI have been well documented, AI is also a tool that can benefit society in a positive way.

Defeat the fear

I’ve seen firsthand how AI can be used for good or bad, depending on who wields it and how. For businesses looking to lower costs and increase productivity, AI can be very powerful—accelerating innovation to deliver what customers want faster, better, and cheaper.

However, businesses need to build trust amongst employees and customers that they’re using AI appropriately. Lack of transparency erodes trust. “Moving fast and breaking things” is not responsible. That said, businesses that delay adoption of AI or unsuccessfully implement AI solutions will find themselves at a competitive disadvantage. Enter data privacy, an important ingredient when it comes to fulfilling the promise of AI and making folks feel more comfortable.

Data is essential for successful usage of AI. AI systems rely on data to learn, grow, and deliver value. Often that data includes personal information, information about identified or identifiable individuals. All this data—your data—is a reflection of who we are, what we like, what we do, and who we know. It’s easy to see why people are nervous about AI.

Data privacy practices ensure the proper use of your personal information while building trust in this new and constantly evolving technology. Let’s dig into what that should look like as we celebrate National Data Privacy Day on Jan. 28.

  • Data transparency tells the whole story on how your information is used. People can trust AI systems more if those systems tell them what they do with their data, where they keep it, and how it’s used. That means adopting transparent data privacy policies and letting people choose what happens to their data.
  • Data protection keeps your information safe. In addition to data transparency, we also need to keep everyone’s data safe and secure. That means having strong data privacy policies and cybersecurity protections to stop anyone—especially bad actors—from accessing and using your data without permission.
  • Ethical use of data means fair and honest application of information. If we want people to trust AI systems, we have to use data behind them in a responsible and fair way that respects fundamental values and rights. This means creating and following rules when we use AI systems, and being honest about what we do with people’s information and stored data.

Lumen has been an early and responsible adopter of AI. It informs our work on many levels. More than a productivity tool, it’s a differentiator for the customers we serve.

We truly understand the importance of building confidence and trust in AI functions as well as addressing areas such as bias, ethics, security, privacy, and compliance head-on—those are some of the reasons we created a Trust Center.

We know data privacy is a vital component to building trust in AI systems. As we continue to understand the limitations and realize the promise of this new data-driven technology, we must also empower companies and consumers to take charge of their data.

Read more:

 

AUTHOR

Hugo Teufel

As Vice President, Deputy General Counsel, and Chief Privacy Officer, Hugo leads the legal aspects of cybersecurity and legal and operational aspects of privacy at Lumen. Hugo has more than 20 years of experience in driving highly effective compliance programs for global companies and federal agencies.