Technologies of togetherness: Shaping an equitable future with AI

Event
June 19, 2025

From innovation to inclusion: AI’s potential for social good   

By Mahmoud Shabeeb

As artificial intelligence (AI) rapidly transforms every facet of our lives, the urgent question is not just what these technologies can do, but how they can serve all of us, equitably and responsibly. At this year’s Big Thinking panel, hosted by the Federation for the Humanities and Social Sciences, experts from diverse sectors came together to explore how we can mobilize technology to foster collective well-being, democratize decision-making, and ensure that innovation is truly inclusive in action, and not just as a buzzword. 

Data as a social contract  

Heather Krause, a leading data scientist and advocate for data equity, suggests reframing data collection as a social contract. “There is ownership. There is control. There is a price involved in the data relationship every time,” Krause explained, emphasizing “every act of data sharing involves choices for both the giver and the receiver.” Arguing that this contract is often overlooked in the rush to gather and analyze data, especially when institutions collect information from communities with a promise of any kind of positive change.  

Data, between harming and helping  

Moderator Ryan Morrison raised a critical point: “Data can be harmful, especially to equity-deserving groups.” This harm is not always intentional. As Krause noted, “To use technology and data to attack people is actually incredibly easy, but the problem we’re talking about today is unintentional, or less intentional, harm.” All three panelists agreed that even the best statistical analyses can inadvertently replicate and scale existing biases–racism, classism, sexism—if we’re not vigilant about the choices embedded in our algorithms.  

Building solutions from the ground up  

Debra Lam, founding executive director of the Partnership for Inclusive Innovation, offered a pragmatic approach to technology and community research. “We start with the problem first, whether it’s housing, energy, or whatever the problem is. We don’t go in with one generic solution. Instead, we try to understand the problem, collect the data, analyze it, and work from there.” Her team’s work in the state of Georgia demonstrates that inclusive innovation means resisting “silver bullet” technologies and instead co-creating solutions with communities, recognizing that no dataset or algorithm can capture the full complexity of human experience.  

Lam also highlighted how AI can be a force for good when thoughtfully applied, sharing an example from a partnership with Kansas State University. In this project, AI streamlines access to mental health support during crises, quickly directing callers to the right resources so that human expertise can intervene where it matters most. “AI is very useful for low-level things so you can get to people faster, so they can get deeper into it,” Lam explained. This approach aligns with the UN's recommendations on the ethics of AI.  

Shared leadership and collective responsibility  

A recurring theme throughout the panel was the need for shared leadership and collective responsibility in shaping the digital future. Krause challenged the audience to recognize that the pace and direction of AI development are not inevitable, but the result of choices made by a privileged few. “The development of AI does not have to go at the pace that we are experiencing right now. That is a choice… being made by a very, very small group of very wealthy, privileged people,” she said, underscoring the importance of democratizing not just data, but also the governance and design of technology itself.  

Looking forward: Technologies for togetherness  

The panelists agreed that the promise of AI and digital innovation lies not in their novelty, but in their capacity to bring people together—if we design them with intention, transparency, and equity at the core. As Lam put it, “When we talk about career readiness or the advancement of AI or technology, it sounds like a great equalizer: but we have to think about the people first.”  

To build a just digital era, we must treat data as a social contract, remain vigilant against bias, and prioritize community-led solutions. The future of technology—and of togetherness—depends on it.  

For more on these themes, explore Data & Society’s work on data equality and justice