16: The Case for the Non-profit Tech Company with Rubén Mancha

Seth speaks with Dr. Rubén Mancha about developing more ethical business models and about the potential for a non-profit technology company. Rubén is an assistant professor of information technology at Babson College and an advisor to entrepreneurs and organizations. Rubén researches the social impact of digital technologies and digital business models, their potential for good, and the responsible digital transformation of businesses.  He also acts as one of the advisors for the podcast. 

The key questions for this episode are: 1) Why is data valuable to businesses? 2) How can we best negotiate data sharing, data capture, and data privacy 3) Can non-profit technology companies succeed in the current marketplace?

Episode Transcript

Seth Villegas 0:05
Welcome to the DigEthix podcast. My name is Seth Villegas. I’m a PhD candidate at Boston University working on the philosophical ethics of emerging and experimental technologies. Here on the podcast we talk to scholars and industry experts about pressing issues in data, business, technology and society from an ethical perspective. On this episode, I have the pleasure of speaking with Dr. Rubén Mancha. Rubén is an Assistant Professor of Information Technology at Babson College, and advisor to entrepreneurs and organizations. Rubén researches the social impact of digital technologies and digital business models, their potential for good, and the responsible digital transformation of businesses. He also acts as one of the advisors for his podcast. I’m personally grateful for all the insights that he’s given me. Before getting to my conversation with Rubén, I want to sketch out some important ideas that will come up in our interview through the following questions. First, why is data valuable to businesses? It is obvious that there are many technology companies that are able to profit because of how they’re able to capture and sell the data that they capture. This leads to a second question: how can we best negotiate data sharing, data capture, and data privacy? As Rubén will point out, it would seem that as a society, we’re starting to understand more and more the role of data in modern business practices. But unless we can understand how and why data is valuable, only then can we start to understand what position we’re actually negotiating from. The notion that these services are free actually misses out on just how valuable data actually is. The final question I would like for you to consider is this: is it possible for nonprofit technology companies to succeed in the current marketplace? This is one of the main ideas that rune will be putting forth. However, there are some notable challenges to nonprofits, especially in terms of gathering the necessary capital to begin one in the first place. But if the startup cost could be overcome, then it might be possible that both developers and customers would be attracted to a business model that provided a given service at the lowest possible costs. Rubén even suggests that nonprofits may be more focused on the technology, since they will be forced to reinvest their profits back into the technology that they’re developing. If you’re a budding entrepreneur or developer, it may be worth considering whether it is possible to develop a different kind of technology company that will work in a new way. Still, you’ll run into the same problems of just how to gather the needed capital, and how to monetize a product in such a way that will be kept going. Personally, I think it is exciting to think about what kind of mission-oriented technologies might be possible, given the broad range of needs that could be addressed through technologies. Throughout this interview, Rubén will give examples of businesses with practices that might be emulated. You can find links to these businesses in the description of this episode. This podcast would not have been possible without the help of the DigEthix team, Nicole Smith and Louise Salinas. This episode was cut and edited by Julia Brukx. The intro and outro tract Dreams was composed by Benjamin Tissot through bensound.com. Our website is DigEthix.org. You can also find us on Facebook and Twitter at DigEthix and on Instagram at DigEthixFuture. You can email us DigEthix@mindandculture.org. Now I’m pleased to present you with my conversation with Rubén Mancha.

So thank you so much for joining me, Rubén. It’s really great to have this chance to talk to you. To start, I was wondering if you could just tell me a little bit about what your your focus is right now. Where are you concentrating your your research interests and everything like that?


Rubén Mancha 3:36
Good morning, Seth. And thank you for inviting me to your podcast. Right now I’m an assistant professor of information systems at Babson College, a small university that is focused on business education with an emphasis on entrepreneurship. And I was recently promoted to associate professor, so this is a good time for me to take stock on on what I have been doing and what is my area of work. In general, since I started working towards a PhD in science, my focus has been on finding the way technology can be used for good and technology quite broadly defined, because at the beginning, I was in the natural sciences, I was studying biology, and my focus was very different to what is now with digital technologies. But in general, I have always looked into the way that technologies can bring value to society.


Seth Villegas 4:21
So it seems like you have a pretty broad focus then right? So we’re talking about thing- systems, potentially that are interacting with on multiple levels of society is there, say, an area which you’re focused on or just kind of business in general?


Rubén Mancha 4:36
So my focus is in information systems, which has to do with the way people use technology, organizational settings, and it’s a systemic discipline, we look at systems and I particularly have the social technical perspective, which I tried to understand the way in which people and technology create or interact, create value in an organizational setting and in a broad social setting. So, that is my area of focus in general. The underlying focus, underlying theme has always been the exploration of technology’s potential for good. But in the past few years, I have focused more and more on digital technologies and digital business models and their impact and potential for good and the responsible development and use of emerging technologies. So I’m more focused on the business side on the business model side and on the use of emerging technologies.


Seth Villegas 5:20
Great. If you had to kind of give a quick explanation of why information and information systems are valuable in a business sense. I don’t know if you would mind explaining that.


Rubén Mancha 5:30
So going back to what is information systems, so yes, so information systems study the way people, employees, stakeholders use technology in a business setting, and the way those organizational entities are created to deliver value to the rest of the organization. In a sense, the way information flows and the way those information flows add value to the company, or to anyone that is going to be engaging with the company and final client. No, it’s quite broad. But more recently, I see the information systems discipline expanding, to look beyond just the organization itself. How can we make organizations more efficient? To try to integrate a perspective on on what is value? And who is receiving the value? Are we impacting society and we impacting the environment? And how can we optimize? Or how can we understand better that impact?


Seth Villegas 6:20
So I’m trying to think about how a business works. And you can tell me if I’m getting this right: there’s kind of different people in an organization, it seems extraordinarily obvious. But, coordinating all of those people is actually pretty difficult. And the ways in which people are able to communicate with each other the rate at which they’re able to communicate the ways in which people can, say, coordinate over large distances, and whatnot, can work remotely. All that’s really changed within the last few decades or so. I think more radically than a lot of people imagine. And it really is this ability of people to communicate different things very, very quickly. And that coordination then allows for bigger kinds of systems to be designed and developed with more sophisticated forms of behavior. And there’s a lot of technology that goes into making that possible.


Rubén Mancha 7:10
First, technology brings new ways, as you’re discussing, of sharing information across enterprise, which allows for new forms of efficiency, and also of capturing data from the environment, from the customers, from the suppliers. And that results in new business processes that can create new efficiencies, the company can optimize its supply chain, look at a company like Walmart, bringing in digital systems, having a close relationship with suppliers, and also starting to capture more and more data from its customers and understanding better the way customers purchase and in which context and we go to the extreme of a mobile application for making a purchase. There is a lot of data that the company is capturing about how the customer is interacting with the product and with the interface that allows us to create new efficiencies, new ways of understanding the customer and offering a better service to the customer. But beyond that, technologies, digital technologies, have also been creating new means of creating a business model in the marketplace. And that’s one of- one of my areas of research. In particular, digital platforms, how can a company use digital technologies in a way that it changes the way the interaction of itself occurs in the marketplace, the way it brings value. And with digital platforms, what we see is, is the emergence of a business model in which the company almost separates itself from the delivery of value and becomes an orchestrator or a coordinator of value delivering, and a matchmaker in which it tries to bring someone who can deliver the value in contact with someone who has that need. You can think about an Uber, the company itself, applying algorithms to find someone who can drive you with the person that needs a ride at a particular time, and optimizing that particular value delivery process. And then being part of capturing the payment, and then delivering some additional layers of value to both the customer and the driver. So that is something in particular that you study, not so much how technologies are have been bringing value to companies over the past few decades. But, how digital technologies are enabling new types of business models in the marketplace.


Seth Villegas 9:07
If I’m going to kind of try to summarize two different kinds of sentiments that I see in the general public, I would say one of them, first is kind of a misunderstanding of what data is and why it’s valuable. So for instance, even just the situation that you’re talking about with your- your Uber, right, like it’s, there’s a new kind of service that’s much more efficient when you can just find people and when you can actually locate them with the GPS locator. The second sentiment is, I think more related to say concerns over privacy and kind of the negotiation of data capture with a business in relationship to that- that person, right, like what what does that kind of personal data look like? At least for me personally, it kind of raises the question of that, of how is this data capture happening and how do we kind of negotiate that relationship between the business and the customer or the business and whoever they happened to be gaining that data from?


Rubén Mancha 10:02
Yeah, I, I agree with you data are being captured throughout the interaction between the business and the company in multiple stages. And then there is little control on how that data are being captured and what is being done with it. Frequently, we hear that the company may be selling customer data for a particular purpose, or customers may learn that data that they didn’t know was being captured are being captured and being sold. But a common quote in in business and business technology is that if you’re not paying for a service, you are the service. In the end tells us that companies need to monetize something. And it may not be obvious that what they’re monetizing is the customer data. But for most digital companies, in a way or another, that’s exactly what they’re doing. And the challenge is for customers, for society in general, to understand the way that data are being captured and, and then monetize. Because, again, it may not tie directly to the delivery of the service. When you take an Uber, you cannot imagine that your location, your location when you’re not writing has any value for the company. But it actually does because they want to understand you as a potential customer. May not be directly for the service that they offer, but for secondary services. And this becomes very clear for a company that base their revenue model on advertising, the more they can understand the customer, the more data they can capture in different contexts, the better off they’re going to be doing the matching of particular ads with your profile. So I completely agree with you, and data being captured in different settings. But I think the challenge is for society to truly understand the consequences of digital business models.


Seth Villegas 11:35
I’m definitely with you on that. So here at Boston University, I work with a lab called BU Spark with the professor Ziba Cranmer. And one of the things that she continually emphasizes to her students is that they need to really be thinking about monetization. This is a technical class, that’s people building technical systems. But the business side of this, say, especially if their project becomes something they actually want to go forward with, as a startup or as something else, they often haven’t really considered what it’ll look like as a business. And so they kind of end up defaulting to this sort of monetization of data model that you’re referring to, right? Something that plays more into advertising and whatnot. I guess, for people who are looking to get into these sorts of areas, what sorts of things should they have in mind when they’re trying to actually plan out their business model, say, to get ahead of the kinds of problems that we’re talking about here, where you do have to monetize something? There’s a service that’s being provided, and do you have any sort of general thoughts that you tell people kind of within your program?


Rubén Mancha 12:43
At Babson, I frequently advise, meet with entrepreneurs, trying to launch these other startups in different settings. And this is a common point of discussion, the first comment that I always make is, you need to think about it. And you need to think about it from the broad perspective of the impact that technology is going to bring to not just those that are giving you money that you need to, in a way pay back, those that are invested in your company, but society in general, what do you want to bring? Why do you want to bring a tech company to the marketplace? So, first, having that awareness about the importance and the impact that technologies are going to have is very important. And then secondly is engage with, with the stakeholders, engage with the users, and engage with a broad community of experts in in ethics and business ethics and technology, and how technology is being used to have clear understanding of the future consequences that your technology may bring to the marketplace. If we were to think about a startup, Google, and when it emerged, it was seen almost as purely a tech startup, not a technology company with a social impact. And years down the road, we see company like Google almost as a utility, as a social benefit. And it’s been discussed, if it should be treated as a utility that everyone should have access to. I know it’s almost impossible for a tech entrepreneur to anticipate how the technology in the startup is going to evolve. But the company can have a continuous awareness of how the technology is being used and the consequences that it is bringing to society, and instill that agile awareness and response to the market, that the company can be more responsive to the ethical consequences. So those are usually my- my points of advice: first awareness, and then responsiveness, continuous exploration and responsiveness.


Seth Villegas 14:26
To revisit one of the things that you mentioned there, which I think I’ve talked about, kind of in other interviews that I’ve done is this point about stakeholders. So stakeholders, you know, specifically people who might be affected by a business who may be of special concern for whatever reason, in terms of what you’re saying, especially since there can be really disparate impacts. And if you think about something like Google, it’s, if I imagine the stakeholders that’s a really, really big group just because of how, you know kind of ubiquitous the technology is. So how do you think people should go about kind of identifying who the important parties are to include in those conversations, when people are trying to do their due diligence with stakeholders?


Rubén Mancha 15:09
I couldn’t really tell you what is the solution because it’s highly context dependent My take is, you just need to be very proactive about it. And be clear that your business is as much about bringing balance, value to different stakeholders as it is to create a revenue stream for your shareholders. And I know that is very difficult to sell at the beginning as an entrepreneur, but I think it’s on the founder’s mind and initial intentions to set to set the course of the company in that direction, how to go about it beyond being very proactive, I think that is very context dependent. And it depends on the industry in which you are, the type of technology that you have, how vested has it been, and in which context. I can provide you with some examples, if if that may help, like a recent case that I wrote was the case of Whoop, a well-being startup, a wellness startup, that tries to optimize performance, human performance, that is that is their tagline for this other company, I don’t know if you’re familiar with a company: it is a bracelet that captures lots of analytics data, something like a Fitbit, or wearable device that captures a lot of data. So a student of mine and I interview different executives, including the CEO and the COO of the company, to try to understand how they see the creation of value and how they see privacy as a challenge or an opportunity for the company. And something that became very clear is that they don’t want to monetize customer data. It’s from the start something that they are never going to do because they want to preserve customer’s privacy. And they want to be very thoughtful about the way they are perceived as a company that provides a very unique service through analytics and not as a company that is trying to gather compile as much data as possible from its users to who knows what, do something later and sell complementary services. So I don’t know if that was clear, but this company Whoop comes to me as a very concrete example of a company that understands the context in which it operates, wellness and health care, and has set a very clear course to be protective of itself on the way it presents itself to its customers. And it tells the customers: we will never sell your data we are not about, we are not in the data business, we are charging you a subscription every month, and our business is to provide you with optimized analytics, so you can understand yourself better, and optimize your day-to-day well-being.


Seth Villegas 17:25
I think this is- you made a lot of really important points. The first one is actually, I think it’s really good to hear honestly how difficult these questions are. Because, again, when I’m kind of in these technical settings, new people who are aspirational kind of tech entrepreneurs of some kind or another, it can be scary to kind of realize how difficult those questions are, and really thinking through the consequences of everything. But I think that reinforces to me, personally, that getting ahead of these questions, kind of revisiting them over time as the products developing as the businesses developing is really, really important. But also even this example of Whoop that you’re giving shows a different kind of business model that goes against the norm. And they’re also really leaning into their potential reputation. As “we’re a company that does this.” And I do think that new startups have an advantage of being able to articulate that more, because they can kind of see where things have gone wrong in the past. And I wouldn’t want, say, a young person to assume that they can’t get into that business using that sort of model. Because, I do think people as, especially as society kind of understands more the ways in which these models work, and maybe they want to opt out of them, they may deliberately choose to go for companies like Whoop that have different priorities and whatnot. I don’t know how you see that. But-


Rubén Mancha 18:44
Yeah, I think it ties back to that earlier comment that societal understanding is limited, but it’s clearly growing. We’re understanding better that companies are using our data and that has been exposed, and data has been monetized in ways that may have negative consequences for us as individuals or for social groups. And there is pushback, and now individuals are exercising their right and their choice and going for companies that they perceive are more ethical, more responsible, and use data the way they they perceive as ethical, or they’re willing to compromise. So I think they got the understandings have proven, and that adds to social pressure for companies to be more responsible. I also think governments are catching up slowly, but they’re catching up with regulation. They are examples in the European Union, we have GDPR, the regulations that control the way data can be accessed and can be sold. That is bringing also some awareness to other countries like the US about how those companies are operating and how they should and should not be restricted. I would add there is there is a third venue, aside from self regulation and that control from government that is frequently as low and trying to catch up to what the companies identify as new opportunities. And this is something I recently wrote about in relation to platforms. It is the use of free market opportunities to control the way that companies operate through competition. So I can tell you a little bit more about my perspective on digital platforms in this space, for example, so with with two colleagues, David Nersessian, and John Marthinsen, at Babson, a law and economics professor, respectively. We looked into the idea of nonprofits making the same use of digital technologies as for-profits do. They’re usually seen as relegated to very niche areas of the market and making very little use of technology. They are not in the Tech Frontier, they are not startups in the tech companies. So we asked the question, what if we were to launch nonprofits, as tech companies? What would be the advantage, disadvantage? And what would be the market adjustment that we would see? And, in particular, we were looking into digital platforms and the use of the digital platform business model by nonprofit. Again, instead of delivering value directly to customers, can they become matchmakers between those that need something and those that can deliver it? And how can they interact with those stakeholders? And what type of value are they going to bring? And will they be competitive with companies that operate in the same space for profit? There is a very interesting example, which we discuss in these article in this article I’m referring to, which was recently published in The Social Responsibility Journal, and the name of the company is RideAustin. And this is the rideshare company that operates exclusively in the region of Austin and uses the platform business model to connect drivers with riders. So going through press releases, we saw that there were some limitations in the way this company was using technology that affected the way the value was perceived in the market, the way customers saw this rideshare company. But we tried to go beyond that and understand what would happen if a company truly becomes a digital startup and uses a nonprofit model. And I think we came to very interesting revelations about the potential impact that nonprofits can have in the digital space in the shared economy, even to adjust, as I said, as a third foot as a third opportunity, the way that for profits operate freely capturing data and using data.


Seth Villegas 19:29
If I was to make a guess, and you can tell me whether this intuition is right or wrong. It’s that the fact that the technology is kind of accelerated to a point where maybe it’s low cost enough that you might be able to operate a service similar to Uber on a local scale within a particular city. So I think that where you’re talking about in, in Austin is a really interesting example of that. Because if I’m thinking about a lot of the criticism of Uber has a lot to do with the the for-profit model, right, really directly ties to how much drivers are paid, what their incentives look like, and everything. Whereas if you had a kind of a nonprofit service that does coordinate things in the way that really worked for drivers, and for the people taking those rides, that seems like a, I don’t know, it seems like a net win for everyone, in part because it’s not being artificially inflated in price.


Rubén Mancha 22:57
Yeah, the negative externalities and negative consequences of for-profits, as you were identifying, clearly the limited protection of workers, the fact that they must grow, to create more and more profit. And that growth is going to come with more capture of data, and the attempt to monetize it, that will have negative impact. The way they try to reach or, they become monopolies, duopolies they try to gain market dominance to establish that position, think about companies like Google and Uber, and how then they use that dominance, in some cases, to kill innovation, to restrict innovation in a way that other smaller startups will have a really hard time coming up to compete with these companies, once they have a still established dominance. And then also the impact that the algorithms that they develop that are hidden within the code and are not visible may have on some social groups. And that comes for both sides for the providers of value and the receivers of value. So yeah, lots of negative impacts. But that that can be counteracted by the fact that companies now have access to technology that is highly scalable, as you were describing. Cloud computing now accessible to everyone and deploying platforms is fairly straightforward. You can hire talent and is well understood what are the basic interactions that you want to create. And there are models to look at you we understand the way a customer wants to be picked up and driven to a location and all the value added features that can be added to the site. So that make it makes it simpler for for any type of a startup, for profit or for social innovation, to launch a similar service and start scaling. But then there’s some other positive outcomes and benefits that the company may obtain, the fact that nonprofits don’t have the corporate tax rate in the US that for-profits do, the way that a nonprofit may decide to capture data and use it to optimize the service and optimize the value delivered to the customers and not just to optimize the amount of money that it’s making. Those are clear value propositions. Also the talent will be more attracted to collaborate with companies that operate in a nonprofit space that clearly want to bring more value to society. And then any profit can be reinvested, will have to be reinvested. in those groups in those societies can make the technology better, but also can bring money back to those communities in which operates will should contribute to a very positive impact our perception of that organization in society.


Seth Villegas 25:15
I definitely think that these points we’ve been talking about of the general public becoming more aware of what’s happening with data capture, and a kind of general interest in more ethical companies, it could be a really good way to draw people into these kinds of enterprises. And, on paper, that sounds really good. But if I’m thinking back to what I’ve seen, with people who’ve tried to work in startups, even more kind of ethically conscious startups, they have a really hard time within the kind of business landscape of, of surviving. And, you know, unfortunately, some of their models look like they make something that looks good enough to get bought up, and then nothing ever really happens after that. Or they kind of get absorbed into one of these larger companies at that point. So that sounds really good. But I guess part of me is kind of skeptical after having seen that over the past 10 years or so of, “Oh, this looks like something that might be able to fill this void,” but then kind of not being quite able to do it.


Rubén Mancha 26:15
Yeah, the starting cost is there for for profit or nonprofit, for nonprofit in the digital space, the company still needs to invest in the development of the algorithms, the apps, and then subsidizing the entrance to the market. So the cost is there. And the challenge is that few people are willing to invest in this type of business model, because the objective of the business model is not to pay back the investors. So we we discuss this, this challenge in the paper, and we came up with a realization that we need philanthropy. We need people willing to put money in a particular cost that is going to solve a social problem. And not because they want to make a lot of money in return, but because they’re willing to invest in a cost. And if this type of business model using digital technologies were to be deployed, it will become self-sustaining, after some starting cost, I would argue that it will become even faster self-sustaining than a for profit, because it could reinvest part of the of the cash on building itself. And you will have a strong network effect, you will attract lots of users and service providers to to the platform. If you were to drive your car for someone, and that platform offers you more money, and you know, is charging less to people who are using your service, and you know that it is contribute to your to your communities versus a company that in the end wants to make billions for those that provided the starting fund, those that are invested in the company, what type of what type of service would you drive for? For an equivalent service, if this company using the nonprofit model? Or to be operating as a tech company, building the same type of technology? If you were a developer, you’ll be more attracted to be part of it? And then how do you actually launch a nonprofit?


Seth Villegas 28:02
You had talked earlier about how say something like Google could eventually become or be viewed as something like a utility. And it seems to me that this model might actually allow for that to be possible, maybe not things that like actually operate as a municipal utility managed by the, like local governments and whatnot. But as something that is much more of a social service, just because of how, how it’s designed to work. So if we could talk a little bit more about kind of the connection with government, which is also something that you mentioned earlier. So you talked about the kind of the the EU Data Commission, which is this, I think, publicly available document for people who want to look it up, which they put out a couple of years ago. In terms of the kind of the regulatory side or businesses kind of interacting with kind of changing government expectations and whatnot, what do you think would actually be helpful towards fostering technologies that are more socially conscious, more more ethical, the things that look more like what you’re talking about, like kind of nonprofits? Do you think that the kind of surrounding regulations would have to change? Or is it just that they’d kind of have to out compete these these other companies over time?


Rubén Mancha 29:19
I think both complement each other. And I do believe that government regulations are required to restrict what companies can and cannot do with personal data, and to benefit the actual providers of data the users. But I also believe there is an opportunity for companies to be more responsible and to compete in the basis of privacy preservation and thoughtful use of the technologies. And as an example, we see now companies like Apple are showcasing itself as a company that is highly concerned with customer privacy, and that given the company an edge in the marketplace. So I think there are opportunities for both for regulation and and for companies acting in a way that is responsible and using it to be more competitive.


Seth Villegas 30:03
As kind of a follow up question. And there is sometimes kind of tug of war that goes on between the data capture that a private entity does and the data capture that say a government is willing to do. So for instance, Apple got into a lot of some, I mean, really bolstered the reputation honestly, by saying that they’re going to try and keep user data private, over and against kind of requests from the government for it,


Rubén Mancha 30:27
I can see companies like DuckDuckGo search engine, suddenly, it’s data preservation becoming more and more attractive to users who want to browse with increased levels of privacy. And they, they are still subject to subpoenas, and the government can request to have access to data. But the fact that they don’t collect the data at the beginning, it is almost impossible for data to be used. In some industries is needed to have some data to be able to present with the best possible service to the user, and in some others, data are gathered because it benefits the company. And I think that’s the case by case, industry by industry discussion that we should be having, as we develop the regulations. And as the startups decide to build new business models, are you capturing data because you’re hoping to monetize at some point? Or are you capturing data, because it is needed to deliver a better service to the customer? And is that a service that the customer actually wants? It’s always challenging to identify, or to discuss, in broad terms, what may be the best possible decisions or paths for a company to take because it is very dependent on the context and the type of service and the industry.


Seth Villegas 31:38
One of the things that we spoke about in our last conversation, which was a few weeks ago, is kind of the the risks and benefits to specific communities. So it’s, you know, it’s one thing to kind of talk about the the general public, and the way in which technology sort of affects the general public. But I think right now, there’s growing concern about, say, vulnerable communities and the ways in which there might be predatory practices, specifically predatory business practices. So I was wondering if you might be able to just talk a little bit about what you what you’ve seen, what kind of issues are coming up play, and if there are any potential sort of landmines that, you know, we should all sort of be aware of in this area.


Rubén Mancha 32:23
I am glad you’re bringing this up. I support the idea that the broad discussion about the threat of general artificial intelligence and and the potential of these all encompassing AI to take over humanity. So almost distracting us from from the threat and the risk that the existing technologies are are having, particularly in some communities, because we don’t understand them. Well, because we are deploying technologies in a way that we’re allowing them to act freely, and we are not challenging them enough. So there are different spaces in which I have seen a lot of them having a negative impact on minorities, minority communities. There are lots of examples in the literature, from self-driving vehicles to the combat system that is used to determine if someone is at risk of committing another crime after committing one, and that may have an impact on future sentences and releases. So there are many different ways in which algorithms are being put out there without enough challenging and have an impact on communities. Are you looking for a particular industry or example?


Seth Villegas 33:28
I think it’d be good again, to just think about the kind of the business perspectives. So for instance, when you’re talking about a Cisco, they have these things called recidivism assessments, right, which is basically a private company goes to a court and offers them a service to say, you know, this person is this likely to commit a crime in the future, right, just as you’re saying. And there’s even say other things like, so for instance, like, ClearView is a company that offers kind of facial recognition services, specifically to law enforcement, for you for the sake of policing, it’s kind of tied to surveillance and other sorts of things. And so, I guess what I’m personally seeing is that these look, on the outside to me, like potentially problematic technologies, you know, in lots of different ways that you first off the fact that you can predict someone’s future behavior. And whether that’s actually relevant to sensing right now is kind of a broader legal question. But businesses are kind of identifying these things like predictive technologies as a as a business opportunity. What’s your your sense of why these things are kind of attractive to these to these businesses? I’m just kind of wondering what’s what’s kind of-


Rubén Mancha 34:48
I think they have an opportunity to monetize that data. And there is not enough regulation to tell those entrepreneurs that they shouldn’t and there are not enough ethical checks to forbid them from from using the data. And I think that that is clear. But the opportunity is there and there are no controls. I can give you a personal example years ago, this was about 10 years ago, I was approached by a nonprofit, I wouldn’t give details, but the nonprofit had a very clear, actually positive constructive objective of providing loans without taking into account credit scores, because credit scores already talked down about your past, and they wanted to provide credits, small business credits to minorities to be able to empower particular communities in a state, in a region. They gave me their data set, the objective was to also add some survey data, a survey that we could deploy, that would capture additional data in a way that we could determine if a person was or not a good candidate to receive a loan, and if the person would pay back, so I worked on these for for about six months, and I capture additional data survey data. But what I learned building different types of models is that the rest predicted no matter what was always zip code. Based on past data, what we can learn is that based on devices that are built into the data, a person who lived in a minority neighborhood, a Hispanic neighborhood, was less likely to pay back according to the data set. And of course, that has built in lots of biases based on previous access to data and resources and, and secondary processes. So I couldn’t find a simple way of overcoming that challenge of identifying the best possible way of giving a loan to someone who would not be a delinquent in the future, just based on past data. And, and the ultimate recommendation that I gave was, I wouldn’t, I wouldn’t build an algorithm for you, we can not automate the process. You need to meet the person, you need to talk with the person, and there needs to be additional support than just running a survey online and determining if you’re worth or not getting this loan. So in some cases, I would say the process is to avoid using technology. There, we find a better way, it’s an I don’t think we have in some cases, we cannot use data that is based on biases and previous processes that are biased, and pretend that we are going to obtain something different out of them.


Seth Villegas 37:09
If I’m gonna try, I’m gonna try to summarize a little bit what I think you just described, which is, you have this system that’s going into play a partially you’re looking at automating a process of loan approval, maybe not complete loan approval, but of, you know, kind of giving someone the green light or not. And what you sort of discovered is that even just taking something like zip code, would would lead us potentially, as someone from an minority community being denied a loan, on the basis of being higher risk, kind of, like automatically, kind of regardless of any of the other factors in that person’s life. Am I kind of getting that right?


Rubén Mancha 37:49
Yep. That study is full of biases, and, and the algorithms will find proxies for those things that we’re trying to avoid. Of course, even without taking into account, race, or region, any factor that may be associated with your precedents, or with a group in which you you are within if you belong to any particular group that was marginalized in the past that will be captured by the algorithm as a proxy for risk worthiness for being a potential receiver of the loan.


Seth Villegas 38:15
Right. And I think that that in itself is really important, in part, because if it’s, if we’re actually treating the loan in the way, that way, which I the nonprofit would hope as a real opportunity for, say, getting a business going, having that kind of denied at a kind of systemic level, which you talked about earlier, there’s this larger system that’s been built in place, and that you can see kind of from the outset that that’s what this this, that’s what the outcome of this system is going to be it kind of very early on. So I think that that’s just that feels a little disheartening, and some, in some sense, right? Of those things can like automatically become proxies. When you’re talking about kind of a non-technical solution to this right, which is that it’d be better not to automate this because of the the bias that seems to run against the the trend towards wanting to have these larger and larger systems in place that don’t need direct human intervention, right. That’s that it almost sounds like scaling back rather than sort of scaling up what service can provide.


Rubén Mancha 39:21
It goes against the optimization principle of trying to use technology to be more efficient to reach more potential users or customers faster and better. It does. But until we understand the consequences of the technology, and we create better processes that are unbiased, that maybe we can try to replicate with technology, I don’t think we should be using technology. Past data is just going to perpetuate those biases in the new processes. So we need to be very thoughtful about the way we use technology as an automation tool.


Seth Villegas 39:52
This again seems like another context in which on the basis of what’s being discovered, a different kind of solution is necessary, you know, things are very context dependent, as you’ve been saying this entire time, For businesses and nonprofits going forward, how are they gonna kind of like unearth these things for themselves, right? Like, oh, like, like, we can sort of see that there’s a kind of encoding that’s happened with the system that we’re building. And maybe we shouldn’t automate this in such a powerful way because of how, you know, say, forceful it could potentially be in the future, because I do think, again, for the people that I’m really hoping to talk to of the people who will go on and eventually build new systems, how can we kind of keep that awareness of Oh, like, this is a red flag, and I should really keep this in mind.


Rubén Mancha 40:42
Yeah, I think the approach goes counter to what we tell entrepreneurs that they should be doing, to the mindset that they should be moving fast breaking things. And that company should be just testing things and evaluating and moving forward with with things that are bringing value to them goes counter, because they should do the opposite, they should stop and think and reflect and have very clear, settled values in the developing adoption of technologies, and stop frequently to understand the consequences, the potential consequences or future consequences of the use of technology. So, it goes counter to the idea that businesses are to optimize the amount of money that they bring in. Because if there are no external incentives to do that, they will not they will not stop, they will not think. Unfortunately, that’s how I see things. I I see companies, and we can talk about entrepreneurs and tech startups, but it seems even more important to talk about how companies are adopting technology and companies that have been around for a while to improve processes or to change processes. There are lots of examples of digital transformation that have failed, or that brought in lots of negative consequences. And they didn’t stop enough to think about the consequences, or they put incentives in place that were counter to the idea that technology can bring negative impact. Just to make it more real Wells Fargo was in a big controversy two years ago, for the way they use analytics technology to identify customers and then create fake accounts, fake accounts under their names. And I wrote about that years ago as a failure of analytics. When we pursue analytics maturity as the objective trying to bring the latest the best technology to know everything that is out there. But we don’t challenge we don’t question the tool, how it can be used and how it aligns with the values and the incentives we give the employees, then employees will be incentivized to do things that go against even common sense, opening an account under name of someone who is not interested in opening a new account, just because we have the ability to know what customer data we can exploit.


Seth Villegas 42:41
And I think this is actually a really important point, in part because when incentives kind of, they become perverse, I think you actually start to get bad data, right? Like you have people who may give false data. And then that’s in that sense, the whole kind of information system itself kind of starts to break apart once, once those incentives get out of whack.


Rubén Mancha 43:07
Yeah, it breaks down and data will be biased, definitely you tend to see you or you to capture the data that you want to see. So going back to the Wells Fargo example, the CEO had a very interesting quote something like eight rhymes with great because he wanted every every customer to have eight accounts open with with Wells Fargo. And that is the type of incentive the type of story language narrative that in the end results in incentives for people to try to find a way they can use technology to optimize that outcome, instead of have a clear understanding of what the company means and represent for society and how to use technology for best. So in a way, I wrote in the past that that eight rhymes with great quote from John Stumpf former CEO of Wells Fargo clearly sets the tone for employees to try to arrive to that number. It accounts and resulted in millions of customer accounts being created, bogus accounts,


Seth Villegas 44:00
It’s just such an interesting example of something that’s an internal incentive just kind of run awry, in which the employees do something because it looks better to the management even though it doesn’t- it’s not even really connected to actual customer behavior. So kind of looking forward then what do you think that we should be paying attention to or what are you kind of personally paying attention to in terms of like trends and data and technology?


Rubén Mancha 44:28
So paying attention as the technologies, as a business leader?


Seth Villegas 44:33
Yeah, so in part I know people are getting, say, increasingly interested in things like cryptocurrencies, the applications of blockchain, there’s other things that are kind of on the horizon. And I’m just kind of interested in what like, like, what are you paying attention to you right now and trying to look at as maybe a future area for business models to be really paying attention to?


Rubén Mancha 44:55
Decentralized business models. So I think you mentioned before in previous conversation is how blockchain can enable decentralized business models because of their potential for scaling, I think those we have to pay close attention to. How smart contracts are enabling transactions and how those transactions are encoded? How are we going to be able to gain visibility into those encoded business processes and understand them and challenge them and change them over time? I am very interested in about the impact that technologies, emergent technologies on automation are going to have on labor, because I think that’s going to impact our societies greatly. Being in education how, how do we retrain the workforce to be able to work alongside digital technologies? And I think automation is gonna come along with augmentation. And for augmentation to exist, the human counterpart, the employee needs to understand how technology is going to deliver value to the to the role to the particular task of making decisions. What is that best possible alignment between the human and the technology? So that is something I’m very interested about, as we look at technologies joining the day to day operations, and businesses in a more active role in which they provide insights on other technology, I tried to steer away from studying bass technologies, the ones that are emerging as as the latest because I see that is distracting us from looking at the other 99% of technology that is being deployed by businesses that are the ones having an impact right now. So the counter, I pay a lot of attention to how businesses are transforming themselves, how industries are transforming themselves and I advise different companies in different industries are adopting technology. And I tried to get them to think about the responsible adoption of existing technologies, because if they adopt an analytics framework, software solution, software as a service that analyzes their customer data, but they have no visibility and understanding of how the data are analyzed, and why they are being presented with results. I think that can have great consequences way more than emergent technology that haven’t reached the market yet.


Seth Villegas 46:58
And I think that’s personally, it’s a really good reminder for me. So you know, I did my undergrad at Stanford, and I think like a lot of Silicon Valley types that can be sort of swept up in the, whatever the emergent technology is, and that’s definitely where my kind of interests are. But as you’re saying, a lot of these systems are already working, they’re already running, and we don’t even have a full grasp of them yet. So for props, it’s best to kind of take a step back and really examine them in greater detail rather than getting swept up in whatever the next thing is.


Rubén Mancha 47:31
Yeah, we’re years away from human computer interaction, brain machine interfaces, reaching the workplace at a scale. So I’m curious about those technologies, and I read about them, but it wouldn’t be fair to say that I see they’re going to have a great impact as other technologies are having. So simple analytics, simple data mining and Automated Analytics that is now accessible to any enterprise. I think those those we have to pay attention to very closely.


Seth Villegas 47:57
Alright great, I think that’s a great point for us to end on. Thank you so much, Rubén.


Rubén Mancha 48:00
Thank you, Seth.


Seth Villegas 48:01
Thank you for listening to this conversation with Rubén Mancha. You can find more information about DigEthix on our website, DigEthix.org and more information about our sponsoring organization the Center for Mind and Culture at mindandculture.org. If you’d like to respond to this episode, you can email us at DigEthix@mindandculture.org. Or you can find us on Facebook and Twitter, at DigEthix and on Instagram at DigEthixFuture. To conclude this interview, Ruben advises not to get too swept up in emerging technologies and the technological possibilities. Why? Well, in part because these technological solutions might not always be able to follow through on their promises. Consider the example that Ruben gives about his experience working with a nonprofit on loan data. The data itself was so reflective of embedded biases that it could not be controlled for the overwhelming predictiveness of zip code alone. For that reason, he believes that attempting to completely automate the loan approval process may further embed biases rather than creating efficiencies within their system. In fact, if the goal of the system was to empower people then sweeping denials of people in those same groups would be contrary to that goal. This isn’t to say that data isn’t powerful and valuable. If anything, this conversation shows just how much power data actually has. But the trap that we might miss is how data actually shows us at times a distorted picture, and that it may also be showing us the picture that we want to see, rather than the one that’s actually there in front of us. Today is certainly a time of technological excitement. And while it is tempting to forge ahead at full speed, I think Rubén is right that we may need to reflect on our technologies from time to time, if for no other reason to see that they’re truly working as intended. This is Seth, signing off.

Transcribed by https://otter.ai. Transcript edited by Julia Brukx.