Tuesdays: 5:00pm - 6:00pm (EST)
What's the missing gap between the promise of AI in the future of work and its actual adoption? What is the difference between the organizations that spend millions on technological advances that ultimately fail and those that can unlock unprecedented innovation? You'll learn the one thing that makes a difference in this episode.
EPISODE SUMMARY:
"WHAT YOU WILL LEARN:
Most companies are approaching AI completely backwards. We're going to talk about what actually breaks organizations when they adopt AI — and the human-centered approach that puts them back together. You'll hear how high-achievers in HR and organizational development are sabotaging their own AI initiatives by focusing on the technology instead of the people who use it. We unpack the emotional mechanics behind why leaders make costly AI decisions, and the critical thinking skills that separate successful adoption from expensive failure.
If you've ever felt overwhelmed by AI's rapid evolution but couldn't name exactly what felt wrong about your approach, this episode will offer some insight from someone who's built a framework that flips traditional AI adoption on its head — putting human-centered design at the core of artificial intelligence strategy. Our guest shares the one mindset shift that separates organizations thriving with AI from those drowning in it. We're diving into the intersection of artificial intelligence and human-centered design, exploring why the future belongs to leaders who can balance automation with authentic human connection. Let's rethink your AI strategy.
***
ABOUT OUR GUEST:
Wayne Williams is the Founder of Prospective Tech and a Subject Matter Expert on AI and Human Centered Design. He is a co-author of the White Paper “The Intersection of AI and Human Centered Design” and “Connecting the Dots to Entrepreneurship."" Wayne serves as a board advisor for The Harvard Business Review Advisory Council, The Center for Science in the Public Interest, Yale's School of the Environment, and ACLU, and was an advisor to The White House Council on Hunger, Nutrition, and Health.
***
FIND OUR GUEST HERE:
www.prospectivetechpa.org/
***
IF YOU ENJOYED THIS EPISODE, CAN I ASK A FAVOR?
We do not receive any funding or sponsorship for this podcast. If you learned something and feel others could also benefit, please leave a positive review. Every review helps amplify our work and visibility. This is especially helpful for small, women-owned, boot-strapped businesses. Simply go to the bottom of the Apple Podcast page to enter a review. Thank you!
Subscribe to my free newsletter at: mailchi.mp/2079c04f4d44/subscribe
Work with me one-on-one: calendly.com/mira-brancu/30-minute-initial-consultation
Connect with me on LinkedIn: www.linkedin.com/in/MiraBrancu
Learn more about my services: www.gotowerscope.com
Get practical workplace politics tips from my books: gotowerscope.com/books
Add this podcast to your feed: www.listennotes.com/podcasts/the-hard-skills-dr-mira-brancu-m0QzwsFiBGE/
https://www.prospectivetechpa.org/
Tune in for this innovative conversation at TalkRadio.nyc or watch the Livestream by Clicking Here.
In this opening segment, Dr. Mira Brancu and guest Wayne Williams explore how the rapid adoption of AI has outpaced human understanding, creating a disconnect between technology and organizational culture. Williams emphasizes that successful AI integration requires a deep understanding of human behavior, emotion, and context—because automation alone cannot replace human insight or adaptability. Together, they call on leaders to slow down, apply humility, and reimagine change management models that align technological innovation with the human-centered principles necessary for sustainable transformation.
In this segment, Dr. Mira Brancu and Wayne Williams unpack how human-centered design can bridge the gap between technology and people, positioning it as both a science and a mindset of humility and self-awareness. Williams explains that successful AI integration begins with studying real human behavior—how individuals think, feel, and interact with technology—to co-create systems that enhance rather than overwhelm. Together, they emphasize that while this approach may seem time-intensive, overlooking it costs far more in wasted investment, underused tools, and employee frustration—making gradual, project-based adoption a smarter path for leaders driving innovation.
In this segment, Dr. Mira Brancu and Wayne Williams explore the limits and lessons of applying human-centered design to AI integration—acknowledging that missteps and human bias are inevitable but essential to learning. Williams highlights that biases are unconsciously built into technology by the people who design it, making collaboration across diverse perspectives vital to reducing blind spots and creating fairer systems. Together, they emphasize that the goal isn’t to replace humans but to refine how humans and AI co-create, arguing that true progress comes not from perfection, but from humility, feedback, and the courage to follow the data-driven science of continuous improvement.
In this closing segment, Dr. Mira Brancu and Wayne Williams reflect on the path forward for leaders integrating AI with a human-centered approach—emphasizing patience, gradual learning, and a deep understanding of how technology impacts both organizations and society. Williams calls for a return to critical thinking as the essential “superpower” for modern leadership, helping decision-makers balance efficiency with humanity and avoid the pitfalls of rushing innovation. Together, they remind leaders that sustainable progress in AI begins not with automation, but with awareness, empathy, and the courage to slow down and think deeply before acting.
00:00:49.060 --> 00:01:05.950 Mira Brancu: Welcome, welcome back to the Hard Skills Show, where we take a deep dive into the most challenging soft skills required to navigate leadership uncertainty, complexities, and change today and into the future.
00:01:05.970 --> 00:01:11.619 Mira Brancu: I'm your host, Dr. Mira Brancou, psychologist, leadership consultant, and founder of Towerscope.
00:01:11.770 --> 00:01:24.519 Mira Brancu: And today, we are talking about the best intersection of soft and hard skills, where they go, right? So we're talking about what actually breaks organizations when they adopt AI,
00:01:24.820 --> 00:01:28.889 Mira Brancu: And the human-centered approach that puts them back together.
00:01:29.150 --> 00:01:43.890 Mira Brancu: So let's rethink your AI strategy. We are in Season 9, when we're focusing on strengthening workplace culture, and I think this is a really good, interesting discussion. I don't think a lot of people think about
00:01:43.890 --> 00:01:57.300 Mira Brancu: AI integration within culture strengthening, but I think once you hear our guest's perspective, you will make the connection quickly. So let me introduce our guest to you today. Wayne Williams.
00:01:57.520 --> 00:02:13.000 Mira Brancu: is the founder of Perspective Tech, and a subject matter expert on AI and human-centered design. He is a co-author of the white paper, The Intersection of AI and Human-Centered Design, and Connecting the Dots to Entrepreneurship.
00:02:13.050 --> 00:02:31.600 Mira Brancu: Wayne serves as a board advisor for the Harvard Business Review Advisory Council, the Center for Science and Public Interest, Yale School of the Environment, and ACLU, and was advisor to the White House Council on Hunger, Nutrition, and Health. Welcome to the show, Wayne.
00:02:32.210 --> 00:02:35.900 Wayne Williams: Thank you. Welcome. Everyone is here. Thank you for having me.
00:02:35.900 --> 00:02:50.089 Mira Brancu: Absolutely. So, yes, you have such an interesting background. Your, experiences range from hunger policy to environmental work to AI. How do these connect for you?
00:02:50.850 --> 00:02:59.989 Wayne Williams: They're connecting for me, because I think, for me, when I think about all of the things that you just mentioned, I don't know how we can escape the human factor.
00:03:00.940 --> 00:03:07.209 Wayne Williams: No matter where we're at, what industry we're in, for-profit, non-profit, it is the human being.
00:03:07.350 --> 00:03:18.970 Wayne Williams: That we really need to understand and connect with. Even if we're gonna move them from hunger to health, we need to understand them and how they see hunger.
00:03:19.060 --> 00:03:36.410 Wayne Williams: how they see health, that helps us to formulate the approach, plan, and strategy. And I think, personally speaking, across the board, once we understand human beings in that way, it helps us to better not only give communication, but receive communication.
00:03:37.140 --> 00:03:45.330 Mira Brancu: Absolutely, 100%. You're talking to the choir here, Wayne. So, I'm thinking, you've been in all of these rooms with
00:03:45.480 --> 00:03:50.089 Mira Brancu: people of major impact, right? They have huge reach.
00:03:50.240 --> 00:04:07.640 Mira Brancu: White House counsels, Harvard advisory boards, major corporate leadership teams. And I'm wondering, what are they thinking about when it comes to AI? What are their greatest concerns? What are their greatest hopes? How do they discuss AI integration?
00:04:09.570 --> 00:04:12.389 Wayne Williams: They are now taking a step back.
00:04:12.830 --> 00:04:14.850 Wayne Williams: From the last 3 years.
00:04:14.940 --> 00:04:17.920 Mira Brancu: Hmm. The last 3 years was…
00:04:18.089 --> 00:04:22.440 Wayne Williams: a rush to adopt AI, not to understand it.
00:04:22.910 --> 00:04:37.670 Wayne Williams: Certainly with ChatGPT and OpenAI emerging on the scene, it certainly created a rush for not just companies, but countries to begin to implement AI as a tool in the world. That is good.
00:04:37.940 --> 00:04:44.449 Wayne Williams: The problem was that they are now realizing that you have to understand a human being in a role
00:04:44.680 --> 00:04:56.150 Wayne Williams: that the human being is going to play in adapting AI, because AI is designed to work with the human being and vice versa. But the human being has to be the lead
00:04:56.450 --> 00:05:08.180 Wayne Williams: in developing AI. We cannot develop AI solely as a tool, and this is where the industry is now coming to the realization, after billions of dollars of
00:05:08.400 --> 00:05:19.860 Wayne Williams: pumping into the investment. They have yet to realize the return for those investments, and the question is becoming why not, and how can they recapture that investment.
00:05:20.210 --> 00:05:21.970 Mira Brancu: And why is it that
00:05:22.190 --> 00:05:35.590 Mira Brancu: this was not a thought 3 years ago, and now it is a thought. Like, what… what transpired for them to start thinking, hold on, put the brakes on, we need to be thinking about this?
00:05:36.660 --> 00:05:46.909 Wayne Williams: It is what we would call a feedback loop. You can't deny reality, you know? And when you start off with any project, any idea, any innovation.
00:05:47.380 --> 00:05:51.520 Wayne Williams: We have to recognize that we're kind of in the blind.
00:05:51.660 --> 00:05:53.729 Wayne Williams: It's okay not to know.
00:05:53.900 --> 00:06:00.840 Wayne Williams: And unfortunately, human beings… are not wired like that anymore. We need to know.
00:06:01.070 --> 00:06:02.829 Wayne Williams: And we need to know right now.
00:06:03.000 --> 00:06:19.700 Wayne Williams: The problem with that is that what we do know right now may not be the truth, it may not be what we need. So we took the position in the early days that we understood where we was going with AI.
00:06:20.300 --> 00:06:35.140 Wayne Williams: And AI is saying to us, I cannot go there. And after billions of dollars and years of doing this, we are now, as they say in the tech world, we are now listening to the data.
00:06:35.420 --> 00:06:49.669 Wayne Williams: And the data's saying, take a step back, because there are key components that we miss going forward, and we need to pause for a moment to recognize those key components, and then begin slowly
00:06:50.030 --> 00:06:51.369 Wayne Williams: Implementing them in.
00:06:52.250 --> 00:07:08.239 Mira Brancu: So, before we get to what is the data saying that indicates we need to take a step back, let's level set for just a second. When we think about incorporating AI, or even generative AI,
00:07:08.490 --> 00:07:14.580 Mira Brancu: In the corporate setting, what does that look like? What are some examples? What do we mean by that?
00:07:15.860 --> 00:07:17.619 Wayne Williams: That's a very good question.
00:07:17.890 --> 00:07:27.630 Wayne Williams: And you will get probably different answers based on the industry. And that is partially the beauty of it, because there's no one answer.
00:07:27.800 --> 00:07:38.990 Wayne Williams: And the reality is that for every industry, for every organization, for every team within that organization, we have to develop a unique approach.
00:07:39.110 --> 00:07:51.569 Wayne Williams: And that is really where AI have to go now to be able to not just give a generic or general approach, but we need to study that organization.
00:07:51.690 --> 00:07:58.250 Wayne Williams: the team, the industry in which their organization is operating in, and then develop AI.
00:07:58.880 --> 00:08:05.660 Mira Brancu: And so then, what are the most common
00:08:06.170 --> 00:08:14.610 Mira Brancu: AI integrations or things that people are incorporating that is causing the data to all converge around
00:08:14.740 --> 00:08:19.959 Mira Brancu: Hey, we need to stop and think about the human side of implementation.
00:08:20.510 --> 00:08:23.980 Wayne Williams: The… I think it would be… the output.
00:08:24.370 --> 00:08:31.860 Wayne Williams: We're not getting what we intended to get, or what we would… what we thought we would get by this time.
00:08:32.230 --> 00:08:32.980 Wayne Williams: By day.
00:08:32.980 --> 00:08:34.540 Mira Brancu: What did we think we would get?
00:08:35.090 --> 00:08:41.609 Wayne Williams: Well, what we thought we was gonna get, it was kind of tricky, Doctor. It was kind of tricky because we had this idea
00:08:41.929 --> 00:08:44.470 Wayne Williams: solely on automation.
00:08:44.620 --> 00:08:49.620 Wayne Williams: And we thought that we were gonna automate everything, and then things were gonna run smooth.
00:08:49.950 --> 00:08:58.140 Wayne Williams: The problem was that the automation system that we was hoping for It can't really be…
00:08:58.440 --> 00:09:02.460 Wayne Williams: place or in place without the human being
00:09:02.720 --> 00:09:06.770 Wayne Williams: automating it. We need to know more about, in other words.
00:09:06.990 --> 00:09:19.500 Wayne Williams: what is it that caused an individual or us to look at a particular movie? What is it that caused us to make the decisions that we're making? We need to have a better understanding of that.
00:09:19.650 --> 00:09:39.109 Wayne Williams: as we go forward to develop AI, to be able, generative AI, to be able to look at those things, which means the AI system now have to look at us in terms of our emotions, our culture, our politics. All those things are factors that will help us to develop a better AI system.
00:09:39.810 --> 00:09:43.370 Mira Brancu: And do you think that
00:09:44.460 --> 00:09:47.310 Mira Brancu: We had to implement it first.
00:09:47.430 --> 00:09:54.689 Mira Brancu: To see how it worked before we would know what went wrong with humans using it, or…
00:09:54.910 --> 00:10:03.500 Mira Brancu: Do you feel like there was greater opportunity in the upfront development where they could have included humans
00:10:03.930 --> 00:10:08.159 Mira Brancu: to develop it in a stronger way. What are your thoughts on that?
00:10:08.290 --> 00:10:12.460 Wayne Williams: I think you're right. I think we needed to create
00:10:12.590 --> 00:10:16.829 Wayne Williams: A situation where we had, like, a prototype.
00:10:17.640 --> 00:10:20.749 Wayne Williams: But I think what happened is we scaled up so fast.
00:10:20.750 --> 00:10:21.979 Mira Brancu: Hmm, hmm, yep.
00:10:21.980 --> 00:10:32.499 Wayne Williams: And we didn't really pay attention to the, again, the information that was coming back to us as we were beginning to integrate this into organizations and systems.
00:10:32.720 --> 00:10:37.810 Wayne Williams: The rush was, of course, the human rush, the economic rush.
00:10:37.940 --> 00:10:40.580 Wayne Williams: Was we wanted to get there before
00:10:41.460 --> 00:10:48.060 Wayne Williams: other companies. That's understandable, that's natural, but it interfered with a natural process.
00:10:48.420 --> 00:10:58.420 Wayne Williams: Had we started slow, The information, or the small scale, would have given us the information that we needed
00:10:58.640 --> 00:11:01.729 Wayne Williams: So that we could scale up properly.
00:11:01.870 --> 00:11:09.169 Wayne Williams: We scaled up so quickly, now we look back in hindsight, and we see that there are things that we missed.
00:11:09.270 --> 00:11:11.600 Wayne Williams: But we are so far out now.
00:11:12.180 --> 00:11:16.300 Wayne Williams: That to fix that, to correct that, it's gonna cause us a lot.
00:11:16.670 --> 00:11:28.280 Mira Brancu: Yeah, yeah. And I think it's an interesting point, a good point, that, like, our technological advances are moving faster than the human response and readiness.
00:11:28.390 --> 00:11:34.540 Mira Brancu: And, you know, what… Makes me think about,
00:11:34.850 --> 00:11:45.489 Mira Brancu: How we manage change management right now is you gotta get an executive sponsor, and then you've got to talk to everybody, and then you've got to get input, and then you've got to try it, and then you've got to…
00:11:45.490 --> 00:11:59.369 Mira Brancu: Sort of… so there's a… there's a… in traditional change management, you need a lot of alignment before you even implement, because otherwise the execution is going to fall apart without the alignment, even if you have a strong vision.
00:11:59.840 --> 00:12:06.630 Mira Brancu: However, with how rapidly our technology is constantly changing, you don't…
00:12:06.860 --> 00:12:14.439 Mira Brancu: necessarily have that luxury anymore. You kind of have to change your change management processes in a way. You kind of have to, like.
00:12:14.580 --> 00:12:21.739 Mira Brancu: Just pilot, throw all the things at it, see what happens, see what breaks, and then…
00:12:21.860 --> 00:12:39.779 Mira Brancu: realize, okay, now we've learned how humans interact with it, now what? And it… you're right, it's a bit more costly, and yet, at the same time, it is the reality of how technology works. Is that a good understanding of, kind of, the change management plus technology issue?
00:12:40.110 --> 00:12:48.400 Wayne Williams: That is a good understanding, and I would only add that with that understanding, that change management technique, they have to adapt.
00:12:48.620 --> 00:12:50.000 Wayne Williams: to these realities.
00:12:50.000 --> 00:12:50.730 Mira Brancu: Yeah.
00:12:50.730 --> 00:12:58.350 Wayne Williams: In other words, that change management technique would have been, and was perfect, say, in another
00:12:58.510 --> 00:13:07.240 Wayne Williams: time, another circumstance. But with technology moving as fast as it is, and then you factor in the human beings.
00:13:07.660 --> 00:13:12.930 Wayne Williams: ability or inability to keep up. How do you align those two?
00:13:13.910 --> 00:13:18.840 Wayne Williams: So we have to come up with a different model. And actually, there's yet
00:13:19.020 --> 00:13:23.630 Wayne Williams: to be a tool, we haven't developed it yet. It's moving too fast.
00:13:24.810 --> 00:13:29.340 Wayne Williams: So we have to find a tool that can move as fast as technology is moving.
00:13:29.840 --> 00:13:31.010 Mira Brancu: Yeah, yeah.
00:13:32.020 --> 00:13:36.710 Mira Brancu: Why is it that now, With the data that we have.
00:13:37.290 --> 00:13:45.330 Mira Brancu: It is going to be a more costly endeavor to try to
00:13:46.250 --> 00:13:54.020 Mira Brancu: use the data to apply to the human side of AI in order to make it run more smoothly? What is the…
00:13:54.200 --> 00:13:55.760 Mira Brancu: challenge here.
00:13:57.360 --> 00:14:00.240 Wayne Williams: Well, in short, the challenge is the human being.
00:14:00.760 --> 00:14:05.050 Wayne Williams: we really need to pause. When I say the challenge is the human being.
00:14:05.210 --> 00:14:10.789 Wayne Williams: We have to grow to recognize what we don't know and what we don't understand.
00:14:11.440 --> 00:14:16.489 Wayne Williams: We cannot bring alignment on something that we do not understand how it works.
00:14:16.650 --> 00:14:23.170 Wayne Williams: So, those of us that's in this space, that's leading this space, we have to… humility is a great tool.
00:14:24.250 --> 00:14:30.239 Wayne Williams: That's how we learn. We have to accept the fact there are certain things that we cannot yet predict.
00:14:30.340 --> 00:14:40.549 Wayne Williams: understand or do. No problem. Then we should begin to study those things so that we would know how to do them, and then, therefore, we can predict them.
00:14:41.470 --> 00:15:00.620 Mira Brancu: Yeah, yeah. This is a good way to close out our first section here. We've kind of level set a little bit, and I'm super excited when we come back from the ad break, we are reaching an ad break. When we come back, I'd love to hear how human-centered design could be a potential solution.
00:15:00.620 --> 00:15:07.880 Mira Brancu: to this big challenge. So, you are listening to the Hard Skills with me, Dr. Mira Brancou, and our guest, Wayne Williams.
00:15:07.920 --> 00:15:20.790 Mira Brancu: The Hard Skills is sponsored by Towerscope, my leadership and team development consulting firm. You can learn more about it at GoTowerscope.com. And the Hard Skills Show livestreams on Tuesdays at 5 p.m. Eastern Time.
00:15:20.970 --> 00:15:37.899 Mira Brancu: on LinkedIn, YouTube, Twitter, Twitch, all over the place through talkradio.nyc. If you are here, live with us on a Tuesday, this Tuesday, October 7th, 5pm, you can also interact with us. You can ask questions, and we will respond.
00:15:37.900 --> 00:15:41.949 Mira Brancu: So, we will be right back in just a moment with our guest.
00:17:23.819 --> 00:17:27.859 Mira Brancu: Welcome, welcome back to the Hard Skills with me, Dr. Mira Brancou, and.
00:17:27.859 --> 00:17:28.239 Wayne Williams: We…
00:17:28.240 --> 00:17:34.480 Mira Brancu: Williams, talking about, AI, Integration.
00:17:34.510 --> 00:17:53.270 Mira Brancu: and human-centered design. So, we talked about AI for a moment. Let's talk about, how human-centered design could be a solution to the challenges that we have found with uptake. What is… let's start with what is human-centered design? A lot of people are not… are still not familiar with that, so what is that?
00:17:53.610 --> 00:18:01.299 Wayne Williams: Human-centered design is, unfortunately, when we use the term, we use it like… a concept.
00:18:01.500 --> 00:18:05.940 Wayne Williams: or intellectually. Human-centered design really is a science.
00:18:06.200 --> 00:18:23.180 Wayne Williams: It is understanding the human being, as you… certainly, we cannot understand the complete… I don't understand myself completely. But the key is, understanding my limitations help me to understand what I can do, what I understand, what I can't understand.
00:18:23.350 --> 00:18:37.550 Wayne Williams: If I move forward without recognizing my limitations, I am prone to make mistakes. On a larger scale, we are making these mistakes because we are not really, I would say.
00:18:38.230 --> 00:18:40.889 Wayne Williams: We're not that self-aware.
00:18:41.140 --> 00:18:46.550 Wayne Williams: to understand our limitations. And limitations are seen as a negative.
00:18:46.750 --> 00:18:49.250 Wayne Williams: A limitation is not a negative.
00:18:49.540 --> 00:18:58.389 Wayne Williams: say humility is not a negative. I think those are the key elements that will help us to be able to escape
00:18:58.990 --> 00:18:59.850 Wayne Williams: the…
00:19:00.380 --> 00:19:09.049 Wayne Williams: human nature, if you will, the trap of human nature. We have done this throughout centuries, and so we always… I like to use a phrase.
00:19:09.210 --> 00:19:11.629 Wayne Williams: If I only knew then what I know now.
00:19:11.910 --> 00:19:22.429 Wayne Williams: Well, there's a way that we can know not everything, but there's a way in which that we can know now some of the things that we do need to know. That means we take it step by step.
00:19:23.030 --> 00:19:25.089 Wayne Williams: We use the known.
00:19:25.300 --> 00:19:27.270 Wayne Williams: To understand the unknown.
00:19:27.640 --> 00:19:32.000 Wayne Williams: We do not try to understand the unknown by the unknown.
00:19:32.050 --> 00:19:44.500 Wayne Williams: We start with what we do know and understand. Human-centered design is designed just like that. It goes into the human being and understands how the human being is going to
00:19:44.500 --> 00:19:56.490 Wayne Williams: Think, behave, what culture is he or she coming from? What language, not verbal language, what language do he or she speak? What is their understanding? What is their perception?
00:19:56.560 --> 00:20:06.999 Wayne Williams: What is their emotional level? All of those things are key factors in designing a human-centered design approach, not just to technology, but to life.
00:20:08.030 --> 00:20:09.649 Mira Brancu: And so,
00:20:09.860 --> 00:20:19.479 Mira Brancu: Can you give an example of what that would look like, applying the human-centered design, science
00:20:19.770 --> 00:20:28.899 Mira Brancu: Two, addressing a large, nebulous question like, making AI work for humans.
00:20:30.020 --> 00:20:48.670 Wayne Williams: That's a good question. And what we would do, we would start with the human, and what we would do is start on a very small scale, and find just a piece of that AI, if you will, and then we would integrate that into that human being. The meaning, the tools, the software that we develop, the AI that we develop.
00:20:48.870 --> 00:21:04.040 Wayne Williams: We will pass that in to the human being, watch how the human being interact with that, because in that interaction is the information or data that we're looking for to form the next step of AI.
00:21:05.780 --> 00:21:07.360 Wayne Williams: Does that make any sense?
00:21:07.560 --> 00:21:12.300 Mira Brancu: Yeah, so, like, for example, it's almost like if,
00:21:12.780 --> 00:21:30.489 Mira Brancu: if I was at work and they, decided to incorporate a, you know, specific chat GPT specific for our environment, right? And I'm using it to try to, like, improve my emails and, you know, the content that I'm writing.
00:21:30.840 --> 00:21:40.469 Mira Brancu: Would it be, like, you actually observing me do that, or you looking at my output, or input and output? What would… what would it look like?
00:21:40.470 --> 00:21:50.790 Wayne Williams: I am watching every keystroke and timing how long it takes you to do that. That will help me to understand where you may be confused at.
00:21:50.970 --> 00:22:00.199 Wayne Williams: I find that, and it's that area that I see that we need to develop a program, a specific program for that particular user.
00:22:00.800 --> 00:22:06.579 Mira Brancu: Interesting, and do you need… how many people do you need to feel like you have valid
00:22:06.770 --> 00:22:14.070 Mira Brancu: Data on… People within a certain organization or environment in order to make adjustments.
00:22:14.470 --> 00:22:25.800 Wayne Williams: In order to make the adjustments, I would start with just the team that's going to be using that. For example, if you are thinking about using ChatGPT to integrate it into your work.
00:22:25.890 --> 00:22:37.589 Wayne Williams: and you're planning to use it for emails and set up a automated system, then I would start with those that's going to use it like that. Because you're going to teach me
00:22:37.950 --> 00:22:43.079 Wayne Williams: how to develop the AI system for you uniquely.
00:22:43.350 --> 00:22:52.739 Wayne Williams: Where you are having, say, missteps at, what you don't understand, where you may be, intimidated at.
00:22:52.900 --> 00:23:10.119 Wayne Williams: Those things help me to understand, because if I'm intimidated by the use of AI because I don't understand it, I want to remove that out of the way. Because as long as that is part of that individual's working and interaction with AI, we can't get a good product.
00:23:10.670 --> 00:23:15.679 Wayne Williams: can't get a good output. So we need to track all that to understand that human beings
00:23:15.920 --> 00:23:17.829 Wayne Williams: Use of AI.
00:23:17.950 --> 00:23:26.860 Wayne Williams: And then develop the AI system that will help, because they, like, they're co-creators. They're creating the next step, together.
00:23:27.750 --> 00:23:34.519 Mira Brancu: Yeah, and so, if I'm… if I'm an organizational leader, and I'm thinking, oh my gosh.
00:23:34.690 --> 00:23:42.009 Mira Brancu: like, what Wayne is talking about is super time-intensive. This is, I'm sure, very costly, right?
00:23:43.150 --> 00:23:55.840 Mira Brancu: what is the difference between the cost of, overlooking, dismissing human-centered design when it comes to AI integration versus incorporating it? What… what have you seen?
00:23:56.370 --> 00:24:06.489 Wayne Williams: What we're seeing is what we talked about earlier. When you look at the amount of money that has been… went into AI development, into the billions, that is the cost.
00:24:07.040 --> 00:24:21.550 Wayne Williams: And we are at a point now with AI development where we have not yet… we are pumping billions of dollars into systems that we're not getting the return back yet. So either we honestly begin to look at that.
00:24:21.720 --> 00:24:27.280 Wayne Williams: Address that, and begin to fix that now, or we can continue to go down that…
00:24:27.560 --> 00:24:36.419 Wayne Williams: financial black hole, if you will, without being able to produce the end results that we have. There's certain hard truths
00:24:36.820 --> 00:24:38.520 Wayne Williams: That we're gonna have to deal with.
00:24:38.790 --> 00:24:41.040 Mira Brancu: Yeah. We're not going to escape that.
00:24:41.470 --> 00:24:54.719 Mira Brancu: So, I'm thinking about, like, the metaphor of sunk cost fallacy, when, you know, you've got, like, this car that you invested so much money in, and in the beginning, it was, like, the fanciest car ever, it was talking to you, it had
00:24:54.720 --> 00:25:04.989 Mira Brancu: blings and dings and, you know, so fancy, and over time, you started realizing you're not using half the stuff. You don't even know how to use half the stuff.
00:25:04.990 --> 00:25:10.799 Mira Brancu: The things that you are using in this fancy car are not working the way that you hoped.
00:25:10.800 --> 00:25:17.320 Mira Brancu: And yet, you keep avoiding getting trained up in using it. You're not even using the most
00:25:17.450 --> 00:25:34.730 Mira Brancu: you know, useful qualities about this car or something like that. That's kind of like… does that, is that a good metaphor for how somebody, like, in a leadership role could think about the, cost-benefit analysis of engaging in this kind of work?
00:25:35.130 --> 00:25:37.240 Wayne Williams: That is a perfect metaphor.
00:25:37.510 --> 00:25:41.809 Wayne Williams: Because that metaphor, if accepted by leadership.
00:25:42.130 --> 00:25:44.750 Wayne Williams: It puts leadership in the decision-making.
00:25:44.980 --> 00:25:45.950 Wayne Williams: position.
00:25:46.500 --> 00:25:50.740 Wayne Williams: Either we will teach that person how to use that car.
00:25:51.080 --> 00:25:53.720 Wayne Williams: Or that person would continue to…
00:25:53.880 --> 00:25:57.400 Wayne Williams: In other words, that person's not getting the best out of that car.
00:25:57.400 --> 00:25:59.230 Mira Brancu: Right. You think about the phone.
00:25:59.390 --> 00:26:08.260 Wayne Williams: I mean, how often, you know, the phone… how many of us actually know how to use it? I mean, we make phone calls. This is the whole world in your hand.
00:26:08.530 --> 00:26:20.289 Wayne Williams: And we're just making phone calls and texting. The things that we can do with technology, you know, we are maybe touching on 20% of it. 80% of it is not being used.
00:26:20.580 --> 00:26:26.419 Mira Brancu: Yeah, well, that reminds me of when my mom first switched over to, like.
00:26:26.630 --> 00:26:33.460 Mira Brancu: you know, a fancy phone, right? And, you know, the iPhone, or, or whatever, Google.
00:26:34.220 --> 00:26:36.789 Mira Brancu: And she was using it for a watch.
00:26:36.960 --> 00:26:52.880 Mira Brancu: Why pay all of that money to use it for a watch? There's so much more you can do with it. But it was too overwhelming, too much, and she just said, forget it, right? And that is the cost, literal cost.
00:26:53.020 --> 00:27:02.009 Mira Brancu: Of getting your employees this fancy stuff that they refuse to use, because it's too much, it's too confusing, it's too overwhelming, it's,
00:27:02.110 --> 00:27:14.420 Mira Brancu: Or they're frustrated with it, because they haven't gotten the appropriate training or support, or it's not fitting into their, like, normal day-to-day lifestyle of how they work, right?
00:27:15.080 --> 00:27:22.720 Wayne Williams: That is correct, and that's what we as organizations and leadership have to look at. You don't want to give this tool to anyone.
00:27:23.470 --> 00:27:25.979 Wayne Williams: We need to understand the…
00:27:26.830 --> 00:27:39.249 Wayne Williams: people that we are integrating this tool with, because if they are the way that you just described them, they would not yet be the right candidates. Let's find the best candidates
00:27:39.300 --> 00:27:52.049 Wayne Williams: in that organization, candidates that is… that are more adaptable to using AI, more up to the challenge. And let's integrate AI into the system with them.
00:27:52.140 --> 00:27:58.800 Wayne Williams: And then we can use them as teachers to help others that may be fearful of
00:27:58.820 --> 00:28:17.889 Wayne Williams: the integration of AI, that may be just using a phone for a clock, and we can show them that you can do more with this. And I think as we take them along, showing them the possibilities with AI, I think gradually they will begin to adapt
00:28:17.890 --> 00:28:19.280 Wayne Williams: to these systems.
00:28:19.280 --> 00:28:31.009 Mira Brancu: Yeah, I was just gonna ask about, you know, under what conditions, you know, when should organizations and executives
00:28:31.140 --> 00:28:39.220 Mira Brancu: think about making an effort to infuse human-centered design. You mentioned one of those conditions is
00:28:39.270 --> 00:28:52.150 Mira Brancu: When you have, high-level folks with, excitement, interest, capability to adopt as, like, early adopters, right?
00:28:52.680 --> 00:29:02.160 Mira Brancu: that's a good case. Are there other, sort of, situations, cases, when… or context.
00:29:02.320 --> 00:29:06.880 Mira Brancu: When, leaders should be thinking about
00:29:06.990 --> 00:29:14.190 Mira Brancu: using human-centered design in order to optimize AI and other technologies.
00:29:14.990 --> 00:29:17.869 Wayne Williams: Yes, the short answer to that is projects.
00:29:18.890 --> 00:29:23.770 Wayne Williams: When we have a team working on a project, We need to be gradually
00:29:23.920 --> 00:29:27.339 Wayne Williams: Slowly integrate that into that project.
00:29:27.990 --> 00:29:33.220 Wayne Williams: Because when… because we learn from our experience, we learn from doing.
00:29:33.620 --> 00:29:40.559 Wayne Williams: You know, of course, we all know in the last 4 or 5 years, we've been in a thousand trainings.
00:29:41.520 --> 00:29:54.410 Wayne Williams: And these trainings have, yes, they, have given us a lot of information, but our best way to learn is actually applying immediately what we learn. So, find a project.
00:29:54.800 --> 00:29:57.459 Wayne Williams: And integrate gradually, slowly.
00:29:57.680 --> 00:30:09.299 Wayne Williams: a human-centered design approach into that AI system for that project. And when they begin to see the return of their time, then that's get buy-in.
00:30:09.690 --> 00:30:20.049 Mira Brancu: Yeah, that makes a lot of sense. You know, if you've already gotten, you know, AI or generative AI, you know, in order to enhance and optimize people's work.
00:30:21.340 --> 00:30:23.680 Mira Brancu: And there's a project coming up.
00:30:23.970 --> 00:30:27.829 Mira Brancu: It makes perfect sense, like, hey, let's make this project
00:30:27.940 --> 00:30:37.470 Mira Brancu: You know, let's maximize the way that we use the technologies that we have with this project, and the best way to do that is
00:30:37.610 --> 00:30:55.610 Mira Brancu: you know, incorporate this human-centered design to make sure that people are actually capable of doing this well, and they're finding utility, and it's exciting, and it's, you know, energizing instead of frustrating to use this technology, for sure, yeah, that makes a lot of sense. So,
00:30:55.940 --> 00:31:05.659 Mira Brancu: somehow, I don't know how we got to this point so quickly, we're reaching another ad break. So, when we come back, let's, continue talking about,
00:31:05.710 --> 00:31:22.539 Mira Brancu: you know, AI, you know, use cases for, different situations, the impact on humans, and more. You're listening to The Hard Skills with me, Dr. Mira Brancou, and our guest, Wayne Williams, and we will be right back in just a moment.
00:32:46.790 --> 00:32:59.190 Mira Brancu: Welcome, welcome back to me, Dr. Mira Brancou, on Hard Skills with Wayne Williams.
00:32:59.520 --> 00:33:05.929 Mira Brancu: Wayne, we just got done talking about, like, conditions in which,
00:33:06.140 --> 00:33:16.039 Mira Brancu: thinking about applying human-centered design to AI integration makes sense. One of those is early adopters, the other is projects.
00:33:16.890 --> 00:33:27.109 Mira Brancu: Are there… Situations… where… That are not good for human-centered design application. And even more so, are there…
00:33:27.310 --> 00:33:31.640 Mira Brancu: Situations that you've seen where you've had early adopters.
00:33:31.910 --> 00:33:35.110 Mira Brancu: Or you've had projects where you thought.
00:33:35.420 --> 00:33:41.799 Mira Brancu: I don't think they're ready for this, or I don't think this human-centered design approach is quite right for this.
00:33:42.200 --> 00:33:43.250 Mira Brancu: Question.
00:33:44.150 --> 00:33:49.849 Wayne Williams: Yes, and… Honestly, that's part of the process, and we should not fear that.
00:33:50.840 --> 00:33:54.729 Wayne Williams: And you… I can't fear the missteps.
00:33:55.360 --> 00:33:56.680 Wayne Williams: They're gonna happen.
00:33:57.240 --> 00:34:12.359 Wayne Williams: My job is to learn from those missteps, to embrace them, because as we all know, we learn more from that misstep than we did from the success. So there are many times that I approach a project, and honestly, I approach it wrongly.
00:34:12.540 --> 00:34:20.529 Wayne Williams: I need to accept that and understand that. It's okay, because the goal, principally speaking, is to create an environment
00:34:21.489 --> 00:34:23.659 Wayne Williams: Where we can learn.
00:34:24.239 --> 00:34:29.329 Wayne Williams: We need to create an environment where the early adopters can come in
00:34:29.550 --> 00:34:35.409 Wayne Williams: But we have to find a way to work them gradually into
00:34:35.540 --> 00:34:45.479 Wayne Williams: those that may not be early adapters. So that means we have to begin to think about the culture that we have in that organization, and begin to find
00:34:46.530 --> 00:34:53.019 Wayne Williams: entry levels where we can begin to gradually change the culture, because it's not just the AI.
00:34:53.130 --> 00:34:54.699 Wayne Williams: It is the human being.
00:34:54.940 --> 00:35:01.120 Wayne Williams: And Dr. Bronco, I think that's what we're trying to get around, the human being. We can't get around that.
00:35:01.120 --> 00:35:02.959 Mira Brancu: Yeah. Dang! Dang!
00:35:02.960 --> 00:35:03.830 Wayne Williams: Yeah.
00:35:03.830 --> 00:35:05.449 Mira Brancu: You can't. You can't.
00:35:05.450 --> 00:35:12.219 Wayne Williams: We can't get around that. And any one of us, or any institution, or organization, or leadership.
00:35:13.020 --> 00:35:18.110 Wayne Williams: that I would say that is courageous enough and bold enough to understand that.
00:35:18.550 --> 00:35:34.910 Wayne Williams: And the value of attacking that, don't worry about getting it right. Don't worry about it being perfect. Be bold enough to recognize it, and then begin to gradually address those human factors, that we can begin to understand the role
00:35:35.280 --> 00:35:43.270 Wayne Williams: That human beings play in our organization, and all of our success. We're not going to be successful without them.
00:35:43.720 --> 00:35:52.669 Mira Brancu: Yeah, yeah. And, this, this, actually gets to the next thing I've been thinking about, which is, yeah, the human factor.
00:35:52.670 --> 00:36:05.330 Mira Brancu: We make mistakes, we don't always get it right. And in your white paper, you mentioned that AI systems can also unintentionally perpetuate biases.
00:36:05.730 --> 00:36:11.910 Mira Brancu: How do those look like? How do they contribute to the problem? How does it interact with human bias?
00:36:12.350 --> 00:36:17.890 Wayne Williams: That is a good question. It interacts because they're human bias, is being…
00:36:18.180 --> 00:36:21.490 Wayne Williams: gradually put into the AI system.
00:36:22.330 --> 00:36:27.259 Wayne Williams: The human bias is, if I have a limited experience.
00:36:28.120 --> 00:36:32.189 Wayne Williams: With, say, technology, or with culture, or with language.
00:36:32.520 --> 00:36:35.909 Wayne Williams: as I develop the AI system or tool.
00:36:36.560 --> 00:36:38.560 Wayne Williams: I have no other choice but…
00:36:38.960 --> 00:36:40.709 Wayne Williams: To pit that into that tool.
00:36:41.050 --> 00:36:59.429 Wayne Williams: That is my bias. My likes and dislikes. And so I put that into the system. That system that I just created determined what is right and what is wrong, but it is based on my own biases. I have to get out of the way
00:36:59.640 --> 00:37:11.680 Wayne Williams: Which means that we have to integrate, when we think about developing these systems, we have to really integrate all spectrums of this… in this system.
00:37:11.680 --> 00:37:20.519 Wayne Williams: We have to be clear about our limited experience. You know, they say that human beings actually have, I think, 150 different biases.
00:37:20.550 --> 00:37:21.620 Mira Brancu: That's a lot.
00:37:22.640 --> 00:37:30.260 Wayne Williams: That is a lot. Now, how do we tackle all those? We can't. So what we have to do is, first and foremost, be honest enough.
00:37:30.520 --> 00:37:48.509 Wayne Williams: as we develop these tools, to understand that I have a limitation, and bring other people in that may not have my limitation, and let them be a part of not just the early phases, but they also have to be a part of the decision-making process.
00:37:48.640 --> 00:37:52.409 Mira Brancu: Yeah, this actually, brings me to a very tough
00:37:52.760 --> 00:37:56.809 Mira Brancu: Question. I'm curious if you have any thoughts on this.
00:37:57.290 --> 00:37:58.510 Mira Brancu: Given that
00:37:58.860 --> 00:38:16.830 Mira Brancu: how many biases humans bring. And they incorporate it in all the things that they design, including AI technologies which are supposed to make our lives easier, but somehow we still, make it more complicated, more difficult for ourselves.
00:38:17.190 --> 00:38:19.350 Mira Brancu: Some might start thinking.
00:38:19.690 --> 00:38:28.109 Mira Brancu: why not develop a system without humans? Or a system that replaces humans, right? So that
00:38:28.280 --> 00:38:36.560 Mira Brancu: we could try to eliminate all the messiness that humans bring to ruin our perfect AI system.
00:38:36.810 --> 00:38:38.330 Mira Brancu: And,
00:38:38.730 --> 00:38:47.249 Mira Brancu: I'm sure you're not for that, but I'm curious, why not, right? Why not, what is… what is the…
00:38:47.370 --> 00:38:55.539 Mira Brancu: sort of opportunity here of humans interacting with AI if we could find a way to address
00:38:55.700 --> 00:39:00.750 Mira Brancu: these biases that are, you know, in both systems, AI and human.
00:39:01.410 --> 00:39:07.469 Wayne Williams: That's a good question. And I think, for me, the answer is that whatever system that we develop.
00:39:07.860 --> 00:39:10.139 Wayne Williams: It will be developed by human beings.
00:39:10.500 --> 00:39:13.510 Wayne Williams: And those human beings that developed that system.
00:39:13.660 --> 00:39:16.509 Wayne Williams: They, in their attempt to escape.
00:39:17.120 --> 00:39:23.339 Wayne Williams: creating a biased system. They have just, in fact, created a biased system.
00:39:23.340 --> 00:39:23.970 Mira Brancu: Hmm…
00:39:23.970 --> 00:39:25.129 Wayne Williams: You can't escape this.
00:39:25.420 --> 00:39:28.650 Mira Brancu: Again, back to, we cannot bypass humans.
00:39:28.650 --> 00:39:31.090 Wayne Williams: Cannot.
00:39:31.780 --> 00:39:40.199 Mira Brancu: Yeah, so, it… do you feel like AI… we… when we started talking.
00:39:40.920 --> 00:39:44.739 Mira Brancu: You said, like, 3 years ago, AI was made for automation.
00:39:45.020 --> 00:39:49.469 Mira Brancu: Right? It's to… it was to simplify and, you know, make us more efficient.
00:39:49.470 --> 00:39:50.130 Wayne Williams: Right.
00:39:50.130 --> 00:39:55.149 Mira Brancu: But I know that you've also referenced the potential
00:39:55.350 --> 00:40:00.860 Mira Brancu: for AI to have a positive impact on humans, not… not just
00:40:01.240 --> 00:40:08.899 Mira Brancu: as a source of, automating, but also creating positive impact. So.
00:40:09.470 --> 00:40:20.699 Mira Brancu: Where do you think, like, AI stands right now? Is it simply to replace routine tasks and fetch information, or is there an opportunity here that if we could
00:40:20.920 --> 00:40:25.850 Mira Brancu: Work hand-in-hand, it could actually improve our world.
00:40:27.280 --> 00:40:32.609 Wayne Williams: That we are really in a position where the opportunity persists.
00:40:32.840 --> 00:40:33.960 Wayne Williams: As this.
00:40:34.340 --> 00:40:37.650 Wayne Williams: Where we can use AI to improve our world.
00:40:37.900 --> 00:40:44.450 Wayne Williams: But in order for us to do that, we have to first understand what is AI, and how do it work.
00:40:44.790 --> 00:40:47.979 Wayne Williams: we can't… if you think of AI as a tool.
00:40:48.500 --> 00:41:02.429 Wayne Williams: It's not, it's something much more than that. Which is funny, because, you know, when we think about this technology, whether it be algorithms or AI or data or whatever, we think about these things as a singular entity, and it's not.
00:41:03.760 --> 00:41:09.309 Wayne Williams: AI systems Combined with other systems.
00:41:10.020 --> 00:41:13.960 Wayne Williams: And there's math within those systems that make those systems work.
00:41:14.340 --> 00:41:28.070 Wayne Williams: properly. If the math is off, then those systems are not going to work properly. The output that we are looking for is not the one we're going to get. And so, the idea of creating a system
00:41:29.200 --> 00:41:36.339 Wayne Williams: for us is that we have to really take a step back. I keep, you know, for me, I tell my team.
00:41:37.460 --> 00:41:39.000 Wayne Williams: You can't forget me.
00:41:39.970 --> 00:41:43.319 Wayne Williams: It begins with me and ends with me.
00:41:43.450 --> 00:41:45.880 Wayne Williams: But here's the thing, I am fallible.
00:41:46.470 --> 00:41:54.099 Wayne Williams: So somebody need to help me to understand what I'm missing. That's not a bad thing. That is a good thing.
00:41:54.300 --> 00:41:59.630 Wayne Williams: The systems that we're developing right now is kind of bypassing that.
00:42:00.220 --> 00:42:03.310 Wayne Williams: I'll give you an example. It's like having…
00:42:04.550 --> 00:42:19.539 Wayne Williams: are not good boss, or good… someone that's not good in leadership. Someone need to tell them that, don't you think? At some point, someone needs to say, respectfully, wisely, someone needs to say to me, you could… I'll give you a quick story. I have a friend.
00:42:19.850 --> 00:42:22.339 Wayne Williams: And I kind of mentor him, he's in Charlotte.
00:42:22.910 --> 00:42:27.039 Wayne Williams: We was at a tech show in Charlotte, Smart City Design.
00:42:27.290 --> 00:42:30.380 Wayne Williams: to make a long story short, I was talking to some people.
00:42:30.580 --> 00:42:41.600 Wayne Williams: And when I got done, and he's kind of younger than me, and he said to me, he said, Wayne, he said, you know, your ideas are good. He said, but your execution is horrible.
00:42:42.260 --> 00:42:47.470 Wayne Williams: And when he told me that, you know, certainly we're human beings, right? I kind of looked at him.
00:42:47.980 --> 00:42:51.600 Wayne Williams: But I had to digest that because he was right.
00:42:52.490 --> 00:42:56.400 Wayne Williams: my execution was horrible. That don't mean I'm horrible.
00:42:56.650 --> 00:43:01.100 Wayne Williams: The execution was, sometimes we do things that are not
00:43:01.950 --> 00:43:08.399 Wayne Williams: produce the outcome we want. That's okay. That don't mean that we're not good. We need someone
00:43:08.570 --> 00:43:13.100 Wayne Williams: That help us when we find ourselves in those situations.
00:43:13.390 --> 00:43:23.609 Mira Brancu: Yeah, yeah. I'm curious… Have you… seen An organization get this right.
00:43:23.930 --> 00:43:27.440 Mira Brancu: be able to implement an AI solution
00:43:28.000 --> 00:43:32.490 Mira Brancu: well with human-centered design? And if so.
00:43:32.650 --> 00:43:35.629 Mira Brancu: What did it look like? What did they do differently?
00:43:36.830 --> 00:43:40.929 Wayne Williams: I have not yet seen which is good.
00:43:41.310 --> 00:43:46.250 Wayne Williams: Because any organization that will get it right this early really would be getting it wrong.
00:43:46.470 --> 00:43:51.839 Wayne Williams: they will get something right. And so, because we're so early into this.
00:43:51.970 --> 00:43:57.730 Wayne Williams: we're not nowhere near the… even the middle. So I see organizations
00:43:57.910 --> 00:44:01.360 Wayne Williams: Changing their approach now, and that is good.
00:44:01.780 --> 00:44:07.599 Wayne Williams: Have they yet gotten the concept right, or the science right? Not yet.
00:44:07.730 --> 00:44:22.040 Wayne Williams: The science could be made right if we follow the science. You can make it right, you know, if we follow the data. You have to really follow the science, and then you can make it right. The science will make it right, not us.
00:44:22.930 --> 00:44:35.899 Mira Brancu: Yeah, yeah. We are reaching another ad break, and so when we come back, let's continue talking about, what would it look like from start to finish,
00:44:35.900 --> 00:44:53.909 Mira Brancu: for… especially now that some companies are in the middle of this process, what's left that they're missing with the science of applying human-centered design? So, you're listening to The Hard Skills with me, Dr. Mira Brancou, and our guest today, Wayne Williams, and we will be right back in just a moment.
00:46:23.810 --> 00:46:30.369 Mira Brancu: Welcome, welcome back to The Hard Skills with me, Dr. Mira Brancou, and our guest today, Wayne Williams.
00:46:30.450 --> 00:46:46.690 Mira Brancu: So, Wayne, you were telling me about how, you haven't yet seen the full, final impact of organizations applying human-centered design to AI integration because it's a little bit too early.
00:46:46.780 --> 00:46:54.199 Mira Brancu: And there's a big science to it, too. And so, I'm curious,
00:46:54.490 --> 00:46:57.040 Mira Brancu: What is necessary to get this right?
00:46:58.500 --> 00:47:09.250 Wayne Williams: That is a strong question, Doctor, because what is really necessary is, one, we have to do it gradually. This is going to take time to get it right.
00:47:10.180 --> 00:47:17.899 Wayne Williams: If we want to rush, We can get to a destination that we're gonna have to adapt later.
00:47:18.250 --> 00:47:24.389 Wayne Williams: It's a question of, I think the old saying is, an ounce of prevention is worth a pound of care.
00:47:24.500 --> 00:47:29.989 Wayne Williams: We have to decide that we want, as a society, As an industry.
00:47:30.260 --> 00:47:33.309 Wayne Williams: That we are willing to embrace.
00:47:33.610 --> 00:47:35.819 Wayne Williams: That ounce of care.
00:47:36.120 --> 00:47:39.120 Wayne Williams: Which means that we have to understand
00:47:39.300 --> 00:47:46.789 Wayne Williams: The tools that we are using and developing, and how they are impacting us.
00:47:47.200 --> 00:47:52.869 Wayne Williams: Not just in the organization, But how are they impacting society?
00:47:53.130 --> 00:48:12.080 Wayne Williams: We cannot ignore those factors, because through understanding how they are impacting society, there's information or data in that that will help us to better develop a system that will allow us to be able to create the
00:48:12.450 --> 00:48:13.770 Wayne Williams: outcomes
00:48:14.000 --> 00:48:26.580 Wayne Williams: that are beneficial, that's there, hidden inside of the AI, but we need to use it, or go into the AI, application-wise, to be able to bring that out.
00:48:26.710 --> 00:48:30.359 Wayne Williams: I look at it like this. You think of a cell.
00:48:31.590 --> 00:48:36.429 Wayne Williams: All of the information that we need is hidden in that cell.
00:48:37.250 --> 00:48:39.180 Wayne Williams: You and I came from a cell.
00:48:39.510 --> 00:48:40.710 Wayne Williams: Here we are.
00:48:40.860 --> 00:48:58.240 Wayne Williams: Everything that we need to understand about AI is in what you would call the origin of the human being, meaning the interaction of the human being with AI. We need to study those things first, starting with the human factor.
00:48:58.240 --> 00:49:01.080 Wayne Williams: To understand how do human beings see this.
00:49:01.630 --> 00:49:04.159 Wayne Williams: And how do human beings see themselves?
00:49:05.280 --> 00:49:17.139 Wayne Williams: You know, because if the human being don't see themselves properly, the human beings bring into the table, what, their own biases, their own ideas, their own concepts. While they are good, we can't solely depend on that.
00:49:17.630 --> 00:49:19.670 Mira Brancu: Yeah. One of the things that…
00:49:20.170 --> 00:49:28.520 Mira Brancu: It's making me reflect a lot about, is… A lot of the newest technologies.
00:49:29.300 --> 00:49:34.560 Mira Brancu: Have been about… Helping us be more efficient with our time.
00:49:35.100 --> 00:49:44.129 Mira Brancu: Right? Like, let's… let's take, a lot of those calendar apps, right? Now, instead of us emailing each other a billion times.
00:49:44.290 --> 00:49:45.670 Mira Brancu: About…
00:49:46.030 --> 00:49:52.959 Mira Brancu: Does this time work? No. Does this time work? No. Does this time work? No. You know? You can say, here's a link to my calendar app.
00:49:53.100 --> 00:50:00.379 Mira Brancu: please find a time, right? Which, technically helps us, right?
00:50:00.490 --> 00:50:15.279 Mira Brancu: And if I can train my calendar app, which I have, to be thoughtful about my preferences, it really does help me. However, there's been a number of advances where
00:50:15.750 --> 00:50:29.250 Mira Brancu: The organization's preferences to improve productivity and efficiency have created situations where we are actually, as humans, more exhausted
00:50:29.840 --> 00:50:32.720 Mira Brancu: Because there's very little room to be humid.
00:50:33.060 --> 00:50:41.299 Mira Brancu: And that includes for example, Zoom after Zoom after Zoom meeting, instead of what used to happen, which is
00:50:41.530 --> 00:50:47.430 Mira Brancu: 10 minutes to walk to this meeting, and 10 minutes to walk to this meeting, and, like, breathing room, space.
00:50:47.620 --> 00:50:53.200 Mira Brancu: And so now we have 13 meetings instead of 8 meetings a day, or whatever, right? And so,
00:50:53.330 --> 00:50:54.750 Mira Brancu: What you're saying
00:50:54.770 --> 00:51:07.710 Mira Brancu: is, in order for us to think about the future of human-centered design and AI application, we have to take into consideration
00:51:07.730 --> 00:51:16.700 Mira Brancu: Than what humans need, but that can be at odds with what organizations want, which is minimal margin.
00:51:17.280 --> 00:51:23.659 Wayne Williams: That is true. And that is the… really the problem. We have to try to find the right balance.
00:51:24.330 --> 00:51:26.579 Wayne Williams: You have been able to…
00:51:26.880 --> 00:51:42.059 Wayne Williams: you actually really explained the human-centered design approach with your calendar. Your calendar knows you now, but how did it get to know you? Because you gave it the information that it needed, so now you and your calendar are on the same page.
00:51:42.060 --> 00:51:43.150 Mira Brancu: It's my best friend.
00:51:43.150 --> 00:51:46.510 Wayne Williams: That's right. And so you trained your best friend.
00:51:46.510 --> 00:51:47.690 Mira Brancu: Yes.
00:51:47.690 --> 00:51:54.259 Wayne Williams: train. And so, with every piece of technology, and particularly, like, with AI, we have to train it.
00:51:54.860 --> 00:52:04.509 Wayne Williams: But we have to understand what we're training. The problem with the… with the Zoom meetings, after Zoom meeting, I don't see the technology as the problem.
00:52:04.510 --> 00:52:05.280 Mira Brancu: Mmm.
00:52:05.280 --> 00:52:07.559 Wayne Williams: I think the human being is the problem.
00:52:07.560 --> 00:52:08.310 Mira Brancu: Mmm.
00:52:08.310 --> 00:52:21.890 Wayne Williams: Somebody sent that Zoom link out. You know, I was at, like, probably like you, I've been a thousand different trainings in the last couple years, and I can tell you, I know when I'm in a good training or a dangerous one.
00:52:22.820 --> 00:52:32.029 Wayne Williams: They start with the saying that, I know we got a lot of material to cover, and we're gonna try to get through it quickly. Once I hear that, I know that's real intro.
00:52:33.470 --> 00:52:40.119 Wayne Williams: I know that we are in a training or a meeting that someone did not really consider.
00:52:40.460 --> 00:52:44.629 Wayne Williams: what are we doing here? More information is not better.
00:52:45.680 --> 00:52:50.419 Wayne Williams: more information is not better. And so we are in a cycle
00:52:50.620 --> 00:53:00.409 Wayne Williams: since the pandemic, where we have given out so much information, but not the room or the time to process it. Because to walk from
00:53:00.460 --> 00:53:11.300 Wayne Williams: one room to another room en route to a meeting, that is time to prepare for that meeting, if that makes any sense. That's time to think about that meeting. That's the human factor.
00:53:11.910 --> 00:53:20.169 Wayne Williams: But when you're just on a Zoom, and you're on a Zoom for an hour, and then the next one is coming in 5 minutes, you never even get a chance to
00:53:20.400 --> 00:53:28.530 Wayne Williams: Debrief. What did you just learn? What did you just… and so that, to me, is not a technology problem. I think that's a human problem.
00:53:28.890 --> 00:53:33.810 Mira Brancu: Yeah, yes, and what that,
00:53:34.150 --> 00:53:42.039 Mira Brancu: informs me is also that as much as you might want to improve efficiency.
00:53:42.320 --> 00:53:54.629 Mira Brancu: it has an effect on effectiveness. I can't be as effective by the end of the day in any meeting if what you've done is improved my quote-unquote efficiency.
00:53:54.900 --> 00:54:07.309 Mira Brancu: with having more meetings, because I haven't had a chance to process, to think, to plan next steps, to… right? To rest, which humans actually need.
00:54:07.440 --> 00:54:13.229 Mira Brancu: For better or for worse, we need some rest in between, and we're most effective
00:54:13.620 --> 00:54:18.870 Mira Brancu: When there's an ideal balance of efficiency to rest.
00:54:20.070 --> 00:54:20.790 Wayne Williams: Yes.
00:54:20.930 --> 00:54:26.599 Mira Brancu: Yeah, yeah. Oh, good conversation, and
00:54:26.820 --> 00:54:35.779 Mira Brancu: Our, our engineer said in the chat, it's more efficient in theory, but less efficient in practice.
00:54:35.780 --> 00:54:37.060 Wayne Williams: That is right.
00:54:37.060 --> 00:54:44.020 Mira Brancu: Very, very good, yes. Okay, so, if people want to learn more about your work, where can they go?
00:54:44.570 --> 00:55:01.059 Wayne Williams: They can go to Perspective Tech website, and I'll put it in the chat. And then, of course, if they want to learn more about that, they can reach out to me, and I will also put my email address into the chat. Well, naturally, you can find it on the website. And…
00:55:01.170 --> 00:55:04.540 Wayne Williams: I would be happy to sit down with, anyone.
00:55:04.770 --> 00:55:06.179 Wayne Williams: Yes, thank you.
00:55:06.180 --> 00:55:21.739 Mira Brancu: Yes, I'm sharing it right now. For those of you who are watching live, or are going to watch the recording later, you will be able to see, prospectivetechPA.org.
00:55:22.020 --> 00:55:36.699 Mira Brancu: So that you can find out more about their work, their AI team, their mission, the white paper that I referenced, and all of this is also going to be in the show notes, how to access and contact, Wayne Williams.
00:55:37.100 --> 00:55:41.320 Mira Brancu: And, if there's one thing
00:55:41.780 --> 00:55:49.849 Mira Brancu: Wayne, that you would like people to take away from our conversation together? What would that one thing be?
00:55:53.100 --> 00:56:00.570 Wayne Williams: That one thing would be… that I think, as… as we enter into this.
00:56:01.330 --> 00:56:08.740 Wayne Williams: We need to really utilize and understand the value and power of critical thinking.
00:56:09.570 --> 00:56:13.879 Wayne Williams: Critical thinking, some say today, is the new superpower.
00:56:14.640 --> 00:56:27.559 Wayne Williams: What do that mean in application? That means that if I'm looking at or listening to someone, whether it be in an organization, structure, or my wife, or whoever, if I'm using critical thinking skills.
00:56:27.670 --> 00:56:32.459 Wayne Williams: I can really hear them, understand them, and I can make better decisions.
00:56:33.070 --> 00:56:37.640 Mira Brancu: Yeah, I think that's a great message, because,
00:56:38.300 --> 00:56:46.770 Mira Brancu: So often, we want to short-circuit And… just… Have a simple answer.
00:56:46.900 --> 00:56:47.800 Mira Brancu: But…
00:56:47.810 --> 00:56:56.869 Mira Brancu: The simple answer is not always the best answer. Sometimes you really actually need to spend time thinking about the right answer, not the simple answer.
00:56:56.870 --> 00:57:08.170 Mira Brancu: Right? And that requires a little bit more critical thinking. So, I did see that, you shared in the chat, and I want to share it with others who are going to be listening and not necessarily able to… to watch,
00:57:08.610 --> 00:57:22.590 Mira Brancu: If you want to contact Wayne Williams, it is, Wayne with a Y, W-A-Y-N-E, Williams, at prospectivetechpa.org, if you want to contact him directly. So…
00:57:22.700 --> 00:57:25.770 Wayne Williams: Audience, what did YOU take away?
00:57:25.770 --> 00:57:26.830 Mira Brancu: from today.
00:57:27.150 --> 00:57:32.979 Mira Brancu: And more importantly, what is one small change you can implement this week based on what you learned from Wayne?
00:57:33.190 --> 00:57:48.799 Mira Brancu: It could just be critical thinking skills. Or it could be more than that, really deeply thinking about how you're integrating your AI into the world of work or others. So, whatever that is, share it with us on LinkedIn.
00:57:48.800 --> 00:57:53.869 Mira Brancu: And at talkradio.nyc, so we could cheer you on and respond to you.
00:57:54.120 --> 00:58:07.499 Mira Brancu: The Hard Skills is also on Facebook, Instagram, Twitter, Twitch, Apple, Spotify, and Amazon Podcasts, and many other places. If today's episode resonated with you, share it with a colleague, or leave a review.
00:58:07.750 --> 00:58:19.119 Mira Brancu: And if you're also looking for more personalized leadership development or team development support, you can also head to GoTowerscope.com to schedule a consultation with me.
00:58:19.230 --> 00:58:35.529 Mira Brancu: Thank you to talkradio.nyc for hosting. Together, we will navigate the complexities of leadership and emerge stronger on the other side. Thank you for joining me and Wayne Williams today on this journey where we talked about one of the greatest, most
00:58:35.660 --> 00:58:38.440 Mira Brancu: Interesting complexities of today's leadership.
00:58:38.670 --> 00:58:49.010 Mira Brancu: Which is technology and AI, right? This is Mira Brancou signing off. Until next time, stay steady, stay present, and keep building those hard skills muscles muscles.