top of page
Compro Technologies
Digital cosystem

DIGITAL ECOSYSTEM

An Ecosystem of components and services which enables publishers to rapidly create their next-generation learning solution.

Engage

ENGAGE

A ready-to-deploy, SAAS learning platform. Allows publishers and content creators to quickly ingest their existing textbooks, and other content assets, and deliver a high-fidelity learning experience.

Ai for Publishers

AI FOR PUBLISHERS

Integrate powerful capabilities into your existing workflows and features, adding intelligence and efficiency.

Overview

OVERVIEW

We have been ideating and creating technology solutions since 1986. We have been evolving with the tech curve and it has been an interesting journey.

Compro Foundation

COMPRO FOUNDATION

We are committed to making the society we live in a better place. The Compro Foundation helps translate that commitment into action.

OUR PARTNERS

We are proud to collaborate with our partners who augment our capabilities at Compro Technologies.

Products and Solutions
About Us
Products and Solutions
About Us

EdExec Summit 2025: Tuesday Morning Keynote

  • Writer: Ravi Ranjan
    Ravi Ranjan
  • Sep 10
  • 8 min read

Updated: Sep 11

"State of the Market Breakfast Back to School is Here. Back to Normal is...Not So Much."


Part 2: “Reaching All Learners."

ree


Below is a transcript (edited for conciseness) from the EdExec Panel Discussion. This is Part 2 of a 3 part series focused on “Reaching All Learners.” Part 1 “Security & Compliance” is available at: https://www.comprotechnologies.com/post/edexec-summit-2025-tuesday-morning-keynote                                                                                                                                


Prepare for the new school year with a panel discussion designed for companies and school districts. This session will tackle key topics, including procurement tips, ensuring safe and compliant technology, fostering cultural responsiveness, and creating a culture of inspiration. Of course, AI will be a recurring theme throughout the discussion, but this expert panel will provide broader advice and insights to help you prepare for a successful and productive back-to-school season.


Speakers:

Deshbir Dugal, SVP, Compro Technologies

Sarabjot Singh, Co-founder & Chief Data Scientist, Tatras Data

Jason Fournier, VP, Product Management AI Initiatives and Data, Imagine Learning

Greg Bagby, former Coordinator of Instructional Technology for Hamilton County School

Moderator: Kevin J. Gray, CEO and Founder of Product & Process


“Reaching All Learners.”


Kevin Gray

How can leaders and educators collaboratively evaluate and select edtech products, and now we're talking about the start of the process, that are culturally responsive and meet the diverse needs of all students? I know that's culture responsiveness is kind of a hot term, but they still have an obligation to meet the needs of the students. So how does technology help that? So Sarab, I'm going to start with you, because you had a lot of thought around what you should be centering when you're thinking about the technology. 


Sarabjot Singh

Sure. So, you know, obviously, we can’t escape from technology. It's part of our lives and especially with the advent of AI now, we are seeing certain fears around using that technology because we feel that people are outsourcing their thinking to AI now, and that's definitely not what we want our students to be doing. So, there's a lot of talk around creativity and critical thinking and how we actually nurture that in the age of AI. And one of the things that I've found as a silver lining with some of communities that I work with, which are low-resource language communities, is the fact that, you know, it's pretty obvious that AI makes mistakes, right? And so you're actually helping them partner with AI to work with it the way we'd like all our students to do. And I think in students in languages that are highly resourced have a problem where there's a great faith being created in technology primarily because it just sounds so credible. And while we know that it hallucinates and so on, we often forget. And so we've got to keep that in mind and develop technologies and get students to start to befriend them because that's what they're going to do in their lives. We're seeing jobs change.


Kevin Gray

Yeah, agreed. Deshbir, you have built a lot of products. Can you talk a bit about that recursive evaluation, reevaluation process that Sarab has mentioned? 


Deshbir Dugal

Building on what you just said: artificial intelligence. One could argue there's nothing artificial and nothing intelligent about AI. It's essentially a system, it's a technology, it's just like, you know, yesterday somebody said, you can look at AI just as another technology, and that's what it is. You have something that's matching the patterns, mimicking what it's seen and has been trained on. So if it's been trained on data that has a bias, it'll do a great job in perpetuating that bias. 


My advice is be aware of what AI is, don't get blown away by the form. I mean a lot of people have said great things about AI. It's not magic, and the more you know, the more you understand how it works, will help you make your decision. Testing your solutions by running themthrough your students, revaluating that feedback, correcting, and benchmarking. I mean, those are all things that I think are so important. 


Kevin Gray

And that's coming from the developer end, but I want to go back to something Sarab said and then turn it to you, Greg. The training of users, thinking about what do we need to be doing with the users themselves, the students, to get them thinking about how to evaluate this technology. What are we missing that is an obligation on the student half or the instructor half of the relationship?


Greg Bagby

I think one of the things that we're missing is the idea that, yes, it's training these large language models, training on all the things that are out there, but... that doesn't include everything, in the sense. I was on a call with Dr. Padmanabhan from Fordham University, and we were talking about AI and AI usage, and he was saying how some of his work he realized that there are whole people groups that are left out of this whole LLM, that their data's not being trained on, because they don't have access to put the data inside of the computer, so to speak. We were talking about this place in the Congo, and he was like, yeah, they don't have the computers to input their data, so the LLMs aren't training on their data, and just think of the folks in, and he started naming different places all around the world, it's like, they're not using their data to train, it's not being trained on that, so we need to be aware that there are people groups that this whole LLM is forgetting, because they don't have access still, and we need to understand that not only is it missing people groups, of course it's going to be biased, of course it's going to have hallucinations, because it's not truly trained on all the things.


Kevin Gray

Jason, you've talked, I want to go back to this idea of shared responsibility, you've talked a lot about what you want buyers of your technology to be doing, the calculations and benchmarks type thing, what does a good buyer, what should a good buyer do, so you as a seller are sure that the circle has been completed.


Jason Fournier 

I think in the product space, there's a bunch of techniques we've always classically used to try to make sure that the solutions we build meet the needs of the customer. And I would argue to the extent that our customers believe that responsiveness are important. We want to do that. I mean, we have beliefs about that ourselves as well. But it's a need we need to figure out together. And so we think about research. We think about piloting together. We think about co-development. We think about gathering data, whether that's through research advisory boards, or we convene groups to give us insight into how our products are working, or it's survey work, or it's observing the classroom. All of that is partnership. So I think part of it is carving out some time and some capacity to work with your vendors to help improve the tools that they're building. That's real work though, so hold them accountable for compensating your teachers and finding ways to give back. But if we don't do that together, the solutions will consistently fall short.


I also think, you know, thinking about AI specifically, there's an aspect of this that has to do with AI literacy. And I was sharing with the team, I was talking to a provider of a product recently, it's a provider all of you would recognize. And they were very proud of the fact that through interactions with their tool, they had anticipated a couple of cases of potential student violence and they were able to intervene and prevent that from happening. But the reason they were able to do that is they felt as though kids didn't understand that the technology they were interacting with was collecting this data and that they were mining that data and then looking for these things. And so there's a problem there. I mean, if kids don't understand what they're engaging with and how these systems work, then we're falling short of our job to help them understand how to use these tools effectively. 


There's also a dangerous piece in terms of how students are attaching to this technology. And so I think it's really important that we think about the literacy and that's inclusive of our own AI literacy. What questions do we need to ask? What do we need to understand about this technology to be better consumers of the products that we're adopting and to better understand what we're purchasing and putting into play in our districts?


All that said, though, it's powerful and it can allow for more representation. We've done a lot of work using image generation technology to create avatars of kids. Traditionally, we'd have to pay illustrators. You could only make so many of those, and that meant the number of students who could see themselves in your product was limited. With tools like MidJourney and Stable Diffusion, you can make a lot of images, but to the points made earlier about lack of training data, they are not always representative. So put aside, for a second, race and nationality and body type. They all have gaps in those areas, but try to make images that include assistive devices, hearing aids, wheelchairs, braces of different types, crutches. The systems all struggle with that type of image generation, yet in the classroom we know that there are kids who have needs for those types of things.


So knowing what to look for, that's the AI literacy piece. Knowing when you generate an image, look to see if it's creating, perpetuating traditional stereotypes, men in traditionally male roles, women in traditionally female roles. Is it representative? And then think about things like assistive devices and test your systems against those and see what they are generating.


Kevin Gray

Sarab, I want to go back to you for a second because you told a whole story about how you were using AI to help students connect to their cultural background. Would you share that? 


Sarabjot Singh

Yeah, so, you know, one of the things that we've started to talk about is vibe storytelling, right? We talk about vibe coding, but again, from a perspective of creativity and instilling creativity with the students rather than them, like I said, outsourcing all of their thinking to AI. And, you know, we started to get them to create stories with LLMs and, you know, they were able to create some amazing images that they could identify with. There were clearly errors in those images and that taught them that AI is very different from a Google where you just get to put in a keyword and you take the results and that's it, you go home. So, you know, being able to consume that, give your feedback and interact with AI and produce these stories was a great way of instilling that kind of creativity within them. But it also meant that it gave them opportunities to see where AI lacked. And a lot of the references that were being made, some wonderfully correct, and on the other hand, some of them were completely incorrect, right? 


And that really helped them evaluate in their own mind that AI is not something to be followed, but is to be a co-creator within your journey in the future. I think that was very, very useful.



Stay tuned for Part 3, which discusses how to use technology to build community and inspire students, teachers, and the surrounding education community.



 
 
bottom of page