top of page
Compro Technologies
Digital cosystem

DIGITAL ECOSYSTEM

An Ecosystem of components and services which enables publishers to rapidly create their next-generation learning solution.

Engage

ENGAGE

A ready-to-deploy, SAAS learning platform. Allows publishers and content creators to quickly ingest their existing textbooks, and other content assets, and deliver a high-fidelity learning experience.

Ai for Publishers

AI FOR PUBLISHERS

Integrate powerful capabilities into your existing workflows and features, adding intelligence and efficiency.

Overview

OVERVIEW

We have been ideating and creating technology solutions since 1986. We have been evolving with the tech curve and it has been an interesting journey.

Compro Foundation

COMPRO FOUNDATION

We are committed to making the society we live in a better place. The Compro Foundation helps translate that commitment into action.

OUR PARTNERS

We are proud to collaborate with our partners who augment our capabilities at Compro Technologies.

Products and Solutions
About Us
Products and Solutions
About Us

EdExec Summit 2025: Tuesday Morning Keynote

  • Writer: Ravi Ranjan
    Ravi Ranjan
  • Sep 16
  • 13 min read

Updated: Sep 17

"State of the Market Breakfast Back to School is Here. Back to Normal is...Not So Much."


Part 3: “Building Community to Inspire Learners.”


ree

Below is a transcript (edited for conciseness) from the EdExec Panel Discussion. This is Part 3 of a 3 part series focused on “Building Community to Inspire Learners.” It also includes several audience questions at the end of the panel.


Part 1 “Security & Compliance” is available at: Read Here

                                                                                                                                     

Part 2 “Reaching All Learners” is available at:


Prepare for the new school year with a panel discussion designed for companies and school districts. This session will tackle key topics, including procurement tips, ensuring safe and compliant technology, fostering cultural responsiveness, and creating a culture of inspiration. Of course, AI will be a recurring theme throughout the discussion, but this expert panel will provide broader advice and insights to help you prepare for a successful and productive back-to-school season.


Speakers:


Deshbir Dugal, SVP, Compro Technologies

Sarabjot Singh, Co-founder & Chief Data Scientist, Tatras Data

Jason Fournier, VP, Product Management AI Initiatives and Data, Imagine Learning

Greg Bagby, former Coordinator of Instructional Technology for Hamilton County School

Moderator: Kevin J. Gray, CEO and Founder of Product & Process


“Building Community to Inspire Learners”


Kevin Gray: I agree, and I think that leads into our last point really nicely, which is beyond the purchase, how can leaders effectively communicate and engage staff, students, and community to ensure successful adoption and inspiring ed-tech work? Greg, when we were talking before, you talked about the need not only for AI literacy for the students, but also from the administration and sort of the top-down approach. Can you talk about how you've seen that work and where that's worked well and what needs to happen to make that happen?


Greg Bagby: Yes. Working with a couple of different districts and… thinking about how you're going to get this AI rolled out. We not only have the teachers that were the grassroots, so to speak, doing the things, but having the leaders lead in areas of, "Hey, this is what we're going to do today. This is how we're going to use generative AI for this." And what he does use, or when she uses gen AI in their presentations, to call it out so that other folks know that they're using it. And even go beyond just the classroom with making sure, oh, we're going to use AI, and this is how I'm doing it, but also bring in your other stakeholders, ring in community members. Bring in children, oh, wait, students, yes, have committees with those groups and just have discussions around AI and what things you could be doing and how you can bring those things to your school or to your district. 


But with someone earlier – I guess it was yesterday, I just talking about how the whole idea of bringing in a community of folks working around AI for the whole district and your school, how it can make a difference in not only that the stakeholders see what's going on and it shows transparency, but they're more likely to accept some of those things. If you're a part of the team or if you're asked to be a part of it, then of course you may want to back it a little bit better than if it's just something that's thrown at you. There's a group out here, maybe they're here, and they're actually asked, "Hey, so when you adopt this product, do you have just the teaching and learning and the tech group, or do you bring in all the stakeholders?" I saw this on one of the little information things I was filling out for the district. I was like, ooh, this is a great idea. And it seems to make a difference when you bring in the whole community to take part in adoption and use of products.


Kevin Gray: Jason, you've been an advocate for that level of diversity, not just in the adoption, but in the development itself. Do you want to speak to that a bit?


Jason Fournier: Yeah. I think products are always improved when you have diverse and multicultural perspectives brought to the table. That's true as you're building them because you're more likely to find gaps in your perspective. It's also true as you're deploying them because you might find gaps in the implementation that you didn't anticipate. So I'm a big fan of having broad groups, to Greg's point, that include all the populations of products that are intended to serve. And then I think you have to transition from the build into the implementation with that same mindset. I think that you can also use your partners. You can make into – I mean, we started by talking about contracts and how you set up the procurement. You can make into your contracts certain types of support.


That includes having account execs or customer service managers who are attached to your account who can help with training. It includes having professional development, professional learning attached so that you can get that support. So there are mechanisms that you can get from the providers, but then you need to be able to help your staff have the time to participate in that. And I think we're all busy. We deploy tools in our company that I'm supposed to use that I don't know how to use, and I don't have the time to figure out how to learn them. So it's also about kind of making sure you set aside the time and the capacity to train and develop.


And then downstream, there's the measurement. So I think when you engage in using this technology, you need to go into it with some sense of what your goals are. What is it that you're trying to achieve, and how will you know? As a product person, we talk about OKRs, and we talk about North Stars, and I think with implementations of new software that you've purchased, new platforms that you've purchased, understanding those things up front, so you know what to look for, what will let us know if we're successfully deploying this, what will let us know if our teachers are adopting and using it. If you don't have those measurements, you won't be able to tell. So working with that cross-functional group, establish what those success metrics will be, and then as you implement, try to measure those things where you can.


Kevin Gray: Yes, and of course, software doesn't exist as a single piece of software, it exists in an ecosystem. And Deshbir, I know that's an area you have a lot of experience with, really thinking about how to smooth out integrations so that it functions the way you want it to. Can you talk a bit about that, about the sort of buyer-reseller responsibility there?


Deshbir Dugal: Absolutely, I think there are very few systems that work in isolation today. Any scenario you can set up will have multiple platforms, multiple systems, and there's data flowing all through them. I think I'll give the example of students' profile. One would argue that the district should have systems that own the profile and only provide anonymous tags to the clouds of the AI systems to provide personalization. So therefore, that's kind of one way you address privacy.


So it's these partnerships where, working out who's responsible for what, what's the right sub-organization where the data should lie and so on. It's coming together, working together as a team. I come in from a point of view of an implementation partner, and we really work the best when we have a collaboration with our clients to really figure out what's the right way of solving that problem, what's the right point of solving some of these issues.


Kevin Gray: Sarab, we talked a little bit as well about how there's always a fear with new technology, right?


Sarabjot Singh: Sure, so I think technology on its own is never useful. It has to be the community that comes around it. So as technology developers, we've always got to think in terms of what is the problem we are trying to solve? And for educators, you know, they're in an unenviable position that we all know that the ideal class size is one, the ratio, right, of educator to student. But they're dealing with 20 to 30 students, maybe 40, 45 students in certain parts of the world. They don't have the time, but administrative stuff to be also doing.


At the same time, if you say, now can we actually start looking at this new shiny object that I've just created, it's not going to happen. So the first thing I think we need to think about as technology creators is to say, "What is the problem we're trying to solve? Is it a real problem?" And if it is a real problem you're trying to solve, then it can definitely get the community behind it, right? We ourselves as a foundation have seen that we have limited resources, and we've been able to scale what we do in our teaching because of technology. But only because we focused on essentially what are the problems, what are the bottlenecks. Let's figure out those bottlenecks rather than just saying, here's some great technology that we bring into the market.


Kevin Gray: I mean, that really goes back to your first comment when we started this about focusing on the user, not on the feature set.


Sarabjot Singh: Absolutely.


Kevin Gray: Good. I think we have a few minutes for questions, if there are questions from the audience.


Anssi Valiaho: Hello, I'm Anssi from Finland. We teach digital skills, keyboarding, coding, digital citizenship, digital literacy, game-based pedagogy, nice things, engaging. Of course, as part of it, we're doing AI literacy lessons for the students. First lessons coming out this fall. And therefore, I'm also asking interest in AI. So what do you think was the biggest opportunity and the biggest risk of AI in education over the next few years? So biggest opportunity and biggest risk in education. And how do we tackle them together?


Kevin Gray: Jason, you and I, we at the table were just talking about this, not from an education standpoint, but from a workforce disruption standpoint. Can you talk about the role of curiosity and creativity in this? Because I think those soft skills really lay into, thank you for setting them up for the conversation we just had with the panel.


Jason Fournier: Yeah, so, I mean this is, Kevin warned us earlier that he was going to have to cut us off, and this is one where I'll get the hook halfway through because I can talk about this forever. I mean, opportunity, I think in education we've always talked about personalization. It's been something we aspire to, but having worked on the vendor side throughout my entire career, there's a lot of expense in personalization. So oftentimes we would have to pick, you know, you might do an on-level version, a slightly ahead version, a slightly behind version. I think one of the biggest opportunities is that with technologies like generative AI, we can create many, many, many permutations. We can bring data there on how we create those, and so now it's not about one lesson plan for those 30 or 45 students. It might be about individual lesson plans for every one, tailored on a daily or weekly basis depending on how students are performing. That takes a completely different approach as a teacher. It changes your role. It changes how students engage, and it creates demand for data in ways that today we don't really service, and we need to think about that. So that's the opportunity. The risk is it's moving so fast, and it's being driven by a very small group of technocrats, and, I would argue that the faces of the people in the rooms making some pretty big decisions are not representative of the faces of all of us. And I think that's a problem.


So that's where curiosity comes in. I think all of us need to show up and be curious about what this technology is capable of and what it does. And if we abdicate our responsibility to be engaged, then we're going to get products that don't take into account our points of view and don't address the risks that we see inherent in those products. Jack Clark at Anthropic has a quote that I really like about curiosity and the fact that it's neither cultivated nor evenly distributed. And in this moment, with technology moving as fast as it is, the most curious of us will win. They will see the biggest rewards. And it's because they're leaning in, they're engaging, they're experimenting.


The labs make general-purpose technology. They're not well-positioned to understand how it will perform or what its strengths are in any specific domain. ChatGPT is not an expert in education. It's just good at responding to user queries. It's tuned to respond to intent and to satisfy. And it's on all of us to then take that technology and figure out, in our domain, how is it best applied and what is it most capable of. And so I would encourage all of you to bring your expertise to the table in experimenting with that, and to lean in and be curious.


And if that means the first time it doesn't do what you expect, don't walk away and assume that it's bad or that it doesn't do it. Continue to learn, continue to level up, and try different things. And eventually, you'll have what many of us who've worked with AI would describe as that mind-blowing moment where it does something you didn't think that it could do in a way that you didn't anticipate. And when you get to that point, what I find is that's a revelatory experience for most people. You know, for me, it was asking it to create a mini proof of concept, and I didn't think it could do it, and it did it in one shot, and I was quite surprised, and I was using research in education and assessment to have it create a proof of concept that brought that to life, and I didn't think that it knew enough to do that, and it did, and it surprised me, and I've been around this technology for a little bit now. 


So, I would just encourage all of you to kind of stick with it until you have that moment, and cultivate those soft skills, and I think when you think about AI literacy, for me, it's the soft skills that are really transferable. Learning ChatGPT is not a durable skill. The next version is going to blow away anything that you trained on last week, and we just saw that with ChatGPT 5. It's curiosity, it's tenacity, it's willingness and ability to learn. Those are the skills that stick, that you can reuse as this technology matures and evolves.


Kevin Gray: Deshbir, you talked a lot about not thinking of a technology out of the box, but really getting it to do what you need it to do. Can you add on to what Jason said there with that kind of mindset?


Deshbir Dugal: Absolutely. I think everything that Jason said is perfect. For me it's as simple as it's a human with a machine. AI is meant to make teachers more effective and to scale better. That's the opportunity. The danger is that you start believing that the AI can be more independent. It's not intelligent. It's pattern matching. That's what it is. That's the danger and that's the risk if you don't really see that. It's balanced. It's got great potential and opportunity to solve certain types of problems but not necessarily all problems, not at this point in time.


Kevin Gray: We probably have time for one more question.


EdExec Delegate: So let me frame a problem I see, which is there's a disconnect really between, I think the disconnect is more apparent today than it was 20 years ago, between the model of education we have, which is very comprehension and transactional, like standardized tests and such, that's the goal, and the disconnect between we have 12 year olds who are able and willing to create and be authentic contributors to communities and just change the whole model of what they're doing, like that was not possible when I was 12 and it is today. So my question around that disconnect is, from like a vendor's perspective and an educator's perspective, how can we be sure that AI, or how can we ensure, how can we help ensure that AI is not helping us run the wrong race faster? Like we're developing tools to meet that goal instead of the other way around.


Kevin Gray: I feel like that's Jason or Sarab.


Jason Fournier: I think it's a great question, and frankly, if you have the answer to it, let's connect. I don't have an answer to that.


We're thinking a lot about that right now, and I think what's clear is that as technology evolves, and it's not just AI, I mean, we've been on a path of many technology innovations that are hitting classrooms and changing what we do. This is just faster in some ways than I think what we've felt before, and I would argue education hasn't always moved very quickly, and so we've always been a little slow to catch up with innovation, and now we're in this place where it feels like a runaway train that we're trying to keep up with. I think what's important is we ask the questions, and I think there is an instinct in moments like this to put your head in the sand and keep doing what you were doing. This technology I think is going to be very transformational. If you think about what's happening in the press right now, the kind of general thread is we're going to see a lot of job loss. And so if part of the mission of K-12 education is preparing learners to go out into society and contribute, whatever that contribution looks like. Start a company, be an employee at a company, be a contributor to democracy. In the face of potential job loss due to AI, what is our goal? Is it to create liberal arts thinkers? Is it to create entrepreneurs? Is it to create I don't know, consumers of universal basic income?


I think like these are the questions we need to be grappling with and I would say I don't think there's enough people in our industry really kind of grappling with the question at the moment. So I hope that doesn't seem hand-wavy. I want to be honest I think we're all struggling with that question. Fundamentally, I think we will see education change. and across all levels, what that looks like and what's on the other side we have to create together and I think that's something that is still TBD.


Kevin Gray: I was going to give you the hook because I wanted to make sure we heard from Greg. I'll give you my mic, but as someone who's been through the technology, because you started with Apple, right? And you've seen this implementation. Let's leave you with the last word.


Greg Bagby: One of the things I was thinking of, a lot of districts across the country are doing a portrait of a graduate. That's really cool, that's the thing. But they're not working with their partners in the industry to create these portraits of the graduates. A lot of them are not. There are some that are. And if partners can work with the districts that are creating these portraits of a graduate, we can try to answer those questions as to, yeah, we're trying to make these little bots of people that can do this work, this job, this thing. But I think if we worked closer together, I know that sounds crazy, yes, silos are for grain, not education. I think we can make this, overcome this fear, or at least do better. Because we've not done well for a long time. And when you know better, you do better, and we've known better, but we've not done better.


Kevin Gray: All right. Thank you. So thank you to the panelists. Please join me in thanking them. And these are pretty weighty topics, so I'm sure you can honor any of them over the next day or two, probably later with some wine. And for all the things we couldn't fit on stage today. Thank you.

 
 
bottom of page