EdExec Summit 2025: Tuesday Morning Keynote
- Ravi Ranjan
- Sep 5
- 8 min read
Updated: Sep 10
August 19, 9:00-9:50 am

"State of the Market Breakfast Back to School is Here. Back to Normal is...Not So Much."
Prepare for the new school year with a panel discussion designed for companies and school districts. This session will tackle key topics, including procurement tips, ensuring safe and compliant technology, fostering cultural responsiveness, and creating a culture of inspiration. Of course, AI will be a recurring theme throughout the discussion, but this expert panel will provide broader advice and insights to help you prepare for a successful and productive back-to-school season.
Speakers:
Deshbir Dugal, SVP, Compro Technologies
Sarabjot Singh, Co-founder & Chief Data Scientist, Tatras Data
Jason Fournier, VP, Product Management AI Initiatives and Data, Imagine Learning
Greg Bagby, former Coordinator of Instructional Technology for Hamilton County School
Moderator: Kevin J. Gray, CEO and Founder of Product & Process
Below is a transcript from the EdExec Panel, broken into three parts. In part 1, the panelists delve into how to balance security and compliance concerns while providing market-leading technology products.
Part 1: Introductions; Security & Compliance
Kevin Gray: Morning. Thank you. Good morning, everyone. I am really excited to be hosting and moderating this panel this morning. It's been really fun putting this together. We've had some really great conversations. I'm going to try and give you a few minutes at the end for questions. First, I want to introduce the folks. So what I'm going to ask is that each of the folks on the panel introduce themselves, give their title, and then also a throwback to Jim's circle exercise yesterday. Tell us about your role. What lens are you bringing to this conversation? So, Greg.
Greg Bagby: Greetings all. My name is Greg Bagby. I've worked at a district for 32 years and now I'm working for myself as a consultant as well as at a private school. I've done work through about, I guess, my 32 years with districts, not just mine but others across the country, with Apple and NCCE.
Deshbir Dugal: Good morning, I'm Senior Vice President of Compro Technologies, I'm essentially an engineer. I help districts, clients, get things done, implement this technology, get things together, and really take you across the line, get things in production.
Sarabjot Singh: I'm Sarabjot Singh. I used to be an academic in the United Kingdom for around two decades. I've moved back to India now to wear two different hats, or turbans, as you may say. I run a foundation where we teach AI and machine learning to students, but also have an AI product development house, so we work on both sides.
Jason Fournier: Nice to meet all of you and be here today, especially in the very enviable spot of breakfast, first thing in the morning. I'm Jason Fournier, VP of Product Management for AI Initiatives and Data at Imagine Learning. We are focused on curriculum-informed AI, the ability to bring AI systems to market that are grounded in the curriculum that have been adopted by districts and classrooms. My lens is, I bring a product development perspective to the table. What jobs are customers hiring our products to do and how do we help them find efficiency to get things done in the classroom?
Kevin Gray: Thank you all. So, we've got three questions we're going to go over this morning about the state of the market. It will be tech-informed. AI is actually going to come up twice, three times here, but we're also going to talk about the general state of the market. We're going to talk about security and compliance. We're going to talk about culturally responsive education. We're going to talk about how to inspire learners. So, we'll start with, what are the most critical steps companies and school district leaders can take to ensure their tech is both secure and compliant with student privacy regulations? Deshbir, I want to start with you because in a lot of our conversations, and I think this theme is going to come throughout, we talk about balance, right? The need to balance risk and benefit. So, can you start us off in that area?
Deshbir Dugal: My view, coming from the lens of, you know, actually getting these things done, is that there has to be awareness. So, yes, as a vendor, you can provide solutions, you can provide options, but it's really important that some of the leaders in this room are also aware of what it means to get security done, to challenge some of the solutions that are demoed. Ask pertinent questions. For example, you know exactly what the issues might be with your students, your environment, so it needs to be sort of a foundation. Yes, as vendors, we do have a lot of options, some really magical solutions, but there is no magic to it. At the same time, you know, there are really good solutions out there, especially for security. It's a problem that matters.
Kevin Gray: Sarab, if you would take this a bit further, so you've talked about working around PII, but then some of the dangers of that might reveal. Can you share a bit about that?
Sarabjot Singh: Sure, so, you know, we're obviously very aware of PII data, and now, of course, technology can be created without having access to PII data. There's a lot that can be done. Machine learning and AI can be developed without having personal information being tied up. I think what is often missed, though, is data leaks that can happen. So I lived for a number of years in Northern Ireland. And it was completely not done to ask somebody whether you were a Protestant, Sikh, or a Catholic. But by asking about their school that they had studied in, they divulged that information. And so there are high correlations there.
And making sure that we are aware of those when we are dealing with vendors that want certain data, and making sure that we're protecting our children from that, I think, is very important. It does become difficult when you're dealing with a speech data, for example. And I've worked in AI's across multiple domains. One of the areas that we found the police is very interested in is when they analyze social media and understand the background of people and gangs based on the language that they use. So when we get it to natural language, when we get it to speech, it's tougher. But we need to be aware of what can be learned from giving that information.
Jason Fournier: I think one of the questions of our time right now is how much benefit the technology needs to offer for the risks that go along with it to be acceptable. And I think in the product space, that's an important question for all of us to be asking, whether we make products or whether we adopt products. There's a number of technologies that are being developed right now that will listen to audio of a classroom and then analyze that audio and provide insights to teachers or insights on student engagement. And I think for some of our products, we know we have low digital use but there's a lot of activity that happens in the classroom, a lot of learning happens through dialogue. So there's tremendous value in collecting that data. But to Sarab's point, now you've recorded student voices, which is biometric, and you need to protect that data. Conversations can also happen in the classroom that are sensitive, where kids might not want that recorded and shared.
So holding vendors and products, developers to account for what data they're collecting and how they're using it is really important. Asking good questions: how long do you store this? Do you need to store it? Just because you recorded it as audio doesn't mean you need to keep the audio. You can get rid of the biometric data and keep the transcript and still be able to do really interesting work with that. So as things evolve, we want to evaluate the capabilities and adopt the things that have meaningful impact, and we want to make sure we do it so that way it's safe.
Kevin Gray: Yeah, you really talked about having to understand your context and your procurement, and so Greg, you're somebody who has a lot of experience with that sort of procurement process. Can you talk about that, from a district standpoint, about what you are doing and what you are expecting vendors to be doing?
Greg Bagby: Of course we're, yes. Of course we're thinking of the FERPA and COPA compliance and one of the things that I ran into with my district a couple of years ago, there was a company from overseas that wanted to provide some solutions for us and they said, oh, this is GDRP, I don't know what that is. Someone was confused about that and I was like, you know, that's a little bit more strenuous than our COPA and CIPA compliance. It actually causes us to do more. We're making sure that they're protecting us a little bit more. So just think about the things that we have to be compliant with and then think about when you're creating these policies, as some call them.
We have these, I just lost the word, ADP, no. Well, you have the acceptable use policies for your students but you also have a privacy policy that you would write with the district and the company so that you're setting the guidelines as to, this is the information we're gonna share, this is the information that you're allowed to have. And then understanding that data life cycle like you were talking about, explaining, hey, you're gonna have our data and our kids matriculate at this time. I need all that data to go away and what are you gonna do with that data? How are you destroying that data? How are you getting rid of it?
And I found that there's some districts that don't even think about what's gonna happen with the data after the company takes it and the student leaves. It just boggles my mind when I see, and I ask, so what is your data privacy agreement? And they say oh, yeah, we didn't do it with this company, but with this company we have this. And you just need to be aware or districts need to be aware and the companies if you have one a boiler plate DBA, go ahead and just say, by the way we have these things and allow districts to see them because believe it or not there are some districts that kind of forget about those things.
Kevin Gray: Well, and I'll add, this came out of conversation yesterday, and I won't single the individual out, but she pointed out that the companies also forget, right? Particularly companies that have been acquired. So she was talking about needing to end a contract; needing that data scrub and her salesperson and the customer service person were no help. So I think one of the things you really think about here is, it's the balance, right? Risk/reward. It's also the relationship between the vendor and the provider to make sure that these things would be, you know, held to what's in what the agreement is and knowing what the agreement is.
Greg Bagby: And I think that's a great point when you're talking about companies being acquired, because we were a part of, working with one company, and then another larger company acquired it. And we did not want to do business with that company when I was in my district. And it's one of those things where you're, okay, so we paid this money for this product, but now they own this product. How do we know that our data is safe?
Kevin Gray: So, Christine, probably a good session for the future is how do you navigate getting out of contracts with vendors because it sounds like it's very, very sticky.
Stay tuned for Parts 2 and 3. Part 2 will cover how to ensure technology reaches all learners, while Part 3 discusses how to use technology to build community and inspire students, teachers, and the surrounding education community.