When I decided to join eSpark Learning 6 years ago, I wanted to know two things: 1) “Are the people on the bus here for the right reasons?” and 2) “Does this technology work?”
It was important to me that a mission-driven organization like eSpark had mission-driven people building tools for students. I’ve come to learn that the team at eSpark Learning is a passionate group of individuals focused on ensuring the student experience in our products is both engaging and academically enriching.
In the fragmented edtech market, it’s all too easy for companies to build a shiny new product, attach a few buzz words to it, and start selling it to overwhelmed school leaders that need solutions fast. While it’s difficult, if they’re to meaningfully transform education, companies must measure product efficacy and share the results, good or bad.
In 2012, I conducted my first academic efficacy analysis on our flagship product, eSpark, for a school district with 80 students. Since then, my team and I have conducted academic efficacy analyses for over 200,000 students. These studies have consistently shown that eSpark leads to significant gains in student achievement. We’ve come away from each of these studies with valuable learnings that have allowed us to refine our personalized learning system.
When eSpark Learning launched Frontier, our newest product designed to build better readers and writers, we applied the same methodology focused on engagement and academic enrichment. Partnering with researchers at Columbia University’s Teachers College, we set out to determine if Frontier was contributing to student growth and identify opportunities to improve our product. We asked the Teachers College researchers to answer two key questions:
- Does Frontier improve academic achievement for struggling learners?
- What can we learn about what engages students most in Frontier?
Columbia University researchers learned that increasing time in Frontier is significantly correlated with stronger performance on the NWEA MAP Assessment, particularly among the most struggling students (students who start off performing below the 50th percentile). Specifically, among these students, those who spend 45 minutes per week in Frontier were 21% more likely to meet expected growth targets, as compared to peers who spent 5 minutes or less on Frontier each week. School districts measure themselves using this metric so it’s important to us that our success and school district success are aligned.
In investigating the usage trends among students on Frontier, the research team investigated whether students with stronger performance also selected certain Frontier topics (e.g. sports, science, pop culture). Instead, they found that in each and every topic category, the students who used Frontier more were also the most likely to meet expected growth targets. It was encouraging to our curriculum design team to know that Frontiers across all categories were engaging and enriching students.
Knowing that the products we’re building have strong academic efficacy, particularly among students that typically struggle in the classroom, ensures us that we’re on the right track to transforming education for the better. By continuously monitoring the impact Frontier has on students, we can adjust our lessons to better address the unique needs of classrooms nationwide.