Students Achieve Even More Math Growth Thanks to Program Improvements
Happy Numbers Students Achieved Even More Math Growth this Year! How Did We Make It Happen?
What we’ve achieved
Our students successfully solve more tasks and spend more time on the platform than they used to. On average, the number of successfully completed tasks increased by 24% compared to the previous year. This number shows significant math growth, which we can also see on the results of the End-of-Year Assessment.
How have we achieved this feat? We improved 342 of our tasks based on teacher and student feedback and anonymous metrics on how students work through the lessons. Our research was fun and challenging, and in this article we’d like to share some findings that we made during our analytical adventure.
Continuous improvement: keeping it challenging (but rewarding!)
The main focus of the Happy Numbers Content Team is to make learning meaningful and fun, and to always keep students in their Zone of Proximal Development. To do so, we always monitor the level of difficulty of tasks: they should be challenging but achievable. First of all, it’s important that every student starts from the appropriate place in our curriculum. You can learn more about how we developed our Placement Test here.
But what happens after the test when our students meet too difficult, unclear, or boring exercises? They just close the tab! That’s why a team of 25 people, consisting of methodologists, user experience designers, illustrators, Spanish translators, programmers, and quality assurance engineers, is constantly working to improve the content!
Here's how we spot problematic tasks
As you know, Happy Numbers constantly collects anonymized data on student performance and uses it to control the following metrics:
- Number of mistakes students make
Since the task has to present some level of challenge, student errors are expected, but, with individual feedback on different kinds of mistakes, the student is able to achieve the goal with a minimum number of errors. If, however, students make more than 4 mistakes on a task, it means we haven’t provided the required level of support, and the exercise should be improved. Therefore, we always expect an exponential decrease in the number of student mistakes.
- Amount of time students spent on a task
We expect that, on average, students should be able to solve one complete task within 3 minutes for PK-K students, within 5 minutes for grades 1-2, and within 7 minutes for grades 3-5. Our research shows that otherwise students get frustrated.
- Number of students dropped off after the task
We monitor the percentage of students who stopped during or after a task and did not continue working on Happy Numbers within a month. This is another way to see if the exercises are too boring or difficult to understand. We re-work these tasks to maintain student interest.
How do we analyze problematic tasks?
First of all, after we identify the task, we go through the topic and take a fresh look at it. Sometimes, the problem can be identified right away. In other cases, though, the reason is not so obvious, and we have to go deeper inside student answers to analyze them.
There are a few important markers that we observe:
- We track the number of students who solve a problem after their first or second mistake. As long as we provide individual feedback on student mistakes, the number of students who correct themselves after that should be more than 80%.
- We monitor the maximum number of mistakes made by the student in one exercise. The number of attempts a student took to solve the problem can show that an additional explanation of the concept is needed.
- We analyze the most common incorrect answers. For example, it might appear at first sight that mistakes in the number sentence below mean that students don’t have a strong understanding of the bar graph concept. But after looking deeper at students’ answers, we find out that 52.7% of students use bars that were used in previous questions: purple and green. So the students misunderstand the question due to inattention rather than lack of knowledge.
How we increased math growth (+24% compared to last year)
- We completely redesigned several topics, making them more interesting, more colorful, and animated. Unfortunately, part of learning math is solving a huge number of tasks. No wonder students get bored when exercises look the same and are not engaging.
We even got some feedback that the kids thought they were solving the same problem multiple times. To engage our students, especially the youngest ones, we’ve added game-like elements:
2. We also made a number of small changes, fixing these common mistakes:
- An explanation or a model is more complex than the concept they represent;
- An interaction with the interface is not easy and intuitive enough;
- We provoked students to make mistakes due to inattention, not misunderstanding;
- Insufficient number of explanations, leading to difficulties in solving abstract problems.
We see now that students solve more exercises and spend more time on the platform this year. Compared to the last school year, the average number of exercises completed by a student has increased by 24%! Like our little colleagues, your students, we’re always learning and improving ourselves. Our entire Content Team wants to thank you and your students for not letting us relax, allowing us to continually improve and grow!