0

A/B Testing Your Classroom for Student Achievement

MjAxMi0zMDUyZmIwYjBkNDUyNDg3Teacher Pinterest boards are filled with innovative, new classroom ideas. “Oh, awesome lab for teaching the water cycle.” “Oooh, I like this classroom motivation strategy!” We Pin and bookmark with the intention of elevating lesson plans that will reach a level of engagement and learning all teachers strive for. But are we truly experimenting in a meaningful, purposeful way?

The relationship between teaching and digital marketing

In digital marketing, there is something called A/B testing. It’s a strategy (or experiment) where just one variable is changed in digital content. Red button versus a blue button, large font versus small font. The idea is to continue to experiment with little tweaks to see which garners the greatest returns, i.e. click rate in emails. As a former teacher and now a student of digital marketing, I know that teachers already do A/B testing, even if it’s not explicit. You might start with a Think-Pair-Share instead of your Do Now between first and second period iterations, or you might pass out scissors before passing out construction paper between yesterday’s and today’s iterations. Which decision had the greatest classroom workflow or student thinking? You’ve just A/B tested your lesson plan.

MatrixBluePillRedPill

Image via Six Nutrition

Now that it’s summer, it’s time to go to the next stage of A/B testing and that is evaluating, “grading,” and iterating. Teachers know first and foremost what works and what doesn’t work, but do you truly know? Is it backed up by evidence of student work and with identifiable patterns? Our teaching should be laser-like and precise, but it cannot be that way without hard-won evidence. Let’s do a deep dive:

1. Evaluating gives you a map of where you need to go

After you gathered and organized your evidence, it’s time to study your past work, just as you ask your students to do, and ascertain its success. Did the students “get it?” What was the response level? The purpose of evaluating is not only to prove the lesson’s success level, but to also give you cold, hard evidence of what did not work. Nominet Trust provides 10 reasons why you should start evaluating: evaluating gives you insights into unexpected outcomes for more variables in future “A/B tests.” And most importantly, if it is working, you can know why it did work. You are then better prepared to show the value of your experimentation to your colleagues and administrators.

Image via Mappio

Image via Mappio

2. Grading yourself

Lifehacker quotes one teacher that graded herself by asking, “How many kids am I steering in the right direction?” Grading your teaching might be in the form of how many “got it” or what kind of lessons were created. You’ve graded yourself on college course evaluations where the professor asked you to grade your own participation— and you were always stumped, probably giving yourself a lower grade than deserved. We’re our own greatest enemy, but when you ask yourself the right questions, you can give yourself a truly authentic grade that measures the value of your work. The data you gain from this grading exercise will be helpful for the final step in A/B testing your classroom.

Image via Soda Head

Image via Soda Head

3. Iterate, iterate, iterate

I hear this word a lot in the digital marketing world, but it’s also a word I didn’t hear much about when I was a teacher. However, as with all of the previously mentioned topics, teachers already do this! First we’ll evaluate our work, then we’ll grade ourselves. The next step was always to make little changes in the next experiment and see what the results are for that with the whole process starting over again. A/B test your lesson plans! Maybe in a scientific method lesson plan, you taught the fundamentals of graphing before the lesson to mediocre success. In this next iteration, you can teach the graphing lesson within the scientific lesson plan so that students can better contextualize it.

connect

Evaluate. Grade. Iterate. It takes a great teacher to be able to look at their past work, student work, and ideas and give an objective evaluation and grade, but in the end, it is all part of the journey of fine-tuning our craft to become the best teachers we can be.

Tomorrow, we’ll have a ProTip that will help take this first step further— by asking the right questions.

Want to teach your students how to evaluate, grade, and iterate their learnings? Sign up for Gradeable and learn more!

1

The Key to Improvement is Reflection

Like all you first-year teachers out there, I am wrapping up my very first school year. I started writing for the Gradeable blog back in September of 2013 and it’s been a whirlwind of researching pedagogy, learning my audience of educators, and keeping on a schedule. More often than not, taking a good, hard look at my work took a backseat to keeping up with a blog. Sometimes, I felt like I was losing the forest for the trees. Perhaps you can relate.

reflection for improvement

The key to improvement is in our backlog

As the school year winds down, I have more time to clean up the blog, assess areas for improvement, and gear up for the next school year—just like you are probably doing before September rolls around.  Some of it is fun—revisiting pieces I forgot about. Some of it makes me cringe—like finding posts without pictures. Most of it is tedious—going back to make sure everything is organized. Although it would be much easier to leave this year behind, charge forward, and focus on next year, improvement starts by making sense of what’s behind us.

So it’s the end of the year. You’re surrounded by extra copies of assignments long past, old projects students never took home, lost homework that suddenly reappeared. If you’re like me, you have an overwhelming urge to file everything in the circular bin and start anew. However, we all know that in that mound of work lies a primary resource for improving your craft. According to educational theorist David Kolb, “reflection plays an important major role in the transformation of experience into knowledge.” [sic] The key to improvement is in our backlog.

To be effective, reflection for improvement must be deliberate. I’m not just rereading my blogs to reminisce about blogging days gone by—I’m analyzing my efforts to see how we can pivot to create better content, to increase reader engagement, and to organize more logically. For example, one thing on my agenda is to comb through all our blog posts to make sure they are labeled consistently. I’m making sure all the exit ticket posts are tagged “exit tickets” and “formative assessment example“. That way, at the end of next year, I can avoid manually re-labeling everything. By looking back on the things I did and adjusting accordingly, I set our team up for success the next time around.

Some questions to ask yourself

Reflection of a school year starts with the syllabus or your lesson plans. Dig up those bad boys (or girls) and match up your original plan with how you executed it. The University of California, Berkeley has some pretty good reflection questions to start with:

  • What worked well in this class, and why? What didn’t, and why?
  • Where did the students seem to have difficulties?
  • Were there any noticeable points where the students seemed very engaged with the material?
  • What types of things may need greater clarification the next time?
  • Were there any particular pedagogical strategies that seemed to work well?
  • What will I change the next time I teach this topic?

Of course reflection is unique to each teacher and each lesson plan, but the idea is the same:  analytical reflection helps us act instead of react. Next time around, what will you do differently based on what you’ve learned?

We’ll be spending our summer months talking about strategies and philosophies around reflection for improvement. Got ideas to share? We’d love to hear them because we’re really excited about this topic so comment below! Wondering how to streamline the reflection process in your classroom? Learn more at www.gradeable.com.

Bonus for the engineers out there: Doesn’t all this feedback talk make you think of control systems and feedback loops! =D

Double bonus for people interested in control systems: “Feedback loops take the system output into consideration, which enables the system to adjust its performance to meet a desired output response.

2

How Tracking Student Data Improves Teaching Practices

tracking student data

Guest Post: Holly Walsh, Common Core Resources Associate, The Achievement Network

Holly is a former teacher and now works as a Common Core Resources Associate at The Achievement Network. She is also a Teach for America alumna and Corps Member Adviser at TFA’s teacher training Institute where she led instructional sessions on the why and how to start tracking classroom data. Read on to learn how she did it!


My first semester as a teacher was… well, exhausting. It was certainly exciting and challenging (and many other things, too) but the bottom line was that I was tired. When I heard about student data tracking practices (in this case, meaning students would individually keep track of their progress in class), I was excited to give it a try, but I felt like I just could not add one more thing to my schedule or management load. I decided to carve out some time during winter break to plan and introduce tracking practices the following semester. The results were incredible.

Why should students track progress?

First and foremost, tracking progress was a huge motivator for my students. I saw immediate changes in student investment after I started tracking in my classroom. My students were empowered to analyze their strengths and growth areas and set individualized academic goals. They learned to articulate which classroom and study habits contributed to success as well as how to calculate their growth percentage over the course of a unit.

Tracking also helped me reflect and improve upon my practices as a teacher. Students were excited about tracking progress after each assessment, thus holding me accountable to staying organized and returning graded assessments the next day.   I had immediate data on hand to send home or share during phone calls home and parent-teacher conferences. Seeing trends across my students’ trackers also helped me understand which types of teaching practices were working better than others.

What does student tracking look like?

Student tracking can take on a number of different forms. I taught my students to track data in several ways, but I found the most effective system was when students calculated and recorded their mastery by standard for each pre-test, quiz, and test we took throughout the unit. They received a new tracker every time we started a new unit.

The trackers looked like this:

student goal setting worksheet

I know… it looks daunting. Reading and filling out the tracker was a process I had to teach my students step-by-step. But, once students understood the process, they LOVED them! They have even convinced a few other teachers to use a similar tracking template in other classrooms.

As you can see in the sample tracker above, I asked students to track their percent mastery (listed on the y-axis) for each standard (listed on the x-axis). (Note: SWBAT stands for “student will be able to,” which is how I presented my daily objective to the class.)

You’ll see each standard is broken down further into “pre-test,” “quiz” and “test” columns. At the beginning of each unit, students would take a pre-test, correct it, and calculate their percent mastery for each standard. Next they drew the bar to the correct height in the “pre-test” column. As we progressed through the unit, students would again calculate their mastery for that standard after taking the quiz, and then again after taking the test.

As students filled in new mastery scores throughout the unit, they would set study and improvement goals at different points. Sometimes, I would have them discuss their progress and goals with a parent or guardian as part of their homework.

Eventually, the trackers would look something like this:

student data tracking template student data tracking template

While I didn’t require students to color code their trackers (I had a limited number of coloring supplies and wanted to make this process as efficient as possible), many students started doing this on their own. It definitely makes a tracker easier to read. As you can see in the second student example above, the student color-coded her diagnostic (a.k.a. “pre-test”) mastery scores blue, her quiz mastery scores pink, and her test mastery scores a darker purple-ish color.

Having this data readily available was incredibly helpful. Students traced clear connections within their performance over the unit and were motivated by seeing their mastery improve. Alternatively, if mastery did not improve, it was a great starting point for us to think about factors for why this was the case.

How should this look in my classroom?

Different classrooms have different tracking needs (depending on grade level, content area, etc.) I encourage you to use any of the ideas I described above or create your own system if something else would better meet your needs. Either way, I’d suggest remembering the following:

  1. Student should know what they are tracking, what it means, and how they are growing.
  2. Tracking needs to be consistent in the classroom. Otherwise, it is ineffective. Set up systems that will ensure you return materials (assessments, performance reviews, whatever you’re tracking) on time and students know what is expected of them during class time dedicated to tracking.
  3. Remember to CELEBRATE progress and successes!

celebrate student data

Here’s of photo my “[Au]some” Student Work wall. I would often spotlight entire trackers on this wall at the end of a unit.

Happy Tracking!

See how Gradeable can help you can your class with your goals at www.gradeable.com.

Video
1

Video: Why is Data Important?

On March 6, we held a panel for our Assessment Debate. Three experts came in to discuss the finer points of assessments: low-stakes, high-stakes, and alternative forms of measuring student learning. Here are the key take-aways from the “Why is Data Important?” cut:

  1. Determine what the data is telling you, then decide how you will use that data.
  2. Data must be timely.
  3. Frame the data in a larger context. Don’t get too focused on one question or one standard.
  4. Having student data is nice, but teachers need time and support to do something with it.

Read full transcript below, or check out all videos from the Assessment Debate.

Kattie: Alexis, in your work as a coach at the Achievement Network (ANet) using data and interim data, do you have any strategies, or tips and tricks that you can give to teachers out there on how to better use this data and better leverage it in their classroom?

Alexis Rosenblatt, ANet: Sure, I think you have to figure out what that data is telling you. So are you looking at that data to see what you’ve taught and what students have learned? Are you looking at the data to decide whether or not you understood… I mean there is a movement in the country around the Common Core State Standards which have been here for a little while, but I think is actually settling into schools now, and the schools that I work with, so you have to decide, am I looking at something I haven’t taught yet and the students are actually showing some mastery on something so they have some of those skills coming into the classroom so I can leverage when I teach this topic?

But I think you can’t do everything. So if I have an assessment that has 30 or 40 items on it, I have to stop and decide, what am I gonna tease out that I can do tomorrow? What do I need to do long term? What is fitting with the curriculum that is in somebody else’s classroom—that maybe, “Oh look this fits really nicely in science,” or “I’d love for history teachers to be teaching more informational nonfiction text. Let me connect with my peers around this.” So sort of trying to figure out what is the data telling you. And you can’t do everything, so to figure out how to not be overwhelmed by too much data.

Jonathan Ketchell, HSTRY: No exactly, I think that’s one of the problems I’ve encountered throughout my teaching career is teachers never have time… we just never have time for anything, unfortunately, but I think that’s why the digital age is actually gonna be beneficial to everyone. We’re actually gonna create—hopefully through the work that Gradeable is doing—we’re gonna create more time so teachers can collaborate and better their classes, clearly.

Alexis: I think that the idea of getting data in real time, or very quickly, is important so—no offense to the MCAS—but taking an assessment in March and getting data in September or October, there’s so little action you can take on that in terms of those students. So the more actionable the data is I just think the better, and timely is awesome… either in real time or a short amount of time.

Jennifer Spencer, MATCH Charter High School: I think also teachers have a tendency, when they get the data, to hone in too deeply on each individual item as well. So looking at why a student got one particular question wrong and then drilling that kind of question over and over and over again with students, rather than looking at the bigger picture about what kinds of errors the student made on that particular questions. I’ve seen that a few time where the teachers that I’ve worked with have said, “Oh well we need to do this kind of question , we need to make sure the students understand this question better” and that are kind of… I think i’m gonna use the word flummoxed… is that…

Alexis: Mmhmm, that’s a word.

Jennifer: …about why they are still not doing so well on that particular standard on the next assessment with a different item. And so I think in terms of the moving target aspect, teachers try to steady that target by nailing down that one particular assessment item rather than looking at the bigger picture, as Alexis said, the idea of it being one tool, looking at the data, what does the data exactly show? That’s why it’s important to have someone who is not the teacher helping to frame the data in the greater context.

Alexis: But I think just to [Jonathan’s] point before about time: yes, you want the data to be timely, and then you want the actual space and support to do something with it. So getting data but then having no opportunity to collaborate with your peers or to sit down even on your own and try to do this—I think there needs to be actually time for teachers to then look at, and plan, from the data.

1

Data Quality Campaign: Better Data for Better Classrooms

data quality campaign student data

With technology becoming an integral part of our classrooms, teachers are handling more and more information on our students. Luckily, there are data proponents out there who make it their livelihood to ensure that the data is accurate, useful, and available for the community.

One of the biggest national proponents is Data Quality Campaign, a nonprofit, nonpartisan, Washington-based organization that exists to promote better data use within the education sector. Each year, DQC issues a report on the growing capacity of State Longitudinal Data Systems (SLDS, pronounced “sleds”) to provide timely, actionable information to parents, principals, and policymakers. I spoke with Chris Kingsley, DQC’s associate director for local policy who is focused particularly on making these resources available to community leaders working to improve student achievement both inside and outside of schools.

Below is a lightly edited excerpt from our conversation.

Why was the Data Quality Campaign founded? What is their goal?

The Data Quality Campaign was founded with the core belief that educators do more effective work with access to good data and the skills to use those data—and that this is as true at the classroom level, at school board meetings and in state capitols. 

“Good data” is the key phrase here, and it’s important to understand that different stakeholders define that a little differently. DQC would say that, at a minimum, data need to be timely, tailored, and of high quality. 

Timely data

The need for more timely data is a constant theme in our conversations with teachers. It does no good, for example, to deliver educators information on the where their students need additional assistance after those students have moved on to the next grade. You might take a listen to the interview we recently conducted with Evansville Vanderburgh School Corporation (EVSC) in Indiana, which we recognized this year for its really smart work to use data to improve instruction. EVSC voluntarily added an additional set of assessments to their school calendar so that they could get their classroom teachers quicker feedback on where students needed more attention. And while you can imagine that it wasn’t a universally popular decision to add another assessment to the schedule, it does seem to be paying off.

“Tailored” data

By “tailored” data, we mean that different stakeholders have different informational needs – they have different questions that need answering. If you want to see a great example of this principle put into practice, take a look at how carefully the Texas Education Agency designed the StudentGPS dashboard they’re rolling out to schools across the state. Ed-Fi Alliance, which largely designed StudentGPS, sent hardcopies of different prototypes to educators across the state and asked them to literally take their red pens out and let Ed-Fi know what they needed to see and how they needed to see it. Student GPS offers differently tailored reports for classroom teachers, principals and district administrators. Good design like this comes from listening carefully to the people you intend to ultimately use the data.

Beyond compliance data

Unfortunately, a lot of data in the education sector right now is used only for compliance rather than continuous improvement—to prove that a school did what it said it would do, or what a legislature mandated that it do. Those kind of data may be necessary, but they are not sufficient. And so DQC works to build the preconditions for data uses that really do have impact: not only the technological infrastructure, but the policies to govern the security and privacy of data, strategies to more effectively communicate to parents about why collecting and using data in these ways is valuable, and training for educators at all levels to build this into their professional practice.

How do you collaborate with schools? What data do you collect?

The Data Quality Campaign doesn’t collect student data. What we do is work with states and partner organizations to make smart policy about how schools are collecting and using data. To give you one concrete example, we are currently facilitating a working group to develop guidance for states about how they publish information on the achievement of students, schools and districts through state “report cards.” One of the common frustrations we hear from the field is that right now, these data are hard to find. And, assuming you do find them, they’re ugly: often times a series of excel tables or charts that are just about impossible for a parent to make sense of. States have not traditionally made the usability of these reports a priority. That’s changing—and by raising up the great work of a few states in this area and circulating principles that other states can use to raise their game, we expect to see the whole education sector providing better resources to decision makers. 

Why do educators listen to you?

We were created, essentially, by the organizations doing this kind of work at the state level, groups like the Council of Chief State School Officers, and our relevance comes from our track record of being able to provide useful guidance to the people charged with getting the work done. They have a big job to accomplish with limited resources. We do what we can to help. 

How many states are involved?

The Data Quality Campaign tracks the progress of every state and the District of Columbia toward accomplishing the 10 State Actions to Ensure Effective Data Use (defined on DQC’s website). This year we are really excited that two states, Arkansas and Delaware, met all 10 Actions. As we move forward, DQC is gaining more traction by working with states to move beyond this fairly binary report (the presence or absence of a specific capacity), to differentiation in quality and understanding what really high quality feedback reports, professional development, governance policies and so forth look like.

What kinds of conversations and concerns do you discuss with policy makers?

At the moment, some of the most important conversations we are having are centered around student privacy and public trust. Legislators and state agencies are quite rightly concerned that the imperative to get better information to educators doesn’t outweigh the need to protect students and families. So we are working with policymakers in a number of states to ensure they are putting in place the necessary technical safeguards and—even more importantly—making and enforcing smart rules about who can access what information for what purposes.   

Future of DQC?

What I’m excited about is the increasing attention we have been able to give to taking these new, incredible resources that states have built and putting them to work for local leaders. When you speak with mayors, and with “Collective Impact” groups working across the country, or with education funders—what you find is a tremendous thirst to know more about the impact of their investments, about “what works” for kids. We are in a better position to answer those questions than ever before and, as a consequence, to make the kinds of gains in achievement that I think the country is counting on from its schools.  

For the latest, follow DQC at @EdDataCampaign and Chris Kingsley at @emersonkingsley.

0

Leadership in Educational Technologies and Data with Vinny Cho

via @profvinnycho

via @profvinnycho

Professor Vinny Cho studies leadership practices and technologies that support data use in education. As an assistant professor at Boston College, his aim is to help educators make the most out of their information on students.

Since Gradeable’s aim is to make information on student learning more accessible and useful for educators, I interviewed Dr. Cho for his thoughts on how data is shaping education, and how it starts with communication from the top down.

Looking at the “whole kid”

Schools are starting to collect more emotional data on students (how they feel about school, if they feel safe, what they aspire to be) to try to draw connections/predictions in conjunction with standard data points like state test scores and attendance.

Also, as data collection and analysis gets more sophisticated, Cho is starting to see more proactive analysis. For example, if you know a student was out the majority of the days you taught fractions, maybe you can give him/her a little more attention before the test. Or, say the system picks up that a student was out three days in a row; an automated phone call goes out to the home.

IT needs a seat at the table

What drives results effectively is when administration acknowledges the need for a person or persons to oversee the rollout of technologies. Leaders should recognize the job to be done (say, by a liaison between tech, teachers, admin) and organize toward that goal. IT people need to have a seat at the table when schools are discussing instruction/curriculum. In addition, administration must provide support, training, and setup help as people get acclimated to the technology.

Decide what information you want

Leaders must set expectations for what they want technology to show them. Cho says that if leaders don’t have the conversation with community members to decide what they want for students, they’re passing the buck to tech designers who literally shape what someone sees about the kid. A bad scenario is profile of a student that features his worst Facebook photo and all disciplinary records.

We still don’t know what we’re looking at

Data and technology has changed so much that we’re only starting to realize what’s possible. Even with all this information on a student, we don’t have to use it. At this point, data checks in, but it doesn’t check out. So again, it remains to be seen how all this information will be utilized.

In conclusion, Dr. Cho’s suggestion for success is communication. Educators must create a common language where everyone understands what the school wants for a student.

For Professor Cho’s latest, visit his blog

0

ProTip Wednesday: 10 Tips on Using Data for New Teachers

datanewteachers

Being a new teacher is never easy when you have classroom management and routine to strengthen— use these tips to start slowly using data in your classroom. Knowing is half the battle; back it up with data! But don’t take our word for it—listen to Gradeable users and education experts on how they use data in their classrooms.

1. Don’t re-invent the wheel

Don’t try to tackle data by creating an immense Excel or Google spreadsheet all by yourself. Chances are, the teacher next door or in your Twitter PLN already has a tracker they’d be happy to share with you! (If not – we do!)

2. Use tools you have to visualize the data

Take advantage of built-in tools, already in your hands. Many grade books, such as Easy Grade Pro, have summaries and reports that visualize student scores for an easy, at-a-glance analysis.

3. Make sure your data has context

Don’t take data for its face value — be sure to evaluate the data for authenticity and possible skewness. Were there students absent? Were there enough students? And then — How were planning to make the data actionable?

Our power user and middle school reading teacher, Colin T., wants to make sure new teachers started with these nuggets of wisdom:

4. Start small and use the data you already have

Begin with an exit ticket that just has 2-3 questions on it, assessing one objective.  At the end, your data should tell you which students mastered the objective, and which still need practice. By counting the number who got it, you will know whether to re-teach the objective to the whole class, just do a short review, or pull a few students for tutoring.

5. Get a buddy

Find a veteran teacher who is a wiz with data, and have her/him show you the spreadsheets s/he uses, and how to use them.

6. Plan ahead

Set up the assessment so it’s easy to analyze the data.  Organize it by objectives, and put a key at the end for yourself. Start with lower-rigor questions that all students can get right, then build to higher-difficulties that only a few will master.  That way, you can chart exactly where student understanding breaks down by looking at which questions were most frequently missed.

Gradeable user Rik Rowe is the #SBLchat moderator and math learning facilitator. These are some great tips for new high school teachers on how to use data in the classroom:

7. Compare pre-test and post-test grades

Teachers often forget to pre-test and jump into a unit before know exactly where students are before learning a new concept. Tracking their growth from pre-test to post-test allows both teachers and students to celebrate growth!

8. Analyze Summative Assessments to pinpoint weaknesses

Teachers should analyze Summative Assessment grades and determine what that tells them about what weaknesses their students have and how to close that gap with the next unit.

Jennifer S. is a high school math teacher and a recent Gradeable Social panelist. She’d like new teachers to keep these things in mind when analyzing data for the first time:

9. Purpose of the paper

Are you using the ticket to leave to figure out if students know how to do a skill?  Make it easily correctable, correct them quickly, incorporate into your next day’s do now/class materials and return them to students to correct errors.  Then, toss.  To correct, I often know the answers by heart, stand at the door and collect the (Ticket to Leave) TTL and give students an immediate yes or now or quick tip as they walk out.

10. Get the wide view

How are students generally doing on a group of standards?  Use data to get the big picture with programs like Gradeable.  Take the time to correlate questions to standards so you can track these.

Looking for more thoughts on data in the classroom? Check out this video from our recent panel on assessment and the importance of data.

1

Highlights from the Third Gradeable Social

(L to R) Jonathan Ketchell, Alexis Rosenblatt, Jennifer Spencer

(L to R) Jonathan Ketchell, Alexis Rosenblatt, Jennifer Spencer

Our third Gradeable Social featuring an assessment debate was a success. Our three panelists (Alexis Rosenblatt of ANET, Jennifer Spencer of Match Charter High School, and Jonathan Ketchell of Hstry) fielded questions all about high-stakes, low-stakes, and alternative assessments. For video of the debate, check out our YouTube playlist. Meanwhile, here are some highlights:

Q: Some say that high stakes, standardized testing is inaccurate because students are “moving targets,” especially on test days. What is your take on that?

Q: How do you feel about “teaching to the test?” What role does creativity have in the realm of test prep?

Q: There are some added hidden benefit to authentic assessments. At the end of the day, how does this help with your big goal of helping students?

Q: Why is data important?

Q: Aren’t these pictures great?

third gradeable social

Party like an education enthusiast!

mikaila spence rowe

Mikaila chats it up with some terrific teachers… like Mr. Rowe!

ANET's Alexis talks to our friend from Listen Edition

ANET’s Alexis talks to our friend Karen from Listen Edition

Kattie hangs with our old friend Lillie and our new little friend

Kattie hangs with our old friend Lillie and our new little friend

Like what you see? Well there’s plenty more in the videos of our debate. Hope to see you next time! 

0

Thought Leaders of Data-Driven Education

Mrs. Bullen's Data-Rich Year via dataqualitycampaign.org

Mrs. Bullen’s Data-Rich Year via dataqualitycampaign.org

Data in education is a topic we’ll be getting into over the next few weeks. At the very least, data is entering test scores into your trusty Excel spreadsheet. At most, data is using the results of formative assessments to drive teaching and shorten the learning loop. The field of thought is vast and still growing so here are some places to start when embarking on the data journey.

TeachThought

TeachThought is a site that’s dedicated to supporting 21st-century educators in the ever-evolving world of teaching. Their mantra is simply “teach better,” and we’ve actually highlighted their director, Terry Heick, in our feature on people who blog on assessments. To help teachers understand what students are learning in class, TeachThought has rounded up five tools that help data-based teaching. These tools help you track, analyze, and report the plethora of data generated every day in your classroom.

Data Quality Campaign

Data Quality Campaign is a nonprofit organization that works to improve student learning through the use to data. They are based in Washington, DC and work on a national level championing awareness and best practices of data use in education. In this post, they emphasize the importance of standards, assessments, and data in education described as “three legs of a stool.” A successful school stool needs all three legs to stand.

Infographics

Infographics on data-driven instruction are worth a thousand explanation. Using data in school isn’t something that can be explained quickly, and sometimes it’s easier to convince (those who need convincing) of data’s power through pictures. eLearninginfographics is a great place to start. Here is a great one by OpenColleges about leveraging educational data, or “Learning Analytics 101.”

Vinny Cho

Professor Cho is an assistant professor at Boston College who studies data use, technology, and leadership in education. His focus is on the role administration and central office figures when rolling out innovations. Like all initiatives, having the right people at the table, clearly defined goals, and effective communication is key to getting edtech off the ground. For more his leading thoughts, check out his blog.

Resources

Researchers and support for teachers are out there for educators to fine tune their craft. Here are a few to get you started:

  • Marzano Research Center tips archive: http://www.marzanoresearch.com/resources/tips
  • ANET for interim assessments, coaching, and professional analysis http://www.achievementnetwork.org/
  • Research for Better Teaching is dedicated to supporting a sustainable school environment: http://www.rbteach.com/rbteach2/about.html

For more information on how Gradeable can help you use data to drive your teaching, visit us at www.gradeable.com

1

The Case for Low-Stakes Assessments

low stakes formative assessments

Low-stakes assessments are our favorite way to keep up with the real-time progress of your students. But that word—assessments—triggers a complex in every student, educator, and teacher’s brain. Assessments, tests, evaluations… judgment. Low-stakes assessments, like formative assessments, aren’t meant to be scary or judgmental. In fact, this brand of assessment is something you probably do on a daily basis without breaking a sweat. They include exit tickets, homework, asking for head nods—anything that checks in with the students about their knowledge. The beauty of low-stakes assessments is that they are low-stakes. Non-threatening. Sans-punishment.

Formative assessments like do-nows and exit tickets are based on feedback as a way to drive learning. As Paul Bambrick-Santoyo says in his book Driven by Data, “Assessments are not the end of the teaching and learning process; they’re the starting point.” Instead of finding out that a student didn’t grasp a concept when they get to the big test, teachers can catch the misunderstanding early and pivot students to the right direction. The key using frequent, low-stakes assessments. It’s like going to the doctor for regular checkups instead of waiting until you’re pretty sure you have a kidney infection.

Knowing that a student doesn’t understand a concept while it’s still being taught allows teachers to adjust their reteaching appropriately. To make formative assessments formative, feedback must be done in a timely fashion. Returning homework after students take the big test is not helpful for anyone. The beauty of formative assessment is that it’s the teacher, not just the student, who is getting feedback on what’s working.

Susan Brookhart, in her book How to Give Effective Feedback to Your Students, says, “Feedback needs to come while students are still mindful of the topic, assignment, or performance in question. It needs to come when they still think of the learning goal as a learning goal… that is, something they are still driving for, not something they already did. It especially needs to come when they still have reason to work on the learning target. Feedback about a topic they won’t have to deal with again all year will strike students as pointless.”

Not only do low-stakes assessments give prescriptive, real-time insight, the feedback that goes with it can engage students. “Once students understand what they need to do and why, most students develop a feeling that they have control over their learning,” Brookhart writes. Students begin to take ownership of their learning process once they have an idea of the bigger picture and understand the doable steps for improvement. Simply put, a good feedback loop helps lessons gain traction with students.

A negative part of low-stakes assessments is that they must be done frequently to be effective. And for anyone who has a large Excel file of grades, you know how tedious it is to keep track of all those grades, concepts, and suggestions. But the saving grace of a formative assessment teaching strategy is that the benefits far outweigh the work that goes in, especially when there are tools out there to ease the process.

Another counterargument to low-stakes assessments come from those who fear the “Big Brother” effect. As we collect more information on our students, who gets to see all that data? Right now, laws are being passed to protect student information from corporate interests. The perception is that ed-tech is an $8 billion industry is foaming at the mouth to get their hands on student information. We’ll be discussing more on that perception next month.

Still, we at Gradeable are completely behind the formative lifestyle. On Wednesday, a blog post by Kattie will go over the different types of feedback. On March 6, we’re hosting our third Gradeable Social that will serve as an assessment support group of sorts. We’ll be gathering once again to talk shop on education best practices, so sign up here. Gradeable users get in free, so email bon@gradeable.com if you need a promo code.

Ready to get formative and the glorious data-driven instruction that comes with us? Come see us at www.gradeable.com to learn more.