A few months ago, I asked for some input regarding your experience with photocopiers as I prepare for a session on error analysis. Thanks for all your help and input. As I continue my preparation, I asked Twitter to share three things you noticed about these three pics.
With over 70 responses, the results have been fun to look at. My session is about effectively using error analysis. One small part I'm curious about is if we can make any generalizations about spotting things that seem out of place. Are we conditioned to look for flaws, or is it something we innately do?
Next phase: I'm curious what students see. If you can put this questionnaire in front of your students, I'd be super appreciative. It's anonymous. Just give them the survey and don't tell them anything other than what are two things they notice about each picture. Maybe it'll foster some discussion in your classrooms after they take it.
A shortened URL is: goo.gl/8XuN8z
3 pictures,
843
I aim to blog about my interactions with math, logic, life, teaching, and the web they spin together. My favorite divisibility rule is that of three.
Pages
▼
Friday, February 27, 2015
Thursday, February 19, 2015
Elevator Speech by Steve Leinwand (Reprise)
You might remember I did seven days worth of elevator speeches back in September, 2014. I was honored to receive two contributions from Steve Leinwand. I was so inspired by his Day 5 elevator speech that I created an audio/visual representation. Check it out.
Spoken by: Andrew Stadel
Audio/visual by: Andrew Stadel
Feel free to share.
Reprise,
207
It’s only one of eight Common Core Standards for Mathematical Practice, but we can change schools and change lives if we truly implement Mathematical Practice 3: “Construct viable arguments and critique the reasoning of others.” In many ways, these nine words may be the most important words in the entire Common Core effort. We can’t expect students to construct viable arguments unless we ask them “why?” and “how do you know?” and “can you convince us?” When we ask such questions we are laying the foundation for the reasoning and justifying that represent the thinking that schools need to develop in all students. Similarly, we can’t expect students to critique the reasoning of others unless we create classrooms where student thinking is valued and students contribute to their own learning within communities of learners. Moreover, this isn’t just mathematics, but what needs to happen in English language arts, social studies and science as well. So when one cuts through all of the misrepresentations and politics that surround the Common Core, these powerful nine words transcend our differences and capture what every parent and every citizen should be demanding from their schools and for their children.Words by: Steve Leinwand
Spoken by: Andrew Stadel
Audio/visual by: Andrew Stadel
Feel free to share.
Reprise,
207
Monday, February 16, 2015
iPad Summit 2015
Last week, I had the extreme pleasure to co-present with my friend and colleague, JR Ginex-Orinion, who tweets at @gochemonline <---Instant follow! Our presentation was at the San Diego iPad Summit put on by EdTechTeacher.
That's a lot of links in those last two sentences, right? Here's one more: our presentation slide deck. Feel free to look through it and contact one of us if you have any questions.
Our session, Quick and Powerful Assessment Tools for the iPad, had the following goals:
That's a lot of links in those last two sentences, right? Here's one more: our presentation slide deck. Feel free to look through it and contact one of us if you have any questions.
Our session, Quick and Powerful Assessment Tools for the iPad, had the following goals:
- share why it's important to use assessment tools
- present four assessment tools
- give attendees a hands-on experience with each tool
- mention shortcomings and suggested uses for each app
- provide time for Q & A
With about 120 attendees and an hour, we had our work cut out for us. However, I think JR will agree that we effectively met our goals and had fun with all the attendees. My intention is to share a few highlights from the session and not recap the whole thing.
Why?
My good buddy, Robert has encouraged me to address the "why" part of a presentation and I'm really appreciative for this encouragement. As we told the attendees, it's easy for us to show you what the assessment tools are and how to use them, but let's first focus on why we should use them.
We talked about why it's important to use assessment tools because they:
- are a window (not a glass house) into student thinking
- can tell a story about your students
- should be informative (to guide your instruction) and
- should be efficient
The tools we presented were:
I won't deny it. I think we saved the best for last. Pear Deck was definitely a crowd favorite. Ironically, we had to throw it in last minute because our originally planned tool, Infuse Learning, will be stopping their service in April. Pear Deck is such a great tool for capturing student thinking. I highly recommend you check it out and give it a test run in your class.
All in all, it was a fun conference. I picked up some inspiration from Doug Kiang's keynote and Sabba Quidwai's session. More importantly, I look forward to presenting with JR again in the future sometime!
Summit,
129
Sunday, February 1, 2015
Piloting
I'm curious where your district (or site) is with piloting math materials for your district/school with the intention of adopting?
I'm fortunate to be able to compare three different surrounding districts: my wife's, my sister's and my own. I'm not here to say which is best or to say that any district's process is better than the other. Furthermore, as I state questions, I'm not implying that any of the aforementioned districts are sufficient or insufficient, competent or incompetent, correct or incorrect.
Therefore, here's something I'm wondering when a district pilots a math program:
Again, I'm not saying that my district, my wife's district, or my sister's district have the math adoption process right or wrong. I'm curious if it makes sense for any district to front-load their teachers with ways to measure the effectiveness of a program.
Thankfully, I've noticed the most patient participants (or bystanders) in this transition (including adoption) are the students. Be sure to thank them for their patience and perseverance as we work hard to do our best getting it right. How long will that patience last?
Pilot,
946
P.S. Chris Hunter shared a very thorough post by our NCTM president.
I'm fortunate to be able to compare three different surrounding districts: my wife's, my sister's and my own. I'm not here to say which is best or to say that any district's process is better than the other. Furthermore, as I state questions, I'm not implying that any of the aforementioned districts are sufficient or insufficient, competent or incompetent, correct or incorrect.
Therefore, here's something I'm wondering when a district pilots a math program:
Are teachers given a comprehensive list of metrics regarding the effectiveness of piloting a math program in their classroom?For example, are teachers asked to pay attention to any of the following (and more)?
- How are lesson objectives structured?
- How are content standards unpacked?
- How are lessons/activities launched?
- What's the level of student engagement?
- What are students doing during the lesson/activity?
- What are teachers doing during the lesson/activity?
- What conclusion do students make at the end of the lesson/activity?
- Is the practice (homework/classwork) effective and meaningful?
- Are the assessments a fair representation of the lesson objectives and content standards?
- What's the distribution of application, procedural, and conceptual understanding mixed with problem-solving or performance activities?
- How applicable are the statistics and probability standards in your grade level and are they imbedded in the other grade level content standards?
If a veteran teacher and first-year teacher are both piloting the same program, how can they both objectively measure the quality of a pilot?
How is any teacher expected to give meaningful feedback to their district if they're not given direction ahead of time?
I've noticed that districts are giving their teachers a chance to voice their opinion on the pilot program, but if there's no common metric, how does one make it a fair comparison?
Again, I'm not saying that my district, my wife's district, or my sister's district have the math adoption process right or wrong. I'm curious if it makes sense for any district to front-load their teachers with ways to measure the effectiveness of a program.
Thankfully, I've noticed the most patient participants (or bystanders) in this transition (including adoption) are the students. Be sure to thank them for their patience and perseverance as we work hard to do our best getting it right. How long will that patience last?
Pilot,
946
P.S. Chris Hunter shared a very thorough post by our NCTM president.