Ponder School Feed

Ponder Reading Groups Pilot: Inquiry is finally cool!?

Ponder School Feed

A Ponder feed for the learning community

We are excited to announce a new pilot: Ponder Reading Groups, our first major move outside the boundaries of specific classes! Over the coming school year we will be working closely with a number of pilot schools to iterate on the experience, but we are looking for a few more.

Therefore, I am happy to further announce that in our search for a diverse spread of schools with a basic level of infrastructure and a passion for discussion, we will be giving away a full-year Ponder site license for up to 10 schools selected for the pilot! If you’re interested, read on and then fill out the pilot request form.

That’s right, a free site license for up to 10 selected schools!

Reading Groups were inspired by repeated anecdotes from our Ponder classes about student usage exceeding all expectations, and then students asking their teachers if they could continue using it after the end of the semester. ‘How often do students ask to keep using educational software?’ we thought. With the further encouragement of the preliminary results from the Ponder efficacy study at San Francisco State University, we began bouncing ideas around about other roles Ponder could play while still boosting student engagement, perseverance and educational outcomes.

How do we make research, reading and thinking a fun, social activity for the entire school community?

Or in the words of the College and Career Readiness Anchor Standards for Reading, how can Ponder make building “a foundation for college and career readiness” by reading “widely and deeply from among a broad range of high-quality, increasingly challenging literary and informational texts” and acquiring “the habits of reading independently and closely” fun?

We grew even more excited when the Robin Hood College Success Prize was announced early this Spring, with a goal in the same spirit: Allowable solutions for the prize cannot be dependent on instructors or the classroom, but must instead be “student focused.” (See rules Section 5.2.) If you are not already familiar, the goal of the College Success Prize is to create a technology-only tool to dramatically increase the percentage of remedial students across the country who complete their community college degree within a “timely fashion”. So it’s understandable why they structured the rules this way – they don’t want solutions to get blocked on instructor adoption, or institution-specific implementation details which might slow them down – they want something that will help students independent of those variables. For us it was further confirmation that there is a need for what we are working on, and even better that it dovetailed nicely with the efforts already underway here. We just had to wrap up the idea in a two-minute video to enter the competition:

 

 

The Robin Hood testing process for solutions is as simple as it is imposing: to run a three-year evaluation of the selected proposals, and simply measure the change in graduation rate between those students randomly assigned to each solution, and the control group of students who are not assigned a solution. This month the judges will be announcing which technologies move to the next phase of the program. Of course, regardless of how the judging cookie crumbles, Ponder will be proceeding with reading groups!

Of course, regardless of how the cookie crumbles, Ponder will proceed with Reading Groups.

So how will they work? As demonstrated by our new interview series, Ponder is used in different ways by almost every instructor who picks it up. There are common themes, of course, but the broad spread of applications is born of the simple fact that students love Ponder. What would you do differently if you knew your students would be debating last night’s reading on the way to class?

Ponder --> Articulate --> Share --> ListenThe goal of Ponder Reading Groups is to support students in their personal reading and research independent of their classes or direct instruction, building the habits they will then employ in class. Here’s how it works:

  1. Deploy Ponder to teachers and students across your school
  2. Students pick reading groups that match their interest areas
  3. Students read, watch and share with micro-responses within the school’s Ponder community
  4. Schools can build upon the online momentum by creating in-person study and discussion periods

Of course, adding students to classes continues to work as it always has – a click creates a private shared space for any class, club, group project, etc that needs one.

Your school may be looking for a similar solution: How do we make research and reading a fun, social activity for the entire school community? How do we better understand student interests to bridge them to the curriculum? How do we expand usage of Ponder beyond the early adopters without gating participation on over-taxed teachers and curricula? How do we let the students run with it?

Over the coming school year we will be working closely with a number of pilot schools to iterate on the experience, but we are looking for a few more. If you’re interested, please tell us a bit more about your school. If you have any questions, contact our support team via chat or ticket.

 

 

SFSU Study Outcomes Detail

Controlled study shows Ponder engages at a rate 12x over the control discussion forum.

We are fortunate that 7 of our teachers and professors agreed to sit down with us to talk about their experiences teaching and teaching with Ponder.

We have a lot of anecdotal evidence that Ponder has a noticeable impact on student engagement and class discussion. But how big is the impact? and compared to what? Many of our teachers had tried using blogs or discussion forums. But many more had never tried any technology to support student reading. In short, we wanted to turn subjective, observed correlations into objective, reproducible causations.

A Controlled Study

To measure Ponder’s effect more rigorously, we need a controlled environment where, to the extent possible, the only difference between two groups of students is the specific online discussion-focused tool they are using for class. Only then can we draw any credible conclusions about Ponder’s effectiveness.

xkcd: Correlation

Thanks to xkcd

Last summer, two researchers at San Francisco State University designed a long-term controlled study of Ponder’s impact on the classroom learning environment. Geoff Desa, Ph. D. and Meg Gorzycki, Ed.D. completed a review with the SFSU Institutional Review Board to ensure that the study met their guidelines. For those who are curious, or those who would like to replicate the process, take a look at the IRB Protocol Review Documents.

Everything but the name…

The study involved 4 classes of ~30 students each taking the identical class with the same professor during an intensive, 5-week summer semester. Two of the classes were instructed to use SFSU’s iLearn discussion forums to post and discuss articles relating to the class – the “control” for the experiment – and two of the classes were instructed to use Ponder. In both cases, identical scripts were used by the professor to introduce the tools, with only the name of the tool changed. The “quantity and quality” of their class contributions in these tools were to be incorporated into their class participation grades.

The professor used one script to introduce the tools to all 4 classes. The only difference was the name of the tool.

For those of you who are familiar with both discussion forums and Ponder, you might be wondering how the same script could be used to describe both, since they are functionally quite different from one another. You are right to wonder! Not only are they quite different, but most students were probably familiar with iLearn or something very much like it, and completely unfamiliar with Ponder or anything remotely like it. Still, it was essential to keep the number of differences between two experimental cohorts as small as possible. Consequently, students in the Ponder classes were left to figure out how to use it on their own. Despite this, students encountered few problems getting up and running with Ponder.

Preliminary Results

SFSU Study Outcomes Detail

SFSU Study Outcomes Detail

Over 90% of Ponder students participated, and on average, each of the students in the Ponder group contributed 26.9 responses while spending 329 minutes reading 208 documents – all of which was self-directed reading that fell outside of the assigned materials for the course. By contrast, students in the control group posted only an average of 2.2 times, and a quarter never contributed at all. The classes with Ponder were not only more inclusive, but generated over 12 times the volume of participation. In iLearn, there were a total of 123 posts. In Ponder there were 1747.

In iLearn, there were a total of 123 posts. In Ponder there were 1747.

In addition, preliminary analysis of data show that:

  • The average final grade for Ponder students was B+ as compared to B for the control.
  • There was a positive correlation between Ponder activity and quizzes, class participation, and group projects.
  • There was a negative correlation between iLearn activity and class participation.
  • More students participated in class discussion in the Ponder section with more participation per student.
ILearn Posts

Number of students who made a given number of iLearn forum posts during the study. (ie. 4 students made 3 forum posts.)

Ponder Posts

Number of students who made a given number of Ponder posts during the study. (ie. 10 students made 20-25 posts.)

What conclusions can we draw from these numbers?

Ponder is either more engaging and fun to use than iLearn, or it’s simply easier to use so students are more likely to use it. It’s hard to tell because none of the students experienced both so we can’t ask for subjective feedback comparing the two.

Either way, what’s important from the perspective of educational impact is that the students in the Ponder classes produced higher quality work, as demonstrated by the average half-grade improvement from the control.

When the final grades came in, Ponder students averaged a statistically significant half grade higher than the students in the control group.

The study itself has many more components to it, and the qualitative survey data had more students complaining about being confused by Ponder’s interface, unsurprising given the circumstances. The survey results also showed that students from the Ponder group appreciated the online component of the course more, said it made them want to read more and would recommend it to other professors more than the iLearn control group.

So what’s next?

These early results are very encouraging, but what we’re really curious about is forthcoming analysis of Ponder data that will start to paint a picture of how student reading habits change over time with Ponder.

  • Do students read from a broader range of sources by the end of the course? Do they read longer articles? Do they read more deeply around individual topics?
  • How do students affect each other’s reading? Do students discover new areas of interest through their peers? In other words, we don’t just want to know if their topic clouds grow over the course of the semester, but whether the students’ topic clouds increase in overlap.
  • Which students are the best at engaging other students in reading?

In short, what we’re interested in measuring with Ponder is not only student engagement but “intellectual curiosity” as defined by the reading behaviors listed above.

The second phase of the study is running right now, and we are looking to repeat and broaden the data to other levels and institutions. If you would be interested in collaborating with us or the SFSU researchers, get in touch!

 

 

Actionable Data: Survey Responses from EdSurge Baltimore

Companies love conducting surveys. Most of them are pretty awful to the point of uselessness.

So it was a nice surprise to find a few gems amongst the questions asked in the survey from the latest EdSurge event in Baltimore,

3 that I liked in particular:

1. If administrators were looking to purchase this product for their school, how strongly would you advocate for this product?

The question was not just another: “Rate this product on a scale of 1 to 5,” a purely theoretical exercise that’s meaningless in comparison to the much more concrete task of trying to imagine how far you’d stick your neck out to try a new product.

2. Forget about the current price of the product (if it’s free, forget about that too). If you were given an extra $100 to spend per user (e.g. student, teacher, etc.) how much would you be willing to spend for this product?

A more confusing alternative might have been “How much would you pay for this product?” Instead, teachers are given a hypothetical that gets rid of issues that often confuse the question of pricing (e.g. variations in teachers’ discretionary budgets, how much was granted and how much they have remaining, whether a product could be gotten for free) and focuses on the teacher’s sense of a fair price. This question replaced my previous favorite: “How much would you expect to pay for a tool like this?” which Desmos founder Eli Luberoff shared a few months ago.

Last but not least, I want to mention this question not so much because it’s particularly well-worded, but because I rarely hear it discussed:

How often would you use this product?

Room for Improvement?

Now that I’ve gotten the praise out of the way, here are 3 specific ideas on how the surveys could be even better.

1. Enable companies to respond to feedback through a craigslist-style-mail-relay mechanism which would enable direct communication while keeping teacher identities anonymous.

2. Provide teachers more of an incentive for quality (of feedback) over quantity (of responses) by allowing each company to enter one survey response in a “Most Useful Feedback” raffle.

3. Don’t allow teachers to respond to surveys before the event starts! I had a teacher show up before the start of the Baltimore Summit who told me she had already looked at Ponder online and decided it was a terrible idea, but if I wanted I could try to convince her otherwise. So I spent ten minutes walking her through our product, after which she said “Well, this is pretty great actually. I’m going to recommend it to the teachers at my school – you should put all of what you said to me on to the web site.” Which is fine, since we definitely need to improve the story on our site, and I was glad she was excited. Then she said “I already filled out the long form survey yesterday, so can you give me the tickets?” I gave her the tickets, and as she walked away realized that we would now have whatever her initial reaction was captured in our public survey results, with no mechanism for her to update them or comment on her misunderstandings.

4. Some sort of tech-savvy-ness question for the teachers to provide some context for their answers. How often do you use tech in your class today? Have you tried other tools similar to this one in purpose and funcationality?

An even bigger idea: The EdSurge Census

Here’s more of a project-sized proposal for EdSurge: A survey of educators, schools and infrastructure – The EdSurge Census. A grassroots, lay of the land sort of thing, which could be independent of the summits. I think it would be valuable to ed-tech companies, the schools themselves as well as funders and foundations. In the spirit of the new “50 States Project,” but a bit more data-oriented.

A few obvious questions we’d love answers to readily come to mind:

Devices – What schools have, what they think they’ll have, what they wish they’d had, etc; and

eBooks – In our experience most K12 schools seem to be lost at sea when it comes to ebook platforms – they don’t like being locked in, it complicates their device story, they have physical books which are easy to manage, etc.

Adoption – What tools are teachers using today and how often are they using them? Tools should be categorized by function (e.g. Administration and Logistics, Planning and Instruction) subject matter and grade-level so it’s easy to see where the gaps are.

EdSurge could charge companies to fund the data collection and reporting. Or perhaps one of the edu heavy weights would be willing to step up up to fund it.

A statistically accurate portrait of the state of ed-tech in schools is probably unlikely to emerge from such a survey. But so long as the results were positioned honestly as an informal sampling, the results would be undeniably useful.

Whatever happens, I’m looking forward to seeing the next incarnation of the Edsurge surveys in Nashville!

Implementing Flip: Why higher-order literacy is not just about text

Within the field of “instructional” EdTech, Ponder is often described as a “literacy” tool, which while accurate, encompasses a much broader spread of pedagogical challenges. We usually describe our focus as “higher-order” literacy – the ability to extract meaning and think critically about information sources.

A couple months ago we began our pilots of Ponder Video, bringing our patent-pending experience to the medium more often associated with the flipped classroom. From our experience with text over the past two and a half years, we knew this would be an iterative process, and as expected we are learning a lot from the pilots and the experimentation – see part 1 and part 2 of our interface studies.

During this process, some people have asked if Ponder Video is, in startup terminology, a “pivot”; a change of strategy and focus of our organization. The question: do we still consider Ponder a literacy tool?

After a bit of reflection the answer is a resounding YES! And the process of reflection helped us gain a deeper appreciation for what “literacy” actually means. This is not a change of strategy, it is an expansion of Ponder to match the true breadth of literacy.

Literacy > Text

The term literacy is most often thought of as the ability to decode words and sentences. That is, of course, the first level of literacy, but there is a shifting focus in many of the new pedagogical and assessment debates, from the Common Core to the SAT, a shift away from memorizing facts and vocabulary towards students developing a higher-order literacy. Still, higher-order literacy is a vague concept, and at Ponder we are always searching for ways of articulating our vision more clearly.

One line I like, from a now deprecated page of ProLiteracy.org with no by-line, does a really nice job of concisely capturing the significance of a broad definition of literacy: “literacy is necessary for an individual to understand information that is out of context, whether written or verbal.”

The definition is so simple, you might miss its significance. So let me repeat it:

“literacy is necessary for an individual to understand information that is out of context, whether written or verbal”

I like it because “understand information” goes beyond mere sentence decoding, and “out of context” unassumingly captures the purpose of literacy – to communicate beyond our immediate surroundings. The “or verbal” I would interpret broadly to include the many forms that information comes in today – audio, video and graphical representations.

The 21st century, at least so far and for the foreseeable future, is the interconnected century, the communication century, the manipulated statistics century, the Photoshopped century, perhaps the misinformation/disinformation century, and I would posit that if there is one “21st century skill” that we can all agree on, it is literacy, in the broad sense:

Understanding information out of context.

A text or video is inherently out of context, so a student at home is not only one step removed from the creator of the content, but also removed from the classroom. So a question immediately jumps to mind:

Are your students ready to learn out of context?

The answer to this question varies dramatically, and is not easily delineated by grade level; defining that readiness to provide an appropriate scaffold requires care, and is something we have worked to understand empirically through student activity in Ponder.

The National Center for Education Statistics, part of the US Department of Education, has put a lot of effort into defining and measuring this skill, and have twice performed a survey they call the National Assessment of Adult Literacy (NAAL), providing a useful jumping off point for thinking about your students.

This is not like one of those surveys you read about. It is a uniquely thorough survey that consists of a background questionaire and screening process, followed by an interview.

The NAAL is made up 100% of open-ended, short-answer responsesnot multiple choice – and focuses on the participants ability to apply what they have read to accomplish “everyday literacy goals”. You read something, then answer a question that depends on something you have to have extracted from the reading.

As you might imagine, this is not a quick process.

Administering the NAAL takes 89 minutes per person and in 2003 was administered to 18,000 adults sampled to represent the US population. That’s almost 26,700 person-hours or three person-years of interviewing.

This thoroughness is important given that they are trying to measure a broad definition of literacy.

The NAAL breaks literacy into three literacy skill types:

  • Prose
  • Document
  • Quantitative

You can read the details on their site, but given that it turns out American adults have roughly comparable prose and document literacy scores, I would lump them together under a general heading of “reading.” Examples of quantitative literacy tasks are reading math word problems, balancing a checkbook or calculating interest on a loan.

They delineate four literacy levels:

  • Below basic
  • Basic
  • Intermediate
  • Proficient

Again, they go into a lot of detail mapping scores on to these names, but I think what’s most useful are the “key abilities” that distinguish each level in their definitions.

My interest in higher-order literacy immediately takes my eye to the key distinction between “Basic” and Intermediate. An intermediate skill level means the individual is capable of:

“reading and understanding moderately dense, less commonplace prose texts as well as summarizing, making simple inferences, determining cause and effect, and recognizing the author’s purpose”
NAAL Overview of Literacy Levels

That list of skills captures the starting point of what we think of as higher order literacy. (If you’re curious, the highest level of literacy, modestly labeled “Proficient,” seems to mostly be distinguished by the ability to this sort of analysis across multiple documents.)

For me, the NAAL provides a useful framework for breaking down the literacy problems that instructional techniques (and technologies) are trying to address.

Ponder supports teachers who are trying to move their students from a level of basic literacy to being able to make inferences, determine cause and effect, recognize the author’s purpose.

…but our goal is to go an important step beyond even the NAAL’s definition of literacy.

Because what is the point really of making inferences and identifying cause and effect if ultimately you are unable to probe with your own questions and evaluate with your own conclusions?

In the end, the end-game of literacy is the so-called ability to “think for yourself.”

Flip is a great way to practice literacy. But you need literate students to flip.

The flipped classroom model is typically used for students to dig into and prepare for class discussion, and obviously presumes a basic student literacy level. But passively consuming a video or skimming a text isn’t enough to drive discussion back in class. As we all know from our own student days, technically meeting the requirements of having “done the reading” does not comprehension make.

Flipping, more so than traditional classroom lectures, requires students to be able to dig beneath the surface of the content, question its credibility, ask clarifying questions and make their own inferences.

Such are the makings of a classic Chicken and Egg conundrum. Flipping requires students to have the skills they are still trying to learn and master through…flipping.

I don’t think anyone has claimed to have answered this question yet, and neither have we, but the first step is realizing what you don’t know, and we do claim to have done that! We will continue to share the learnings from our video research as we iterate on Ponder Video, and welcome more ideas and discussion from teachers everywhere.

Population by Prose Literacy Level (Courtesy NAAL)

Population by Prose Literacy Level (Courtesy NAAL)

Curious about the numbers? The NAAL has been run twice – once in 1993, and a second time in 2003, and there wasn’t a big change in the scores in those ten years, except a slight increase in quantitative literacy. However, we have a pretty serious higher-order literacy problem. Between 34% and 43% of adult Americans lack the higher order literacy skills to be classified as “intermediate” or above by the NAAL.

From ponderi.ng to ponder.co!

We were quite enamored with our domain hack when we launched last summer – we felt that http://ponderi.ng captured what students are doing when using Ponder and even captured one of our favorite sentiments.

Unfortunately, while in Europe we appeared on the first page of search results for “ponder”, in the US we were stubbornly missing. We thought the fact that the “.ng” domain is technically for Nigerian companies would wear off as various education resources began linking to us. But it

www.ponder.co

www.ponder.co

didn’t. It turns out that the the major search engines take these country designations seriously, and we needed a generic domain (.com, .co, .net, etc) in order to appear in searches globally.

Luckily, we found that the owners of ponder.co were trying to set up a Ponder Family web site and would prefer to have ponder.net, and then negotiated to buy ponder.net, and trade it for ponder.co.

We miss ponderi.ng, but will save it for some creative purpose down the road.

In the meantime, please welcome our new domain www.ponder.co, and update your links to us so we’re easier to find!

Realizing Flip: Ponder for Video and eBooks

Pondering Video and/or eBooks for your class? Sign up for a pilot to alpha test.

Pondering Video?

If you’re flipping your classroom or simply have a lot of video content you’d like your students to watch outside of class, Ponder will soon be a way for you to engage and track student activity around video.

Just like Ponder for reading, Ponder for Video doesn’t require you to upload anything. Ponder for video will work on Youtube or Vimeo or Dropbox or Google Drive. Just like Ponder for reading, you will be able review visualizations of your students’ responses to the video along the video timeline. Even better than Ponder for reading, you will be able to manage a question queue and see where students are getting stuck watching and re-watching the same segment.

Pondering the Ring of Gyges

Pondering the Ring of Gyges

Pondering eBooks?

Ponder already works on any text that renders in a browser (including pdfs!), but we’ve been hankering after a way to organize Ponder activity around chapters and sub-sections for longer documents like books. So we were excited to discover an EPUB-lishing service called Thuze that integrates nicely with the Ponder browser add-on. Ponder + Thuze means you will be able to read eBooks in the Thuze web reader and organize Ponder activity around chapters and sections of long texts. (Continue reading)

Federalist Papers as an ePub

With Thuze, Ponder works on ePubs just as you would expect

Sign up!

We are looking for teachers and professors interested in trying out Ponder in these new contexts and providing us with feedback on the myriad ways it works and doesn’t work with your class.

For more information, fill out this short Google Doc form.

Learnings from the pilots will (of course!) be incorporated into the product and released for everyone.

Calling all Teachers: Ponder eBooks with Thuze and EPUB

Until now, Ponder worked on any text that rendered in the browser (including PDFs), but for longer texts we have been hankering after a way to organize Ponder activity around chapters and sub-sections of documents. EPUB is a widely used open standard for publishing structured documents which suits our needs well.

At the OpenEd 2013 conference, I met Victoria Kinzig from Bridgepoint Education who introduced me to Bridgepoint’s versatile new EPUB reader and textbook library, Thuze.

Thuze offers over 100 peer-reviewed e-textbooks across 23 disciplines from Health Care Administration to Ethics to Criminal Justice to various History texts, all of which you can read on the device of your choosing (Android, iPhone, iPad, and browser) for $35/textbook.

Signing up for a Ponder + Thuze pilot means you will get free access to a Thuze account (no textbook purchase necessary).

Federalist Papers as an ePub

With Thuze, Ponder works on ePubs just as you would expect

DIY Textbooks

But wait, there’s more. In addition to being a reader for Thuze texts, Thuze built a platform to allow instructors to publish their own compilations of text (e.g. EPUBS of any works in the public domain available on Project Gutenburg, feedbooks or mobileread; OR simply author documents through their editing interface.)

Ponder works on Thuze in the web browser as you would expect, with the same in-context aggregations of student reactions to the text.

The big new feature in Ponder + Thuze is we can now roll up student activity by chapter and section. The nicely paginated reading experience on Thuze doesn’t hurt either.

Sign up to try it out!

If you’re interested, tell us a bit about your class Ponder and some examples of EPUB texts you will be using.

Learnings from the pilots will (of course!) be incorporated into the product and released for everyone.

Semester End: Class Archiving and Cloning

Happy New Year!

We just made a small but important feature release that Ponder teachers who are preparing for their spring semesters will appreciate:

Archive!

You can now archive your classes!

Class archiving and Class cloning!

You will notice in your class settings that there are a couple of new buttons. The first is the “Archive” button. At the end of each semester, you will want to archive your classes. You and your students will still have access to all of the activity from the semester, but it will be put into storage, and frozen in time, to make room for new classes for the up-coming semester.

Archiving a class does the following:

Archived Selector

The group selector now has a separate section for archived classes.

  • The class is moved to the “Archived” section of the class selector
  • The class is marked “Archived” in the class settings control panel
  • Students can no longer join the class
  • The class is no longer available in the response box for either students or teachers
  • The class no longer counts against your maximum simultaneous class count

We have additional plans for archived classes which we will ship in the coming months, but for now you have the basics. Also, when you accidentally archive the wrong class, you can of course “un-archive” it by clicking the “reinstate” button.

Reinstate an archived class

Have no fear, you can always reinstate an archived class

The second important feature we released is class cloning. Many of you have patiently re-created your classes from one semester to the next, or even for multiple sections of the same class. We have heard your calls for help and answered!

Once you have the reading list and themes configured for a class, the clone button will allow you to create as many more of that class as you need. For example, many of our teachers teach 4, 5 and sometimes 6 different sections of the same class. Now, once they have the reading list and course packet configured the way they would like it, and the themes created, they can simply clone it for each additional section in seconds, and then re-name the clones to match section numbers or class periods.

Class Cloning

You can now instantly clone all of the settings of an existing or archived class!

Of course, if you accidentally clone it too many times, you can delete any extras by clicking the “remove” button. As before, if a class has any students or teachers other than you joined to it, we will prevent you from deleting it and losing your data. Once others have joined your class, archive it to make room or tidy things up.

Thanks and let us know if you have any questions!

 

 

 

In Flexibility, an Opportunity for Foreign Language Learning

Designing software is a balancing act between generality and specificity.

The more specific you make your tool. The clearer its purpose and how to start using it.

The more generic you make your tool, the more flexible and customizable it is, and the broader the range of scenarios it can support.

En El Ano 2019

Someone has translated many of the xkcd comics into Spanish!

For example, right now there’s actually nothing to stop a teacher from using Ponder micro-responses as a way to grade and provide feedback on their own students’ written assignments. Why not? Highlight a claim in their essay and ask for more examples to explicate and support it.

Similarly, Ponder could also be used to to guide students through a structured peer-review process where they evaluate and respond to each others written assignments.

However, we know that unless we bake an explicit workflow into Ponder to guide students and teachers through these scenarios, it will occur to precisely no one to use Ponder in these ways. We know because no one’s done it!

In the grand scheme of things, Ponder is more on the generic, customizable end of things. Student management is loose (just send your students a link and it’s up to them to join a class!). There is no explicit assignment workflow. (We’re depending on teachers to broadcast assignments and students to receive them “out-of-band.”)

In this respect, Ponder feels more like a general purpose tool (Google Docs), or nerdy Twitter, than educational software.

xkcd comics, on the other hand, are specific and often piercingly insightful, pretty much the opposite of generic, and quite hard to translate, not just linguistically, but culturally. (Amazingly, someone has done it anyway!)

Software people love generality. However, Ponder’s open-endedness has posed challenges to us on the adoption front. We’re continuously working on ways to get teachers up and running, both in the form of tutorials and FAQs as well as continuous iteration on design.

But we proceed with caution, careful not to undermine the flexibility that teachers appreciate and enjoy once they’ve figured out how to create that first Ponder assignment that gets the class reading together.

Perhaps Ponder starts out as a way of practicing finding topic sentences and supporting evidence in class. Then it becomes a way to identify examples of key concepts in reading assigned for homework. Then it becomes a way for students to bring in examples for class from outside of assigned reading, from reading they’re doing on their own that’s driven by their own interests. Then it becomes a way for students to do research for a paper, or work together on a group project. We’re still waiting for someone to start using it to support content-area literacy in Science and Math classes.

The sky’s the limit, anything could happen…including apparently, teaching Spanish.

Ponder Sentimientos

Ponder Sentimientos!

We just received our first Spanish sentiment set from teacher Federico Moreno at Sea Crest School in Half Moon Bay, California! As many teachers have said to us, what better way to learn the colloquialisms of a new language than to practice when to say you’re “on the fence” or when someone or something is “over the hill”? It’s early days, of course, the English sentiments have gone through many, many iterations and we’re only just getting to understand sub-sets by student level and subject area. But this Spanish set is a big first and that’s very exciting!

So, Foreign Language Learners (FLL) and teachers, let us know if you’d like to work with us on sentiments for your favorite language! We have teachers interested in applying it to Latin and French.

We’ve also had teachers point out the now obvious fact that really there’s nothing to stop people using Ponder to teach Spanish-speaking students ELA in their native tongue. (Duh, how obvious!)

Spanish teachers, let us know if you’d like to give these a trial run, and send us your feedback – when you create your class, just send a note to support through the “Ask Us” tab on the left of the site, and we’ll drop in the Spanish sentiments for you.

We’re excited for Ponder’s foreign-language learning potential and grateful for all of the teacherly enthusiasm!

Identifying Teaching Moments at the NYC DOE Shark Tank

On Friday we were invited to present at NYC DOE’s Teacher Shark Tank, one event in a series where three edtech startups get 30 minutes each to present and answer questions from DOE teachers.

The Teacher Shark Tank is hosted by iZone, NYC DOE’s Office of Innovation…which supports schools in personalizing learning to accelerate college and career readiness among our students.

Ponder is running in many schools across the country this semester, but in our hometown of New York, we are in one NYC DOE school (Stuyvesant H.S.), as well as one NYC Charter (WHEELS) and one NYC private school ( Trinity School). This was our first opportunity to formally present to DOE educators at a DOE-organized event, so we were excited to be there!

Other presenters included Quill, who has figured out a way to blend learning grammar into an interactive reading experience and Fast Fig, a word processor for math that enables teachers to cleanly and easily create equations and graphs online – a long sought after solution with many applications!

We had a late start, but this didn’t deter the great group of interested and engaged teachers who are clearly the vanguard of technology users at their schools (City as School, High School of Telecommunications Arts and Technology and P.S. 64 the Robert Simon School)

We wanted to impress this audience in particular. Fortunately, over the past two years of watching classes use Ponder (first graduate business classes then undergraduate philosophy classes then 12th grade English classes and 9th grade global studies classes and now 2nd grade ELA classes!!) we’ve evolved how we present and explain Ponder.

In our presentation Friday, Ben and I focused on one key concept: the speed at which a teacher can review student micro reading responses. How fast can a teacher review Ponder micro-reading responses you ask? Real fast. Fast enough that teachers can encourage their students to make as many responses as they’d like, knowing they will have time to grade them all and provide meaningful feedback. In fact, our conceit (which has proven true in higher ed and is starting to prove itself in K12 as well) is that not only will the instructor be able to review everyone’s responses, they’ll be able to do so *before* class starts, and actually use their students responses as the basis for in-class discussion.

To prove my point, Ben and I put up four different Ponder micro-reading responses from a single 8th grade class in the Chicago Public School system and asked the teachers in the room how quickly they could assess each one.

Number 1: A solid response.

No. 1 Coherent and appropriate.

No. 1 Coherent and appropriate.

The excerpt that the student chose is coherent, though it’s not making a particularly controversial or insightful point. The sentiment s/he applied is appropriate though not particularly nuanced (I empathize.) nor does it exhibit a deeper insight or independent thinking.

 

Number 2: Exemplary!

No. 2 Real insight and independent thinking!

No. 2 Real insight and independent thinking!

The excerpt is coherent and interesting, making a surprising, counter-intuitive argument.The sentiment applied is spot on, demonstrating the student clearly understands the author is making a claim and now needs to substantiate it with supporting evidence.

Number 3: Red Flag!

No. 3 Incoherent and inappropriate.

No. 3 Incoherent and inappropriate.

The selection itself is incoherent. And the sentiment is clearly inappropriate. Either the student is completely lost and doesn’t understand the point of the assignment or is simply not trying at all.

Number 4: A Teaching Moment.

No. 4 What is there to agree about?

No. 4 What is there to agree about?

This is where things start to get interesting. This is an opportunity for what would pedagogically referred to as a “teaching moment,” an invitation for further discussion in class. First of all, the selection itself is interesting. The author describes an interaction that is clearly intended to provoke some sort of emotional reaction from the reader. However, the student chose to agree with it – not the reaction the author probably intended! So, why did you concur? What are you agreeing with? What is the idea that you thought emerged from this quote? Or, perhaps, you’ve identified a moment in which the student wasn’t reading very carefully at all, which is valuable in and of itself.

We maintain a long list of ideas on how to better support this process of evaluating reading responses. It changes week to week as we watch our K12 classes settle into how to use Ponder while discovering new uses for it as well.

Still, I think we’ve reached an important milestone in delivering on the promise of providing a way for students to “practice critical reading” while giving teachers a way to respond to and build on that practice.

And, let it not go without saying, we are lucky to have such thoughtful students and teachers using Ponder that we can so easily find a mountain of interesting responses!