EdSurge Silicon Valley Summit: Gameify the Conversation

For better or worse, the Valley has become a rather glam place, in an Iron Chef! kind of way.

The grumpy old man in me is often wistful for the days when credibility was measured in the age and wear of your company t-shirts (and when everyone got the same tent-cut XXL Hanes t-shirt regardless of the size and shape of the wearer).

EdSurge's Tony Wan, Making things happen

EdSurge’s Tony Wan, making it look easy!

I’m happy to report that this past Saturday the get-your-hands-dirty pragmatism I fell in love with when I first arrived in the Valley was out in full force in Mountain View, CA, albeit with better fitting t-shirts. The t-shirts were green and they read: “Keep Calm and Read EdSurge.”

A few weeks ago, I wrote about a courageous first effort at enabling real educator-edtech conversations in Rhode Island. There were problems, potentially event-killing catastrophes, but by thinking on their feet, EdSurgers, Highland Instituters and EdTechRI-ers adapted and iterated “in real-time” and pulled off a successful event. There were clear lessons coming out of that first event, and I was excited to see how “agile” EdSurge could be with how they ran events.

This was the basic setup…

Ready to Ponder at 8AM.

Ready to Ponder @8AM

  • A generous room of large round tables, one for each company, each surrounded by chairs, loosely grouped by product focus.
  • Power strips for each table with tape for the inevitable cable undergrowth.
  • Surprisingly solid wifi, especially given the attendance
  • Efficient Lightening talks in a separate, yet adjacent auditorium

These structural changes got the whole thing moving…

  • No parallel workshops!!! More on this in a moment.
  • A team of easily identifiable green-shirted people assigned to specific companies to match-make educators with technologies.

One final stroke of genius put it over the top…

A raffle for teachers, with real prizes (a bunch of Surfaces, iPads, Chromebooks, Lynda.com subscriptions, no keychains, no squeezyballs). But this was no fundraising raffle. The way teachers obtained raffle tickets was by providing feedback to the companies. Let me just repeat that so you can fully absorb what a great idea it was. (I’ll make the font size really big too.)

The way teachers obtained raffle tickets was by providing feedback to the companies.

Technically the way it worked was I was supposed to give them 1 ticket for filling out a short 1-2 minute survey at my table, and then they could get three tickets by filling out a longer survey at the main EdSurge table-Island in the center of the room.

In reality, EdSurge had successfully game-ified cross-discipline conversation. Once the game was clear, everyone knew how to play.

Gone was the awkward mingling, the Who’s-going-to-say-something-first? interactions. Everyone had a non-stop flow of teachers coming up to them with a clear agenda: “Tell me about your product. I teach ____ to grades ____ and I want to hear how you think it could be relevant to me and my students.”

Really, no workshops?

One of the concerns I had as the event planning was coming together was the ballsy decision to not have any PD-type workshops at the summit. I had a secret fear that without that incentive many teachers would not make the trek to come to the conference at all. My fear was all for nought. I’m curious what the final EdSurge numbers will be, but I spoke with well over 100 educators in a constant stream from 9AM to 4PM with just one break for the lunch panel. I’m not sure how to extrapolate that over the 30 vendors, but the place was packed and filled with energetic voices and enthusiastic discussion.

My admittedly self-serving theory is that by putting a stake in the ground and saying:

“This event is dedicated solely to bridging the communication gap between educators and edtech so ‘Conversation’ is going to be its sole activity.”

EdSurge showed everyone involved that they really mean it. And in my experience teachers are some of the best Do-you-mean-it? detectors out there. Still, I think that took guts and it paid off – “Conversation” was center stage the whole day.

The kids will be alright.
There’s just one last thing I need to sing the praises of and then I promise I’ll get more critical. The lunch panel featured 10 kids from roughly 1st through 12th grade. It’s a “stunt” which I have seen attempted at various other events and I call it a “stunt” because that’s how it often comes off. EdSurge however managed to pull it off. The moderator Chris Fitz Walsh from Zaption did a great job of asking questions, and the kids they picked provided real insights beyond the usual “I like to tweet so schools should tweet too.” Judge for yourself:

“My teacher has a little microphone clipped to her shirt so everyone can hear her clearly even when her voice is tired.”

Huh, I would never have thought of that.

So, no room for improvement?

Just to shore up my plummeting credibility, here’s a list of complaints:

  • There were so many teachers that it became unrealistic for them to both talk to me and then fill out the 1-2 minute survey at my table in exchange for more raffle tickets, so I simply started handing them out to whomever I spoke with. I like the idea of eliciting more frank, quantifiable feedback through the survey. But there simply wasn’t enough room for teachers to fill out the survey without creating a bottleneck for conversations. Perhaps if the raffle ticket was attached to the survey itself, then I wouldn’t need to play “enforcer” for whether a teacher “earned” a raffle ticket. The problem with that would be people filling out the survey without even coming to the table…maybe there isn’t a solution.
  • We were almost shouting over one another to be heard. (I know a good problem to have!) It was definitely a function of attendance levels, but paying attention to the acoustics of the room might have helped manage noise levels. Still a noisy room is an energetic room, so in the end, it probably did more good than harm.
  • I didn’t have a great way of keeping track of all the teachers and schools who came by to talk other than giving them hand-outs or having them scribble emails on a sheet. Because we’re still in pilot-mode for K12, we’re trying to be extra hands-on with our teachers and I was worried the whole time I wouldn’t be able to follow-up with all the people I was talking to. This seems like a very solveable problem though.

So what’s next?

Now that EdSurge has a template for this kind of event, I’m hearing whispers that lots of other cities around the country are requesting their own version, and my guess is that EdSurge will deliver. But what about new templates to address other gaps in edtech? Creating a “conversation space” where educators and technologists can talk is just a first step.

Chatting afterwards with Idit Harel Caperton from Globaloria and EdSurge’s Mary Jo Madda, Idit suggested a principal/administrator/budget-decision-maker focused event, which at the very least this edtech startup would find incredibly useful.

We are a few weeks from sending a survey out to our K12 pilot schools about our pricing plans. We’re still struggling with the conundrum of: Teachers love the idea of using Ponder, but rarely have any personal budget to pay for it at a price that would sustain the service longer-term. I’m sure we’re not alone.

Unlike professors in higher-ed, K12 teachers lack personal agency to purchase tools. Yet, they more than principals and administrators know which tools would actually be useful. I also suspect edtech companies need to address structural issues to be convincing to budget-decision-makers:

  • We need a tidy answer to the question: Does it work? In the form of objective efficacy validation! We’re working on it now!
  • We need tried and true best practices to help manage the risks of trying new technologies in classrooms that have little room for wasting time.
  • We need to reward teachers and schools for taking risks and being open to experiment. From our perspective, we learn far more from failures (e.g. the technology made no difference) than unmitigated success.

I know, this should really be a separate post.

I’ll just wrap up and say: Thank you EdSurge, your hard work and attention to detail showed. We’re not just reading, we’re looking forward to more, and following your lead!

Facilitating Conversations with Educators: Learnings from EdSurge RIDE 2013

This past Saturday was the 2013 Rhode Island Department of Education (RIDE) Technology Conference at the Rhode Island Convention Center in Providence, RI. It promised to be and, through the intervention of several individuals, became a unique forum for highly interactive ad-hoc discussions between educators and technologists.

In the days leading up to it, the logistics of the event sounded downright awful: 700 teachers already signed up for sessions parallel to the discussions, uncertain wifi, no electrical outlets, and we would have to close down our table for lunch, because the discussions were to take place where everyone was going to be eating and trying to watch the lunch panel.

Fears of an empty, electricity- and wifi-free room

Overblown fears of an empty, electricity- and wifi-free room (via instragram)

It’s worth noting that at Ponder we have long since learned that trying to talk to a teacher meaningfully about their classes and students while mousing around on a tiny laptop screen really doesn’t work. Instead, we have a screencast that walks through some of the core functionality of Ponder on a fairly short loop, which we run on a 24″ display. This way, it is possible to have a conversation with someone and point out functionality to answer their questions when it comes around on the loop. So, the whole no-electricity issue meant that the conversations would be much less productive.

Even my educator friends who long to bridge the gap between educators and technologists asked me why we were bothering to go.

I explained that as crazy as it sounded, this was one of the best opportunities we had gotten to meet with a lot of teachers in a setting where we could actually learn about their classes and try to brainstorm how Ponder could fit into them.

Optimistic, I ordered a big battery pack from Amazon that promised 3.5 hours of running the screen, which was more than my dying laptop battery promised, and hoped for the best.

When I arrived, I walked around the third floor of the convention center, which featured the big company vendors’ displays, some schools talking technology integration, and the main “keynote” stage and seating area. Teachers were seated with coffees listening to the first panel. The wifi was working despite expectations to the contrary, but the seating was in a giant open convention space, no walls or power outlets in sight.

At the end of that first panel, Shawn Rubin from the Highlander Institute made a valiant attempt at explaining that if the teachers stayed where they were, they would get the opportunity to provide feedback to the edtech companies that had come to talk to them. Sadly, when he had finished, almost all of the teachers stood up and left to go to the workshops upstairs they had previously signed up for. Disappointing. There were a few hold outs who had us to themselves, and some teachers who waited at the periphery of the hall, seemingly unsure of how to engage.

To an audience of largely other edtech companies, the first half of the companies took turns giving three minute presentations about their technologies.

But then the organizers stepped in to fill the gaps and things started to turn around.

As lunch approached, Dana Borrelli-Murray from Highlander Institute came around to the companies and told us to ignore the plan to shut down our tables for lunch. This was a huge time saver, and allowed us to engage teachers in the gaps during lunch and the lunch panel.

The second half of the companies, which we were lucky enough to be part of, were scheduled to present after lunch, and due largely to another impassioned plea from Shawn to the crowd of teachers, many more teachers stayed to listen. Also helpful was EdSurge bringing up one of the few teachers who had met with many companies in the morning to speak briefly to the crowd about her (positive) experiences meeting with us.

Working from what he had seen in the morning, Shawn spent the second half of the day personally corralling and categorizing teachers, then bringing them in small groups to see companies that were appropriate to their subject areas, grade levels, etc.

For the remaining 2-3 hours of the day, every company was engaged in discussion with group after group of teachers. HUGE SUCCESS.

And as luck should have it, I found an outlet inside the stand for some nearby speakers and snuck an extension cord to it to extend my power setup.

In the end, the event worked so well, that combined with the learnings around the logistics, I think EdSurge RIDE provides a template for future events. (Perhaps even the upcoming EdSurge Silicon Valley Summit.)

Here is my template:

  • Dedicated time in the conference schedule for company-educator conversations (ideally not in competition with PD workshops)
  • A large room with discussion tables for each company clustered near the rear, with a presentation stage at the front and rows of audience chairs facing it
  • Each company table should have a sign visible at a distance
  • The tables might even ideally be organized by subject area and/or grade level
  • Lightening-talk style presentations to the audience chairs at the front of the room, with volume adjusted to not stifle the discussions in the back.
  • Power outlets at each table and working wifi
  • Last, but possibly the single most important: A troop of Shawn Rubin-style matchmaking valets to take uncertain teachers from the audience chairs or the edges of the room to specific, appropriate companies.
U of Minnesota's Active Learning Classroom

U of Minnesota’s Active Learning Classroom (MPR Photo/Tim Post)

Bonus points for a setup along the lines of the University of Minnesota’s active learning classrooms.

And yes, the EdSurge RIDE Summit was absolutely worth attending. I met with teachers from at least a couple dozen schools, some of which were great fits, others that were not. I usually had several minutes to explain to 2-3 teachers at a time what Ponder does and what sorts of schools, levels, and subject areas we were looking to collaborate with. The teachers then took 5-10 minutes to ask questions and talk about their classes and the tools they had tried using.

Key pieces of intelligence from the field:

  • There are still Internet Explorer only schools (I tried to give teachers some arguments to provide to their IT departments for trying Chrome or Firefox.) Internet Explorer no longer allows most types of extensions, and developing extensions for older versions is a huge pain.
  • There are also no-browser-updates-allowed schools – teachers had already run into issues where some sites wouldn’t work because they were stuck on old versions; (an issue which I wrote about a few weeks ago); and
  • We also discussed issues around complying with COPPA when doing class list management for younger students

New and noteworthy Pondering:

  • Content area literacy: I think we found our first K12 teachers interested in using Ponder for Math and Science literacy!
  • Latin! A Latin teacher is interested in using Ponder with his Latin students to help break apart Latin grammar. Looking forward to brainstorming about this further.

All in all, thank you Rhode Island Department of Education, thank you Highlander Institute and thank you EdSurge, for a great effort in bridging the educator-technology gap and allowing us lots of great conversations!

 

The Risk of Data Monopolies in Education

In EdTech, we take for granted the drive for efficiency and innovation generally associated with private industry. Occasional hiccups are often dismissed as the short-term cost of longer term innovation. The recent surge of investment has escaped being lumped in with still warm education privatization failures.

Everyone working in education right now agrees that the primary goal of EdTech as a business is to improve educational outcomes; profit is secondary, a means to that end and not an end in itself. Despite these good intentions, the devil is in the details, and a key challenge is the lack of consensus on how to measure student, teacher and school performance.

Recently I wrote about the idea of standardizing the certification of educational technologies, and on Tuesday, September 17th, an audience question came up during the NYT Schools for Tomorrow conference that raised what may be a key next discussion in this debate:

Who will have access to all of the educational data being captured through online learning services?

At Ponder, we have debated this issue at great length. Our answer? Something we call Fair Trade Data. But I’ll save that for a later blog post.

xkcd on Monopoly

Time to think about monopoly in education data (courtesy of xkcd,a/>)

Instead, let’s examine that question, which was posed at the end of the “Gamechangers” panel, moderated by Ethan Bronner from the New York Times.

Here are the panelists, from left to right, with the bios provided by the conference:

Daphne Koller, co-founder, Coursera
Michael Horn, co-founder, The Clayton Christenen Institute for Disruptive Innovation
Alec Ross, senior advisor on innovation and former senior advisor to Secretary Hillary Clinton at the U.S. State Department
Paula Singer, C.E.O. Global Products and Services, Laureate Education

At about 40:20 in the video, you will hear a question from Lee Zia of the National Science Foundation.

“One of the promises on the Big Data side and connecting [sic] directing to learning is the opportunity to share that data. And I worry a little bit that there aren’t any incentives for the various aggregators and collectors to do that sharing. So what is the Federal role if there is one, to try to move us in that direction, not to set the standards, but to get the community together to form those standards, so you might even see something like a national weather service, where the weather data is free, but commercial entities can compete on the services they provide on that data.”

The moderator, Ethan Bronner, passes this question to Alec Ross “…because he works for the Federal Government…”

[~41:10 in the video] Alex Ross responds that (paraphrased) ‘the federal government should keep out of it’; we need to let the market work on the “supply-chain” for delivering education, where those that “build the broadest and strongest partnerships will win the standards war.”

I can’t be sure what Lee Zia was looking for from the panelists, but I think it was more than ‘Should the federal government create data standards?’ And Alec Ross’ answer felt dismissive. It’s also possible that I simply wish he had asked the question that I wanted to ask:

Will all this amazing education data, often being generated by private entities, go the way of the giant troves of insurance, credit card and internet data, that are kept out of public hands in the name of trade secrets and competitive advantage?

Contrary to Alec Ross’ statement, I think the federal government should get involved, if only to steer the industry in the right direction. Ownership of this data is a pressing concern for everyone involved: students, teachers, schools, parents, policy makers, investors and technologists.

The NYT conference was laced with generalizations about the masses of priceless data being captured that would soon revolutionize learning. While the value of that data is still to be proven in education, I think we would all agree that

  • We want students to be able to switch freely between educational providers (something that is already a challenge in offline education);
  • We need education providers (public and private) to provide their data openly to outside parents, evaluators, auditors, researchers and the like to validate their claims;

I hope, though I’m less sure, that we could all agree that

  • We want companies to compete on how they make use of student data, not on merely having the largest stockpile of it.

So, where is that data, right now?

To pick on the conference panelists, the data is on servers at:

While it is early days yet in terms of the accumulation of truly “Big” education data, I’m not aware of the general availability of the data from any of these services to the public.

We need to start a serious national conversation about educational data, and set expectations on transparency before too many investment dollars get put behind ventures whose perceived value may rest significantly on the proprietary data they will accumulate. We ought to be strategic about managing the control of this data before it’s too late.

EdTech Integration Troubleshooting: Software Updates

Ponder is excited to have the Washington Heights Expeditionary Learning School (or WHEELS) as one of our K12 pilot schools. InsideSchools.org was right when they described their faculty as “dedicated.” WHEELS is an impressive school. Yesterday, as Ponder’s Founder and CEO, I went to WHEELS to help troubleshoot some challenges their teachers faced when using Ponder on school technology resources. I think the challenges they face are common for academic technologists trying to deploy new tools. Kudos to our friends at WHEELS for working through this rather than deciding it would be simpler to give up! Hopefully our experience yesterday can be helpful to all of you as you confront similar situations.

xkcd: All Adobe Updates

Courtesy xkcd

 

Pacing school hallways around the world there are solitary school IT managers asking themselves “To Update, or Not to Update?”

In my experience, schools rarely have the IT support they need, and teachers are often left to deal with updates, security warnings, unnecessary toolbars, draconian web filters, and more interfering with basic uses of technology in class. These experiences illustrates the problems caused by not updating, while hopefully not ignoring why updates are annoying, and then offers some recommendations for schools and EdTech software developers.

The faculty at WHEELS had their classes setup, had planned integration with the curriculum and their students had successfully registered their Ponder accounts in the classes – everything went as planned. Then they discovered that on the shared student laptops, Ponder’s add-on wouldn’t install. Instead they got the following cryptic message: “An error has occurred: There was a problem adding the item to chrome. Please refresh the page and try again. ” See screenshot. Needless to say, refreshing the page and trying again had no effect.

An error has occurred: There was a problem adding the item to chrome. Please refresh the page and try again.

“An error has occurred: There was a problem adding the item to chrome. Please refresh the page and try again. “

 

Ponder has run on various hardware and OSes for over 1,700 students at more than a dozen institutions, so it was most likely a configuration issue specific to their setup rather than a general bug. Remotely it was difficult to tell what was going on. Aside from the error message itself, we knew:

  • Ponder was working and installing on the faculty computers without issue
  • The school had separate student email accounts with Google

We were not sure:

  • What hardware they had
  • What OS version or Chrome version they were using
  • Whether the machines were managed with a Google admin account
  • Whether we would be able to load an “unpacked” or debug version of our extension on their machines to test

Tony and I brainstormed together the following possible explanations:

  • Extension installation at the machine level completely blocked
  • Chrome version so old that we didn’t support it (<17)
  • Google account management in place with restrictions that prevented our installation
  • A bug in our client

For some reason I latched on to the idea that the school had simply disallowed the installation of browser add-ons for student accounts, but I got on the A train to 181st street with my laptop and a USB stick with:

  • a link to google’s simple “high contrast” add-on which we surmised had fewer requirements specified in it’s manifest, and therefore might be allowed to install.
  • a link to a fairly complex add-on that would probably trip more restrictive management settings – – if this worked, it might suggest that there was in fact a Ponder bug.
  • The uncompiled “unpacked” source of the latest version of our extension
  • A slightly older version of the Ponder extension that was missing some recent functionality (around extending PDF support) that we thought could conceivably cause a problem.

When I got there, I quickly learned the following:

  • the school tech guy confirmed that the student’s Google accounts were not managed, and that they didn’t depend on students logging into chrome. This ruled out the google account level admin configuration issue.
  • They were running on recent Apple laptop hardware.

I attempted the install of our client with the Chrome console open (command-option j on a mac) and saw the following “[INVALID MANIFEST]”.

We had seen this before, and it’s Chrome’s way of saying that the extension has a configuration element that it doesn’t recognize. This immediately suggested that they were on an old version of Chrome.

I checked, and indeed they were running version 20, and in the past we had seen issues when people had versions below 17, but the current version is 29, so that still seemed to be a likely culprit. The tech guy explained that the laptops were imaged by the NYC Department of Education before they arrived, which explained why it was so old. So, no big deal – we can just update them.

I was instant messaging with Tony, and he had a great idea which I put into motion while we were waiting for the machines to update. I loaded the debug version of our add-on locally and got a more detailed error:

Could not load extension from ‘/Users/student/Desktop/0.10.0.13.prod’. Invalid value for ‘content_security_policy’.

According to Tony’s plan, I removed the line in our manifest for content_security_policy and re-installed. This time it worked without a problem! The “invalid value” error message suggested that their version of Chrome interpreted “content_security_policy” differently than the current version. Ah-hah!

Unfortunately, Chrome wouldn’t update, and gave us “error 12“. I investigated a little bit, but the tech guy was more pragmatic – he simply downloaded the latest Chrome .dmg and ran the installer successfully. With the latest version of Chrome, Ponder installed successfully and they were on their way.

Problem solved! WHEELS was now back on track. ;-)

While WHEELS’ students were now ready to go, this was obviously an important lesson to learn so that we could avoid this issue with other schools. When I got back to the office, I had two follow-ups I was curious about.

First, earlier this week I had read an article on EdSurge explaining to teachers how to minimize logistical hassles when implementing blended learning. One bullet point that caught my eye was a pet peeve that I had flagged on Ponder to other EdTechers almost as a joke. It said “Avoid interrupted class time by making sure you turn off automatic updates on each of your computers.” and recommended using Deep Freeze.

I wanted to write this blog post to shed light on the challenges this mentality creates – it isn’t as simple as turning off updates.

Minor tangent alert: Unpatched machines mean security holes, and security holes lead to infected machines and infected machines infect more machines. Infected machines become part of Zombie Computer Botnets, which are used to steal people’s identities, send spam, execute denial of service attacks, etc. Having worked on the Microsoft service that manages patching all Windows computers over the Internet, this is a bit of a sore spot for me, but I digress.

Perhaps more relevant to this scenario, it also means that developers making software that works with Chrome will primarily be testing against the most recent version of Chrome. The likelihood of them testing against the last dozen versions of your browser is small because its a LOT of work. That said, because updating school laptops (which are closed and asleep on laptop carts when they aren’t actively being used in class) is a logistical problem that I don’t see a good answer for, one can’t really fault the schools for struggling here.

For school tech folks, some starting points that may or may not be useful: For Windows, check out WSUS (Windows Software Update Services) and System Center, for iOS the Apple Configurator might help, for Macs, OS X Server has NetInstall, though I don’t know much about it.

As far as I know, none of these tools will open each of the laptops on your cart, patch them and then put them back to sleep for you.

The second thing I wanted to investigate is how, for the umpteenth time, we could try to find out ahead of time when Google makes changes that are not backward compatible, and better manage our release process. This is especially relevant to the software update issue, because

Google assumes the exact opposite of the EdSurge recommendation: that everyone is updating constantly all the time.

It turns out, the addition of the “content_security_policy” manifest field is called out on the “what’s new” site, but at version 14! So something had changed about that field between version 14 and version 20, where the WHEELS laptops were, without any indication on this developer status page. And, now that I looked more closely, the page hasn’t been updated since version 26, and now we’re at version 29.

What especially got me about this issue is that all extensions are installed through Google’s servers. That means that Google’s servers happily passed our extension to an old version of Google Chrome, which couldn’t read the manifest, with ZERO FEEDBACK.

No feedback to the schools trying to install it, and no feedback to us, the developer, that our extension was failing installation on a certain version of Chrome.

Clearly it would be too much to ask that when we submit our app to the store they could preemptively validate which versions of Chrome it will work for. Here the gate-keeping mechanism in the Apple App Store at least has a concept of per-platform version.

My recommendation to schools is a twist on the EdSurge recommendation: Update your computers on a predictable schedule. If your IT folks can prevent updates from popping in class, and instead take a “Patch Tuesday” approach to getting machines updated, maybe on a rotation – for example one cart per week, you will hopefully minimize short-term disruption and also avoid these larger compatibility issues.

And, fellow Chrome developers, remember,

1. Google assumes everyone is updating Chrome all the time.

2. People, especially schools and businesses, hate updating because it is constantly interrupting them, and they see little value in it; generally they try to prevent it from happening. In short, schools are NOT updating Chrome all the time.

3. Because of the previous two lessons, this is a royal pain for developers. Google should figure out how to make updating less painful and more pervasive in schools, and in the meantime, should work harder to help developers manage compatibility with their ever-changing versions. In the meantime, as a developer, be very aware of what APIs you are using and what you require in your install manifest – the less surface area for change, the better!

(The cynic in me has to imagine that Google’s education folks are focused more on addressing Chromebook deployment issues.)

Hopefully by drawing some attention to these issues we can avoid them in the future. Good luck!

Certifying EdTech for Educational Efficacy: An FDA for EdTech

Today I want to share some thoughts we’ve had recently about measuring educational efficacy. The goal of the EdTech industry is to harness technological advances that have sped many other industries forward to improve educational outcomes. This requires collaboration between education practitioners and technology practitioners; a sharing of their separate expertise, in a complex, iterative back and forth.

An FDA-approved blackboard

An FDA-approved blackboard. (Einstein’s blackboard By decltype [CC-BY-SA-3.0])

Proof Needed!

As you may know, Ponder is a critical reading tool designed to scaffold the individual critical reading process as well as support a collaborative, social group work experience. Our guiding hypothesis is that we can help students improve these skills with a service that enhances the reading they are doing for class. Ponder scaffolds their critical thinking and facilitates collaboration around those readings. The fact that the service performs those functions doesn’t necessarily mean it achieves the intended goals; only a carefully designed scientific study will prove that.

Proof is needed for any educational tool.

We all must prove that the student’s learning experience is improved by the addition of tool X into their classroom. Now, agreeing on what “improving the student’s learning experience” means exactly or how to measure it is outside the scope of this blog post. However, I think we can all agree that when a school is considering purchasing a new tool, they need to know that it does what it says it does (e.g. makes a positive academic impact). Currently there is too much marketing mumbo-jumbo flying around, and the districts don’t have the expertise or the resources to certify companies’ claims. That certification needs to be derived from a test. Considering how many unproven tools there are, we need a good testing and certification process urgently. (On a side note, because it’s easier to measure, it seems that logistical concerns around tech integration often trump educational impact when it comes to comparing technologies. Will this tool work with this other tool my school uses? This is not an unreasonable question, but it should not be confused with educational effectiveness.) So, how do we create a good testing and certification process?

Testing and Certification   

There are two big challenges with running studies in K12:

  1. Minimizing learning disruptions to the current students who are part of the test.
  2. Maintaining the objectivity of the organizations and individuals involved in administering the study.

First, testing a tool inevitably brings with it the risk of disrupting learning. While to a certain extent that is unavoidable, schools need help managing that risk so that the testing process doesn’t do more damage than good. Without testing, no one can find out what works; without a testing facilitator to take responsibility for controlling chaos, schools will rightfully remain reluctant to try new technologies. As a school, I would be skeptical of any company’s claims at minimizing disruption in my classrooms. After all, they are likely more concerned about running their study than possibly causing trouble for the students in the study.

Second, administering a controlled study requires disinterested, skilled professionals to design the study and then evaluate the results. It requires time from school personnel and may require new infrastructure to support the study. Those requirements cost money. Schools should also be skeptical of a study that was paid for by the company behind the product being tested. Even well-intentioned people running a test evaluating a product they have a financial interest in will have difficulty being objective. Not so well-intentioned people may use money to influence the outcome of the study. To address these two challenges, I think there needs to be an objective third-party whose ultimate success is measured in student performance improvements.

We Need an Objective Third-party

Let’s call that entity an EdTech “broker”. It’s a bit like having an FDA for EdTech. The US Food and Drug Administration (FDA) provides an essential objectivity to the success of many food and drug industries and a similar provider is essential to the success of EdTech.

This broker will create a level, merit-driven playing field that focuses corporate education investments on impact rather than marketing.

So the next question is, who covers the costs of the broker? At first blush one might think it should be the companies. After all, the study is on their product, they are potentially wasting the school’s time, and they stand to make money if it is proven effective. I don’t think that will work. As a society if we have decided to apply technological innovation to improve the free education that is provided to all children in the US, we should invest money in creating the political, technical and managerial infrastructure to facilitate the evaluation of these educational technologies.

Some core tenets of such an entity:

  • Standard evaluation processes
  • Transparency of funding sources
  • Shared governance by educational and technological stakeholders
  • Politically non-partisan

Some process ideas:

  • Low processing fee to filter out spurious test applications but not so high as to make venture backing a pre-requisite
  • Guidelines around maximum participation for a given student to minimize disruptions
  • Streamlined re-certification process for technology updates
  • Non-financial benefits (access to additional infrastructure?) to participating schools to encourage participation
  • Publicly available study data for public learnings
  • Low-friction certification verification process (and/or policing of certification claims)
  • Marketing/distribution opportunities for certified

Some ideas for who should be involved:

Who or what are we missing? We’d love to hear your thoughts!