Skip to content

Presentations Involving Physical Things

Let's assume you're in a situation where you have to show a physical object during a presentation to 20 or more people. The current default is showing a static picture of the thing on the projector and possibly holding up the real thing so the first few rows of the audience might get a glimpse. What if I told you there's a better way?

Working in the Internet Of Things segment, I'm constantly dealing with, well, a lot of things. Those things tend to be small enough to fit in my hands. Considering we're "agile" (whatever that means to you), we have to present our work in two week intervals. Considering we're a distributed team, we live-stream those presentations all over the place. People may be sitting in one of three offices or even their homes while passively watching - or actively giving - the presentation.

In our biweekly reviews we use WebEx, in ad-hoc presentations we may use Google Hangouts or appear.in or Skype or whatever tool we could settle on. Most of us use macs, some are on Windows and everyone has an iPhone or some Android phone.

The goal is to stream what's happening in the room along with the actual presentation content to both, the projector and the remote client. The constraints are uncertain video conference tool and lack of specialized hardware. The solution is so simple, it took us ages to figure it out.

We're streaming our phone's camera image to our computer screens, which in turn end up on the projector and the remote client. In some scenarios you can manage this with built in mechanisms, in other situations you'll want to buy a $15 license for ReflectorApp. All you need to do is start the application on your computer, make your phone mirror its screen (using AirPlay, Google Cast, …) and open the phone's camera app. Et voila: instant wireless, handheld video camera that allows you to include the real-world in your presentation. It's so simple, it almost hurts.

Granted, we haven't used this technique particularly often and not at all in months. It's at conferences that I get reminded of this idea time and time again. I try to talk to the speakers after their shows containing little things on stage that nobody could see past the fifth row and explain that I couldn't see what they were showing off. Never mind the people watching the videos of the talks will likely not see all that much either. The discussion usually ends in a good laugh about not seeing the obvious: using your phone as a mobile camera to get your physical interactions on the big screen.

The same thing happened again just now - finally prompting me to write this post. I saw Tim Pietrusky showing his NERDDISCO at KarlsruheJS, luckily I was sitting in the first row and filmed a bit of what I was seeing

The show was partly about what was being projected - a web browser's canvas reacting to some audio and getting additional input from a midi device - and the comparably tiny LED cube Tim built and programmed to visualize sections of the browser's canvas. Tim created some fantastic stuff right there. His show was a far cry from bad. But people sitting away 15m from the pedestal couldn't enjoy what I was seeing.

Here's a screenshot of how streaming your phone's camera to your computer screen might look in real life.

Phone's camera mirrored onto computer screen
Phone's camera mirrored onto computer screen

Ignore the camera app's interface and focus on the content. Ignore that you'll have to leave your fullscreen presentation for a moment. Ignore the little imperfections and focus on what you just gained: Everyone is able to see. A screencast will contain not only the presentation, but also the real life thing.

Comments

Display comments as Linear | Threaded

No comments

The author does not allow comments to this entry