SXSW: You Suck, Cleantech: How Design Can Help

David Merkoski
Partner & Chief Designer
Greenstart

Presentation Description

Cleantech is dead. Investors have dropped out and entrepreneurs are more interested in making apps. What happened? Why didn’t the same spirit and capital that created the internet grow renewable energy and other “clean technologies” to scale? With 93% of the world’s fuel supply still coming from burning stuff we dig out of the ground – it’s a question that matters. So what do we do now?

David Merkoski, the former Executive Creative Director of frog design and current Chief Designer at Greenstart, has a plan to “bring sexy back” to cleantech by ultimately using design (service, product, brand and business) to fight climate change and blow apart our energy constraints. Join him for a look back at the mistakes made in the cleantech design track record, and leave with a vision of how we can design a cleaner world. Long live clean tech.

Presentation Notes

The problem with Cleantech is nothing has drastically happened as hoped within the last 10 years. The reason why it has been doing so poorly is because it failed to capture the imagination.

The Cleanweb: a term invoked last year at SXSW 2012. It’s the idea that software can potentially clean up the world. Cleantech is a collection of things, and Cleanweb is a connection of things. It is super critical as it is the hardware and software of the internet. The argument is that those services that gave us Facebook, etc. etc. has the potential to do this.

Cleantech is there, but until people are aware of its uses and actually start pro-actively pursuing saving the world, nothing is going to happen. That’s where Cleanweb comes in, as it’s our way as designers and developers to make innovative ideas to assist people with making the jump. We’ve been saying that simply turning off a light switch will save the world, but in all seriousness it wont. However, creating automation to turn off all lights in your entire house may. Think Cleanweb.

SXSW 2013: Extreme GPS: Limits of Security & Precision

Todd Humphreys
Assistant Professor
The University of Texas at Austin

Presentation Description

GPS has its limits. My students and I at the University of Texas Radionavigation Lab work to find them. For 20 years, GPS was so reliable it became navigation and timing crack for engineers. We all got addicted. We put it in our phones, planes, power grid, comms networks. But there are limits.

My students and I bought an $80k helicopter drone a few months back and pushed its embedded GPS receiver to the extreme. Turns out, you can hijack one of these drones by perfectly aligning fake GPS signals with the real ones. And you can do it from miles away. We grabbed the world’s attention at White Sands in June. Our demo has changed the national conversation about integrating civil drones into the national airspace.

We want to probe the extremes again, only this time in precision. Surveyors already have hyper-precise GPS; we want to show how this can be commoditized, put in your cell phone, overlaid on the world. We want hyper-precise augmented reality.

Presentation Notes

On May2, 2000 President Clinton ordered that all GPS be changed from their accuracy of a football field, to what it is now.

GPS Jammers are an option you can use to prevent people from tracking you, since nowadays you can easily create and slap a tiny GPS on anyone or anything. These however are illegal to use a GPS Jammer (to turn it on), but not illegal to own one, or build one. These take about as much power as a 30 watt light bulb.

Texas has sensors to determine if people are using jammers on major bus routes. Strangely, they see about 5 different people per day using jammers driving through Texas. Sometimes jammers are used by people smuggling illegal drugs or items.

On December 4th, 2011 a missing drone was reported. Later it was found, in a gymnasium, in Iraq. So what happened? One of the engineers who produced the attack on the drone said it was an electronic attack to spoof it into thinking it was somewhere else.

GPS Spoofers don’t just jam GPS signals, but they spoof them into thinking they are somewhere else. It’s possible to do, as the Government decided to not encrypt or sign GPS packets, and they are fully open.

Todd and his team of students got the go-ahead from the National Homeland Security to test and implement a GPS Spoofer to take over the GPS coordinates of where a UAV thinks it is at. However, the NHS did not provide the UAV so they had to get funds to buy a used $80K UAV helicopter. They were able to do this, and overtake the UAV. Without permission from the NHS this would have been illegal otherwise. They now have a contract with the US Government to work on implementing “spoof-proof” specifications for any UAV over 18lbs.

Augmented Reality

Augmented Reality has a good concept, but they all require some sort of labels on an object for it to actually work. Even the Google Glass just slaps layers on your real life, and doesn’t meet these requirements:

  • A true 3D immersive experience
  • Virtual 3D elements that look and behave like real elements
  • Absolute cm-level registrations
  • Global reach, outdoors and indoors
  • Available soon

Handheld is fine; wearable is harder to implement and socially awkward anyhow. If someone looks at their watch while talking to you, you assume the conversation is about over. We recognize that the sensor suite in existing smartphones and tablets may be inadequate.

Carrier-Phase Differential GPS Positioning: Allows you to get down to distances by centimeter, or even millimeter if you’re close enough.

PTAM: allows you to determine distance, and position, on a grid system very accurately without GPS. This is locally defined, and not natively shared.

SXSW 2012: JavaScript Performance MythBusters (via JSPerf)

Chris Joel
CloudFlare, Developer

John David Dalton
Uxebu, JavaScript Developer

Kyle Simpson
Getify Solutions, Owner

Lindsey Simon
Twist, Developer

Presentation Description

JavaScript is everywhere from mobile phones and tablets to e-readers and TVs. With such a wide range of supported environments developers are often looking for an easy way to compare the performance between
snippets, browsers, and devices. jsPerf.com, a site for community driven JavaScript benchmarks, was created to help devs do just that.

Join Mathias Bynens and John-David Dalton from jsPerf.com, Chris Joel from Cloudflare.com and Lindsey Simon from Google/Browserscope in this panel discussion on some of the best dev-created benchmarks and most interesting practices debunked by real-world tests.

Presentation Notes

Browserscope and jsPerf

Open-source, community-driven project for profiling browsers. Really good at helping inform developers by providing number crunching and actual data. The whole idea is that anyone can reproduce results with any type of hardware (crowdsourcing).

Explicit Goals:

  • Foster innovation by tracking functionality
  • Push browser innovation, uncover regressions
  • Historical resource for web developers

Myths

  1. Your for loops suck: rewrite all your code and use: while –i BUSTED
  2. Double your localStorage read performances in Chrome by getting by index. TRUE
  3. The arguments object should be avoided. BUSTED (but isn’t as good in Opera)
  4. Frameworks (like jQuery) are always better at managing performance than you are, so just use what they provide. BUSTED
  5. Converting a NodeList to an array is expensive, so you shouldn’t do it. For instance document.getElementsByTagName() returns a NodeList, not an array, and then iterating over it compared to an array after taking the performance hit of converting it. BUSTED (also see: Static node list, which is closer to a performance with an array)
  6. Script concatenation and/or <script defer> is all you need to load JS performantly (aka “Issue 28“). POSSIBLY. The average website has over 300K of JavaScript. The best thing to do with your JavaScript is to concatenate all your files, but then split them into about 100K sizes. This highly increases the speed at which your browser can download if you’re downloading these in parallel. Also, chunking up your code into pieces where you separate never-changing javascript with frequent you will help with browser caching. Lazy loading (pulling in the important file first and them the others).
  7. Eval is evil, it’s too slow and quirky to be considered useful. BUSTED The performance is pretty much equal with all benchmarks.
  8. Regular expressions are slow and should be replaced with simple string method calls using indexOf(). BUSTED Engines are getting faster now with RegEx.
  9. OO API abstraction means you never have to worry about the details (including the performance). BUSTED Your API design matters more than it just being OO.
  10. Type torsion (===) takes more processing power than a regular comparison (==). BUSTED There is a difference, but it’s so tiny you shouldn’t be concerned.
  11. Caching “nested property lookup” or “prototype chain lookup” helps (or hurts) performance BUSTED In most cases the browser engine already makes the cache, and this wont matter at all
  12. Operations which require an implicit primitive-to-object cast/conversion will be slower BUSTED For instance, when converting a number to a toString() or toNumber() it doesn’t affect performance
  13. Scope chain lookups are expensive BUSTED
  14. Use switch statements instead of if/else if for better performance. POSSIBLY. In most cases this is true, except in Safari and Mobile Safari. The panel recommended to just use what you need.
  15. Use native methods for better performance. BUSTED

SXSW 2012: DIY Mobile Usability Testing

Belen Barros Pena
Open Source Technology Center (Intel), Interaction Designer

Bernard Tyers
Nokia Siemens Networks, Packet Core Engineer

Presentation Description

Usability testing is an interaction designer’s bread and butter, but applying it to the study of mobile applications and websites brings considerable challenges. Which device should we use for testing? Can we use an emulator? How do we prototype for mobile? Can we just recycle the tasks we use for desktop software tests? Do we test in the lab or in the wild? How do we record screen, fingers and facial expressions?

We don’t intend to answer all those questions in just one session: that would be madness! We’ll focus instead on the last one.
Follow us in our quest to set up a mobile usability testing environment on a tight budget. We’ll show you how others do it. We’ll roam around electronics and professional video stores searching for brackets and webcams. We’ll put our DIY skills to the test and waste a lot of silicon trying to build our mobile recording device. We’ll scour the Internet for free software, and we’ll finish off building the lab and running a usability test in front of your eyes.

If we can do it, so can you! You’ll come out of this session knowing exactly what you need to do to run and record usability tests with mobile devices.

Presentation Notes

Slides: diymobileusabilitytesting.net/doku.php?id=diy-mobile-usability-testing-sxsw2012

Record mobile interaction for both a memory aid, but it’s also a powerful communication tool to prove to the clients/owners of the software that people do visually struggle using their product. Intel records both the actions of what they are intending and actually do, and as well as the reaction of the person.

Usability tests are pretty much the same on mobile devices as they are on desktop computers, except… Before you run usability tests on mobile devices you need to ask the following to produce the goals of your test:

  • Which Phone?
  • Which Context?
  • Which Connection?

Handset usability affects test results. If a user is used to an iPhone and you give them an Android, then you’re going to have a learning curve and cause issue. To get around this always make sure you run tests against users with the phone they are used to. If you cannot do this, make sure to use training and warm-up tasks which allows the participant to get used to using the device first.

Should we run tests in the field or the lab? Well, with desktop usability testing it doesn’t really matter. But with mobile devices we use phones on the toilet, well lit, and dark settings. In all seriousness, no one really knows right now which test is best to do. However, we do know that testing in the field is resource intensive and expensive. Even if you just test in the lab, it’s better to do that than nothing at all.

If you must do field testing:

  • Do it late, because your in-lab tests will get most of the usability concerns first
  • Plan and run pilot tests
  • Be prepared, such as if it rains

Recommendations:

  • Never test over wi-fi, as you’ll loose a lot of value running over a slower network
  • Cover participants’ data costs who are doing the tests for you

So how do we record the experience?

  • Wearable equipment like hat-cameras
  • Document cameras; but those are not cheap, and have the disadvantage of requiring participants to keep within the camera’s range and this just isn’t natural feeling
  • Mountable cameras which allow for natural interaction with the phone, if they don’t get too heavy
  • Screen capture software; but no one likes you installing stuff on their phones, and no application will support all platforms
  • Remote tool such as mouseflow.com (records visits to your website without people knowing). It supposedly also works on mobile devices. It seems as though this doesn’t fully work yet on all phones though

If it would be possible you’d want

  • Easy to put together
  • Cheap
  • Repeatable
  • Allows holding the device
  • Allows one-handed use
  • Supports all form factors
  • Runs tests with participants’ phones
  • Captures screen, face and fingers
  • Gives enough video quality

Photograph of PresentationIntel took the 5 recording methods and found that the mounted devices were the best solution. But that was too expensive, so they instead built their own using Erector Sets, cheap web cams, poster putty (BlueTag, which also helps protect the phone), and bolts. They then run this through a windows machine with both of the cameras showing up, and just simply screen capture.

SXSW 2012: The Science of Good Design: A Dangerous Idea

Ben McAllister (@benmcallister)
Frog Design, Assoc Creative Dir

#SXDangerous

Presentation Description

The business world is increasingly enamored with design. Business leaders look to designers for guidance on everything from product innovation to corporate strategy. While designers and business people may bring different perspectives to the table, they share one common language: research.

But research can be dangerous. It often provides easy answers that go unquestioned because the research feels like science. What if we’ve put too much trust in research? What about the aspects of design and product development that are important, but hard to measure? Where does research end and design judgment begin?

In this talk, frog Associate Strategy Director Ben McAllister explores these questions and takes a hard look at the role of research in design. Drawing from not only design, but also economics and the philosophy of science, Ben confronts the conventional wisdom around design research, offering a new vision of how research can inspire creativity and guide decision making.

Presentation Notes

“Strategy” is a pretentious word and idea. Ben used to have a title with the word “Strategy” in it, and it was always a challenge. The word strategy comes from the Greek word “general” and breaks down to “to lead” and “that which is spread out.” But Strategy is really just about Leadership and Uncertainty. All humans do not like uncertainty, but without uncertainty there is no need for leadership. If you know what is going to happen, and have the perfect information, then you have no need for strategy.

Research is about informing decisions, but not everyone will agree. In regards to Mad Men the following are still true: Agency life hasn’t changed, but agency work has. Advertising agencies used to be a much more creative world, and they were highly trusted for their advice. However, now this creative world has marginalized. The word “The Research” bothers Ben. It implies the research has its own voice, and cannot be interpreted any other way. It wasn’t about ambitiousness, it was about a clear represented answer. Scientism is the act of using science terms to trick people and create a level of uncertainty (such as people in lab coats smoking cigarets as an advertisement). Scientism is a con, and it is cartoon science as it misguides you on what science really is.

With science you have certainty, objectivity, and progress. The problem is that we take Science and easily lump it in with Research although not all Research is Science. On one end of the spectrum of Research we have “Hard Sciences” (Laws), in the middle is “Social Sciences” (Experiments), and on the other end “Looking at Stuff” (Design World). But even Hard Science Scientists are not absolutely sure of anything (See: Richard Feynman, who admitted this). Even with the Great Depression people are still asking why it started, and why it ended.

Confirmation Bias is when you do research to find research that match your beliefs, and then you find more and more, and then you count is as fact although there is a whole slue of other science for the other side.
Flip Flop Rhythm is when one person says something is good for you, and then someone else says it is bad for you. This happens in nutrition and medication a lot.

We need to approach everything with a level of skepticism, and don’t take it to heart. As well, always keep an open mind that anything you do could be wrong. We need to be honest about where the value of design comes from. It’s dangerous whenever a client asks us to prove why we design the way we do. Sure, science can provide us with an easy answer. The value of what research provides comes from the person doing the research or the person interpreting the data. Research should be used to inform decisions, but not make them for us.

What kind of business do you want to be in? Do you want to be in the business of leading people through uncertainty, or in the business of following directions?