Life Connected to Life: How to Revolutionize Environmental Education

[This post also appears on Wiggle Planet’s blog.]

There are many educational games out there that do their best to teach people about the environment.  And many of them do a great job.

For example, I really like how Earth Day Canada put together their EcoKids website.  The games on EcoKids are mostly simple simulations with engaging action and puzzle-based mechanics, and it’s great how they blend the computer-based games with physical-world activities (e.g., play a game on the computer then go outside and do some recycling).  Games that encourage people to make positive changes to their physical world, improving the environment for everyone.

In fact, both the United States Environmental Protection Agency and the National Oceanic and Atmospheric Administration have sites that promote environmental education through games.  It’s inspiring for me to see large governmental agencies exploring innovative ways to protect the environment and educate the public.

But there’s another level of immersive environmental education we haven’t even touched yet.

wshedcrosscut2

First, a key fact about Nature that we often forget.

Nature likes to hide things.  Particularly when something is wrong.  

It’s a fundamental trait that has developed in pretty much every species on the planet.  Are you sick?  Weak?  Injured?  Well, you better hide it as much as possible, otherwise something will come along, notice you’re indisposed, and then eat you for lunch.  This trait also manifests itself in entire networks of interdependent and related organisms (i.e., ecosystems).  By the time it’s easy to observe a systemic problem, the damage is often irreversible.

So, it’s not enough for us to be well educated and observant.  We need superhuman powers to help us visualize what’s really happening in Nature.

I believe artificial life combined with augmented reality is the magic key.  We can help Nature tell us her secrets by creating artificial life forms directly connected to all the data repositories we’ve already created for collecting and tracking environmental data.  Imagine the appearance and behaviors of these artificial life forms changing based on these data, generating powerful human-observable moments.  And finally, imagine these artificial life forms living in an augmented reality space overlaid on the natural world.

For example, take the beautiful concept of the Kodama from the movie Princess Mononoke.

kodama on tree

Kodama are small mystical creatures living in the forest that represent the spirits of all the trees.  Their behavior and appearance in the movie is directly related to the health of all the trees they inhabit.  For example, when the trees get sick, the kodama can be seen falling from the air and dissolving into the ground.

Now, imagine walking up to a tree in the physical world.

oak-tree

Is that tree really healthy?  Not sure, since trees (like most life forms) are pretty good at hiding things (until it’s too late).  Is the forest in which this tree lives getting enough water?  Is the water table polluted?

Sure, you could pull environmental data up on your smartphone and look at graphs and charts and summarized reports.

But those are all cold data, with no sense of life to them.

Rather, imagine watching the data express itself through a family of Kodama that live around the tree.  Imagine looking through your smart phone into an augmented reality space full of artificial life with which you can interact and communicate.

Oh no, all the Kodama are brown and withered!  That means drought!   Oh, they’re all walking over to that other tree.  There must be water over there.  Wait, they’re mutating into something weird.  Some kind of pollution?  The imaginative possibilities, let alone the entertaining and engaging gaming scenarios, are endless.

Effective learning and true understanding comes from emotional resonance.

And nothing resonates more with human beings than life connected to life.

Take care,
-John “Pathfinder” Lester

My Keynote at e-LEOT 2014 – “Augmented Mind: The Evolution of Learning Tools from Language to Immersive Reality”

The 1st International Conference on e-Learning, e-Education and Online Training is being held September 18-20 in Bethesda, Maryland.  This conference will assess a wide range of progressive ideas for the future of e-Learning, focusing on the idea of technology as a means to education rather than an end in itself.  The conference organizers have lined up a wonderful range of interdisciplinary speakers and are planning to attract a wide group of heterogeneous scholars and practitioners.

I’ll be attending the entire conference, and I’m honored to be giving the opening keynote presentation.   Here’s what I’ll be talking about:

nature-mind

“Augmented Mind: The Evolution of Learning Tools
from Language to Immersive Reality”

Innovative educators are constantly facing the challenge of matching pedagogical goals with complementary technological tools.  Unfortunately, given the wide range of technologies and devices that vie for consumer attention, the right choices are not always clear and are typically obscured by media hype. In this presentation, John Lester will describe how focusing on the way the human mind interacts with the world and other human beings can help identify the right tools for the right jobs.  From a mind-augmentation perspective combining constructivist and behaviorist approaches, John will explore web based tools ideal for knowledge management, augmented reality based self-animated autonomous agents, and finally the unique (and sometimes over-hyped) affordances of perceptually immersive multiuser 3d virtual worlds for collaborative learning.

My goal will be to tell an interesting story with examples and demos of technologies that I think really leverage how our minds naturally embrace the world around us.  One such technology that I’m currently exploring and that you’ve probably never heard of are Wiglets.

Visit Wiggle Planet to learn a lot more about Wiglets.

Visit Wiggle Planet to learn a lot more about Wiglets.

Wiglets are autonomous, evolving, self-animated and self-motivated agents that can exist in both completely virtual and augmented reality environments.  They exist at a wildly creative intersection of artificial life, art and gaming.  And perhaps best of all, you can interact with them directly through touch and gestures.

Another topic of discussion will be the affordances of multiuser 3d virtual worlds, especially how one can reduce the barrier to entry for people interested in leveraging them for educational purposes.  ReactionGrid has recently developed some new tools that integrate with the Unity3d-based Jibe platform to provide on-the-fly content editing in a simple yet powerful way.  I’ll be giving a sneak preview during my presentation.

Want to easily change this web-based 3d environment on the fly without having to muck around in Unity?  
Now you can. I’ve got some new tricks with Jibe to show you.

I’ll also be discussing and giving examples of innovative uses of commonly used virtual world technologies such as Second Life, Opensimulator and the Oculus Rift.  If you plan on attending and would like to connect with me at the conference, please drop me a line on Twitter or email.  And if you’re looking to interact with the organizers and other attendees and speakers, be sure to check out the e-LEOT LinkedIn Conference Group.

After my keynote I’ll be updating this blog post to include my slides and links to any recordings.

UPDATE Sept 19, 2014

Here are my slides:

Giving a Virtual Worlds Lecture – April 7 @ 6pm PDT in Second Life

blog post sjsu headerOn Monday April 7 at 6pm PDT I’ll be giving a Virtual Worlds Lecture in Second Life.

The title of my talk is “Finding the Balance between Pedagogy and Technology.”  Here’s a summary:

One must always seek a thoughtful match between pedagogy and technology. Different virtual world platforms are suited for different uses, ranging from collaborative work environments to immersive goal-oriented simulations. The speaker will discuss current virtual world technological trends involving specific gaming technologies like Unity3D and the growth of Open Source platforms such as OpenSimulator. A discussion will focus on helping educators choose the right tool for the right job, matching pedagogical goals with technological affordances.

My presentation is part of an ongoing series of talks hosted by the School of Library and Information Science at San Jose State University.  Here’s more information about the colloquia, and here’s a SLURL for where I’ll be speaking.

Part of what I’ll be doing in addition to showing slides and speaking will be a live demo of some of the content import/export tools in the Singularity Viewer.  You’ll get so see how you can easily backup content you’ve created in Second Life or Opensim to your hard drive and how to get that content into other 3D platforms like Unity3d and Blender.

Content import and export in Singularity Viewer

Hope to see you there!

-John “Pathfinder” Lester

Rebooting the Hypergrid Adventurers Club and Thanking Latif Khalifa

singularityIf you’re a fan of the Hypergrid, you should definitely check out the new 1.8.3 release of the Singularity viewer for OpenSim and Second Life.

In particular, take a look at this section in the update notes.  The fact that it is a very brief sentence seriously belies the magnitude of its significance.

  • Fixed a problem with long teleports in OpenSim (“4096 bug” SVC-2941 FIRE-11593) (Latif)

Latif Khalifa has fixed the bug that, since the beginning of time, prevented Hypergrid explorers from jumping to places more than 4096 regions away.  No more mandatory intermediate hops!  No more “cannot reach destination – too far away” messages!

I encourage all explorers of the Hypergrid to please take a moment and thank Latif on Twitter.  His hard work has resulted in a major improvement to the use of the Hypergrid and the evolution of OpenSim as a constellation of easily accessible interconnected grids.

Which brings me to the topic of the Hypergrid Adventurers Club.  Since my presentation at the OpenSimulator Community Conference, I’ve received a great deal of interest in possibly restarting our tours of the Hypergrid.  Many people reached out to me, and the outpouring of interest was very inspiring.

So I’m rebooting the tours!  Our next tour will be Saturday Sept 28 at 10pm EDT.  For all the details, please join and read our Google Group.

Take care,
-John “Pathfinder” Lester

Getting Real World Terrains into OpenSim: A Tutorial by Brian A. White

This is recreation of a blog post by Brian A. White.  Brian’s blog went offline sometime in 2009, and recently it was suggested that someone republish this useful tutorial to make sure it can be found by search engines and does not someday completely vanish from the Internet.  So here it is.

Some of the links in this article are dead and not available in any existing archives.  I’ve left those in but crossed them out.  Fortunately these dead links are not critical to the tutorial, and I was able to update the other links that have changed since this article was written.  Brian, wherever you are, thank you again for writing this and I hope you are well.

-John “Pathfinder” Lester

Continue reading

How to create multiuser networked events in Jibe and Unity3d using iTween

In Jibe 2.0 we’ve included an easy system that gives you the power to use iTween to create multiuser networked events. This allows you create shared experiences between avatars using interactive and complex object animations.

Watch my tutorial to learn more!


Video: How to create multiuser networked events in Jibe and Unity3d using iTween

Take care,
-John “Pathfinder” Lester
Chief Learning Officer
ReactionGrid, Inc.

How to embed and play a video on an object in Unity3d and Jibe

step 06 play movie

Watching “Hedgehog in the Fog” in my Jibe world.

Note: You’ll need the Unity Pro editor if you want to work with Movie Textures in Unity3d.

Unity3d allows you to embed and play videos on any surface in a 3d environment.

This means you can easily create a web-based Jibe world where avatars explore a multiuser 3d virtual space while watching videos or movies playing on screens/signs/any surface you wish.

The most common way to add video to a Unity3d project is by adding a video file to your project’s Asset Folder, which automatically creates a Movie Texture (details here).

However, adding a video file directly to your project means the size of the video file will be added to the final size of your completed Unity webplayer file.  In other words, if your video clip is 50 Megabytes large, then your Unity webplayer file will have an extra 50 Megabytes added on to it.

For folks creating Jibe worlds with Unity3d (or anyone creating Unity webplayer files for streaming on the Web) this is not good.  You always want your webplayer file to be as small as possible so your webplayer file will finish downloading and start running as quickly as possible.

Fortunately, there’s a way you can download/stream a movie from the Web so it doesn’t add to the size of your Unity webplayer file.  Unity will immediately start playing the movie as soon as it has buffered enough of it, similar to how YouTube works.

Here’s a simple example:

Step 1: Get your video ready as an OGG file on the Web

If you have a video on YouTube that you want to use, you’ll have to download it.  I suggest using Flash Video Downloader.

Unity needs videos to be in OGG format (file extension .ogg).  If you need to convert an existing video file into OGG format, I suggest using VLC (it’s free and cross platform).  Take your OGG video, put it on a webserver somewhere and remember the URL.

Important Note: If you’re managing your own webserver, be sure it has the MIME type for Ogg Vorbis enabled.  For Extension use .ogg, and for MIME type use application/ogg.

Here’s a sample 60 Megabyte OGG video I made and uploaded to WordPress.  Feel free to use this URL in your own tests.  You can also click on it to see how it plays in your browser.
http://becunningandfulloftricks.files.wordpress.com/2013/04/hedgehog_in_the_fog.ogg

Step 2: Create a Cube

In this example, we’re going to make a basic cube and have the video play on its surface.  Of course you could flatten the cube so it looks likes a screen and then place it on a model of a TV or something.  I’m just being lazy.

step 01 creating a cubestep 02 creating a cube

Step 3: Create a new Javascript

I like the name of a script to remind me what the script actually does, so I’m going to call this new script ClicktoPlayWebMovie.

step 03 create a new javascript script

Here’s the code.  Copy and paste this into your new script and save it.

var url = "http://becunningandfulloftricks.files.wordpress.com/2013/04/hedgehog_in_the_fog.ogg";
function OnMouseDown () {
 // Start download
 var www = new WWW(url);
// Make sure the movie is ready to start before we start playing
 var movieTexture = www.movie;
 while (!movieTexture.isReadyToPlay)
 yield;
// Initialize texture to be 1:1 resolution
 renderer.material.mainTexture = movieTexture;
// Assign clip to audio source
 // Sync playback with audio
 audio.clip = movieTexture.audioClip;
// Play both movie & sound
 movieTexture.Play();
 audio.Play();
}
// Make sure we have audio source
@script RequireComponent (AudioSource)
function Update () {
}

You can see at the top of the script that I’ve included my demo URL as the default movie URL.  You can always change it later.

Step 4: Add ClicktoPlayWebMovie script to your cube

Drag the ClicktoPlayWebMovie script from your Project folder onto the Cube in your Scene view.  This will add the script to the cube.

step 04 drag javascript to cube

Now select your Cube in the Scene view and look at the Inspector settings.  You can change the movie URL by simply editing the URL field in the Inspector.

Also notice that there is an Audio Source added to the Cube.  This was added automatically when you added the script to the Cube, since the script needs an Audio Source component to work.  Don’t delete or modify the Audio Source component.  Just leave it be.

step 05 script in cube - check audio component and movie url

Step 5: You’re done.  Test it out!

You can run your Jibe world locally in the Unity editor and test it out that way.  Walk up to the cube and click on it.  The movie will start playing on all surfaces of  the cube.

step 06 play movie

You can also view an online version of this demo in my own Jibe world.

Enjoy!

-John “Pathfinder” Lester
Chief Learning Officer, ReactionGrid Inc.

How to Fix the Unity Asset Server error “Cannot Start Service” on Windows

The Unity Asset Server is a fantastic tool for version control and collaborating with a group of people on a project in Unity3D.  Our customers at ReactionGrid Inc. primarily use Windows servers, and some of them are running their own Unity Asset Servers to facilitate working together on Jibe projects.

I recently learned of a bug that might affect folks running their own Unity Asset Server on Windows.  Fortunately it’s not serious at all (just annoying as heck), and the fix is incredibly easy and permanent.

Are you running the Unity Asset Server on a Windows machine?

Has it suddenly stopped working, refusing to start no matter what you do?

Do you immediately see the following popup error when using the Unity Asset Server Control Panel?

Unity asset server error popup

Speaking at “Train for Success” Panel on the Future of Virtual Worlds – Nov 8 @ noon Eastern

The Gronstedt Group hosts a weekly “Train for Success” speaking series, and this week I’ll be participating in a panel discussion on the State and Future of Virtual Worlds.

The panel will be held in Second Life and starts on Thursday November 8 at noon Eastern. You can also watch and ask questions via the live stream on the web.

For more details, please see Facebook. Here’s a summary:

“The landscape of virtual worlds is changing. Social and game mechanics make virtual worlds more engaging. Browser-based virtual worlds make them more accessible to a wider audience. The panel will discuss the state and future of virtual worlds. Join this conversation about the emerging platforms and applications of virtual worlds in learning and business.”

Hope to see you there, and special thanks to Anders Gronstedt for inviting me to participate.

-John “Pathfinder” Lester

Video of panel on “Virtual Worlds Revisited” at the 2012 Chicago eLearning and Technology Showcase

I  just participated in a wonderful “Virtual Worlds Revisited” panel discussion as part of the Chicago eLearning and Technology Showcase.  The panel was organized and moderated by Mike Kemmler, and participants included virtual world innovators Anders Gronstedt , Mark Jankowski and Karl Kapp.

The panel was held in Second Life, but the focus of our discussion was firmly on the future of new virtual world platforms and new modalities for immersive learning.  We were projected into the physical world meeting room in Chicago where about 30 people attended in person.  Here’s a summary:

Still deeply entrenched in Gartner’s Trough of Disillusionment, is it time to revisit virtual worlds?  Mike Kemmler hosts a virtual panel discussion via Second Life with a distinguished group of virtual world innovators, presenters, consultants, and authors, including Anders Gronstedt, Mark Jankowski, Karl Kapp, and John “Pathfinder” Lester. Panelists address the current state of learning in virtual worlds, explain platforms they see organizations using for immersive learning, and discuss current challenges and future possibilities of using virtual worlds for learning.

Thanks again, Mike, for this great opportunity.  It was an honor to be on a panel with such a stellar group of pioneers in virtual worlds and immersive learning.


NOTE: The audio from Second Life is bit choppy for the first 30 seconds, but then clears up perfectly for the rest of the video.

-John “Pathfinder” Lester