My Interview on The Drax Files Radio Hour – Second Life and Community Development in Virtual Worlds

show-64

I was recently interviewed by Draxtor on his highly informative and entertaining radio show “The Drax Files Radio Hour.”

Check out the full show and hear me chat about Second Life, Jibe, Unity, WigglePlanet, and the role of community development in multiuser virtual world platforms.

Take care,
-John “Pathfinder” Lester

My old lighthouse in Second Life

old lighthouse home in SLI was meeting some educator friends in Second Life today when I looked across the virtual landscape and saw this nice view of my old lighthouse in the historic neighborhood of Nova Albion.

Amazing to think I’ve owned the parcel for over 10 years (since 2004).  So much has changed in both my life and technology over that time, yet this specific virtual place remains constant and very special to me.

Take care,
-John “Pathfinder” Lester

ACES: A new system from ReactionGrid that adds live inworld customization to your Jibe 3d multiuser virtual world

If you’re looking for a powerful system that allows you to easily change in-world surface images/textures, make live modifications and add customized web-links within your Unity-based Jibe multiuser virtual world, then you might be interested in a new product from ReactionGrid.

It’s called ACES, and it’s now on sale in ReactionGrid’s Online Store.

Watch the above video for a complete walkthrough. Here are some screenshots from it.

aces board 1

Everything looks better with Wiener Dogs.

aces board 3

This beautiful 3d store model is made up of many high quality 2d surface images/textures, and I could modify any of them with ACES.

Use the in-world menu system to change surface textures and weblinks.

Use the in-world menu system to change surface textures and weblinks.

fruit

I just changed the junk food aisle into one containing healthy fruit using ACES!

What is ACES?

  • ACES is a system that adds user­-modifiable display boards to a Jibe world.  Any image on the web can be projected onto any display board, and boards can be configured to open any URL in a new browser window.
  • The content in an ACES display board can be changed in-situ (within the live published Jibe world) with no work required in the Unity editor.
  • All ACES display boards can be changed in­world by any user who has logged in with an account that has administrative level access.
  • ACES display boards can also be configured so that any user can “claim” an unused board as their own and then have full admin control over it.
  • ACES display boards don’t have to look like boards!  Imagine changing the image textures of storefronts, buildings, any surface at all in your Jibe world.

Live Demo:
http://demos.jibemix.com

Demo Accounts:
ACES Admin / password
Demo User / demodemo

If you have any questions, please feel free to email me at pathfinder@reactiongrid.com.

Take care,
– John “Pathfinder” Lester

Life Connected to Life: How to Revolutionize Environmental Education

[This post also appears on Wiggle Planet’s blog.]

There are many educational games out there that do their best to teach people about the environment.  And many of them do a great job.

For example, I really like how Earth Day Canada put together their EcoKids website.  The games on EcoKids are mostly simple simulations with engaging action and puzzle-based mechanics, and it’s great how they blend the computer-based games with physical-world activities (e.g., play a game on the computer then go outside and do some recycling).  Games that encourage people to make positive changes to their physical world, improving the environment for everyone.

In fact, both the United States Environmental Protection Agency and the National Oceanic and Atmospheric Administration have sites that promote environmental education through games.  It’s inspiring for me to see large governmental agencies exploring innovative ways to protect the environment and educate the public.

But there’s another level of immersive environmental education we haven’t even touched yet.

wshedcrosscut2

First, a key fact about Nature that we often forget.

Nature likes to hide things.  Particularly when something is wrong.  

It’s a fundamental trait that has developed in pretty much every species on the planet.  Are you sick?  Weak?  Injured?  Well, you better hide it as much as possible, otherwise something will come along, notice you’re indisposed, and then eat you for lunch.  This trait also manifests itself in entire networks of interdependent and related organisms (i.e., ecosystems).  By the time it’s easy to observe a systemic problem, the damage is often irreversible.

So, it’s not enough for us to be well educated and observant.  We need superhuman powers to help us visualize what’s really happening in Nature.

I believe artificial life combined with augmented reality is the magic key.  We can help Nature tell us her secrets by creating artificial life forms directly connected to all the data repositories we’ve already created for collecting and tracking environmental data.  Imagine the appearance and behaviors of these artificial life forms changing based on these data, generating powerful human-observable moments.  And finally, imagine these artificial life forms living in an augmented reality space overlaid on the natural world.

For example, take the beautiful concept of the Kodama from the movie Princess Mononoke.

kodama on tree

Kodama are small mystical creatures living in the forest that represent the spirits of all the trees.  Their behavior and appearance in the movie is directly related to the health of all the trees they inhabit.  For example, when the trees get sick, the kodama can be seen falling from the air and dissolving into the ground.

Now, imagine walking up to a tree in the physical world.

oak-tree

Is that tree really healthy?  Not sure, since trees (like most life forms) are pretty good at hiding things (until it’s too late).  Is the forest in which this tree lives getting enough water?  Is the water table polluted?

Sure, you could pull environmental data up on your smartphone and look at graphs and charts and summarized reports.

But those are all cold data, with no sense of life to them.

Rather, imagine watching the data express itself through a family of Kodama that live around the tree.  Imagine looking through your smart phone into an augmented reality space full of artificial life with which you can interact and communicate.

Oh no, all the Kodama are brown and withered!  That means drought!   Oh, they’re all walking over to that other tree.  There must be water over there.  Wait, they’re mutating into something weird.  Some kind of pollution?  The imaginative possibilities, let alone the entertaining and engaging gaming scenarios, are endless.

Effective learning and true understanding comes from emotional resonance.

And nothing resonates more with human beings than life connected to life.

Take care,
-John “Pathfinder” Lester

My Keynote at e-LEOT 2014 – “Augmented Mind: The Evolution of Learning Tools from Language to Immersive Reality”

The 1st International Conference on e-Learning, e-Education and Online Training is being held September 18-20 in Bethesda, Maryland.  This conference will assess a wide range of progressive ideas for the future of e-Learning, focusing on the idea of technology as a means to education rather than an end in itself.  The conference organizers have lined up a wonderful range of interdisciplinary speakers and are planning to attract a wide group of heterogeneous scholars and practitioners.

I’ll be attending the entire conference, and I’m honored to be giving the opening keynote presentation.   Here’s what I’ll be talking about:

nature-mind

“Augmented Mind: The Evolution of Learning Tools
from Language to Immersive Reality”

Innovative educators are constantly facing the challenge of matching pedagogical goals with complementary technological tools.  Unfortunately, given the wide range of technologies and devices that vie for consumer attention, the right choices are not always clear and are typically obscured by media hype. In this presentation, John Lester will describe how focusing on the way the human mind interacts with the world and other human beings can help identify the right tools for the right jobs.  From a mind-augmentation perspective combining constructivist and behaviorist approaches, John will explore web based tools ideal for knowledge management, augmented reality based self-animated autonomous agents, and finally the unique (and sometimes over-hyped) affordances of perceptually immersive multiuser 3d virtual worlds for collaborative learning.

My goal will be to tell an interesting story with examples and demos of technologies that I think really leverage how our minds naturally embrace the world around us.  One such technology that I’m currently exploring and that you’ve probably never heard of are Wiglets.

Visit Wiggle Planet to learn a lot more about Wiglets.

Visit Wiggle Planet to learn a lot more about Wiglets.

Wiglets are autonomous, evolving, self-animated and self-motivated agents that can exist in both completely virtual and augmented reality environments.  They exist at a wildly creative intersection of artificial life, art and gaming.  And perhaps best of all, you can interact with them directly through touch and gestures.

Another topic of discussion will be the affordances of multiuser 3d virtual worlds, especially how one can reduce the barrier to entry for people interested in leveraging them for educational purposes.  ReactionGrid has recently developed some new tools that integrate with the Unity3d-based Jibe platform to provide on-the-fly content editing in a simple yet powerful way.  I’ll be giving a sneak preview during my presentation.

Want to easily change this web-based 3d environment on the fly without having to muck around in Unity?  
Now you can. I’ve got some new tricks with Jibe to show you.

I’ll also be discussing and giving examples of innovative uses of commonly used virtual world technologies such as Second Life, Opensimulator and the Oculus Rift.  If you plan on attending and would like to connect with me at the conference, please drop me a line on Twitter or email.  And if you’re looking to interact with the organizers and other attendees and speakers, be sure to check out the e-LEOT LinkedIn Conference Group.

After my keynote I’ll be updating this blog post to include my slides and links to any recordings.

UPDATE Sept 19, 2014

Here are my slides:

Breakdown of Oculus Rift Virtual Reality Headset and Integrating with Unity3d and Jibe

Oculis RiftI’m eagerly awaiting my own developer version of the Oculus Rift, which should arrive in about a month.

My plans are to immediately start working on how to best integrate it with Jibe and Unity3d.

In particular, our newly released Jibe 2.0 has a built-in 1st-person perspective mode that is ideal for things like virtual reality headsets.

Exploring a test Jibe 2.0 world in 1st-person perspective

Exploring a multiuser Jibe 2.0 world in 1st-person perspective.

Keep an eye on this blog for future details.

Needless to say, I was very excited to see the folks at iFixit posting a great teardown of the developer version of the Oculus Rift headset.

oculus-rift-teardown

If you have an Oculus Rift and would like to brainstorm with me on how it can be integrated with multiuser virtual world applications, please drop me an email (john.e.lester@gmail.com) or post in the comments.

Perhaps we can also schedule a Team Fortress 2 game while using our headsets!

-John “Pathfinder” Lester
Chief Learning Officer, ReactionGrid Inc.

P.S.  ReactionGrid’s Lead Developer Matthew Bertrand is also getting an Oculus Rift dev kit.  He’s pretty psyched about it, and we all expect amazing things from him!