Tips on Exporting/Importing Virtual World Content and Speaking at OpenSimulator Community Conference 2014

15433151408_16e4142328_hThe OpenSimulator Community Conference (OSCC) is an annual conference that focuses on the developer and user community creating the OpenSimulator software. Organized as a joint production by AvaCon and the Overte Foundation, the virtual conference features two days of presentations, workshops, keynote sessions, and social events across diverse sectors of the OpenSimulator user base.

Last year’s conference was a fantastic experience, and I’m thrilled to be both attending and presenting again this year.  All the inworld venue tickets are sold out, but if you can still register for a free streaming ticket and watch all the presentations live on Ustream.

oscc14_website_header_banner

Here’s what I’ll be presenting this year:

“You Only Own what you can Carry: How to backup and move your content between Second Life, Openim and Unity”
Saturday Nov 8 from 9:00am – 9:45am PST

In this hands-on workshop, I’ll be demonstrating exactly how to export your own user-created objects (both prim and mesh based) and move them between Second Life, Opensim and Unity. Attendees will watch my desktop via a live TeamViewer screenshare and follow along on their own using freely-available software.

Requirements: Inworld attendees should be using the OSCC recommended 32-bit Singularity viewer and have pre-installed both the free version of the Unity Editor and the free TeamViewer application. No previous technical expertise required, just a willingness to learn.

The crux of my workshop will be a live demonstration of me creating something in both Opensim and Second Life and then walking through exactly how to get it into a scene in Unity.  I’ll also be demoing how to move content between Second Life and Opensim.  If you’re worried all this might be overly complicated, I promise it will be a lot easier than you expect.  Plus you’ll have the fun and “excitement” of watching me do all this live on my own desktop (what could possibly go wrong?).  The key takeaway will be that the whole process is easy enough for anyone to learn how to do, regardless of your level of technical expertise.

If you can’t watch it live, no worries.  My session will be recorded so you’ll be able to watch it later.  I’ll update this blog post with a link to the recording once it’s online.

[UPDATE - Nov 9 2014 - Here's the full video of my presentation]

I’ll also be a panelist later in the day on “The New Era of Content Protection in OpenSim” where I’ll be sharing my thoughts about DRM versus content licensing.

15433152548_62c309628d_h

Hope to see you there!

Take care,
-John “Pathfinder” Lester

My Keynote at e-LEOT 2014 – “Augmented Mind: The Evolution of Learning Tools from Language to Immersive Reality”

The 1st International Conference on e-Learning, e-Education and Online Training is being held September 18-20 in Bethesda, Maryland.  This conference will assess a wide range of progressive ideas for the future of e-Learning, focusing on the idea of technology as a means to education rather than an end in itself.  The conference organizers have lined up a wonderful range of interdisciplinary speakers and are planning to attract a wide group of heterogeneous scholars and practitioners.

I’ll be attending the entire conference, and I’m honored to be giving the opening keynote presentation.   Here’s what I’ll be talking about:

nature-mind

“Augmented Mind: The Evolution of Learning Tools
from Language to Immersive Reality”

Innovative educators are constantly facing the challenge of matching pedagogical goals with complementary technological tools.  Unfortunately, given the wide range of technologies and devices that vie for consumer attention, the right choices are not always clear and are typically obscured by media hype. In this presentation, John Lester will describe how focusing on the way the human mind interacts with the world and other human beings can help identify the right tools for the right jobs.  From a mind-augmentation perspective combining constructivist and behaviorist approaches, John will explore web based tools ideal for knowledge management, augmented reality based self-animated autonomous agents, and finally the unique (and sometimes over-hyped) affordances of perceptually immersive multiuser 3d virtual worlds for collaborative learning.

My goal will be to tell an interesting story with examples and demos of technologies that I think really leverage how our minds naturally embrace the world around us.  One such technology that I’m currently exploring and that you’ve probably never heard of are Wiglets.

Visit Wiggle Planet to learn a lot more about Wiglets.

Visit Wiggle Planet to learn a lot more about Wiglets.

Wiglets are autonomous, evolving, self-animated and self-motivated agents that can exist in both completely virtual and augmented reality environments.  They exist at a wildly creative intersection of artificial life, art and gaming.  And perhaps best of all, you can interact with them directly through touch and gestures.

Another topic of discussion will be the affordances of multiuser 3d virtual worlds, especially how one can reduce the barrier to entry for people interested in leveraging them for educational purposes.  ReactionGrid has recently developed some new tools that integrate with the Unity3d-based Jibe platform to provide on-the-fly content editing in a simple yet powerful way.  I’ll be giving a sneak preview during my presentation.

Want to easily change this web-based 3d environment on the fly without having to muck around in Unity?  
Now you can. I’ve got some new tricks with Jibe to show you.

I’ll also be discussing and giving examples of innovative uses of commonly used virtual world technologies such as Second Life, Opensimulator and the Oculus Rift.  If you plan on attending and would like to connect with me at the conference, please drop me a line on Twitter or email.  And if you’re looking to interact with the organizers and other attendees and speakers, be sure to check out the e-LEOT LinkedIn Conference Group.

After my keynote I’ll be updating this blog post to include my slides and links to any recordings.

UPDATE Sept 19, 2014

Here are my slides:

Giving a Virtual Worlds Lecture – April 7 @ 6pm PDT in Second Life

blog post sjsu headerOn Monday April 7 at 6pm PDT I’ll be giving a Virtual Worlds Lecture in Second Life.

The title of my talk is “Finding the Balance between Pedagogy and Technology.”  Here’s a summary:

One must always seek a thoughtful match between pedagogy and technology. Different virtual world platforms are suited for different uses, ranging from collaborative work environments to immersive goal-oriented simulations. The speaker will discuss current virtual world technological trends involving specific gaming technologies like Unity3D and the growth of Open Source platforms such as OpenSimulator. A discussion will focus on helping educators choose the right tool for the right job, matching pedagogical goals with technological affordances.

My presentation is part of an ongoing series of talks hosted by the School of Library and Information Science at San Jose State University.  Here’s more information about the colloquia, and here’s a SLURL for where I’ll be speaking.

Part of what I’ll be doing in addition to showing slides and speaking will be a live demo of some of the content import/export tools in the Singularity Viewer.  You’ll get so see how you can easily backup content you’ve created in Second Life or Opensim to your hard drive and how to get that content into other 3D platforms like Unity3d and Blender.

Content import and export in Singularity Viewer

Hope to see you there!

-John “Pathfinder” Lester

Upcoming Conferences on Experiential Learning and Virtual Worlds: Let’s Meet!

Howdy folks,

I’ll be attending these two upcoming conferences.  If you’re planning to attend either of them or if you just happen to be in town when they occur, please contact me via my about.me page if you’d like to meet up and chat about learning in virtual worlds!

The main aims of this conference are to increase our understanding of experiential learning in virtual worlds, both formal and informal, to share experiences and best practices, and to debate future possibilities for learning in virtual worlds.  For full details, please see the conference website.

My panel presentation will be “Finding the Balance between Pedagogy and Technology.”  Here’s my abstract:

Next Generation virtual worlds will be tightly coupled to many other emerging technologies, leveraging modern knowledge management processes and providing platforms for broad use among teachers and learners.  As the technological landscape grows, it is becoming increasingly difficult for educators to identify the right platform (or mix of platforms) for their specific immersive learning needs.

In my current position at ReactionGrid and my previous work at Linden Lab and Harvard Medical School, I have explored the use of a wide range of gaming and virtual world platforms to augment education.  Today there are a number of very interesting virtual world technological trends involving specific gaming technologies like Unity as well as the growth of Open Source platforms such as OpenSimulator.  My ongoing work involves finding the right match between educational goals and technological affordances as well as identifying key synergies when virtual world technologies are interwoven with existing social media and web-based educational content.

Above all else, there must be a thoughtful match between pedagogy and technology.  Different virtual world platforms are suited for different uses, ranging from collaborative work environments to immersive goal-oriented simulations.  One of the most important and challenging goals for any educator exploring virtual worlds is simply finding the right tool for the right job.  Likewise, it is critical for virtual world platform developers to keep a firm focus on well established knowledge management principles when designing new technologies intended to advance the field of immersive learning.

I’m particularly thrilled about this panel because I’ll be participating with Dr. Bryan Carter from the University of Arizona.  Bryan is a true pioneer in using virtual worlds for experiential learning, and he’s been working with virtual environments since his dissertation project in 1997 when he created a virtual simulation of Harlem, NY as it existed during the 1920s Jazz Age and Harlem Renaissance.  Virtual Harlem was one of the earliest full virtual reality environments created for use in the humanities and certainly one of the first for use in an African American literature course.  The project continues to grow and evolve as Bryan explores new virtual world platforms.

1st International Conference on e-Learning e-Education and Online Training (e-LEOT)
September 18–20, 2014
Bethesda, Maryland, United States

This new conference will assess a wide range of progressive ideas for the future of e-Learning, focusing on the idea of technology as a means to education rather than an end in itself.  The conference organizers are lining up a wonderful range of interdisciplinary speakers and are planning to attract a wide group of heterogeneous scholars and practitioners.  For full details, please see the conference website.

I’ll be giving a keynote at this conference.  And if you’re looking to interact with the organizers and other attendees and speakers, be sure to check out the e-LEOT LinkedIn Conference Group.

Be seeing you!

How to convert a prim-based object in Second Life or Opensim into a mesh object on your hard drive using the Singularity viewer

prim cube

ye olde prim

This is pretty cool.

The most recent version of the Singularity viewer (version 1.8.1) adds a particularly interesting feature:

  • Wavefront (.obj) and Collada (.dae) Export by Apelsin, Inusaito, and Latif Khalifa - Allows export of your creations into Blender, Unity3D and other modeling applications and game engines

This means you can now take a prim-based object from within Second Life or Opensim and export it to your hard drive as a mesh object (either .obj or .dae file format).

Continue reading

How to create multiuser networked events in Jibe and Unity3d using iTween

In Jibe 2.0 we’ve included an easy system that gives you the power to use iTween to create multiuser networked events. This allows you create shared experiences between avatars using interactive and complex object animations.

Watch my tutorial to learn more!


Video: How to create multiuser networked events in Jibe and Unity3d using iTween

Take care,
-John “Pathfinder” Lester
Chief Learning Officer
ReactionGrid, Inc.

How to create Avatar Sit locations on any object in a Jibe world in Unity3d

Here’s a short tutorial video that will show you how to create avatar sit locations on any object in your Jibe world.  It’s a very powerful and flexible system where you simply drag and drop sit locations onto anything in your multiuser Jibe world, allowing you to easily create collaborative meeting environments that encourage avatars to gather together in groups.

In this video, I also review how to avoid the accidental misuse of a script that could potentially cause your Avatar to fall through the floor.  Safety First!

Creating Sit Locations in Jibe and Unity3d from John Lester on Vimeo.

You can also find this tutorial video in our Knowledge Base.

Take care,
-John “Pathfinder” Lester
Chief Learning Officer, ReactionGrid

Breakdown of Oculus Rift Virtual Reality Headset and Integrating with Unity3d and Jibe

Oculis RiftI’m eagerly awaiting my own developer version of the Oculus Rift, which should arrive in about a month.

My plans are to immediately start working on how to best integrate it with Jibe and Unity3d.

In particular, our newly released Jibe 2.0 has a built-in 1st-person perspective mode that is ideal for things like virtual reality headsets.

Exploring a test Jibe 2.0 world in 1st-person perspective

Exploring a multiuser Jibe 2.0 world in 1st-person perspective.

Keep an eye on this blog for future details.

Needless to say, I was very excited to see the folks at iFixit posting a great teardown of the developer version of the Oculus Rift headset.

oculus-rift-teardown

If you have an Oculus Rift and would like to brainstorm with me on how it can be integrated with multiuser virtual world applications, please drop me an email (john.e.lester@gmail.com) or post in the comments.

Perhaps we can also schedule a Team Fortress 2 game while using our headsets!

-John “Pathfinder” Lester
Chief Learning Officer, ReactionGrid Inc.

P.S.  ReactionGrid’s Lead Developer Matthew Bertrand is also getting an Oculus Rift dev kit.  He’s pretty psyched about it, and we all expect amazing things from him!

How to embed and play a video on an object in Unity3d and Jibe

step 06 play movie

Watching “Hedgehog in the Fog” in my Jibe world.

Note: You’ll need the Unity Pro editor if you want to work with Movie Textures in Unity3d.

Unity3d allows you to embed and play videos on any surface in a 3d environment.

This means you can easily create a web-based Jibe world where avatars explore a multiuser 3d virtual space while watching videos or movies playing on screens/signs/any surface you wish.

The most common way to add video to a Unity3d project is by adding a video file to your project’s Asset Folder, which automatically creates a Movie Texture (details here).

However, adding a video file directly to your project means the size of the video file will be added to the final size of your completed Unity webplayer file.  In other words, if your video clip is 50 Megabytes large, then your Unity webplayer file will have an extra 50 Megabytes added on to it.

For folks creating Jibe worlds with Unity3d (or anyone creating Unity webplayer files for streaming on the Web) this is not good.  You always want your webplayer file to be as small as possible so your webplayer file will finish downloading and start running as quickly as possible.

Fortunately, there’s a way you can download/stream a movie from the Web so it doesn’t add to the size of your Unity webplayer file.  Unity will immediately start playing the movie as soon as it has buffered enough of it, similar to how YouTube works.

Here’s a simple example:

Step 1: Get your video ready as an OGG file on the Web

If you have a video on YouTube that you want to use, you’ll have to download it.  I suggest using Flash Video Downloader.

Unity needs videos to be in OGG format (file extension .ogg).  If you need to convert an existing video file into OGG format, I suggest using VLC (it’s free and cross platform).  Take your OGG video, put it on a webserver somewhere and remember the URL.

Important Note: If you’re managing your own webserver, be sure it has the MIME type for Ogg Vorbis enabled.  For Extension use .ogg, and for MIME type use application/ogg.

Here’s a sample 60 Megabyte OGG video I made and uploaded to WordPress.  Feel free to use this URL in your own tests.  You can also click on it to see how it plays in your browser.
http://becunningandfulloftricks.files.wordpress.com/2013/04/hedgehog_in_the_fog.ogg

Step 2: Create a Cube

In this example, we’re going to make a basic cube and have the video play on its surface.  Of course you could flatten the cube so it looks likes a screen and then place it on a model of a TV or something.  I’m just being lazy.

step 01 creating a cubestep 02 creating a cube

Step 3: Create a new Javascript

I like the name of a script to remind me what the script actually does, so I’m going to call this new script ClicktoPlayWebMovie.

step 03 create a new javascript script

Here’s the code.  Copy and paste this into your new script and save it.

var url = "http://becunningandfulloftricks.files.wordpress.com/2013/04/hedgehog_in_the_fog.ogg";
function OnMouseDown () {
 // Start download
 var www = new WWW(url);
// Make sure the movie is ready to start before we start playing
 var movieTexture = www.movie;
 while (!movieTexture.isReadyToPlay)
 yield;
// Initialize texture to be 1:1 resolution
 renderer.material.mainTexture = movieTexture;
// Assign clip to audio source
 // Sync playback with audio
 audio.clip = movieTexture.audioClip;
// Play both movie & sound
 movieTexture.Play();
 audio.Play();
}
// Make sure we have audio source
@script RequireComponent (AudioSource)
function Update () {
}

You can see at the top of the script that I’ve included my demo URL as the default movie URL.  You can always change it later.

Step 4: Add ClicktoPlayWebMovie script to your cube

Drag the ClicktoPlayWebMovie script from your Project folder onto the Cube in your Scene view.  This will add the script to the cube.

step 04 drag javascript to cube

Now select your Cube in the Scene view and look at the Inspector settings.  You can change the movie URL by simply editing the URL field in the Inspector.

Also notice that there is an Audio Source added to the Cube.  This was added automatically when you added the script to the Cube, since the script needs an Audio Source component to work.  Don’t delete or modify the Audio Source component.  Just leave it be.

step 05 script in cube - check audio component and movie url

Step 5: You’re done.  Test it out!

You can run your Jibe world locally in the Unity editor and test it out that way.  Walk up to the cube and click on it.  The movie will start playing on all surfaces of  the cube.

step 06 play movie

You can also view an online version of this demo in my own Jibe world.

Enjoy!

-John “Pathfinder” Lester
Chief Learning Officer, ReactionGrid Inc.

How to Fix the Unity Asset Server error “Cannot Start Service” on Windows

The Unity Asset Server is a fantastic tool for version control and collaborating with a group of people on a project in Unity3D.  Our customers at ReactionGrid Inc. primarily use Windows servers, and some of them are running their own Unity Asset Servers to facilitate working together on Jibe projects.

I recently learned of a bug that might affect folks running their own Unity Asset Server on Windows.  Fortunately it’s not serious at all (just annoying as heck), and the fix is incredibly easy and permanent.

Are you running the Unity Asset Server on a Windows machine?

Has it suddenly stopped working, refusing to start no matter what you do?

Do you immediately see the following popup error when using the Unity Asset Server Control Panel?

Unity asset server error popup