Thursday 8 October 2009

Educational tools in SL

I've just finished compiling a list of educational tools available in Second Life. You can view the Google Doc here:

http://docs.google.com/Doc?docid=0AbJhUXLbxPT9ZGY0NDNwM3pfNmhwbTVxZGMy&hl=en

Friday 2 October 2009

AIMLBot and Open Metaverse

To remove any hard work in creating a chat bot using Open Metaverse, I performed a search for C# AIML parsers. Top of the list was AIMLBot (http://aimlbot.sourceforge.net/), and after a bit of poking around they seem to work well together.


There was some head scratching because the chat example on the Open Metaverse page (http://lib.openmetaverse.org/wiki/Respond_to_inworld_chat) seems to be out of date - the LL prefix from LLUUID and LLVector3 seem to have been dropped. The AIMLBot also doesn't come with the xml and aiml files needed, but I just took those from Radegast.

As it stands the chat call back is fired whenever anyone starts typing nearby, and when it hears itself. My slap-dash solution is included below:


static void Self_OnChat(string message, ChatAudibleLevel audible,
ChatType type, ChatSourceType sourceType, string fromName,
UUID id, UUID ownerid, Vector3 position)
{
if (message != "")
{
if (fromName != "Test Bot")
{
//process chat here
AIMLbot.Request request = new AIMLbot.Request(message,
myUser, myBot);
AIMLbot.Result reply = myBot.Chat(request);

Client.Self.Chat(reply.Output, 0, ChatType.Normal);
Console.WriteLine(reply.Output);
}
}
}


That's the main component of the chat bot, so you can see that it is very easy to implement.

Friday 25 September 2009

Creating Bots for SL/OpenSim

If you have any exposure to Second Life culture, then you will hear discussions about 'bots' - which are simply avatars controlled by computers, rather than by real human beings. For example, when you perform a search for places in SL, the results are ordered based on traffic, with the places that have had the most visitors listed at the top. Some unscrupulous people used bots to inflate their traffic stats to move up the search list. Also, how do you know that the preset grinding sequence of the SL stripper you just paid was initiated by a person and not a computer? Do you care? Should you care? Why are you there?

Regardless of the controversies, bots are very useful as NPCs. Compared to the alternative on in-world modelled and scripted bots, out-world bots will generally look better (using the avatar mesh and any of the high-quality garments and animations you choose to purchase) and have potentially better AI. I will now share with you the knowledge gained from a brief investigation into how to create these bots, with specific consideration to OpenSim.

Important note: Every bot neads a valid login account.

Easy-Peasy Method:

The Radegast client (http://radegastclient.org/wp/) is a non-graphical client for SL. It supports almost everything you might want to do in SL, except look at it, and also includes a built-in AI agent based on A.L.I.C.E. You just turn it on as a setting, and when anyone enters chat nearby that includes the avatar's firstnname (this is very important), then Alice will repond.

I.e. If you log in as Rupert Marmaduke, and somebody says "Hello Rupert Marmaduke" or "Hello Rupert", then they will illicit some suitable response.

For OpenSim, you will need to set a URL and port in order to connect (e.g. http://127.0.0.1:9000).

Personalising Radegast (Maybe):

As mentioned the Radegast chat-bot is built upon A.L.I.C.E, which uses aiml definitions. Basically these are xml files, listing patterns (what the user might say) and templates (how the bot might respond). For example:

<category>
<pattern>TELL ME ABOUT YOURSELF</pattern>
<template>I am a natural language chatterbot, that talks to people via computer networks such as the Internet.</template>
</category>

You can conceivably modify the aiml files that come with Radegast (stored in Program Files\Radegast\aiml) as you see fit. Whatever you put in the template section is regurgitated whenever the pattern is matched. I have not tried this, but I'm 90% sure that it would work.

Getting Complex: OpenMetaverse

If you want to get your hands dirty then you can download a C# library called Open Metaverse - and it is what Radegast was built upon. There are plenty of examples provided, and it is very easy to get an avatar to login and say something (i.e. follow 5 instructions then copy and paste). Capturing the surrounding chat and responding intelligently is obvioulsy the biggest challenge.

By default the library connects to SL, but you can switch to any grid using:

client.Settings.LOGIN_SERVER = "http://osgrid.org:8002/";
Full details are given here: http://lib.openmetaverse.org/wiki/How_to_create_a_basic_libopenmv_bot_for_osgrid%3F

Monday 21 September 2009

MCQs in SL

Shown in the video are various options for showing multiple choice questions (MCQs) in SL. The fundamental criteria was that the questions needed to be attached to an object in-world, and so there were 3 main options:
  1. Dialogs: MCQ presented through a series of dialogs. Only a certain amount of text can fit on a dialog button, and so longer options need an ABC-style system.
  2. HUD: Very similar to the dialogs, but allows for 'bigger buttons' to display options.
  3. Website: Displays a website using the in-built browser. In this case the website was made using software called Twine, but any html page would work (including javascript).




The dialog and HUD are roughly equivalent; with the HUD possibly just having the edge since its appearance can be customised, and it doesn't have the button size restrictions. The website option has many advantages, including support for images and larger passages of text, but requires a server somewhere to host the pages, and extra effort in authoring them.

In the video the Dialog questions load slightly faster because they are hard-coded, while the HUD is reading from a notecard. The notecard is the better option because it is easier to produce questions for. The system that I'm using requires lines to start with special characters, as shown below:

?Question
*Incorrect option
*Correct option<
*Incorrect option
+Positive feedback
-Negative feedback
#

I think that this is a fairly straight-forward, human-readable format, so putting together quizes shouldn't be too difficult.

I want to briefly mention Twine, which is a fantastic tool for creating branching scenarios. It was used in the website example to create a linear quiz, but it can be used in much more creative ways. Full details can be found here: http://gimcrackd.com/etc/src/

Sunday 16 August 2009

Second Life Games Directory Update

15 more games added; many more still waiting.

http://thoughtfulmonkey.com/slgameslist/

Sunday 9 August 2009

Crime Scene - Fingerprints

This morning's addition to the forensics simulator was a Comparison Microscope, specifically for the purpose of identifying finger prints (although I'm not sure it's actually used for this purpose).

It is now possible to assign fingerprints to objects in the world, which can be retrieved while wielding the correct tool. Below we have the murder weapon hidden in some bushes. A print taken from this can be compared with one taken from the hand-prints in the bathroom.


Shown below is the comparison microscope itself. Fingerprint samples can be dropped onto the left or right location. If the print is on file then the corresponding record appears on the panel behind; and a match between two samples is also flagged.

Saturday 8 August 2009

Crime Scene - Blood Samples

The Ideal System for Evidence

I strongly believe that in training situations, virtual envi
ronments should only be used to 'fill in the gaps' for things that can't be done in the real-world. For example in this case, it's unlikely that a department would be able to provide a restaurant and alleyway for students to investigate; and impossible to have 20 such environments, so that all students can perform the exercise at the same time. This is what the virtual environment provides.

However, in using the environment I would argue for a system similar to a murder mystery event that I recently attended:

  1. Students explore the environment gathering virtual evidence.
  2. At the end of the exercise a list of reference codes is given for the virtual evidence.
  3. Tutors give students real evidence based on these references.
For example, in the simulation I take a swab from a pool of blood, and after the exercise the tutor gives me a real swab sample to analyse; or finger prints to compare, or fibres to analyse etc. Even if the students don't perform the actual analysis, then they should at least be given a physical report of the findings, acurately matching what they would encounter in the real-world.

I think that this system has a good clear division between gathering (virtual) and analysing (real); which would obviously need to be supported with real-world evidence gathering exercises - but these could be on a smaller scale focusing on skills rather than the environment.

Without wanting to waffle too much, it could be taken to the extreme, where students are given physical props when they encounter the virtual counterpart. Act-UK have a system in this style, developed by make-media; where a cave-style technology provides the environment and real-life actors and props form the foreground.

Virtual Blood Sampling

However, using the above method wouldn't leave me with much to do, and so I've created a virtual blood sampling system as an alternative. As with the UV light, the student must equip the swab to the avatar, and can then click on the object to sample (in the image below it is the victims blood).

The student is then given a sample, which is stored in their inventory. These can be given unique id's for use in the afore-mentioned method, or the student can be tasked with assigning them names and recording where they were taken.
These samples can then be dropped onto the centrifuge; where various characteristics are reported back.

Friday 7 August 2009

Crime Scene - Visual Tools

Camera

To be honest my experience of forensics is based solely on documentaries and CSI, but something that seems important is taking lots of photographs. SL Viewers offer a free-roaming camera that you can position how you want, and a 'snapshot' feature to save screen-shots. So that's one tool ticked off.

UV Light

Another common tool seems to be the UV light, for highlighting various bodily fluids. A representation of this is fairly straight forward in SL; you simply control the switching of textures and emissive colour (see the images below of the suspected escape route - a bathroom window).Note that it was the ladies bathroom, which already gives us a clue as to the muderer.

A major issue encountered is that OpenSim doesn't seem to allow objects in-world to hear messages from HUDs. In the example above the avatar is carrying a UV light, which you click on to toggle the light on/off. This method may be more realistic in that different tools must be worn in order to be used, but from a usability perspective a HUD is often the best choice. We'll see how this affects other tools.

Thursday 6 August 2009

Virtual Crime Scene

There has been some discussion at work about using SL as a tool for a forensics course; and having some free time I thought I'd look into it. The biggest initial surprise when working with the latest edition of OpenSim (with the Meerkat viewer) is that the maximum size limit for prims has been increased; which makes things considerably easier.

The first benefit is being able to lay out a whole floor plan in one piece (apparently produced in some software called SmartDraw).

The creation of buildings is also a lot simpler, since you're no longer restricted to 10m panels.

The scene is at the moment fairly basic, but will allow me to start investigating potential tools and issues. The first concern is the camera angle. In the field of computer games, if a third-person view is used then camera positioning is massively important, and a lot of effort is put into getting it right - with various algorithms used to reposition it so that the view of the main character is never obscured. In tightly enclosed environments a first person view is often chosen instead. In contrast SL's 3rd person camera is fixed behind the avatar, which in enclosed spaces means that the view is often obscured by walls. A 1st person view is included, but removes the possibility of interacting with a HUD. The enclosed architectural spaces appropriate for a forensic simulation may cause some design problems.

Next step is to look at some tools (see below for a sneak preview):

Monday 22 June 2009

Toy Interface

I added the video to YouTube, but forget to comment on it here. Basically it's very easy to make all sorts of interesting user input devices if you can get hold of a keyboard encoder. A keyboard is in essence a lot of buttons connected to some electronics that communicates to the computer. A keyboard encoder is just a stripped down keyboard, which allows you to add your own buttons.

In the example below I took a childrens toy and wired it up to control the avatar in second life. For example the accelerator pedal is wired up so that when you press it, the encoder tells the computer that the up arrow key was pressed. Very easy.

As a rule of thumb, if the toy makes a noise when you press/turn/squeeze/shake/whatever, then you can use that as an input. Just find the two wires that go to that part, and connect them to your keyboard encoder instead.

Friday 19 June 2009

Using the Meerkat viewer


Shown above is the Cannons and Castles game, ported from SL to a local OpenSim grid using the new Meerkat viewer (discussed in several places). It's fairly fast and pain free, but doesn't carry over scripts - and may never be able to.

It looks like a great option for people to backup work, create locally and then upload to SL, or to flee the sinking ship (2 years of sinking and still no change in the water line).

Interestingly the data is saved to an xml format, which is likely to lead to an explosion in tools for procedurally generating content, file conversion and import/export plugins. It looks to be a much more elegant solution than current systems.

Tuesday 7 April 2009

General update

The store at Wild Hollow has gone, but the one at Celebration remains.

I recently started a new job, which includes some occasional SL work, but which doesn't leave much time for game development. Although as part of it I have been looking more closely at other Virtual World platforms; of which the SL clones seem most promising at this point. Once you've replaced the default physics engine in OpenSim, it seems to work fairly well; while RealXtend looks good, but its graphics rendering causes problems with my work and home PCs.

Expect to see some OpenSim-based game update coming soon.

Tuesday 3 March 2009

llDetectedTouchUV function

I'm not sure when it was added, but the ability to determine where the user clicked on an object's surface is very useful. To try it out I threw together two examples, shown in the video below.



Subbuteo:
The HUD calculates the distance and angle from centre of the point clicked, and applies an equivalent force to the Subbuteo player. ((I wasn't impressed by Ronaldo's finishing))

Star Fleet Command III turning:
You can click on the ring surrounding the ship to set a heading.

Monday 2 March 2009

SL Games List

There doesn't seem to be one around at the moment, so I've made one myself:

http://thoughtfulmonkey.com/slgameslist

This will be a steadily expanding list of the games available within Second Life. They won't be reviewed; just briefly described and linked to.

Wednesday 25 February 2009

Store Locations

**Update 2009-06-22: I ran out of Lindens and the stores closed. Now I'm back in the black I might open another, but nothing is decided yet. **

I've been looking into setting up a store for a while; and now I have two. They are 'stores' more in the literal sense of being places to store my games, rather than being shops, since everything is FREE.

The locations are:

Wild Hollow: SLURL


Celebration: SLURL

The Wild Hollow store has a higher prim allowance and so there are some extra decorations, but both will have identical content. Tell your friends.

Friday 13 February 2009

Pocket Battler: Faery Update

... and the faery graphics/model - which still requires some work. In fact it's taking the most work in all respects; control, attack effects, attack coding, and texturing. Let's just say its not my favourite.

Pocket Battler: Pirate Ship update


... and here's the updated graphics/model for the pirate ship.

Wednesday 11 February 2009

Pocket Battler: Tank Update

Just a quick update to show the tailored HUD and new model for the tank Pocket Battler. Area effect damage is now also working.

Sunday 1 February 2009

Inventor's Show & Tell - Joint 1st place

Yesterday (31/1) I demonstrated Cannons & Castles at the Rivet Town Inventor's Show & Tell, and was honoured to take joint first place; alongside Edwardian Halberstadt (who won my vote). It was well recieved, with an enquiry as to whether it was for sale, and recommendations for a tournament to launch it. I should really try and do something with it.

Rivet Town is Steampunk-esque role-play sim, and so all the items shown were in that theme. Yesterday's exhibitors were, in order and as far as I remember:

  • Sidney Arctor: a submarine in the Jules Verne style, packed full of features.
  • Edwardian Halberstadt: a highly detailed film projector, which could actually stream media (including a tiny version on the 'film' at the projection lens).
  • Torus Heliosense: also highly detailed gramophone and 'steam compass'.
  • -- at this point I was starting to feel ashamed about the plain wood textures I'd used --
  • Maximus Ecksol: an ingenious crate that unfolded to four seats and a table.
  • Professor Sadovnycha: a complex suit of armour, in what I've come to recognise as the creator's unique style, with many hidden surprises.
  • Lakhesis Nikolaidis: well designed ear protectors and breathing apparatus.
The event is held every saturday (1pm slt - as shown in the image), and is a great place to see interesting creations; particularly if you have an interest in the steampunk genre.

*update*
The Show and Tell event is no longer taking place, and it seems that the actual Sim it was hosted in might be vanishing from the grid in the near future.

Saturday 24 January 2009

WAIN (Where Am I Now?)

Following yesterday's post, the navigation aide (now named WAIN) can load information from an object in the environment - specifically a UUID for the map, and a list of landmarks. Time for a video - including an unnecessary overlay that I just learnt how to add.



To be a useful system users would need to know where to get these updates, and so I've created a logo to identify WAIN info points. A user arriving in a new location and spotting one of these signs, would then know that some map information is provided.

I have the code to take SL's internal Sim maps and apply it as the texture (taken from a wiki example originally I believe), but the Lindens seem to be in the process of changing the way maps work so I might leave it for a while.

Friday 23 January 2009

Virtual Navigation Aide

Navigation in virtual worlds is a similar challenge to navigating through the real world. You often need to identify the purpose of buildings and environments through visual cues, and identify a route that leads to your goal location. The nature of Second Life makes lots of people lazy complacent about supporting navigation, generally offering teleport coordinates to a specific location without considering how people will move around once there. Buildings are generally labelled (though often just a vague company name), but there is a distinct lack of directions.

This is to be expected through most of the mainland where ownership is fragmented, but many times I've landed in private Sims and had little clue what was North/South/East/West. Where are the "You are here" maps, the paths with signposts, the guide leaflets, or other examples from the real world? A reasonable answer is "Explore", but to have the desire to do so a person must know that there is something there that they want. If you go to a website, and nothing on the homepage hints at what you're looking for, you'll go somewhere else. Sometimes wandering is the goal, but builders should ask themselves "do I want to help people find things?".

The waffle is due to a navigation HUD that I've been inspired to make (below), which shows your current position and orientation within a Sim (represented by the blue arrow). You're probably wondering why it might be necessary when there are map tools built into the client; but one feature of this tool is that a Sim owner can include their own annotated map, highlighting the different areas. And there's more...

The HUD includes the option for setting landmarks, which are shown on the map as the user cycles through them. These are hard-coded at the moment, but ultimately I aim to have them added dynamically. A user could arrive at a sim and be given "todays" landmarks. If you have generic meeting or teaching rooms, then the landmarks could be for the specific meetings or lessons of that day; if a commercial sim has changing store rentals, then the landmarks could update to the current sellers etc.

Monday 19 January 2009

Pocket Battlers - Generation 1 prototypes

At the moment I'm just getting a few ideas out of my head, and into working prototypes. One of these is Pocket Battlers, in which you control a small 'Battler' (as a sub-avatar?) that can move around and launch various attacks. A major feature is that these Battlers can be very different, for example in the first generation with be a tank, a pirate ship and a faery; all with appropriate movement and unique attacks.

The second major feature is the power bar, through which attacks are activated. Power gradually accumulates over time, and each attack requires a certain amount of power before it can be used. When an attack is used then a portion of the power is used up. You can see it in action at the end of the video below (there's no sound so don't go fiddling with your volume control).



They still need a bit of dressing up, but after a few small tweaks they'll be at the stage where they can be fought against each other (I just need to add in area-effect damage).

Credits: The particle systems used for the faery are based on free examples available at 'The Particle Lab' in SL, and the faery's wings are by Jane Mc Carthy.

Monday 12 January 2009

Cannoneer turn controller

Shown above is the turn controller for Cannoneers. Players can click on it to register; the owner of the device can use a dialog menu to join/start/reset the game; and player order is randomised.

It was fairly easy to lock player controls so that only one person can move at a time; and that turns ends either after 10 seconds have passed, after they have moved 5 metres, or (soon) after they have fired (I still have it allowing multiple shots for final testing).

Also shown in the image is the HUD for switching between the two ammo types: shells that explode on impact (small area, large damage), and grenades that explode after 5 seconds (wider area, low damage).

Friday 9 January 2009

Cannoneers - Early Development

One of my current projects has the working title of "Cannoneers". Ultimately it will be a turn-based game where players run around shooting at each other, along the lines of the 3D versions of Worms. Shown below is the Avatar I've built for it, which should give an idea of the style I'm aiming for.

I decided to go with "tiny" avatars, because it adds a sense of fun to things. It's also good practice with the anim overides, squashed anims and invisi prims that tiny avatars rely on.

The animations are done, and cannon power is controlled by how long you hold the mouse down. The next big feature will be controlling turn taking.