Thursday, September 27, 2012

End-to-End

I was sitting in a conference room last week with the IT Den Mother (also known as the Executive Assistant to the mucky muck) waiting for the mobile video teleconferencing unit to be wheeled in for a meeting.

I was trying to cancel one night of a two night reservation for a contractor we have coming in.  I couldn't do it - so I called up the reservation desk of the hotel.  The reservationist was using the same system and she couldn't do it either.  Something that would have taken 5 minutes with a well placed phone call before now took 30 - and still had to be done by hand.  Why do software designers design in pieces?  Why can't they just make it easy for me to go through a process?


I ask myself this with almost every application I train. 
It's actually one of the most important pieces of my job - translate the individual pieces of the application (many of which were designed and coded by teams that don't talk to each other) into something resembling a workflow.

It's the piecemeal cobbling together of different bits of code from different teams (and possibly different companies as organizations merge or get eaten) that create the little quirks that make a number of applications so hard to use.

  • Search in this field using this technique
  • Search in that field using this other technique
  • Use tabs and drop-down menus to navigate in this section
  • Use left hand menus and separate pages to navigate in another section
  • Use the standard model for navigating the application - unless you are trying to do this special, but super-important, thing that requires an entirely different workflow.
This next year - I am on a couple of projects that promise to change the way the University does things.
Big strategic and cultural change projects.

I'm going to have to be the IT Den Mother's voice.

How can we make it easy for the folks we work with to do what they need to do?

Tuesday, September 25, 2012

TDRp: The Reports

In the TDRp - once you have the statements, you can then generate reports.
Essentially - re-packaged statements.

The white paper currently recommends three reports:
  • The Summary Report - goes to high-level executives and shows progress against high-level goals
  • The Program Report - individual reports for each program for the learning execs
  • The Operations Report - an overall view of course usage and L&D costs - pretty similar to the high-level efficiency statement, but with some project management numbers for items under development.
Looking at where our current gaps are - the Operations Report might be the easiest of the 3 to implement.
We have a project management system we can get those numbers from.  And the LMS takes care of course utilization numbers.  The biggest block to implementing this report is the cost numbers and where those might be housed.  I am hoping that my discomfort with whether we can get this information in an automated way is more a result of my ignorance of our budgeting and purchasing system vs. reality.

The Summary Report and the Program Report requires defined programs. 
Something I talked about in an earlier post.
Without those defined programs, both of these reports will be impossible to generate.
Once we DO have those programs - the money numbers and where those are housed come into play.

Thankfully - getting the cost numbers in a place where they can be easily extracted will take care of the gap in all 3 reports.

I suspect the strategy / program discussion will be a more difficult issue.

--------------------
So the heartening thing about this analysis is that we are not as far away from being able to implement this model as I feared.  We're working with more than I originally thought.

There is still a LOT of work to do in the meantime.

The exercise helped me.  I hope it helped you too.....


Thursday, September 20, 2012

TDRp: High Level Business Outcome Statement


















This one will be the toughest nut to crack.

Mostly because we don't have a central decision-maker leading our training initiatives. 
And the greater strategic plan seems a bit vague from my view at the bottom.

No real "programs"
No solidly defined objectives
No quantitative goals that actually align to those objectives
Activities that (in the best case) only roughly move towards the ill-defined goal and objective.

I say this because I know for fact that our organization is not alone.

To my current employers credit - they are working hard to move that direction.
Baby steps.... but any progress is good progress.

Until there is some sort of focused initiative or defined set of business outcomes that the entire institution is expected to work towards, we will be hard-pressed to even begin coming up with this statement.  Much less the appropriate measurements.

(I am expecting to hear an "I told you so" from the Data Whisperer any day now....)

Tuesday, September 18, 2012

TDRp: High Level Effectiveness Statement


















The big thing stopping us from quickly implementing this statement is surveys.

Various groups are better at implementing surveys than others.  And we don't have a standard survey tool.

Of course, our training "organization" is organized like Al Qaida without the single-minded focus (a whole 'nother issue).

We would also need to decide what programs are worth the time spent surveying. 

We can't just do blanket surveying because our organization is very averse to distributing surveys without approval from various groups that seem to appear out of the woodwork whenever one threatens to try to get information out of people - such as "did you like the course".

There is also that "truthiness" factor.

How many of you just circled 4s or 5s on a survey and over-estimated how "important" the class was to your job just to a) get out of there and b) justify taking more classes later?

Or am I alone in this one......

-----------------------
Level 5 numbers may need to be collected using a combination of duct tape, string and chewing gum. data resources.

Again - I need to find out where these numbers are stored and which systems we would need to pull the data from.  I fear that there not be a real "system" at all.

I'll probably spend more time calculating numbers once I find them.
----------------------

ROI numbers can be calculated using the formulas I noted in the post
Calculating Return on Investment and Benefit / Cost Ratio

Other useful learning metric posts:
- Monetizing Program Effect
- Another Way of Looking at Learning Metrics

Thursday, September 13, 2012

TDRp: High Level Efficiency Statement


















This should be the easiest statement for us to implement.  Many of these numbers are already available in our LMS.  The big challenge will be capturing learning object types since we need to better differentiate between classroom instructor-led training, virtual instructor-led training and blended training.

The bigger version of this document (found in Appendix II of the full white paper) also includes Utilization, Reach and Development metrics.  Utilization (how popular the courses are) and Reach (the percentage of employees who used the resources) can also be at least estimated from our LMS.

The program management numbers can be estimated from our project management system - though these numbers are a bit rougher due to some project intake workflow issues my division is currently working through.
 ---------------------------
The TDRp also accounts for other types of learning interventions such as social media, performance support tools, and other types of "non-course" interventions.  The defined learning types in this example just looked like an easy kill since we already have these in our LMS.  Furthermore, we don't currently have tracking of the other types of interventions - at least, not without begging our web people to help us out. And that, honestly, would be a pretty large project.

I have suckered convinced one of my current project teams to allow me to use them as lab rats experiment with ways of differentiating the virtual sessions from the classroom sessions.  I have gotten some excellent feedback from them and hope this will help me get some reports separating classroom / virtual / blended.

------------------------------
The second, more pressing, challenge is with costs.

I would like to think that our costs are tracked in some sophisticated system that exports in an easy-to-read, easy-to-manipulate format.

I strongly suspect that our costs are found in various pieces of paper, napkins, spreadsheets and electronic detritus.

I'm not the money person in our department, so I am not privy to how they track these numbers.

--------------------
The detailed version of the Efficiency statement separates this high-level information into various important programs.

I think once our organization actually defines what these important programs are, and what is in these programs, we might be able to create one of these statements.

That's not a data / computer issue. 

That is a people / decision-making issue.

Not sure what to do about that one.

Monday, September 10, 2012

Teaching a Thought Process


On a bit of an LCD Soundsystem kick.  Not sure this has anything to do with this post.  
My blog...I can do what I want.

------------------------
During one of my earlier zombie project reanimation efforts, the Data Whisperer asked me
"How can we teach people how to understand databases, how they work, how to design them and how to get information out of them?"

A year or so after that conversation, staring at a project intake document for a project called "LMS Data Integration v3," I thought "Guess we are just about to find out."  

Our content library has spiffy courses like "The Logical and Physical Database Design Methodologies" and "Introduction to Relational Databases".  Just reading the descriptions and objectives for these courses hurt my head.  And I have a lot more exposure to this sort of stuff than the audience we had in mind.  I don't see our audience making it through 30 minutes of these courses without throwing bricks at their monitors.

How do we get non-technical folks who have a vague idea of what they are looking for to think in terms of defining the information they need, the format they need it in, and where to get the pieces from?  If only to better help the technical folks help them.

--------------
As I've mentioned in a previous post - I've been picking through the Talent Development Reporting Principles (TDRp) information.

Because I figured if I can define the rules, I can win the game :)

That and now I have a document I can point execs and co-workers to that is written by folks that are not me.

Familiarity breeds contempt, ya know.

Sometime this weekend, while watching RGIII pick apart the New Orleans Saints defense, it dawned on me...

Backwards design!

-----------------
The TDRp white papers have sample reports.  What if I looked at each of those reports and figured out where we would get the information for each section from?

Maybe if we figured out what our end-state reports would look like, we would have a road map towards developing data systems that would give us what we need.

The analysis would (hopefully) give me an idea of what I can do now, what needs to be changed, and where (or if) I can get the other information.

Over the next few posts - let's see whether this idea works.

Thursday, September 06, 2012

Live Streaming Test Results

(Update:

UStream recording of our Bocce match

--------------
Our results focused on the following:
- How aggravating was it to set up and administer
- How aggravating was it for the end-user to access and watch

We didn't really worry about the participation piece - but I made comments on that anyway.
Because I know someone will ask.

Blackboard Collaborate 12 - We have been using Elluminate v10 and decided to test the most recent version of Blackboard Collaborate.  Better video streaming and audio quality than prior versions - by a lot.  I wouldn't wish video conferencing in Elluminate v10 on my enemies. The user needs to know how to expand the screen.  Otherwise, they are going to be looking at teeny tiny people in a small window. At least until we can figure out how to auto-expand the window for the viewer in the new meeting interface.  The built in interactivity features are still available.

Biggest bonus for us - it is currently our main web meeting tool, so we won't have to go through the security process.  The other thing we like about Blackboard Collaborate is that we can more easily limit who can access the live meeting and the resulting recording.  We were also familiar with setup and sending meetings - so we can't really comment on the learning curve for video-meeting administration.

The Blackboard Collaborate 12 UI is very different from Elluminate v10 and requires some getting used to.  This slowed our test setup down a bit. 

UStream - I played with this at Innovations in eLearning 2012.  Not bad with my netbook and cheap webcam.  This time, we had a nicer (larger) laptop with better multimedia cards and (slightly) more stable wireless.  On the broadcasting side, the video was choppy.  However, the viewers reported that everything seemed smooth and audio/video quality was decent.. Chat and social streams are available, but they are awkward to get to in the interface.

This would be best used for things that you want to be fully public and where you don't really care about interaction.  I know our HR team is planning on using this service for some of their Service Excellence keynotes.  There are limits on free UStream (100 viewer hours) and they are more aggressively marketing the paid services, so we need to keep an eye on that.

Google + Hangouts - We looked at Google+ Hangouts since Google is adding the Google+ features to the enterprise services.  Great for 12 person conversations and meetings.  Not quite appropriate for the application we were considering. Setup was a little awkward through the Events feature in Google +.  Straight Hangouts is interactive via video chat.  Didn't see other types of collaboration.

I will also admit I am not very good at using Google +, so these comments may be reflect a PICNIC (problem in chair, not in computer) error vs an application design flaw. 

Craig Wiggins was kind enough to join us at the beginning with his Android.  He reported that video and audio quality was pretty good (for a cell phone).  He also made us jealous by joining us from the Udvar-Hazy Air and Space Museum.

A couple of features we didn't test: the "Hangouts on air" feature that would have been closer to the application for this tool that we are considering and the collaboration features through Google Drive.  If our organization decides to turn on enterprise Google +, we will investigate these features further.

Our in-house testers said that audio and video quality were good.

Update: I talked to Ben Fielden in person after he posted his comment. When he is talking about above the line vs. below the line, he is talking about what we have purchased (appears in the administrative interface above a line) and the stuff that they want us to purchase (stuff below the administrative interface line).  So if you wonder why your enterprise IT is not terribly excited about implementing that cool new feature you get for free, ask Google.

ooVoo - A non-starter.  Requires a download that also adds one of those aggravating toolbars to all of your browser windows.  No one could get into the link, and even if they could, I wasn't in the position to answer the phone.  A pain to set up.  Just....no thank you.

---------------------
I'm going to send the results to the rest of the team for next steps.

Again - this will be an ad hoc, temporary recommendation while we get our official, formal solution for this type of thing set up.

Tuesday, September 04, 2012

Ad Hoc Live Streaming Test

We are starting to see demand for Live Streaming services.
IT has a project underway that will address that formally.  High quality, fancy equipment and all that.

This is not that project.

We decided to take a little time to test out cheap/free solutions to suggest to our clients while we waited for the big "everything communications" project (somewhere around 2014).
-------------------

We tested 5 tools - Elluminate, Google Hangouts, vTok (for the iPad), UStream, and ooVoo

Yes - I know that some of these tools aren't designed to do live streaming, but we had it lying around.

Also - Skype was taken out of consideration because of security issues.  
People use it anyway - but IT is reserving the right to point and laugh when they have a security issue.

We asked our audience for feedback on the following:
- Video latency - Does the video freeze?  Look jerky?
- Audio latency - Does the audio stop?  Do I get "chipmunking"?
- Ease of access
- Does it freeze the computer?
- Any comments regarding features and user experience.

Testing Protocol
We decided it would be fun to use a game of indoor bocce for the test.


This allowed us to test the following:
- Motion
- Visibility from standard "speaker" distance of faces AND (hopefully) PowerPoint presentations
- Audio from standard "speaker" distance


And how often do you get an entire abandoned space to play in that still has power and wireless?
We had to do SOMETHING fun before we all moved out ;)

Testing environment considerations- This space is in the basement of a dorm.  With all the kids. And their movie streaming.  We felt that this provided the best "worst-case scenario" short of a hotel ballroom (which this space used to be).

- Relative video quality - we are using different web camera setups. We plan to run another test focused on web cameras later.

- Relative audio quality - again, we are using different microphone setups.  We plan to run another test focused on audio input devices later.
----------------
We will have the results of these tests in a later post.  And maybe video :)