Thursday, June 07, 2012

#iel12 Teaching and Learning Lab

Had an a-ha moment during the keynote.
OMG - I'm trying to build a one-person Teaching and Learning Lab!

Most interested in the requirements process.....

Presenters: Judith Bayless, Janine LeBoeuf

Teaching and Learning Lab - current iteration 14 mo old
- Before: looked at HW and system or physical setups
- Was looking at what is cutting edge

Which components of DAU's mission can we help meet

Sometimes had conflicting requirements

Trying to provide leadership with information to make strategic decisions
- Before: I know the problem, here is a solution.  Let's buy!

Why build?
- Research and assess technologies, methodologies and best practices (the non-tangible pieces impt.)
- Enable research with minimal cost
- Do you need a physical space? 
   + Are you swapping out equipment? (my
   + Do you need a space where someone can dress-rehearse a classroom-training technique (my answer ... no)

(My case - for hardware, we have a lab space within IT that I have easy access to with folks who have done thorough testing and will ultimately support.  Lucky me.)

How do we make sure our dollars last as long as possible?
How can we dress-rehearsal a new learning technology (material and intellectual) at low/no risk?

- Maintain a high level of flexibility
- Separate network but mirror DAU production
  + So don't wreck production + legal BUT need enough mirror so smoother transfer
- Speed up the evaluation / testing process
- Get result - "feels like" the right thing to do when have the idea.
  + As soon as budget support - clock is ticking

Year 2 - need results
- everyone understands build up time
- After year one - whatdya got?

Classrooms is one of the scarcest resources at DAU
- Bigger stress on results
- They got funding for remodeling - all screens touch-enabled
- If need physical space and remodeling - learn duck and cover really fast.
- Just think of remodeling a house..... (thank goodness I don't need to do that)

Lab is for everyone - DAU institution resource
- Converting elearning
- Video
- Learning software or evaluating software.....

Can build on who needs you - and who needs you to do what

DAU likes to benchmark. Leadership understands this.
- Compare yourself to other institutions  (what other Universities have learning labs)
- MIT Georgia Tech, Deloitte U

- Similar issues
  + Difficult finding the right performance metrics for success
  + Marketing challenges
  + Small pool of customers

- Dissimular issues (but apply to me)
  + Evaluating more COTS technology products than their benchmarks
  + Available SMEs too busy to support initiatives - we have a shortage of experts
  + Classroom space issues

Mission to practice
- Just curious
  + Find available software already in-house and demo.
     -- usually triggered by conferences
     -- Help expand use of stuff we already have.  Better than reinventing wheel - wasting time and money
    -- Make right connections
  + Request a Demo - hey!  where can I download the demo copy! "Brochure test"
- Need a Business Case
  + Request an Evaluation
    -- Find solutions.
    -- Is this what you really need?
    -- Are you going to have to change the business process too?
   -- What do you really need to have satisfied?
  + Request Beta / Pilot Support
    -- Does it work in a learning environment
    -- (Does it work in our OWN environment?  What's gonna break?)

Evaluation Methodology
- Initiation
  + Request interview / form
  + This is the contract point for their service
  + Ask whether there is existing technology here

- Planning
  +  This is where they find and bring in the stakeholders
  +  Schedules and milestones
  + Also curricula development, implementation and updates

- Research Analysis
  +  Where not sure where they are going.  Testing the decision
  + Or....looking for options / setting up criteria
  + Requirements collection / what measuring products against.
     -- This can be painful (no kidding)
    -- But the bonus is that you are not going to move forward on something can't define well enough to create a business case
    -- Why are they chasing it?  Sounds good / Politics
    -- The contract is then very important. CYA if they don't get the "result" they "really" want

- Evaluation
  + Stand up in environment
  + UAT / Participation Assessment

- Business Case is the end output OR determine it is not viable.
  + Long process with cost avoidance

Context shapes and changes each research project.
- Not going to drive someone through this who just needs a software demo

Use the intake process to find patterns and create templates for types of requests.
Also ensures transparency

No top secret info - nothing hidden going on in lab.
- Will need to respect need for confidentiality.

Once far in the process - the Lab is not hiding anything from anyone.
- Part of their process

Questions come in waves
- Darrell Hitton - eReader research
  + Pointing them to their own research (teaching them to help)

They developed dashboard to measure performance
(they are measuring business collaborations, demos, formal training, leadership meetings, other and total events.  Also looking quickly at what is finished vs. what is in the pipeline)
(I may need to look at who gets in the process and who completes + outcomes.  Might be time to do a quick eval - shame I didn't collect the info in a more comprehensive place)

Lab - Fail Here!
- Great place for projects to come to die.
- Saving resource investments on unsupportable deployments
(Note to IT - we need a place like this with reward / not penalty for killing bad projects)

- Virtual worlds (they did demo a few years back).  Not stable enough
- SocialText - collaborative environment, developer intensive and did not accommodate business processes.
- LATIST - requirements management / course development, but maintenance cost-prohibitive)

The research is labor-intensive
- May be dependent upon someone else - see Virtual Worlds
- Go back to IT (and any other implementation and maintenance stakeholders, ie the folks who have to live with this stuff) - "Do you have problems".  Gotta wait for the answers - before even TESTING

Have to keep asking - what do you REALLY want to do with this?
- Basic minimal things

People know what they need, know end game, but not how to get there.

Lab - did this help great ideas to thrive? (thanks Stephen Martin)
- Getting folks to think about what is already available
- Getting folks to think about real use cases
- Cost avoidance is considered a success
- Give voices a chance to be heard that leadership listens to.
- (from one of their faculty): the fact that stuff comes out TESTED vs untested is a huge plus
- (another faculty): Faculty gained better understanding of IT solutions.  (They essentially served as translator between Faculty and IT).  Improved relationship and reduce choices.  More consistent process for getting product through

They also are a place where people doing courseware development - which tool should I use?
- Allows hands-on.
- They don't see what time / effort saved as a result

(Course designer) - they also assist with outreach and communication.
- Help them understand technology
- They allow folks to divide "nice to have" from "need to have"
- Saves him, personally, time.

Challenges and Discussion
- It is not REALLY about the data (but don't dare not have it)
- Sustainment - now, next, future (how quickly will you need to replace this?)
- Flexibility
- Leadership

No comments: