Tuesday, April 15, 2008

A View from the Trenches of Simulation Development

Presentation: A View from the Trenches of Simulation Development
Presenter: Alec Lamon,: Senior Director, Wharton Learning Lab - University of Pennsylvania

Hoping to find another model for the type of instruction our team (and our soft-skills colleagues) hope to develop over the next few years.

--------------------------------------------------------------

Their learning lab is a result of a donor. Part of the computing group in the Wharton school.

- Partnering (with faculty authors)

- Exploration
+ They try different experiments.
+ Faculty just needs to teach. Not need to teach in a particular way.
+ Have a faculty willing to experiement. Some successes and failures.

- Alignment with School strategy
+ Important to align case with goals of organization

- Technology
+ He comes from technology background.
+ Make sure choices match what trying to accomplish. Don't drive from cool.

Exploration
- Always constant pressure that technology magically transform learning.
+ 1950s - TV. Not change way people learn. Just more access. But only 1 way (talking head)
+ 1990s - Distance Learning. Tried TV studio where beam faculty to learning centers. Small geographically located learning groups with interactive videoteleconferencing. Still very lecture-based with limited discussion. (Much like my MIDLN experience.)

- Before - unstructured experiments. Lots of individuals experimenting.
+ With learning lab - help find the experiments that work and try to formalize them. See if will bear fruit in larger scale.
+ "Core" classes - the 1st year MBA classes. When looking for impact - looking for impact in Core classes - will touch all MBA students.

Agreement: School / faculty author gets resources from Lab, but school owns the intellectual property of the result.

Why some experiments don't work - does not align with strategy of school.
- Your experiments should match the goals of the org.

Wharton Business School / Learning Lab
- World Class Faculty / Faculty acceptance and adoption
+ Don't want to embarras the faculty
+ Very important that what Lab adds works. No point in student doing pre-reading then the meat and potatoes of the course not work.

- World-class education / Increase student engagement
+ Intersperse cases with games etc.

- Brand / External adoption
+ Being able to have stuff used at other schools with Wharton's name is a huge brand builder (and $$$$$$)

Match strengths of school - Educational Methods. Try to do this with their projects and measure against some of these design goals. (Not trying to match ALL design goals, but usually apply 2-3 goals per project. Wish the graphic was posted).
- Open ended outcomes - may not necessarily be the greatest number, but also graded on the strategy.

- What's the point - not give the point up front, allow faculty member and students to analyze resulting data together.

- More than meets the eye - not give the variables up front. Hide information may have at the beginning at a half-way point.

- Just do it - actually do and participate in the actual theory

- Encourage dialog and discussion - Can assign item and discuss with rules (student-generated) and how the team will accomplish the item. Can show group results. Can change rules as see results. All must work together.

Faculty Acceptance and Adoption
- Over 30 applications. MOST in use. 50% of ones aren't used because faculty left and no-one pick up ap. 50% didn't work (technically, pedagogically, or process failure)

Student Reaction
- 83% students said computer/web-base tools increase attention and engagement
- 78% satisfied. 5% negative, rest neutral
- in past 2-3 years, student clubs approach to use applications outside of classroom.

(Most of Wharton Lab's applications are PART of class. Not the entire class. Still need good instructional design and overall course structure).

Metrics to start for satisfaction
1) Getting into core classes. Within first 2 years.
2) No real metrics set (ie - want 60% of students be happy)
3) Faculty - wanted to have proposals increase over time. In beginning - 3. Now - 13. Success = more proposals each year.

External adoption = internal respect (esp. when others pay for it!!!)
- OTIS - commercial application. Bundled with textbook. 9,000 students in past 5 years.
- Stanford, ISB, INSEAD also using applications. Leveraging faculty/author relationships.
- Not commercialize more because realize it's a much larger than thought. Found external-focus split team too much during OTIS experience.
- Don't forget your vendors (Adobe).
- Can also leverage expertise in the tools.

Partnering with Faculty
- Can't get SMEs to join you - have nothing.
- Where to start?
+ Core classes
+ Find the more developed experiements. Faculty doing stuff - but not in good form (paper-based games, excel).
+Find prototypes - low-hanging fruit
- Extend depth, not reach (is this a problem most eLearning has had in terms of faculty adoption? Focus on reach, not depth - improving what they already do?)

Paper-based games and protypes are your friend.

(Is your solution going to make the faculty's life easier? This may be a primary question for adoption.)

Successful gating process
- Method to submit proposals. Refine. Academic directors and faculty approval committee make the decision as to whether to pursue the project.

Worst thing you can do - be in the "wrong quadrant".
- Faculty expects the world (high expectations) - but won't do anything about it (low engagement).
- Faculty must be highly engaged, while expecting a lot.
- Don't go begging for projects. Faculty feel like they are doing YOU a favor.
- Trust your technology instincts. May not know pedagogy, but do know technology. Do NOT let faculty make technology choices for you.
- Scope creep - have to stick to specs and make sure they are reasonable - esp. timeline.

Early Pitfalls
- From 2nd round of proposals. Begged for projects.
- Faculty said want all 800 students to play this, real time. No time in class so much do from home (on web).
- Technologists knew real-time data over web not very doable at time (2002). Said OK anyway.
- E-mails from faculty wanting more stuff too. Kept saying OK.
- Conclusion - didn't work pedagogically, didn't work technically (crashed servers), students very upset.
- Never used again.

How much flexibility does faculty have after "Approval Process" to make changes before Beta.
- Try to have signed letter with specifications early in project.
- Learned to say "No" to faculty.
- Must demonstrate what asking for will affect timeline and/or functionality.
- Academic directors can also serve as mediators. Problem children to them.

As a result of learning to say "No," gained reputation for knowing what doing.
- Expanded consulting services (6 years later)
- Enhancements to 3rd party aps.
- Advice on simulation choices and management
- Earlier intervention to impact decision-making

Technology
- 3 main factors
+ Hardware - leverage infrastructure have (not a hardware shop).
+ Software - use standard tools unless something very particular. Tools match what organization already has. ColdFusion, Flash. Has flexibility, but most data-driven, web-based and had infrastructure.
+ Human (DON'T IGNORE THIS!!!)

Team
- Strong technical skills. Database-design (ground-up), PHP, ColdFusion.
- Strong dedication to customer service. No separate technical support line.
- Quick studies with diverse interests. Don't know where next project coming from. Must quickly understand what faculty trying to get across.

Software choices
- Phase 1 - ColdFusion + Flash. Game more flexible web app development.Allowed to build interactivity

- Phase 2 - Flash + Remoting. Finer control of interactivity. Caveats - interactivity creep and too time consuming. Build usually 5-6 months time, Flash took longer. "Magpie development" - adding cool shiny things that don't add to the application, may even hurt (i.e. pegging the processor).

Technology must be in service of the learning!!!

- Phase 3 - Flex + Data Services. Must faster way of developing rich apps.

Little surface commonality. Underneath - similar techniques.
As much code reuse as possible.

Everything they do is custom development. Choice made early on. Not sure if it was right choice.
- Because on diverse set of databases, can't really do externally. Lots of faculty support for each application, especially in early going.

Revision process - communicating with faculty
- Built own bug-tracking software (not sure why).
- Faculty - email or meetings. Encourage to get up and go over. Use teaching schedule and meet after class.

The applications designed to be used by faculty member - can't just use out of the box. (Another problem with scalability). Lots of faculty training before let them loose.

7 developers + director
- Director still does some development
- Everyone else does everything (meet with faculty, databases, ColdFusion coding, testing, support). Plus most people like doing the design piece.
- Silos - harder to keep people happy.

Measurements - nothing formal, but should. Do the one survey.

Conference handouts

1 comment:

DrBob said...

keep it coming -- all good stuff