Thursday, June 03, 2010

#IEL2010 Knowledge Management Interface

Presenters: Michael Lambert, DAU
Raimund Feldman, Fraunhofer Center

Challenges
- Increased push for 24/7/365 learning, performance support and reach-back for better program outcomes and success
- Infrastructure and human capital resources must be added to support a DoD acquisition workforce expanding from 126K to 186K
- Culture change from COURSE management to LEARNING ASSET management
+ Both formal and knowledge sharing assets

Good News / Bad Nes
- DAU has world class assets
- But explosive expansion of formal and informal assets
- Inadequate faculty and staff hours to manage the assets
- Good news: LAMP created to focus on process and resource solutions

Trying to take policy of asset management progeam to more informal asset
- Better technology

“Best practices always recommended but
- Too many lists
- No basis for selecting specific practices
- Proof of effectiveness not often available
- How do you apply the practice?
Practice’s success factors not well understood or documented

BPCh Content “Flow”
- Clearinghouse for software acquisition best practices
- They help shape and vet
- Leads from various sources (seem to be encouraged)
- Put through vetting process
- Rejected leads failed test for quality go out
- Rest go to publicly available content

BPCh Content type
- Practice: Needs to be repeatable, actionable, documented.
+ May include commercial offering
+ Gotta include how to do it
+ Need to be able to create empirical data
+ Practice area: a general area (ie Risk manageemtn)
+ Lesson learned: after the fact report. Good advice without enough detail to be clearly repeatable

- Evidence: Context to the practice. Can be browsed
- Lead – Potential information for inclusion in BPCh. Could be feedback, story, lesson learned, experience etc
- Story – narratives or “war stories”, Lessons learned

BPCh leads with a different approach
- Stores EVIDENCE information about how each practice was used in different situations and the results
- Acknowledges that not all good practices best
- Content includes descriptions of past results in context

Context – sensitive search environment.
- Show me JUST the practices that programs like mine have used.

Situation: Find supporting material for specific task / situation
- Task: Write an essay
- Supporting material – PDFs, we-sites etc

Traditional search and retrieval
- Library style (facet classification). User browse (like the SkillPort catalog)
+ Users need to know the Facets and classification

- Full text search (“Google search”)
+ Users must use keywords that are in materials. Confident your users using the RIGHT keywords?
+ What about pictures / videos etc
+ Will it show you what is IN the document?

- Tagging (additional keywords / information)
+ allow to add keywords to find materials. Often manual. (pray the keywording is consistent)
+ User must still know”the “right” keyword
+ Some implementations (if not flexible) have a check the box in the back, then drop-down for the user. At least ensures keywords are relevant and known. Just a first step. But the keyword can be meaningless to the end user.

Improved Keyword Tagging
- The one thing you know for sure is your current situation
+ The task / situation is KEY
+ Use the user’s context information as a search index

- Situation-specific Browsing Views
+ Guide user to material by task/situation specific information
+ Facet information combined with tagging
+ Works with any structure that describes a user-specific situation (e.g. process description, handbooks, TOC, career fields etc)

(without seeing the system at this point…it sounds like Context-sensitive help in a computer program, like Office 2007)

A list of browsing context views on the left.
- There is a search box.
- Browse content views if know what looking for. (multi-level – much like a folder structure)
- Will bring up list of practices.

Organized – “What is the position you are in…(job)
- General processes across the organization

Could also do as “process chart”

Focus on the user’s information and point of view. Guide person through the knowledge.
- What is the specific situation of the user? This will help organize the flow.
- Tagging the information is secondary
- Offer structure / lists that the user knows.

Word of caution
- Browsing views should not be used alone – use as add on
- Use this in ADDITION to common approaches such as full text search and tagging
- As with all (manual) tagging approaches – cost intensive

----------------------

Concerned with getting the government workforce
- Already some standardization and standard steps
- Took most common process descriptions and used as prototype

Start with standard process handbooks

Talk to people. Get typical scenarios

Already have DoD / DA taxonomy
- Certification career fields
- Broke out software acq. Mgmt
- CmmI acquisition module
- System Engineering Plan

Other browsing perspectives – incorporation for 2.0

Also – user-assisted tagging.
- How are you going to tag an asset (user).
- Getting help from community of practice

Subject Matter experts – ones doing the tagging
- May come back to help flesh out.

General idea: User-centric. How is your user thinking about the information.

Tools are just enablers.
- 5 minutes you spend tagging, save time for everyone else!

Use the user terminology – not SME terminology.
- Try to make easy to allow both users and SMEs to tag.
- Museum – also used popular search terms and added the tag as people found stuff.

They have clear systems, clear classifications and access to SMEs

Tough to build structure of this sort when organization does not have stable processes.
- Can at least use Job Description

Use the masses for tagging – scan user statistics. “Most popular / auto-detect”

(Audience: some work on personal profiles / portal)

Vetting process – currently building out network of SMEs
- SMEs can log into their subject matter clearinghouse
- Taken processes for governance –
+ what SME needs to do, Content mgr needs to do.
+ put in sharepoint (base of BPCh) – assign tasks
+ Take leads and let you know what you have to do.
- After 1 year – will be automatically asked to review existing assets

(this seems very similar to what we are trying to build with our web site back-end)

Medals / Levels
- Initially lead – being vetted
- Silver – more resources attached.
- Gold – with lots of evidence and resources + summary written taken account of all data. “Proven practice” Will be piloting this process this year.

This is a proven practice vetting system.

Army – has established personal vetting of fighters
- capture their experiences – verbal history / lessons learned clearinghouse.

Tracking the quality of a process
- In 2.0 – discussing follow-up surveys, likert scale etc
- Get more information about end-user value of the asset
- Pull into datamart – “are we actually providing value”

2 sides of quality – quality of document (accurate, easy to use), value

1 comment:

Anonymous said...

I look forward to your english-language translation of this one. I'm a DAU "graduate" so I'm curious what you think of them..