Thursday, December 03, 2015

Relating the Learning Ecosystem to other Ecosystems

Click the picture to view a larger version.

Again, this is NOT the direction we have decided to head. 

NO decision has been made as of this writing and the picture does not reflect what is going on with my employer.

I am putting this here as an example.
In the proposed architecture picture above, you may have noticed some colored rectangles.

I am following some ecosystems and roadmaps that impact our ecosystem.

  • Identity - Corporate / Staff training relies heavily on the ability to divide people into groups.  By org chart (who reports to who and which department). By job type (ex. all acccountants across an organization. By job task (ex. the person in this job needs to view and work with personal health information).  Efforts in this area determine how easily we can assign and push materials to particular groups.
  • Document and File Management - Where we stash our stuff. I don't want to pay for and maintain an entirely separate solution. I want to work within where everyone else is stashing his / her stuff.
  • Communications - These tools are how we deliver live online training.  I pulled out WebEx since this is the primary tool we use.
  • Business Intelligence - this is not listed as a roadmap in this diagram, but more of a catch-all for reporting. The current thinking is that we are going to try to maintain as much reporting as possible within the host system (in this case, the LMS). This thinking maps to the Business Intelligence unit's roadmap. 
  • HR Roadmap - this is the roadmap of our primary staff-side stakeholder
  • Academic Roadmap - this is the roadmap of our primary student-side stakeholder AND the group that, in this model, would serve as the Learning subject-matter expert.
Identity, Document and File Management, Communication and Business Intelligence Roadmaps all impact the tools we have at our disposal.  One of the guiding principles of our ecosystem is that we use what everyone else uses as much as possible.  No point in doing the same functions in a special "learning" solution when everyone is working elsewhere.

The HR and Academic Roadmaps help me stay in alignment with the goals and direction of our primary stakeholders.  Through those roadmaps, I am able to see what is important to them, where this ecosystem needs to connect to their ecosystem, and what direction they plan to head. Ultimately, any Learning ecosystem needs to work for these two stakeholders.  They are the ones who will use the solutions coming out of this the most.

Tuesday, December 01, 2015

Architecture Vision - Developing Learning Ecosystem v2

This is just one option.  This does not reflect the direction we may actually head...

I have found that 6 months - 1 year out from the end of the LMS contract (or whatever the "core" of your ecosystem is) is a good time to re-evaluate the ecosystem and take a close look at the environmental factors surrounding the ecosystem.

In our case, there are some major environmental changes afoot. 

1) The chance of us being able to have one learning ecosystem (vs separate ecosystems for staff and students) is greater than it has ever been.  It never made much sense to me that we had 2 separate and practically parallel systems doing pretty much the same thing.
  • The biggest risk we are taking by having just one ecosystem is that we are giving up control to another group. In this case, academics. Which is where it rightly belongs.
2) Our leadership has finally realized that how we do business is unsustainable.  We might be able to afford multiple systems doing the same thing now, but we won't be able to afford that much longer.  The willingness to make some hard and uncomfortable decisions is much higher than I've ever seen it.
  • In execution - there are going to be a lot of negotiations around people's processes and what is REALLY a mandatory requirement vs a nice to have.  I see those conversations (the ones that are forcing people to change) getting pretty ugly.  I hope everyone is ready.
3) The technologies that are available for the ecosystem are very different from the ones that were available when we first designed it.   The technologies we now have available are a lot more flexible and interchangeable than before.
  • Our lead architecture teams are very focused on flexibility and interchangeability.
  • These teams are also focused on "cloud" - which is going to force everyone to be a heck of a lot more responsible for learning their toolkits without leaning on excuses.  
  • Where our environment is headed is going to force a lot more accountability for one's own learning and mastery of his/her job. The same people who whine that "they weren't trained on the tool" are the same people who are perfectly capable of figuring out how to use a new cell phone. "I wasn't trained on that" is not going to fly as an excuse much longer.
4) Our IT group is now thinking in terms of roadmaps for those technologies. As a result, I have a better idea of how the ecosystems that touch the Learning Ecosystem plan to evolve.  I can plan the next iteration of my ecosystem to account for and support that evolution.
  • The Learning Ecosystem is well-placed to relate multiple roadmaps.  Because, ultimately, people use a collection of technologies to work. Not just one type.
In my mind, these are positive environmental changes. 
  • The recognition that systems have not just a purchase cost, but a maintenance and operations cost.
  • The recognition that we do not have endless pools of money (or people, or people's time and energy)
  • The recognition that we need to plan and think longer-term.
  • The recognition that we need to start saying "no".
In a recent meeting, one of my favorite mid-level managers said the following:
"Probably the best thing we can do is take the checkbook out of everyone's hands. THAT'S going to force an awful lot of conversations, compromise and creative thinking. We've been too comfortable for too long."

I'm gearing up to get mighty uncomfortable.
And to give up a lot of what we've been doing for the past 7 years.

It's time.

Tuesday, November 17, 2015

Business Architecture - Validating Workflows

I feel like I don't use enough random clipart.....
This was found under "sharing" in MS Word 2010.
You're welcome.
Once I have a few process workflows completed and the members of my team are happy, the next step is to show them to others.

I don't need all of them.  I just want to show a few examples of the "least challenging" and get a better feel for how much these stakeholders want to participate.  And I want to do this really quickly so I get input before I get too far in the process.

The workflows  are meant to be conversation-starter.

For the process itself -  "This is what we are doing?  What's your process? What's working? What's not? What would you like to be able to do?"

For architecture in general - "Is this something you would like to participate in?  Is there something you would love to see coming out of this effort? Does this flowchart format make sense to you without me explaining it?  Are you interested in seeing some of the other work we are doing to make sense of our environment?"

If all goes well, these little bits of outreach and relationship-building is the start of a beautiful relationship.

When having discussions with the other trainers, I have to be really careful about couching this as just our process. I don't expect them to adopt it or want to use it or even that it will work for them.  Just a conversation.

This is where coming from the IT department, honestly, makes this conversation a bit more hazardous.  The last thing I want is to have the other training teams feel like they are being forced to do something because "that's the way IT does it."  That never goes well.

Thursday, November 12, 2015

Business Architecture Sample - Student Registration Into Archi

Last spring, after the Integration SWAT Team guy showed me Archi and walked me through how to develop a process using Archimate, he sent me off with a copy of the enterprise architecture as it currently exists (not including specific server IPs or other sensitive information that I don't need anyway). a link to Archimate, a couple of work samples, and the invitation to ask him any further questions.

THIS is why you want to make friends with your IT folks.

Huge swaths of things I had planned to do, he had already done.
All I needed to do was fill in the blanks.
I took my Visio version of the Classroom Trainee Registration workflow and began to convert it to Archi.

And here....I started to run into issues.....

A side benefit of learning another language - whether it be cultural, programming or modeling - is the chance to look at something in a completely different way.

In my attempts to fit this workflow into this tool, I'm having to think about other ways to communicate this workflow.

Unfortunately, I also ran into some big problems with trying to cram a particular workflow into a particular modeling language.  Hence why this post has been sitting in my draft queue since last spring.

After much thought and head-banging, I came to the following conclusions:

1) Just because I am learning a particular modeling language and want to communicate in that language doesn't mean anyone else understands what I am trying to communicate.  It's like saying "I'm learning Spanish, therefore I am going to speak to everyone in Spanish whether they understand me or not."

2) Modeling languages have types of workflows that work best in that language. 
- Archimate is best designed for technical architectures
- BPMN (Business Process Modelling Notation) is best for business process workflows

I have personally decided to prioritize speaking / modeling in the stakeholder's language. 
I want someone to be able to look at the picture and follow along easily.
I want someone to quickly see what is a human step and what is a computer step.
If that means using stick figures and little clip art people, then so be it. 
Besides - stick figures communicate a lot.
And, frankly, it's easier for ME to understand too.

Tuesday, November 10, 2015

Sample Solutions Survey or....How Training fits in

I've had a number of people ask me recently about what questions we are asking in our Pilot surveys.

The largest survey we did was for our Unified Communications rollout.

I have started to use some of these questions during other implementations to get a feel for how the training programs impact the adoption of a new tool.  These surveys are being sent out one month after the implementations. 

We have found that we get a better idea as to whether the solution and the training that supports the solution worked. People have had time to work with the new solution and processes. Patterns as to what is being used and what isn't are beginning to surface. Plus, I find I get more honest answers from people one month out. 

The general questions we ask:
  • How helpful did you find the training materials?
    • We then list each individual training object (each type of classroom training, each tutorial, each quick reference)
    • Helpful / Not Helpful / Did Not View or Attend
  • What other resources did you find helpful as you learned how to x?
    • This is a free-text field
  • I find x intuitive and easy to use - Yes / Somewhat / Not at all
    • x = the IT application
  • As a result of the training, I felt I could use x - Yes / Somewhat / Not at all
    • I have generally found that for Somewhat and Not at all - people will add comments without prompting
  • As a result of the training, I could use x to perform y tasks - Yes / Somewhat / Not at all
    • This question can be separated into the different type of tasks if the solution was modular
    • Again, people will generally add comments without prompting
  • As a result of the training, I felt I understood (any resulting new process / concept) - Yes / Somewhat / Not at all
    • We will add this question when there is an affiliated major process change. 
    • The question also helps us to see whether we did a good job putting the material in context
  • The training applied to how I need to use x to do my job - Yes / Somewhat / Not at all
    • Another question asking whether we got the context right
  • What features do you find you use most often, check all that apply
    • For new IT applications, this gives us a feel for what people are actually using
    • This also gives us information for where we need to do more training, or ask more questions as to why they are not using a particular feature or area we are expecting people to use
      • Depending on the sensitivity of the audience, this question is best done with clear identification vs anonymously.  This provides a better feel for whether we got the audience needs assessment right if we are working with multiple audiences.
  • Rate your understanding of the following subjects as a result of this implementation
    • Here we check to see how well we covered the individual areas 
    • We used a Likert scale ranking from I understand to No understanding + Does Not Apply to Me
    • We are able to see what may require more training and whether we captured the appropriate audience.
  • What topics do you wish were better covered? - Free Text
  • How can we improve the training and support for implementations like this one - Free Text
  • What improvements do you think we can make to (the resulting new process) - Free Text
    • We had one question for each major process that was affected by the implementation
  • Any other comments or would you like someone to follow up with you - Free Text

These surveys have really helped us get a better feel for how the training solutions we design are helping (or hurting) implementations that (hopefully positively) impact the business.

If you see other questions you think we should be asking - please add them to the comments.

Thursday, November 05, 2015

Translating Capabilities to Strategy - Survey example

As you may have discovered, we have multiple tools that do similar things.

We have multiple tools that provide survey capabilities.

So now we can sit and figure out what tool is best for a particular scenario - and come up with a clear set of guidelines for the team.

This will allow us to better evaluate what is actually being used, whether we should be using one of the other tools instead, and what is not needed.


In this example, I have taken the capabilities and developed a strategy based on careful evaluation of those capabilities. 

The most salient requirements and capabilities I used to create this are
- Reporting and my audience for reporting (higher the level, the prettier the pictures need to be)
- Whether I have to (or can) connect a survey to a particular learning object
- Whether I can create anonymous surveys

The most difficult part of this exercise is defining the scenarios. I just listed the ones I either encounter frequently or have seen recently. 

Like other educators, we commonly use survey tools for smile sheets, testing and pre-testing.

However, we've also been using surveys to help us see measures of whether particular solutions have been creating change and the role of training in that change (the Solution surveys).  Those surveys are not connected to a particular course - so using the LMS survey tools (which forced me to connect a survey to a particular item) was out of the question. 

I also have a Skills survey scenario. One of my projects last year was to do a skills inventory for a segment of our division.  This helps us see what human capability we had in-house and how it potentially measured up to certain planned activities.. From that exercise, I learned why vendors have expensive skills evaluation solutions. 

Again, the way I approached (requirements to capability to strategy) generally occurs in Wendy's Utopian Fantasyland.
The reality normally looks like this.

Tuesday, November 03, 2015

Translating Requirements to Capabilities

We have talked about Capability Matrices before.

Application Architecture: Capability Matrices
Where do I stash my files?
Capability Matrices: Functionality Replacement

Doing the translation between requirements and capability is actually fairly simple:
1) Copy the actor/verb/object columns to a new sheet within your spreadsheet
2) List the tools you have available across the top
3) Ask - does it have the feature? Yes / No.  This also allows me to comment if the answer isn't so black and white.

Example of an incomplete capability matrix is below.

This can be used either for applications that you have existing in the environment (like the one shown here) or ones that you are evaluating.

In this instance, we are looking at the capabilities that are in our environment.

Notice that this is incomplete.  I still need to sit down for a quality day and go through each line for each tool.  It'll happen....eventually :)
You will notice in this particular requirements / capability matrix, I have also added any survey / assessment / evaluation capabilities housed in LMSs.

Mostly because often when training groups use surveys, they are in the context of a particular course (smile sheets, tests etc.).   Once I get a full evaluation of what each one of these systems has - I can then start putting together a use strategy.

ie - I use Google Forms when I am trying to do x for y audience.  I use SkillPort when I am trying to create a test attached for a particular course for staff. etc.  I'll flesh that out later.