Saturday, February 25, 2017

EDDE 806 - Post XI - Get your Waldorf on...

Statler & Woldorf, muppet critics
This past week the presenter of the week was Angie Parkes of Cohort 3, who is a fellow instructional designer!  Angie was presenting to us  her (potential?) dissertation proposal which as to do  with testing the hypothesis that the DACUM process can be done effectively online.  More specifically, her three hypotheses are that (1) an online asynchronous DACUM can produce a comprehensive and rigorous competency analysis; (2) the online asynchronous DACUM can be completed in less than 6 months (2 financial quarters); and (3) the online asynchronous DACUM can be completed for less than $1000.

Angie is coming at this problem from a corporate instructional design lens, where a lot of money is spent in corporate environments for training, however 50%-90% of this training is deemed ineffective.  Because of this training departments are one of the first things that get cut when a company needs to tighten the budget (explains a lot of the angst that friends who are corporate IDs feel).  I do wonder though what about corporate training makes it ineffective.  Being a bit of a Waldrof (or am I more of a Statler?), it seems to me that fellow instructional designers in corporate settings do what's expected of them to do (self-paced, drill & kill interventions), but those don't work because they are usually compliance (and everyone seems to hate that).  If instructional designers were more integral in the talent development cycle, the interventions might be more effective.  Anyway, I think I digress.

So, one might ask, what is DACUM?  DACUM was new for me, and it is defined as:
Developing a Curriculum (DACUM) is a process that incorporates the use of a focus group in a facilitated storyboarding process to capture the major duties and related tasks included in an occupation, as well as, the necessary knowledge, skills, and traits.  This cost-effective method provides a quick and thorough analysis of any job.   
It seems to me to be one of the tools used by instructional designers in the needs analysis phase to determine what is needed to be accomplished by the learning intervention. Apparently DACUM is only done in person at the moment, which can be quite expensive when done face to face, and synchronously, for the same reason that training is deemed expensive at times: you need to pull employees away from their work to do this thing.  Angie is looking at employing Design Based Research (DBR) with a Delphi approach. Her expert informants will be 6 PhD Psychometricians at her company, distributed over a geographic distance (some are in the same office, but some are not). She will have one group of senior psychometricians and one group of junior psychometricians (it will be interesting to see if there are differences between those who are more senior).

On another note, it's interesting that this is not Angie's first idea.  She's had several over the years, but opportunities dry up and doctoral students are left trying to pick up the pieces.  I often wonder what happens if you've passed your dissertation proposal defense (and hence you are formally an EdD candidate), but that opportunity dries up and you need to do something else.  Does you committee ask you to re-defend something new?  Do you try to salvage what you have with what little is left?  Do you put together a new proposal with just your advisor?  With coursework it's pretty cut and dry - you do the work, you get a good grade, you pass.  The dissertation can be a year long project (or longer) after you defend the proposal.  What happens when stuff hits the fan when you're in the thick of it?

If any cohort 1 or cohort 2 folks are reading this, advice is definitely welcomed :-)

Thursday, February 23, 2017

Are MOOCs really that useful on a resume?


I came across an article on Campus Technology last week titled 7 Tips for Listing MOOCs on Your Résumé, and it was citing a CEO of an employer/employee matchmaking firm.  One piece of advice says to create a new section for MOOCs taken to list them there. This is not all that controversial since I do the same.  Not on my resume, but rather on my extended CV (which I don't share anyone), and it serves more a purpose of self-documentation than anything else.

The first part that got me thinking was the piece of advice listed that says "only list MOOCs that you have completed".  Their rationale is as follows:

"Listing a MOOC is only an advantage if you've actually completed the course," Mustafa noted. "Only about 10 percent of students complete MOOCs, so your completed courses show your potential employer that you follow through with your commitments. You should also be prepared to talk about what you learned from the MOOC — in an interview — and how it has helped you improve."  

This bothered me a little bit.  In my aforementioned CV I list every MOOC I signed up for(†) and "completed" in some way shape or form. However, I define what it means to have "completed" a MOOC.  I guess this pushback on my part stems from me having started my MOOC learning with cMOOCs where there (usually) isn't a quiz or some other deliverable that is graded by a party other than the learner. When I signed up for specific xMOOCs I signed up for a variety of reasons, including interest in the topic, the instructional form, the design form, the assessment forms, and so on. I've learned something from each MOOC, but I don't meet the criterion of "completed" if I am going by the rubrics set forth by the designers of those xMOOCs.  I actually don't care what those designers set as the completion standards for their designed MOOCs because a certificate of completion carries little currency anywhere. Simple time-based economics dictate that my time shouldn't be spent doing activities that leading to a certificate that carries no value, if I don't see value in those assessments or activities either. Taking a designer's or professor's path through the course is only valuable when there is a valuable carrot at the end of the path. Otherwise, it's perfectly fine to be a free-range learner.

Another thing that made me ponder a bit is the linking to badges and showcasing your work.  Generally speaking, in the US at least, résumés are a brief window into who you are as a potential candidate.  What you're told to include in a resume is a brief snapshot of your relevant education, experience, and skills for the job you are applying for.  The general advice I hear (which I think is stupid) is to keep to to 1 page.  I ignore this and go for 1 sheet of paper (two pages if printed both sides).  Even that is constraining if you have been in the workforce for more than 5 years. The cover letter expounds on the résumé, but that too is brief (1 page single spaced). So, a candidate doesn't really have a ton of space to showcase their work, and external linkages (to portfolios and badges) aren't really encouraged. At best a candidate can whet the hiring committee's palate to get you in for an interview. This is why I find this advice a little odd.

Your thoughts on MOOCs on résumé?


NOTES:
† This includes cMOOC, xMOOC, pMOOC, iMOOC, uMOOC, etcMOOC...

Wednesday, February 22, 2017

Course beta testing...


This past weekend a story came across my slashdot feed titled Software Goes Through Beta Testing. Should Online College Courses? I don't often see educational news on slashdot so it piqued my interest. Slashdot links to an EdSurge article where Coursera courses are described as going through beta testing by volunteers (unpaid labor...)

The beta tests cover things such as:

... catching mistakes in quizzes and pointing out befuddling bits of video lectures, which can then be clarified before professors release the course to students.

Fair enough, these are things that we tend to catch in developing our own (traditional) online courses as well, and that we fix or update in continuous offering cycles.   The immediate comparison, quite explicitly, in this edsurge article is the comparison of xMOOCs to traditional online courses.  The article mentions rubrics like Quality Matters and SUNY's open access OSCQR ("oscar") rubric for online 'quality'. One SUNY college is reportedly paying external people $150 per course for such reviews of their online courses, and the overall question seems to be: how do we get people to do this beta test their online courses?

This article did have me getting a bit of a Janeway facepalm, when I read it (and when I read associated comments). The first reason I had a negative reaction to this article was that it assumes that such checks don't happen.   At the instructional design level there are (well, there are supposed to be) checks and balances for this type of testing. If an instructional designer is helping you design your course, you should be getting critical feedback as a faculty member on this course.  In academic departments where only designers do the design and development (in consultation with the faculty member as the expert) then the entire process is run by IDs who should see to this testing and control. Even when faculty work on their own (without instructional designers), which happens to often be the case in face-to-face courses, there are checks and balances there.  There are touch-points throughout the semester and at the end where you get feedback from your students and you can update materials and the course as needed. So, I don't buy this notion that courses aren't 'tested'.†

Furthermore, a senior instructional designer at SUNY is cited as saying that one of the challenges "has been figuring out incentives for professors or instructional designers to conduct the quality checks," but at the same time is quoted as saying “on most campuses, instructional designers have their hands full and don’t have time to review the courses before they go live.”  You can't say (insinuate) that you are trying to coax someone to do a specific task, and then say that these individuals don't have enough time on their hands to do the task you are trying to coax them to do. When will they accomplish it?  Maybe the solution is to hire more instructional designers? Maybe look at the tenure and promotion processes for your institutions and see what can be done there to encourage better review/testing/development cycles for faculty who teach. Maybe hire designers who are also subject matter experts to work with those departments.‡

Another problem I have with this analogy on beta testing is that taught courses (not self-paced courses, which is what xMOOCs have become) have the benefit of a faculty member actually teaching the course, not just creating course packet material. Even multimodal course materials such as videos, podcasts, and animations, are in the end, a self-paced course packet if there isn't an actual person there tutoring or helping to guide you through that journey.   When you have an actual human being teaching/instructing/facilitating/mentoring the course and the students in the course there is a certain degree of flexibility.  You do want to test somewhat, but there is a lot of just-in-time fixes (or hot-fixes) as issues crop up.  In a self-paced course you do want to test the heck out of the course to make sure that self-paced learners aren't stuck (especially when there is no other help!), but in a taught course, extensive testing is almost a waste of limited resources.  The reason for this is that live courses (unlike self-paced courses and xMOOCs) aren meant to be kept up to date and to evolve as new knowledge comes into the field (I deal mostly with graduate online courses),  Hence spending a lot of time and money testing courses that will have some component of the course change within the next 12-18 months is not a wise way to use a finite set of sources.

At the end of the day, I think it's important to critically query our underlying assumptions.  When MOOCs were the new and shiny thing they were often (and wrongly) compared with traditional courses - they are not, and they don't have the same functional requirements.  Now that MOOCs are 'innovating' in other areas, we want to make sure that these innovations are found elsewhere as well, but we don't see a stop to query if the functional requirements and the environment are the same.   Maybe for a 100 level (intro course) that doesn't change often, and that is taken by several hundred students per year (if not per semester) you DO spend the time to exhaustively test and redesign (and maybe those beta testers get 3-credits of their college studies for free!), but for some courses that have the potential change often and have fewer students, this is overkill.  At the end, for me, it comes down to local knowledge, and prioritizing of limited resources.  Instructional Designers are a key element to this and it's important that organizations utilize their skills effectively for the improvement of the organization as a whole.

Your thoughts?




NOTES:
† Yes, OK, there are faculty out there have have taught the same thing for the past 10 years without any change, even the same typos in their lecture notes! I hope that these folks are the exception in academia and not the norm.

‡ The comparison here is to the librarian world where you have generalist librarians, and librarians who also have subject matter expertise in the discipline that they are librarians in. Why not do this for instructional designers?

Wednesday, February 15, 2017

Institutional Memory



It's been a long time since I've blogged about something educational, other than my classes of course.  With one thing down (and a million more to go), I decided to take a little breather to see what's accumulated on Pocket over these past few months.  I saw a post by Martin Weller on Institutional Memory, and it seemed quite pertinent to my day to day work existence these past six or so months.  Martin points to a BBC article indicating that the optimal time in a specific job is around 3 years.

This isn't the first time I've heard this.  About 11 years ago (wow!) I was working for my university library.  I was new to the Systems Department (the IT department in a library) and my supervisor was new.  When we were getting to know more about each other's work histories (before you could look at LinkedIn profiles), she had told me that she aimed to stay there for a few years and then move on. People should only stay in their current work for 3 years. At the time I found this advice a little odd, after all I had stayed with my previous department for 8 years total, before moving to the library, and even then I still stayed within the institution.

From my own experience I can say that if institutions were perfectly running machines, with perfectly documented procedures, and good version histories that we could reference to get an insight into why things are done the way they are done, then "short" 3 year stays at a job (or an institution) might (in theory) make sense.  You come in, the institution benefits from your expertise, you benefit from the experience, you (metaphorically) hug and go your separate ways at the end of your tour. However, institutions are complex organisms. The reasons why things are the way they are might not be documented. Sometimes the procedure was a backroom deal between one academic Dean and another.  Sometimes it's the duct tape and paper-clip that holds everything together because at the time the organization didn't have the ability to break everything down and rework something from scratch.  Other times it's good ol' fashioned human-human relationships that make things work (i.e. bypassing parts of the system where things are bottlenecked but no one will change things).

Given this reality, I think 3 years is a rather short time to spend at a job or an institution.  I know that when I've changed jobs it's taken me up to a year to fully "get" all the connections, the people, and the systems in place to not only do my job but to do my job effectively and efficiently. Leaving before you can make a lasting impact at the institution is a little selfish given that the employee gets good exposure to new skills and ideas, but leaves before they can really put those to use on anything more than a bandaid†.

Sure.  Even when you stay at an organization for more than 3 years, after a little while you will reach the plateau of efficiency in what you are doing. It may take you 3 years, it might take you 2, it might take more.  Sooner or later you will get there.  At that point, that's when the organization has a responsibility to keep things fresh for their employees. This benefits both the organization and the employees.  Employees feel challenged, in good ways, (think of it as a ZPD for work), and organizations get to retain and employ the talent that they've incubated.  If people leave because they feel bored that's a shortcoming of the organization.

I know that in my own experience working at my university (19 years now), even though my jobs have changed, and my departments have changed, that institutional knowledge follows me, and I share it with other people. Just because something might not be of particular use to me right now, it doesn't mean that it's not useful to another colleague who is newer at the institution.  Having this oral history, and this means of passing it down to others is of use.  Leaving your post and experiencing this high turnover rate  is detrimental to an institution‡.

Your thoughts?



NOTES:
† Don't get me wrong, private sector companies, especially ones that vehemently refuse union organization, and use globalization as a way to use and abuse employees by not paying them a living wage, by not providing good benefits, and by shirking their responsibilities in their social contracts are not worthy of employee loyalty of this nature. We just can't afford, as people to to say "I am only looking out for myself".

‡ Another thing that came to mind, as I was writing this, has to do with hiring. Hiring isn't as simple as posting a job at the university's "help wanted" site. Between the time a need for someone arises, and someone is hired, it can take a very (very) long time.  Just as an example, there are two jobs that come to mind that I applied for.  One for my current job where I applied in March, interviewed in December, started in February).  My job at library systems where I applied in February (I think), got the call for an interview in November, heard that I got the job in December, started in January. All of this is considered "fast", so when it takes that long to get hired, I would say that 3 years somewhere is a rather short time.

Saturday, February 11, 2017

EDDE 806 - Post X - it marks the spot!

This past Thursday we had our official EDDE 806 session (on Monday, Norine did a mock proposal defense, which I wasn't able to attend, but luckily it's archived for later viewing). In any case, in this session we heard from Renate who reported in on her ideas for a dissertation topic, and there were a ton of interesting things about process that were shared by Susan and others.

Renate is looking to do a study in order to understand the lived experience of pre-licensure (nursing?) students, attending their final clinical practicum, after they have been exposed to an IPE (interprofessional education) didactic curriculum. To do this she will use a qualitative, phenomenological, approach to her research design.  Phenomenology seems to be quite popular between the current cohorts (wonder why). She aims to get about 15 participants from a variety of healthcare professions (in Canada) who will be her research participants.  I am looking forward to reading this research when it's done. It reminds me a little of other professions where there is professional education, but we haven't necessarily seen if the former students practices connect with what they have learned, and how well those connect.

In terms of tips for the dissertation process (and the proposal process for that matter), Susan and Peggy Lynn shared the following (my comments are in italics):

  • Getting yourself in a routine.  Even if you are not doing much on your proposal (or you dissertation), do spend 10-15 minutes on the document anyway.  Re-read, copy edit, make notes. Just keep the process going, even if you're not actively working on it. I have not been doing this this semester, but I think that next week I'll start.  Maybe grab a cup of coffee and spend 15 minutes editing (and look at what Debra commented on from EDDE805, lol)
  • Once the changes to your dissertation (or dissertation proposal) are made (based on the committee feedback) and you have an oral defense scheduled, do not edit the document, not even copy edit!  The committee will use this document as a reference when they quiz you, so it's best if you are all on the same page.
  • Once you pass your dissertation proposal, make a copy of the proposal file for archival purposes.  File it away (I would add, maybe in PDF format!). Then take another other copy to build out your dissertation from.  This is good versioning practice, and it allows us to share successful proposals with other cohort members who might want to see a sample of what is good.
  • The runtime for a defense is about 2 hours.  There are three members on the committee, and the order for questioning is: 1) External member, 2) other member from AU, and 3) your supervisor.  Each gets about 15 minutes of Q&A.  Your presentation at the start of this is 20 minutes, so I guess it's good to practice the heck out of our presentation to make sure that we are on the mark with the points we want to make, and on time!
  • The examiners need to see your face when you start your dissertation to verify visually that it is you defending.  So...make sure that you wear appropriate clothing and present a professional environment. Also make sure that if you are at home that cats, dogs, birds, and rodents are somewhere else and that they don't provide their own soundtrack to your defense
  • Finally, a good point by Peggy Lynn: look for articles that are reporting on the opposite thing that you are proposing. This stuff might come up in your defense so you need to know how to rebut it!


By the way, if you are reading this, and you are in one of the cohorts, please feel free to add to this wiki page. We are putting together a list of topics that we are all working on  (or have worked in, in the case of previous cohorts) for our dissertations.  This will give others in future cohorts (as well as our own) what people have worked on in the past :-)


And, since it was a phenomenology sort of talk... for your learning pleasure, the muppets!