Surely everyone knows that process improvement has changed dramatically over the last 20 years, and certainly there are generational differences in the practitioners then and now.
The first post in this series, The Next Generation of Lean Six Sigma (Part 1): Then and Now, was about the generational shifts in process improvement. Part 2, Old Tools for a New World, focused on uncovering new vs old tools and how we’ve come into the next generation of Lean Six Sigma.
Like many inhabitants of the digital age, Dodd and James occasionally work together from different cities, and so today they are collaborating by sending a file back and forth in review mode to write about Part 3 of this Series: Time Studies Get an Update.
The Tools of Time Studies
Dodd: Last time we talked about our generational differences, we were discussing my favorite âoldâ tool, Excel, and how we sometimes donât realize how much capability it truly has.
James: And I was saying that other applications like PowerQuery, PowerBI, and Tableau bring even more speed and capability now, ever since that meteor recently hit the Earth and the dinosaurs got wiped out.
Dodd: Hmmm. We might end up arguing that one a bit more later! Anyway, I hope we can agree that we said weâd discuss time studies next. Rather than debating which tools work best, though, perhaps we should start with a few things we can agree upon, like the original purpose of the time study. As the old guy here, I guess Iâll start with the history.
The History of Time Studies
James: From what I understand, there was a caveman lying around a cave, and another cave-person told him, âYouâre not working hard enough.â An argument ensued and neither of them could prove their assertionsâone went out to invent the time study to manage her colleagueâs performance while the other one made his work easier by inventing the wheel. Thus engineering and management consulting were bornâcompeting to become the worldâs oldest professions.
Dodd: And your point is that âLaziness is the mother of invention?â Iâm not sure thatâs exactly how the time study got inventedâĶ but industrial engineers did start doing them long ago. As factories and other businesses began looking for more efficient ways to work, they hired industrial engineers to measure their processes.
After figuring out what a process really is (Standard Work) they needed to calculate how long it took to accomplish (Standard Time), so that they could look for improvements, plan for scalable growth, hire the right numbers of people, and hold those people accountable for appropriate performance. VoliaâThe Time Study!
The Challenges of Accurate Measurements
James: All kidding aside, we know the stories of the Hawthorne Effect, where the measurements of factory performance were impacted by people simply knowing they were being measured. For you scientists out there, the timing coincidedâthough not exactlyâwith the discovery of the Heisenberg Uncertainty Principle, which in turn inspired chemist Walter White to cook.
Okay, maybe I am kidding a little about Walter White, and the words ânot exactlyâ are a pun too, given that the Heisenberg Uncertainty Principle and the Hawthorne Effect are both about the measurement of something impacting the reality of the thing being measured and the need to accept that your measurement will never be exactly accurate.
Anyway, my point is that time studies are hard to conduct accurately with small samples, mostly because of those measurement effectsâthe fact that youâre measuring people will impact their behavior and therefore influence the results, so you need to focus lots of energy on getting accurate and representative data.
Time studies are hard to conduct accurately with small samples… so you need to focus lots of energy on getting accurate and representative data.
Accordingly, some of the sampling techniques we teach for other Lean Six Sigma project situations donât fit as well for this one, and we end up using larger-scale and more automated methods to collect more data, while also trying very hard to prevent and filter out inaccuracies.
Dodd: True, those measurement issues really are the key challenge. And thatâs an unfortunate consequence of the people-related issues that measuring work can uncover. When we run time studies, the purpose is generally very pure: finding out the time it takes to do each type of work will lead to quantification of opportunity, investment in technology, justification of staffing models, and the ability to grow and scale.
All of those things are essential to running and improving a business. But as soon as the stopwatches come out, some people think, âTheyâre here to take away our jobs.â Sad, and untrue!
James: Yes, and as we always say, âEfficiency is the ultimate form of job security.â Hiding or covering up inefficiencies will simply expose your business to more competition. Even though itâs scary to find out where hidden capacity may be lurking in your company, it turns out to be liberating: we can streamline processes and take time to right-size teams through moving people into new roles or investing in growth, which then leads to safer, more-fulfilling jobs for everyone.
We can streamline processes and take time to right-size teams through moving people into new roles or investing in growth.
Manual Collection vs Time-Saving Macros
Dodd: I know itâs true, but that sounds almost utopian. Next youâll be asking for universal income or telling me I have to pay for your college. Oh, right, I already didâĶ
But I agree with you that the world will keep marching forward, and trying to resist becoming more efficient will just doom your company (and everyoneâs jobs there!) eventually. So we communicate our positive intentions and then enlist the entire operation in gathering the time study data, which we call a âself-reportedâ time study. While we may end up with more data than we need from a sampling perspective, engaging the whole team in the data collection has additional benefits for change management as well.
Self-reported doesnât mean inaccurate or undisciplined, though; it just means every employee on the team becomes a data collector while they do their own tasks. We ask that all team members fill in an Excel worksheet and record a row immediately as they do each and every task. When we first unveil the approach, of course, everyoneâs greatest fear is that it will slow them down.
Self-reported doesnât mean inaccurate or undisciplined, though; it just means every employee on the team becomes a data collector while they do their own tasks.
To mitigate that concern, we first put a substantial effort into streamlining the collection system. After studying the teamâs processes and designing a spreadsheet with drop-down menus for every task type as well as other important attributes of the work (product type, customer, work outcome, interruptions, etc.), we create a template for each person to collect the data, often tailored by role type.
Data validation, conditional formatting, and other built-in functionality to look up attributes from other systems help team members to fill in only the information thatâs needed for each task, which makes the tracking both faster and more accurate. Excel is really capable of a lot.
James: At this point, many in my generation might ask why weâre not using technology to make this data collection easier. We put significant effort into making the worksheet easy to understand and fill out, but we still require people to manually select categories and input numbers in certain fields. Why so manual? Couldnât we simply create macros within Excel to automate even more of the data collection?
For instance, some ask why we donât have macro formulas input the stop time for each task based on when the task was entered into the worksheet? Or create functions to automatically determine the tasks based on the time elapsed? This would be rather simple, and it would certainly make the data collection easier. It might even increase quality since we could better control more of the variables. Now youâre going to tell me that the simplification of tracking will come with some unintended consequences, right?
Dodd: James, your experience in workflow automation is showing; I remember us having this exact conversation back in 2017. As you suggest, itâs a reasonable question. But, can you imagine a person accidentally forgetting to record a stop time for a task before going to lunch and then remembering to finish filling in their sheet when they returned? In this case, if we automated the capture of stop times, it would look like a task lasted much longer than it really did.
What we find is that people can sometimes make mistakes in tracking, but the vast majority really donât want to lie on their forms. Trusting people to track honestly (and then checking for less-than-truthful entries) can be more effective than trying to automate everything, even though it takes a little bit of extra work for the people doing the tracking.
We have done time studies with more automated versions in the past, using some of the exact same concepts you just outlined. While these streamlined automations made the data collection easier, we reverted to the more-manual version after seeing some of the unintended consequences that impacted accuracy.
Every time we automate the population of data fields, we lose flexibility in the process andâas you knowâthese studies can look quite different in different companies or industries. We donât want to bias the data collection by having our opinions of what people should be doing get coded into rules that the computer uses to measure the work. That results in self-fulfilling answers that miss seeing the true performance!
We deliberately use limited automation so that we can accurately capture all of the work people are doing. Iâm sure there is some combination of automation that could get us across the finish line more easily, and someday weâll figure out how to implement it. But until that happens, weâll continue using what we have and try to find ways to augment and improve our current Excel-based process.
James: So, I guess we should plan on using this process for the next 10 years then, right?
Dodd: Let’s not get carried awayâĶ
Engaging the Human Factor
James: Thank goodness. While an Excel spreadsheet can certainly be programmed to make that data collection easier, weâre still ultimately relying on people to make the additional effort to record their work properly. We end up having to review the submissions and look for indicators of errors or intentional fudging of the data, which includes rounded start and stop times or work times that seem biologically impossibleâwhen people work a whole day without a break. That goes back to people distrusting the purpose of the study, but itâs a big challenge.
Dodd: Yeah, we jokingly call that effect âinhumane conditionsâ and suggest that unless they were handed a protein bar, a bottle of water, and a catheter at the start of the work day, they probably didnât work all that time without a break. Certainly a self-reported time study has its drawbacks. Of course, now youâll tell me the modern world has a technological solution for this one!
James: Of course. I buy your point about empowering people to give us good data when we have to use Excel to track their work, but occasionally nowadays all of the data that we need is already captured in another system. Sometimes we donât need to have people track anything in Excel at all.
In 2017, we launched a project with a team that had already been operating with a Business Process Management (BPM) workflow system for almost a decade. This team underwrites and issues life insurance policies, and over the years their information technology team fielded a useful workflow system that tracked all of their work and even automated some of their key activities. The system was able to track the outcomes, key attributes, and start and stop times of every touchâdown to the nearest hundredth of a second. The time study data was already there.
That was the easy part. The hard part was getting that data out of the system in a way that we could understand it.
For example, the system displayed work activities at policy level for users but actually tracked them as separate tasks, and because a person could work on a policy file that had multiple open activities, the system tracked that touch time to each open task at the same time. So the original reports that we made in Tableau from the system werenât trusted because they double-counted activities that happened concurrently.
The answer for the time study was to download the data into an Excel file where we could see those duplication issues, then write âifâ statements and use pivot tables to remove the duplicates. Once we made the data make sense in Excel, the technology team was quickly able to follow the logic we had used, filter the data properly in Tableau, and automate the standard time calculations.
Dodd: So wait, youâre telling me that we mined data from a sophisticated workflow system and eventually made sophisticated reports in Tableau, but we used simple old Excel to figure out how to troubleshoot the queries? The old-school ways really are still the best. I knew it!
New & Old Tools Working Together
James: How about we agree that both are essential? As it turns out, Excel wasnât able to handle the multi-million-row data files of touch-level data from the system very well, and so yeah, we looked at small segments of the data in Excel in order to figure it out, but Excel was never going to be a good solution. The new technology was critical for the eventual solution as well, and the brute-force Excel method wasnât going to be sustainable.
Dodd: âAnd we would have gotten away with it too, if it werenât for you meddling kids.â
James: Haha. Even Iâm old enough to recognize a Scooby Doo quote.
Dodd: Okay, you win this time.
Next time, letâs look at the data that those time studies produce and talk about how to use it to make improvements and enhance performance, particularly in the Control Phase. Some of those concepts are in the Lean Six Sigma manuals and some are brand new. Talk soon!