Welcome to October, 2017 edition of Carnival of Quality Management Articles and Blogs.
Our topic for October 2017 is World Standards Day : Each year on 14th October, the members of the International Electrotechnical Commission (IEC), International Organization for Standardization (ISO) and International Telecommunication Union (ITU) celebrate World Standards Day, which is a means of paying tribute to the collaborative efforts of the thousands of experts worldwide who develop the voluntary technical agreements that are published as international standards.
The theme for World Standards Day, 2017 was ‘Standards make cities smarter.’ Sufficient fresh water; universal access to cleaner energy; the ability to travel efficiently from one point to another; a sense of safety and security: these are the kinds of promises modern cities must fulfil if they are to stay competitive and provide a decent quality of life to their citizens.
More about the WSC and Information on previous celebrations (1998-2015) to see all previous World Standards day posters.
Setting standards is the key to building smarter cities: Eswaran Subrahmanian
What are Smart Cities? | Larissa Suzuki | TEDxUCLWomen
How we design and build a smart city and nation | Cheong Koon Hean | TEDxSingapore
Smart Cities – The Untold Story: Mischa Dohler at TEDxLondon City 2.0
Benefits of Smart Cities – #WorldStandardsDay2017 Gabriel Hernández from Mexico is winner of the video contest
We will now turn to our regular sections:
For the present episode we have picked up one article(s) Be Data Literate: Understanding Why Aggregated Data Misleads, Misinforms, Misdirects: Part 1 & Part 2 @ the column Measuring Performance (People & Enterprise) @ Management Matters Network.
Not a day goes by that we are not being subjected to cheating charts, meaningless statistics, improper comparisons, and erroneous conclusions.
Worse, by failing to apply what might be called elementary statistical analysis to a variety of societal and management problems, it’s near impossible to separate a problem’s symptoms from its causes.
To arrive at the definition of the real problem and the development of alternative and effective solutions requires an approach thoroughly grounded in scientific and statistical thinking.
From this point forward, we ask you to internalize this basic truth: Overly-aggregated data misleads, misinforms, and misguides.
For any manager looking to flex their leadership acumen, he or she must not only be able to read data, but have the ability to detect the forces that skew the accuracy of its results as well.
It is called homogeneity.
Simply put, homogeneity of data refers to whether or not the total data set from which measurements were computed conceals important differences between or among what statisticians call “rational subgroups or just plain subgroups.”
To Sum It All Up:
- An aggregated performance measurement is of limited diagnostic value.
- Through the process of isolating and analyzing variation among relevant subgroups, you can locate the “root cause” of the problem.
- Management action is required to deal with the “root cause” of the problem. (A reminder: A decision is not an action. A decision is a good intention. Decisions must be converted into action).
- Faulty conclusions and/or policies inevitably flow from a dataset that is not homogeneous with respect to the performance measurement under investigation. In other words, the wrong problem is being solved.
- Statistical procedures detect significant variation among subgroups. If significant differences in a performance characteristic (because of thoughtful subdivision of a data set) are found to exist, the reasons for the variation must be investigated and eliminated from the process.
- After the “causes” of the variation are discovered and eliminated, the performance measurement under investigation improves.
ASQ CEO, Bill Troy section has one interesting article on how to Apply Design Thinking to Quality Practices. The subject of Design Thinking calls for a full-fledged post in blog carnival series. So, we will take that up in our November, 2017 issue.
For the present, we continue with the practice of picking up one article form ASQ.org site. For our present edition we will fall back upon a 1991 interview – Statistical Quality Control in World War II Years – by Eugene L Grant [Born: 1897|Died: 1996] that translates important memories into historical documentation…….. Although Eugene L Grant is best known for Statistical Quality Control, his contributions extend beyond the boundaries of the quality profession. Industrial quality control was only one of the areas in which he specialized. He authored books in several other areas, including engineering economy, depreciation, and accounting, and one of those books outsold Statistical Quality Control.
We now watch one of the latest ASQ TV episodes:
- Likert Scales and Data Analysis presents advice on gathering and analyzing data in organizations, tips on using Likert Scales, and a case study on leveraging data to help the bottom line.-
- Chris McMillan’s Full Interview
- Full Case Study by Sivaram Pandravada and Thimmiah Gurunatha
- “Likert Scales and Data Analyses”, I. Elaine Allen and Christopher A. Seaman, QP, 2007
If we search for Likert Scales and Data Analysis on YT, we will find quite a few more informative videos on the subject.
Jim L. Smith’s Jim’s Gems for the month of September, 2017:
- The Role of Disposition Limits – This in continuation to the previous articles – The role of specification limits in manufacturing and The Role of Specification Limits. The first article dealt with the Specification Limits, within which a product would be expected to perform its stated and intended function for customer use. The second article dealt with Process Control Limits which provide information about process behavior with a view to permit simple detection events that are indicative of actual process change.
Fundamentally, disposition limits are focused on product, not process, control. The decisions they drive are focused on what to do with product that has already been processed through a specific process step or set of steps. The basic decision involved is whether a specific group of product should be allowed to move on for further processing and eventually become finished product worthy to be shipped….
Specifically disposition limits differ from process control limits in three areas.
- Disposition limits are applied to a finite group of product that has already been manufactured. Control limits, on the other hand, are applied to the manufacture of current and future operations of a process for variable amounts of time and processed product.
- Disposition limits are focused on product control to minimize overall producer and customer costs. Control limits are focused on process control and are ideally determined by appropriately balancing false signal rates with required levels of sensitivity.
- Disposition limits and process control limits differ in the amount of risk they impose on a manufacturing operation. It sounds strange but the risk associated with determining the fate of a finite lot of product outside the appropriate limits is often perceived as much less than the risk of determining the fate of the associated process. Something has to be done with the product that has already been produced outside the appropriate limits but that decision is only applied to that finite lot. However, adjusting a process will potentially impact all future product through the affected process step.
I look forward to your active participation in enriching the blog carnival as we pursue our journey in exploring the happenings across quality management blogs…………
Note: The images depicted here above are through courtesy of respective websites who have the copyrights for the respective images.