IEEE Reliability Society Newsletter     Vol. 60, No. 2., May 2014

Table of Contents

Front page:

President's Message

From the Editor

Society News:

In Memoriam: Harold E. Ascher (1935-2014)

Candidates Sought for Reliability Society Admin Committee (2015-2017)

 

Members & Chapters

Chapter Outreach Xi'an, China

Chapter Outreach Taiwan

Chapter Outreach UK/Ireland Chapter

Announcement: Reliability Outreach, Milan, Italy

Dallas Chapter

Boston-NH-Providence Chapter

 

Meetings & Conferences

2014 IEEE PHM Conference in Spokane

RAMS 2015, Palm Harbor, FL

Letters in Reliability

Thoughts on the Power of Questioning

Microsecond PHM


Links:

Reliability Society Home

RS Newsletter Homepage

Thoughts on the Power of Questioning

Samuel Keene, PhD, FIEEE
Six Sigma MBB
Past President, IEEE Reliability Society

"Six Sigma is the process of asking questions that lead to tangible, quantifiable answers, that ultimately produce profitable (trustworthy) results." -- Mikel Harry, Six Sigma pioneer (Motorola)

Then I heard a related quote recently: "When you ask a Jew a question, he will answer with another question" - which I deem to be a good process.  indeed, a very good process,

 When teaching Six Sigma classes, I put questions to the class. Then, I always deem the class to be smarter when they, in turn, asked me refining questions.  I felt that those classes had better knowledge and insights. E.g., In teaching estimation, I would ask each student to right down on a paper to pass forward, their estimation of the population size of Moscow including an average, low and high bound.
.
Then I would tabulate their answers on a flip chart. Typically, I found that 75% of their bounding estimates would not contain the actual value.

Some refining questions that I received relative to the Population size of Moscow:
1. Moscow Russia vs Moscow Idaho?
2. What year?
3. City limits or metro area? 
4. How tight a bound:0, 90%, or 5, 9%, or 1, 99%

Point to be made, ASK QUESTIONS.

Another supporting point, my BIGGEST professional contributions came from my asking questions. I have always worked in the development side of engineering, but I found reasons to make regular excursions into the Manufacturing areas. I did this a lot more than my development engineering colleagues.  I tend to outreach more, as is my general nature. 

Once while I was in the manufacturing area,  I questioned a process that was reworking component boards, returned from the field.  The rework was intended make these cards "equivalent to new".  Mostly this involved replacing the solid tantalum capacitors on the boards. I also found out that they had 13 outside contract organizations supporting this rework effort. When I previously worked for NASA Goddard,  I had tested and qualified solid tantalum capacitors for space flight applications.  I found that the dielectric of these capacitors heal or anneal over time. This healing reduces the porosity of the Tantalum Pentoxide dielectric reducing the capacitor leakage current. These capacitor reliability actually improves over their application time period.  Replacing these capacitors was equivalent to replacing "aged wine with new wine". Eliminating this unnecessary process step, resulted in a multi-million dollar savings and preserved better card reliability.  

Another point.  I was part of the IBM team that took the HeNe laser (like we see in Super Market check out stands) from a research curiosity to an industrial application. In the early application of these lasers, they were exhibiting erratic life times.  Some times they were likened to the Blue Dot flash bulb, by failing so early in life and these were $10,000  a piece prototypes.  We wanted a 5,000 hour laser. We found some sample lasers failed in just a few hours. Failure analysis revealed variations in the laser materials and construction across the samples we tested. E.g., some cathodes were made with 2024 aluminum, some with 6061 aluminum. Some with temper T-4, some with a temper T-6. Was it a problem?  Could be. Variation is always a concern for product quality.

Too often problems like this have are approached by intuitively "fiddling" with the design or manufacturing parameters of the laser.  This is usually approached by changing one variable or factor at a time. We label this process OFAT for one factor at a time.

Fortunately, we had an executive manager who had seen the benefit of using "Design of Experiments" to methodically and scientifically improve a design and manufacture process.  We also had a DOE expert, on staff, that had worked with the executive manager.  That was Mr Bill Diamond, author of "Practical Experiment Designs: for Engineers and Scientists".  This experimental process began with asking a lot of questions to identify current and planned variations in the laser construction and its component materials.
  
We applied the DOE analysis technique to find "Gods formula for the Laser", using Bill Diamond's terminology.  We teamed with the laser manufacturer and identified over 25 possible variables that could impact laser life and performance. We honed this list down to thirteen key laser parameters. These parameters were initially evaluated in a 32 sample,13 variable factorial experiment.  This DOE experiment allowed assessment of the key factors from the initial 13 variables considered.  us to evaluate the main effects as well as the variable interaction effects. Subsequent tests were focused on the key variables (reduced from the original 13 we initially tested). The final, optimized HeNe laser never caused us a significant problem in deployment.

One last point. The biggest problem (reliability, cost, and schedule) in developing large systems stems from requirements deficiencies. I saw a critical military system that was to be redesigned from a single host based system to a distributed architecture.  It took three years to design the replacement system and when it was ready to deploy, the system was scuttled because it was deemed it could not be switched over from the existing architecture to the new architecture, without disrupting operations. This violated the system requirement that the system could never go dark or blind. Apparently, this requirement was not sufficiently accounted for during the the requirements or development phases.  Early program questioning could have revealed this deployment gap.  Six Sigma has a plethora of tools that can promote comprehensive requirements review, e.g., Quality Function Development (QFD) cross checks the design capability to meet the requirement needs. Six Sigma tools provide a systematic baseline for system reviews and component reviews.  It systematizes the desired questioning process, promoting collaboration among the stake holders. This "teaming collaboration" promotes discovery and reduces the defensiveness of the parties involved.  Further this process will create the trace-ability of product requirement and design decisions. This is a high leverage analysis process and provides a monument, to document,  the planning process and capture the requirements and design rationale.