That the inputted data should be “normally distributed” is a requirement of the calculations used in many statistical tests and methods. Typically, the methods used for Student’s t-Tests, ANOVA tables, F-tests, Normal Tolerance limits, and Process Capability Indices include such calculations.
A core criterion for ensuring the correct results is the one that the raw data used in such calculations be “normally distributed”. It is in view of this fact that the assurance that the FDA holds a company’s “valid statistical techniques” being “suitable for their intended use” as a critical component for making the assessment of whether or not data is “normally distributed”.
Now, which are the types of data that are normally distributed? Dimensional data, such as length, height and width are considered as being typically normally distributed. Which kinds of data are usually non-normal? The typical example of non-normal data are burst pressure, tensile strength, and time or cycles to failure. Some non-normal data can be transformed or converted into normality data for the purpose of validating statistical calculations, when they are run on the transformed data.
Get an understanding of the normally distributed and normality transformation data
Statistics professionals can get a thorough understanding of the concepts of what are normally distributed and how to transform non-normal data into normality at a webinar that is being organized by Compliance4All, a leading provider of professional trainings for the areas of regulatory compliance. At this webinar, John N. Zorich, a senior consultant for the medical device manufacturing industry, will be the Speaker. To register, please visit
John will take the participants of this webinar through all the aspects of “normally distributed” data and how to transform non-normal data. He will explain the various elements of data, such as:
o What it means to be “normally distributed”
o How to assess normality
o How to test for normality
o How to transform non-normal data into normal data
o How to justify the transformations to internal and external quality system auditors.
A combination of graphical and numerical methods that have been in use for many years constitute Normality Tests and normality transformations. These methods are essential whenever one has to apply a statistical test or method whose fundamental assumption is that the inputted data is normally distributed. John will offer an explanation of these methods.
What goes into normality testing and normality transformation?
While normality “testing” involves creating a “normal probability plot” and calculating simple statistics for comparison to critical values in published tables; making simple changes to each of the raw-data values, such that the resulting values are more normally distributed than the original raw data, is what a normality “transformation” involves.
Professionals whose work involves normality testing, such as QA/QC Supervisor, Process Engineers, Manufacturing Engineers, QC/QC Technicians, Manufacturing Technicians, and R&D Engineers will find this session very helpful. John will offer guidance on both objective and some subjective decision-making that go into the evaluation of the results of “tests” and “transformations”.
John will cover the following areas at this webinar:
o Regulatory requirements
o Binomial distribution
o Historical origin of the Normal distribution
o Normal distribution formula, histogram, and curve
o Validity of Normality transformations
o Necessity for transformation to Normality
o How to use Normality transformations
o Normal Probability Plot
o How to evaluate Normality of raw data and transformed data
o Significance tests for Normality
o Evaluating the results of a Normality test
o Recommendations for implementation
o Recommended reference textbooks.