Corrective and Preventive Action (CAPA) is vital to the life sciences

Corrective and Preventive Action (CAPA) is of the essence in any industry. The importance of CAPA is all the more pronounced in the life sciences industry, considering the importance of this discipline on human lives. It is the very core of a sound management system. CAPA, as the name suggests, is a set of preventive and corrective measures that need to be taken to ensure that the product meets its quality and regulatory expectations.

The essence of CAPA for the life sciences is that it should build the ability to respond to problems as they arise, but more importantly, a CAPA system should help the life sciences organization to anticipate and thus prevent problems from happening.

CAPA in the life sciences is an amalgamation of the following core ingredients:

o  Change Control

o  Continuous improvement

o  Complaint management

o  The need for understanding and implementing CAPA

CAPA is the very edifice of Quality Management. It is a sine qua non for meeting regulatory requirements. Its importance to the life sciences industry vis-à-vis the FDA can be gauged from the fact that in just four years starting 2012, the number of Warning Letters the FDA issued shot up by over 12 times. Understanding and implementing the right CAPA system is critical to meeting regulatory requirements and preventing such actions from the FDA.

Common pitfalls of CAPA implementation

Many life sciences organizations face a few obstacles when it comes to CAPA implementation. The most common ones among these are:

o  Failure to achieve an integrated view of the whole process that can be managed from any source

o  Lack of understanding of the highly intricate and difficult processes

o  Not documenting every step of the CAPA process

o  Lack of clarity between what a corrective and preventive action is

o  Not implementing uniform process across the product’s or company’s numerous sites.

An important line of thinking that needs to go into approaching CAPA is that understanding failures is as important as understanding success of a product in the life sciences industry if it has to meet quality and regulatory requirements. Understanding and correcting problems before they become critical and impede the product’s progress are vital to ensuring quality and meeting regulatory requirements.

Given the extremely high importance CAPA has in the regulated industries; it is but natural that there are a number of regulations that govern this area of the life sciences. How does an organization get an understanding of these regulations right? How does it enforce these regulations in the right manner, so that its products meet the regulatory and quality requirements?

Learn to get these aspects right

All these will be the topic of a highly valuable webinar from Compliance4All, a leading provider of professional trainings for all the areas of regulatory compliance.

At this webinar, Charles H. Paul, who is the President of C. H. Paul Consulting, Inc., a regulatory, manufacturing, training, and technical documentation consulting firm, will be the speaker. To gain proper and complete insights into all the areas of CAPA for the life sciences, just register for this webinar by visiting

At this session, Charles will explain how to apply core aspects of the CAPA process that includes:

o  The critical steps

o  The timing of CAPA

o  Who all in the organization need to take part in the process

o  Their roles, responsibilities and functions, and

o  The snags and hazards with CAPA investigation into the process.

The content of this webinar is designed to help participants achieve a highly effective CAPA system. It will help explain the purpose and function of CAPA, by which they will be able to:

o  Identify and explain the relevant CAPA regulations

o  Define exception/deviation reporting and explain the process of executing the reporting process

o  Explain and trace the CAPA flow from problem identification to resolution

o  Explain the challenges and pitfalls of the CAPA process and how they are overcome.

o  Explain CAPA’s role in risk mitigation.

o  Explain how root cause analysis is executed.

At this very valuable session, Charles will cover the following areas of CAPA for the life sciences:

o  CAPA defined

o  CAPA relevant regulations

o  Exception/deviation reporting

o  CAPA process flow

o  CAPA process steps explained

o  Challenges and pitfalls of CAPA’s

o  CAPA and risk mitigation

o  Root Cause Analysis.

International Financial Reporting Standards (IFRS) 6

The International Financial Reporting Standards (IFRS) standards are a set of standards pertaining to different industries and their activities and practices. IFRS 6 relates to guidance in the accounting practices of the extractive industries, such as oil, mining and gas. The IFRS 6 accounting standard states the requirements, as well as the disclosures that need to go into accounting practices for expenses that a company incurs during the course of exploring and evaluating expenditures.

Till the enactment of the IFRS 6, regulations on the accounting practices of the extractive industries were fragmented and piecemeal. The major change the IFRS 6 brought about it is that it consolidated these practices. Also, with the passage of IFRS 6, entities that were using accounting practices for exploration and evaluation assets that were in use prior to the enactment of the IFRS 6 could integrate these earlier practices with the provisions of the IFRS 6.

Core accounting requirements

One of its core requirements is that of the issuance of IFRS compliant financial statements by companies that have assets used for exploration and evaluation of mineral resources.

So, it is imperative for accounting professionals to have full knowledge of the IFRS 6. Working with the oil, mining or gas areas in companies that have assets that are used for exploration and valuation of mineral resources and being successful entails having to comply with the requirements set out by the IFRS 6.

A proper understanding of the IFRS 6

A learning session that is being organized by Compliance4All, a leading provider of professional trainings for the areas of regulatory compliance, will offer the learning needed for getting trained on how to comply with the requirements set out in IFRS 6.

At this webinar, Mike Morley A Certified Public Accountant and business author who organizes various training programs, such as IFRS, SOX, and Financial Statement Analysis that focus on providing continuing education opportunities for finance and accounting professionals, will be the speaker.

Professionals who work in the oil, mining or gas areas in companies that have assets that are used for exploration and valuation of mineral resources can gain insights into what the IFRS 6 means for them by enrolling for this webinar. To register, please visit

Familiarization with all the aspects of the IFRS 6

The aim of this presentation is to help oil; mining and gas professionals become knowledgeable about the latest information about IFRS 6. The speaker will familiarize participants with the unique accounting and reporting issues, particularly in regards to the evaluation of assets, revenues and expenditures that professionals in the extractive industries, involved in the search for mineral resources, including oil, gas, minerals, and similar exhaustible resources face.

Accounting professionals who work in these industries, and who need in-depth understanding of the way the IFRS 6 is structured, and the ways in which they need to apply the standards in the right manner, such as Auditors, Accountants, Financial Managers, Financial Controllers, Company Executives, and anyone involved in the SOX compliance process, will benefit immensely from this webinar on the accounting practices set out by IFRS 6.

At this session on the IFRS 6, the speaker will cover the following areas:

o  Why the accounting for this sector is different

o  How resource assets are evaluated

o  Special rules for measuring revenues and expenditures

o  How revaluation rules apply to the Oil, Gas, and Mining industries

o  Other specific requirements of IFRS 6

o  Required disclosures.

The Attribute Agreement Analysis

Humans can be calibrated, although most people like to think otherwise. The commonly used standard, Attribute Agreement Analysis, or what is called AAA, is a handy tool in helping to do this. At its barest, Attribute Agreement Analysis is a method in which the level of agreement or conformance between the appraisal made by the appraiser(s) and the standard is assessed. Then, the elements used for the appraisal that have the highest levels of disagreement with the standard are identified.

The two methods of the Attribute Agreement Analysis

The Attribute Agreement Analysis uses two primary methods of assessing the agreement of the attribute with the standard:

–       The percentage or extent to which the appraisals agree with the standard

–       Kappa statistics, or the percentage or extent to which adjustment is made between the agreement between the appraisals and the standard and the percentage of agreement that happens by chance

The three aspects of Attribute Agreement Analysis

Attribute Agreement Analysis has three aspects: Agreement with oneself, Agreement to a peer, and Agreement to the standard. When calibrating humans, the use of Attribute Agreement Analysis calls for control plans that need to be in put in place for “MSA” analysis on key processes. An AAA may be described as a “Measurement Systems Analysis” (MSA) for attributes.

The Attribute Agreement Analysis method is useful to auditing professionals, to whom it makes sense to understand the effectiveness of these methods when these are used by their clientele and/or in their own organization.

Gain learning of Attribute Agreement Analysis

The ways by which Attribute Agreement Analysis can be comprehended and used effectively will be the learning a webinar being organized by Compliance4All, a provider of cost-effective regulatory compliance trainings for a wide range of regulated industries, is offering.

The speaker at this webinar is Jd Marhevko, Vice President of Quality and Lean for Accuride Corporation, who has been involved in Operations and Quality/Lean/Six Sigma efforts across a variety of industries for more than 25 years. To gain insights into the inner aspects of Attribute Agreement Analysis, please register for this webinar by logging on to

The “Statistical AAA” and the Kappa value

At this webinar, Jd will review both the “Statistical AAA” and the Kappa value, as well as the confidence levels for the result bands and incorporation of AAA into the Control Plan and frequency of calibration.

She will assess the pros and cons while discussing the general benefits of reductions in arguments (what is good or not) internal/ external rework, returns, premium freight, etc.

A number of uses from the Attribute Agreement Analysis method

This explanation will help participants understand ways by which they can apply this tool while learning how to bring down business costs. Jd will evaluate the benefits of human calibration by reviewing the three basic types of agreements.

The important learning this session will give is that it will enable participants to learn the ways of developing, creating, executing and interpreting an Attribute Agreement Analysis so that an accurate and repeatable disposition can be made and rework and returns can be effectively reduced.

At this webinar on Attribute Agreement Analysis, which will be highly useful to professionals such as Quality and Engineering system practitioners, Directors, Engineers, Analysts and Managers, Jd will cover the following areas:

o  To help people understand how AAA can be effectively utilized for mitigating business loss

o  Increased understanding of how to actually perform the analysis

o  Build confidence in the ability to calibrate a human operator.

Demonstrating Product Reliability

Product Reliability is among the most important attributes for a product, no matter what kind of product one is considering. Product Reliability can be defined as the likeliness or probability of the product performing its stated purpose, in the light of the conditions under which it is going to be put, for a defined period of time.

The parameters used for quantification of Product Reliability are:

o  MTBF or Mean Time Between Failures for products that can be repaired and used, and

o  MTTF or (Mean Time To Failure) for products that cannot be repaired.

If these are the parameters used for quantifying Product Reliability, how does one predict it? Quality professionals use a relatable analogy to explain how to assure measurability of Product Reliability. This is known as the bathtub curve analogy, where the start or early life of the product is placed at the start of the bathtub. As the product starts getting used, it moves on to the phase of its useful life, from where it moves on to its wear out time.

The time taken for each of these processes is the indication of Product Reliability. Product reliability using the bathtub curve analogy suggests that at the beginning of the product lifecycle, the probability of failure rate is relatively less. As the product progresses on to its next phases, the probability of failure rate increases. This is the root of the understanding of predicting Product Reliability. Mathematical formulae are used to describe these.

How does on get Product Reliability right?

The ways of choosing the right reliability parameters for failure rates of respective products and estimating their lifecycle in the light of failure rates will be the topic of a webinar that is being organized by Compliance4All, a leading provider of professional trainings for all the areas of regulatory compliance.

At this webinar, Steven Wachs, Principal Statistician at Integral Concepts, Inc. where he assists manufacturers in the application of statistical methods to reduce variation and improve quality and productivity, will be the speaker. To benefit from the experience Steven brings into Quality, please register for this webinar by logging on to

A description of the various approaches to Product Reliability

Steven will offer and explain several approaches that can be used to verify whether reliability targets or specifications have been achieved at the desired level of confidence. In particular, he will describe the approaches using time-to-failure data to estimate reliability metrics.

Also taken up at this session, which will be of immense use to anyone with a vested interest in product quality and reliability, such as Product Engineers, Reliability Engineers, Design Engineers, Quality Engineers, Quality Assurance Managers, Project/Program Managers, and Manufacturing Personnel are demonstration tests, where minimum reliability may be demonstrated with zero or few failures.

A description of the methods that increase the risk of failures

Steven will discuss the kind of methods, which when used, increase the risks of field failures due either to inadequate designs or misconstruction of product use conditions that need to be managed. He will also offer the options for verifying and demonstrating that customer reliability requirements have been achieved.

Steven will cover the following areas at this webinar:

o  Overview of Reliability

o  Reliability Metrics and Specifications

o  Estimating Reliability with Time-to-Failure Data

o  Confidence Intervals and Bounds

o  Demonstrating Reliability with zero or few failures

o  Tradeoffs between Testing Time and Sample Size

o  Impact of Assumptions on Test Plans

o  Improving Demonstration Test Power

Alternatives to AQL sampling plans do exist

Alternatives to AQL sampling plans do exist, but companies need to be aware of them and explore them. Acceptance Quality Limit, or AQL, is applied as a benchmark in most manufacturing organizations to inspect the quality of products they purchase. It is only when the product meets AQL that the receipt is acknowledged and the payment made.

So, what is AQL?

What is AQL? In simple terms, AQL, which expands to Acceptance Quality Limit, is what may be termed as the least or the worst or lowest level of tolerable process means that can be accepted for the quality of product. It is the ratio or percentage level below which quality cannot be lowered to be termed acceptable.

Acceptance Quality Level is accepted as a standard business practice by most medical device companies. Attribute sampling based on ANSI/ASQ Z1.4 and Zero Acceptance Number Sampling Plans by Nicholas L. Squeglia continue to be the most common applications used by companies.

Considering a viable alternative to AQL sampling plans

Although popular, these common methods are not always the best approaches. This is not to belittle the effectiveness of this method, but to just point out that these are in themselves insufficient. Medical devices need to be aware of a variety of methods and when and how to use them.

Establishing “processes needed to demonstrate [product] conformity” is a requirement from ISO 9001 and ISO 13485. Similarly, the FDA’s GMP (21CFR820) requires that “sampling methods are adequate for their use”. Further, an FDA guideline states that “A manufacturer shall be prepared to demonstrate the statistical rationale for any sampling plan used”.

However, an AQL sampling plan does not provide what is needed to meet either of those requirements. Using only Attribute sampling based on ANSI/ASQ Z1.4 and Squeglia’s Zero Acceptance Number Sampling Plans, it is not possible to actually “demonstrate” that an AQL sampling plan ensures product quality.

This is where “Confidence/reliability” calculations come in as alternatives to AQL sampling plans. They are a better way to assess the quality of purchased parts. It is easy to make such calculations using tables and/or an electronic spreadsheet. It is also easy to use confidence/reliability calculations to provide evidence of product quality. The statistical rationale for such calculations is easy to explain and demonstrate, which is why these calculations constitute strong and reliable alternatives to AQL sampling plans.

A learning session on the alternatives to AQL sampling plans

These alternatives to AQL sampling plans will be the core of a learning session that Compliance4All, a leading provider of professional trainings for all the areas of regulatory compliance, will be organizing. John N. Zorich, a senior consultant for the medical device manufacturing industry, will be the Speaker at this webinar, to enroll for which, all that is needed is to visit

A complete heads-up on the alternatives to AQL sampling plans

At this webinar on the alternatives to AQL sampling plans, the speaker will explain the pros and cons of ANSI Z1.4, and Squeglia’s C=0 in detail. He will highlight the weaknesses of such plans vis-à-vis meeting regulatory requirements. John will offer real-world examples of how using such sampling plans leads to production of non-conforming product to fortify the learning on the alternatives to AQL sampling plans.

He will also examine ISO and FDA regulations and guidelines regarding the use of statistics, especially in regards to Sampling Plans. As part of alternatives to AQL sampling plans, John will explain the advantages of “confidence/reliability” calculations are explained. Such calculations are demonstrated for Attribute data (pass/fail, yes/no data) as well as for variables data (i.e., measurements). If variables data is “Normally distributed”, the calculations are extremely simple. The webinar explains how to handle “non-Normal” data, and provides the methods, formulas, and tools to handle such situations.

The webinar on alternatives to AQL sampling plans ends with a discussion of how one OEM manufacturer has implemented “confidence/reliability” calculations instead of AQL sampling plans for all of its clients. The speaker will offer suggestions for how to use “confidence/reliability” QC specifications instead of “AQL” QC specifications. The use of “reliability plotting” for assessing product reliability during R&D is also discussed.

The speaker will talk on the following topics during this session:

·                    AQL and LQL sampling plans

·                    OC Curves

·                    AOQL

·                    ANSI Z1.4

·                    Squeglia’s C=0

·                    Confidence/Reliability calculations for

o                 Attribute data

o                 Normally-distributed variables data

o                 Non-Normal data

·                    Transformations to Normality

·                    K-tables

·                    Normal Probability Plot

·                    Reliability Plotting

Getting a grasp of the FDA’s New Enforcement of 21 CFR Part 11

That the FDA has become more rigorous in the enforcement of Part 11 through its new Part 11 and on-going data integrity inspection and enforcement program is evident in this: In just the last three years, it issued more than 30 Warning Letters with Part 11 and data integrity violations. The most common citations are related to not only inadequate integrity, security and availability of electronic records, but also to validation of software and computer systems.

Among the most important questions about the program is what major findings inspectors are looking at. So, companies need to be even more vigilant then before in implementing their electronic records. In view of these developments, it is essential to have in place a proper and foolproof process for ensuring integrity, authenticity and availability of electronic records.

A learning session on understanding the matter

A webinar from Compliance4All, a leading provider of professional trainings for all the areas of regulatory compliance, will explain how to achieve all these. Dr. Ludwig Huber, Ph.D., who is Director of Labcompliance, will be the speaker at this session. To understand how to get a grasp of the FDA’s thinking on Part 11 enforcement, please register for this webinar by visiting

Practical lessons

Dr. Huber will use industry proven case studies on how to avoid 483 inspectional observations and Warning Letters. With this learning, participants will be able to learn how to prepare their organization for trouble-free Part 11 related inspections.

Dr. Huber will set out detailed, six-step plans for helping companies maintain these records. The learning at this webinar will also feature several other strategies and learning experiences to make sure that there will be no surprises should an FDA visit a participant’s company.

Dr. Huber will include three SOPs as an additional bonus to enable easy implementation for participants to this webinar. These are the ones:

o  Checklist: Part 11 compliance

o  Case Studies: How to avoid Part 11 related 483’s and Warning Letters

o  SOP: Electronic Audit trail: Specifications, Implementation, Validation

These are the areas this webinar will cover:

o  FDA’s current inspection and enforcement practices

o  FDA’s new interpretation: learning from FDA inspection reports

o  Strategy for cost-effective implementation of Part 11: A six step plan

o  Recommended changes to existing Part 11 programs to reduce costs

o  Justification and documentation for the FDA and your management

o  Going through case studies from laboratories, offices and manufacturing with graphical workflow of records, step-by-step description, recommendations for individual Part 11 requirements with justifications and documentation for the FDA and your management.

o  Case studies how to avoid or respond to Part 11 related observations with corrective actions to fix current issues and preventive actions to prevent reoccurrence of the same or similar issues.

o  How to prepare your company for Part 11 Inspections.

Making Big Data big in terms of effectiveness

Our world today is unthinkable without data. We seem to be flooded by it. Day in and day out, the world processes trillions of bytes. Big Data seems to be everywhere and has compounded our already heavy reliance on data.

But has this proliferation of data made any significant difference to our lives? Has it made our business decision-making any more effective or insightful? What has this surfeit of data meant in terms of usefulness and value? Does reliance on Big Data necessarily mean better business decisions?

Making sense of data

Translating tomes and tomes of this data into something that is useful is a big challenge in today’s world. This is a capability that needs to be translated into tangible, competitive strength, if quality and compliance have to be improved. Data Management and Quality metrics are important tools that can help in a host of important functions such as forecasting, resource allocation, risk management, decision making, and continuous improvement.

Susanne Manz, who is an accomplished leader in the medical device industry emphasizing on quality, compliance, and Six Sigma, will impart the insightfulness needed to make sense of Big Data to participants of a webinar that Compliance4All, a highly popular provider of professional trainings for all the areas of regulatory compliance, is organizing.

In order to understand what perceptiveness can be inculcated into understanding and analyzing Big Data to aid in decision-making, just log on to to register.

Absolutely useful aid in decision-making for management review

Management Review, among the fundamental requirements of a suitable Quality System; relies on timely, accurate and complete information to make risk-based decisions. An organization that is immersed in data that leads nowhere in helping it with the critical information to ensure product safety and effectiveness has no use from this kind of Big Data.

Organizations need to have not heaps of data, but what in it gives them the ability to measure to understand quality, compliance, and customer satisfaction. This is the test of the accuracy, completeness and timeliness of their data. The aim of this webinar is to help participants develop the data management processes that help them optimize their quality system efficiency and effectiveness.

Susanne will cover the following areas at this webinar:

o  What metrics are needed for quality and compliance success

o  Sources of data

o  Analytics capabilities

o  Descriptive and Predictive Data

o  Structure and process for managing data

o  Data Governance

o  Data Preparation

o  Using data for forecasting, continuous improvement, and management review.