Right now Medical device hazard analysis, the core of medical devices

Medical device hazard analysis is of vital importance to a medical device. Medical device hazard analysis is at the heart of medical devices because if the device is not analyzed thoroughly for the hazard, or danger, that it poses, it is likely to cause problems of any kind to the user. Many a time, it becomes a matter of life and death. This is why medical device hazard analysis is of foremost importance.

So, what is medical device hazard analysis? Medical device hazard analysis may be defined as a structured method of analyzing the inherent and potential problems that a medical device could have at any stage of its production or after it is released into the market.

The need for medical device hazard analysis

Medical device hazard analysis has to be done for a number of reasons. It is required by the FDA as part of a product development Design Control Program. The FDA recommends ISO 14971 as the standard for medical device hazard analysis. This is because the ISO 14971 hazard analysis standard is considered the most comprehensive of all medical device hazard analysis tools.

What makes this so is that the ISO 14971 takes risk into consideration in normal state, as opposed to other tools such as Fault Tree Analysis (FTA) and Failure Modes and Effects Analysis (FMEA), which only consider fault conditions. The latter two, in relation to the ISO 14971 standard, are more suitable for as tools for reliability rather than as those for product safety.

However, despite the uses it has; medical device hazard analysis in relation to ISO 14971 is considered quite complex because of the free and broad implication of a few important terms, even if they are pretty straightforward in terms of their definitions. Some of the words that cause confusion over their interpretation when they are being put to practical use include:

Hazard: Generally described as the potential site or basis of harm

Hazardous situation: A circumstance or situation which exposes people to a hazardous event or environment

Harm: Any degree of physical damage or injury from the medical device to the people working with it, or to the property or to the environment

Causative event: An event that may be said to be the source or cause of an adverse event in a medical device

ALARP: As Low as Reasonably Practicable, or judging how to weigh a risk against its benefits. This involves having to take a tricky decision in many situations

Risk index: A risk index is about assigning a score that will help determine the course of action when the risk index falls within categories

Residual risk: The level or extent of risk that remains after all the measures for risk control have been implemented.

There is more to medical device hazard analysis than these

The fact of having to take all these plus other factors into consideration for medical device hazard analysis makes it a challenging matter, because medical device hazard analysis should be done in such a manner that nothing is left to chance.

Want to understand the intricacies of medical device hazard analysis? Then, enroll for a highly relevant and absorbing session on medical device hazard analysis. This webinar is being organized by Compliance4All, a highly recognized provider of professional trainings for all the areas of regulatory compliance. Just log on to http://www.compliance4all.com/control/w_product/~product_id=501207?Wordpress-SEO to register.

Clearing the confusion about terms in medical device hazard analysis

At this session, Edwin Waldbusser, who has been consulting in the US and internationally in the areas of design control, risk analysis and software validation, will be the speaker. He will throw light on all the confusing terms listed above. He will offer clarity on how to prepare a thorough medical device hazard analysis that will help those participating into this learning session.

Edwin will go step by step through a template for hazard analysis to help clear the confusion about the meaning of these terms and make the process clear. He will discuss examples of hazards and hazardous situations and explain how to deal with residual risk. He will walk participants step by step through a typical medical device hazard analysis.

Also explained in this session on medical device hazard analysis is the way of integrating Human Factors studies into the Hazard Analysis and how to integrate Hazard Analysis into the design program.

In the course of this lively discussion on medical device hazard analysis, Edwin will cover the following areas:

o  Explanation of Hazard Analysis terms

o  Hazard analysis process explanation using a template

o  Examples of terms will be given

o  Hazard analysis examples will be covered step by step.

http://blog.greenlight.guru/iso-14971-medical-device-risk-management

http://www.fda-consultant.com/risk1.pdf

http://nuhrise.org/wp-content/uploads/SOP-52-Adverse-event-reporting-for-medical-device-trials-SOP-NUH-sponsor-version-1.pdf

http://www.hse.gov.uk/risk/theory/alarpglance.htm

Today’s Pre control and Statistical Process Control (SPC)

Pre control and Statistical Process Control (SPC) are key tools for determining the process that goes into a product.

SPC is a key ingredient of Quality. It is an important step in reducing nonconformities and defects in any manufacturing process. SPC charts help to detect assignable causes of a process change in a timely fashion. SPC helps to identify root causes and take corrective actions before the development of the product has reached such a stage that carrying changes out is neither practicable nor useful.

Examples of process changes that SPC helps to detect include trends, shifts and variation. Three items are needed for SPC to meet its goal:

o  A system that measures effectiveness in real-time

o  Tolerance that is practical and is connected to customer keenness and satisfaction

o  A dial indicator that comes with an anticipated response.

All these help SPC to determine whether a process is stable and requires no adjustment, is incapable of performing its functions altogether, or is deviating but capable.

And now, pre-control

Pre-control, on the other hand, inspects the units and adjusts the process and the succeeding sampling procedures assuming where the measurements are placed in relation to the specification limits. The focus of pre-control is individual measurements.

It uses a set of probabilities, based on assumed distributions and the location of the process, to estimate where there is a justification for the process adjustments. Since decisions concerning pre-control are based broadly, i.e., on the area in which the measurements; it obviates the need for charting, as it is very responsive to the process signals right from the start.

SPC or pre-control?

There are arguments for and against the use of SPC and pre-control as an effective means of ensuring that the process is right and that it results in the desired quality for the product.

At a webinar that is being organized by Compliance4All, a leading provider of professional trainings for all the areas of regulatory compliance, the speaker, Jd Marhevko, who has been involved in Operations and Quality/Lean/Six Sigma efforts across a variety of industries for more than 25 years, will explain all the aspects of pre-control. To hear her perspective of pre-control, please register for this webinar by visiting http://www.compliance4all.com/control/w_product/~product_id=501074?Linkedin-SEO

What makes this webinar special is that it has consistently ranked in the top 1-5% at previous conferences at more than five venues. It was featured in ASQ QMD’s special edition of the Quality Management Forum’s 2015 Spring edition (ASQ-QM.org). A webinar of this topic was provided in 2015 via the ASQ QMD Linkage Technical Committee to over 1300 respondents through the IMA and ASQ QMD.

Tools needed for pre-control

At this hour-long session, Jd will explain all the elements of pre-control in Quality. she will show to participants the way of drafting and creating a pre-control chart. She will run a process in which to model the next steps and decisions.

The aim of this session is to equip participants with the knowledge needed for reducing the complexity of the system and bringing about an improvement in the effectiveness and efficiency of their Quality Management Systems.

A session packed with interaction and practical application of principles

A major component of this webinar on SPC and pre-control is that Jd will share the result of case studies. The knowledge gained at this webinar can be applied immediately at their work in respect to the following:

–       Measurement System Analysis (MSA): Jd will conduct a high level overview of MSA. This will help the participants get a grasp of the need for putting an effective measuring system in place ahead of implementing pre-control

–       Cpk Overview: To help participants gain baseline capability in advance of implementation of pre-control, a high level Cpk overview will be conducted

–       Normal Distribution: The way in which the cumulative distribution function of the normal distribution is to be used for estimating and establishing the zones on a pre-control chart

–       Pre-Control Chart: Jd will show participants how to apply the concepts listed above with the use of a mock pre-control chart where the process will be demonstrated based on the “go/no go” zones that are established.

At this webinar, Jd will cover the following areas of pre-control:

o  Reduce process complexity and minimize risk

o  Increase affectivity of a Core Tool

o  Increase personnel compliance in proactive process management.

https://www.isixsigma.com/tools-templates/control-charts/using-control-charts-or-pre-control-charts/

http://www.symphonytech.com/articles/pdfs/precontrol.pdf

http://www.winspc.com/what-is-spc/ask-the-expert/400-pre-control-no-substitute-for-statistical-process-control

http://www.qualitymag.com/articles/86794-pre-control-may-be-the-solution

Corrective and Preventive Action (CAPA) is vital to the life sciences

Corrective and Preventive Action (CAPA) is of the essence in any industry. The importance of CAPA is all the more pronounced in the life sciences industry, considering the importance of this discipline on human lives. It is the very core of a sound management system. CAPA, as the name suggests, is a set of preventive and corrective measures that need to be taken to ensure that the product meets its quality and regulatory expectations.

The essence of CAPA for the life sciences is that it should build the ability to respond to problems as they arise, but more importantly, a CAPA system should help the life sciences organization to anticipate and thus prevent problems from happening.

CAPA in the life sciences is an amalgamation of the following core ingredients:

o  Change Control

o  Continuous improvement

o  Complaint management

o  The need for understanding and implementing CAPA

CAPA is the very edifice of Quality Management. It is a sine qua non for meeting regulatory requirements. Its importance to the life sciences industry vis-à-vis the FDA can be gauged from the fact that in just four years starting 2012, the number of Warning Letters the FDA issued shot up by over 12 times. Understanding and implementing the right CAPA system is critical to meeting regulatory requirements and preventing such actions from the FDA.

Common pitfalls of CAPA implementation

Many life sciences organizations face a few obstacles when it comes to CAPA implementation. The most common ones among these are:

o  Failure to achieve an integrated view of the whole process that can be managed from any source

o  Lack of understanding of the highly intricate and difficult processes

o  Not documenting every step of the CAPA process

o  Lack of clarity between what a corrective and preventive action is

o  Not implementing uniform process across the product’s or company’s numerous sites.

An important line of thinking that needs to go into approaching CAPA is that understanding failures is as important as understanding success of a product in the life sciences industry if it has to meet quality and regulatory requirements. Understanding and correcting problems before they become critical and impede the product’s progress are vital to ensuring quality and meeting regulatory requirements.

Given the extremely high importance CAPA has in the regulated industries; it is but natural that there are a number of regulations that govern this area of the life sciences. How does an organization get an understanding of these regulations right? How does it enforce these regulations in the right manner, so that its products meet the regulatory and quality requirements?

Learn to get these aspects right

All these will be the topic of a highly valuable webinar from Compliance4All, a leading provider of professional trainings for all the areas of regulatory compliance.

At this webinar, Charles H. Paul, who is the President of C. H. Paul Consulting, Inc., a regulatory, manufacturing, training, and technical documentation consulting firm, will be the speaker. To gain proper and complete insights into all the areas of CAPA for the life sciences, just register for this webinar by visiting http://www.compliance4all.com/control/w_product/~product_id=501216?Wordpress-SEO

At this session, Charles will explain how to apply core aspects of the CAPA process that includes:

o  The critical steps

o  The timing of CAPA

o  Who all in the organization need to take part in the process

o  Their roles, responsibilities and functions, and

o  The snags and hazards with CAPA investigation into the process.

The content of this webinar is designed to help participants achieve a highly effective CAPA system. It will help explain the purpose and function of CAPA, by which they will be able to:

o  Identify and explain the relevant CAPA regulations

o  Define exception/deviation reporting and explain the process of executing the reporting process

o  Explain and trace the CAPA flow from problem identification to resolution

o  Explain the challenges and pitfalls of the CAPA process and how they are overcome.

o  Explain CAPA’s role in risk mitigation.

o  Explain how root cause analysis is executed.

At this very valuable session, Charles will cover the following areas of CAPA for the life sciences:

o  CAPA defined

o  CAPA relevant regulations

o  Exception/deviation reporting

o  CAPA process flow

o  CAPA process steps explained

o  Challenges and pitfalls of CAPA’s

o  CAPA and risk mitigation

o  Root Cause Analysis.

http://www.meritsolutions.com/life-sciences/top-8-pitfalls-and-challenges-of-life-sciences-capa-systems-and-processes/

A look into the 21st Century Cures Act

Considered one of the most significant changes to be introduced into the American healthcare sector since the passage of the Affordable Care Act; the 21st Century Cures Act was one of the last legislative acts of outgoing president Barack Obama. Signed into law in December 2016; the 21st Century Cures Act seeks to strengthen medical research, foster innovation and accelerate the development of innovative treatments for chronic ailments such as cancer.

The 21st Century Cures Act aims at strengthening funding for the National Institutes of Health (NIH) by allocating over $ six billion to them. Of this, nearly $ five billion will go towards biomedical research funding. One of the highlights of the 21st Century Cures Act is the allocation of nearly $ two billion for the “Beau Biden Cancer Moonshot” initiative, which is in honor of the Vice President Joe Biden’s son, who succumbed to brain tumor.

A different take on health improvement

This approach is a significant one, considering that researchers from some of the nation’s best-known science universities depend on NIH funding for their research. Nearly two thirds of the major drugs that are in the market since 2000 have been the result of NIH research.

In addition, the 21st Century Cures Act will also have a major impact on mental health. This is one of the most notable features of this Act. It allocates over a billion dollars for addressing opioid and other addictions in the US, and the health-related complications that arise from them, which is a significant contributor to the fall in national health standards and productivity of the population in its prime.

Will the 21st Century Cures Act change the FDA approval process?

While addressing this core aspect of scientific research in helping to treat chronic diseases; the 21st Century Cures Act also focuses on another very critical point of medicines: The FDA. Since no drug can ever enter the market without FDA approval and the FDA approval process is very lengthy, expensive and cumbersome; the 21st Century Cures Act seeks to address this fundamental issue by suggesting changes into the approval process for new drugs, as well as medical devices.

Concerns and criticisms

Reservations have been expressed about the effectiveness of the 21st Century Cures Act. The main concern is that the regulatory approval process of drugs from the FDA could get diluted, causing a risk to the lives of the patients, thereby reducing the ability of the FDA to protect lives with its regulations.

There are many issues at stake in this highly important legislation. A webinar that is being organized by Compliance4All, a leading provider of professional trainings for all the areas of regulatory compliance, will discuss the prospects and downsides of this very important law.

At this webinar, John E. Lincoln, a very experienced medical device and regulatory affairs consultant, will offer a complete insight into the provisions of the 21st Century Cures Act. This webinar will be highly meaningful and useful for research institutions and organizations that are directly concerned and connected with the FDA and its administration of emerging technologies and general health. Enroll for this webinar by registering at http://www.compliance4all.com/control/w_product/~product_id=501213?Worpress-SEO

Analysis of all the major aspects of the 21st Century Cures Act

John will explain and analyze all the main areas of the 21st Century Cures Act. He will scrutinize how the Act will concern the FDA, the impact it will have on other areas of the Federal government, and the possible benefits and drawbacks for industries and research institutions that will get directly and indirectly affected by the Act, and the general public.

John will cover the following areas at this webinar:

  • The Act’s 3 Main Areas
  • Increased Funding for Medical Research
  • Speed the Development and Approval of Experimental Treatments
  • Overhaul Federal Policy on Mental Health Care
  • Stated Advantages
  • Concerns Expressed
  • What Has Happened So Far

http://www.foxnews.com/health/2017/01/24/how-21st-century-cures-act-will-save-lives-through-research.html

Lyophilization: Validation and regulatory approaches

Lyophilization and its validation and regulatory approaches form an important aspect of parenteral drugs.

Lyophilization is a process in which a product is frozen and placed in a vacuum and water is removed from it. This removes the need for the liquid stage in the change of ice from solid to vapor. This process consists of three phases: primary, sublimation and desorption.

Lyophilization is considered complex

Parenteral products such as anti-infective drugs, many biotechnological products and in-vitro diagnostic products are some of the areas in which lyophilization is used. However, no matter for which field lyophilization is used, it is considered rather tough.

This is mainly because the process of obtaining the product from the lyophilization process is rather complex. It has to be carried out through a number of minute and delicate processes. Complex technology goes into the entire process.

Solution formulation, filling up vials and the validation of this process, sterilization, engineering, and the act of scaling up of the lyophilization cycle and its validation are some of the issues that make lyophilization difficult. Moreover, the processing and handling time needed for lyophilization is pretty high, as is the cost and complexity of the machinery and equipment needed for lyophilization.

Regulatory issues as well

On top of all these, the process of lyophilization has to be compliant with FDA compliance requirements. This adds to the complexity of lyophilization. However, despite these difficulties associated in many ways, lyophilization is a process that cannot be done away with, since it has tangible benefits for parenteral products.

Is there a way out of this difficulty? Should the complexity associated with lyophilization put manufacturers off? No. A webinar from Compliance4All, a leading provider of professional trainings for the areas of regulatory compliance, will suggest ways of simplifying lyophilization while meeting the required regulatory expectations.

Simplified learning about the complex art of lyophilization

At this webinar, the speaker, John Godshalk, will seek to simplify the science of lyophilization. Currently a Senior Consultant at the Biologics Consulting Group; John brings the many years of his experience in the regulatory areas into this webinar. To make lyophilization simpler and to derive the benefit of this teaching, please register for this webinar by logging on to http://www.compliance4all.com/control/w_product/~product_id=501111?Wordpress-SEO

Understanding the FDA’s line of thinking

At this session on lyophilization, which will be highly useful for professionals such as Compliance Managers, Process Engineers, Validation Managers, Validation Engineers and Regulatory Managers; John will explain the way the FDA thinks when it comes to inspection and regulation of the lyophilization process, equipment, and controls.

The speaker will offer insights into what the FDA and other regulatory bodies consider as important while inspecting lyophilizers, and in the validation process. He will devote a major portion of the presentation to the regulatory aspects of lyophilization.

While explaining the science and art of developing lyophilization cycles and the way in which lyophilizers work and are controlled; John will take up other important areas of lyophilization, such as lyophilization controls, of which computer controls and validation are a part, and quality-related aspects of lyophilization, such as how to obtain a resulting quality product.

During the course of the presentation, the speaker will cover the following areas:

o  Science of Lyophilization

o  Cycle development and tools

o  Validation of the Lyo Cycle

o  Lyo equipment validation

o  Regulatory requirements

o  How the lyo process and equipment are inspected

o  The science and the art

https://www.fda.gov/ICECI/Inspections/InspectionGuides/ucm074909.htm

Understanding and handling payment issues

A financial organization, or an organization involved in any business for that matter, faces the prospect of receiving duplicate, fraudulent or late payments. These are the typical payment issues an organization is likely to face at some point of time in its business.

Payment issues are something almost no organization is likely to be free from. Duplicate invoice payments, just one of the payment issues an organization is likely to face, account for losses of something like $100 million over a three-year period just for medium sized organizations. The amount is likely to be several times higher for large companies and those in the public sector, which most likely deal with billions of dollars in transactions.

Payment issues have their implications

The consequence of payment issues, be they duplicate, fraudulent or late payments, is whopping. It is likely to lead to business losses, because late payments, for instance, hinder investment into other productive activities by businesses. Although there are a number of sources at which payment issues can happen; it usually takes an organization quite a while to detect any payment issue, which could be duplicate, fraudulent or late payments. It also takes herculean efforts at times to get to the bottom of the payment issues.

Payment issues can happen due to a number of reasons

There is any number of reasons for which payment issues could arise. Manual data entry and processing, possible overlooking of characters while entering Accounts Payable (AP) or by Automated Clearinghouses (ACH) processors, oversights by manual checks and overlapping or duplication of payments while making payments from varied sources are just some of the reasons for which payment issues can occur with businesses.

Although the Sarbanes Oxley (SOX) Act has put in a number of checks and balances into the payment aspect of corporations; there are still a good number of loopholes that need to be e plugged if payment issues have to be addressed. How do organizations, especially those in finance, mitigate payment issues? What steps do they need to take to understand the regulations set out by the SOX Act, or take their own measures to prevent payment issues arising out of duplicate, fraudulent or late payments?

Learn the aspects of payment issues at a learning session

All these will be addressed at a very valuable learning session on this topic. The webinar, being organized by Compliance4All, a leading provider of professional trainings for all the areas of regulatory compliance, will have Ray Graber as speaker.

Ray is a senior BFSI professional who brings a deep and thorough understanding of banking, technology, and finance. To hear from him on how to understand and address payment issues such as duplicate, fraudulent or late payments; just register for this webinar by logging on to http://www.compliance4all.com/control/w_product/~product_id=501138?Wordpress-SEO

Insights for understanding payment issues

At this webinar, Ray will help participants understand how to foresee payments issues and strategize solutions. He will offer suggestions about how to put risk management plans in place to do this. The suggestions Ray will offer at this webinar will help participants from banks and corporations to get a clearer understanding of each other’s concerns and constraints, and ways of addressing them.

This session will arm them with the tools necessary for accurately auditing their existent processes and limit the potential for fraud. He will teach them how to understand the settlement process, which is part of the banking business. In other words, attending this session will equip participants with the insight needed for understanding payment issues and tackle them in relation to duplicate, fraudulent or late payments.

At this session, Ray will cover the following areas:

o  Payment System Risk Policy

o  FFIEC Action Summary for Retail Payments

o  Areas of Risk

o  Risk Assessment Activities

o  People, Processes, and Products

o  Is there an optimal organizational structure/for managing payments strategy?

o  Are there best practices that apply to my institution?

o  What are the hurdles in establishing an organization focused on the payments business?

o  Are there common pitfalls?

http://www.infor.com/content/whitepapers/detecting-prev-dup-invoice.pdf/

Understanding normality tests and normality transformations

That the inputted data should be “normally distributed” is a requirement of the calculations used in many statistical tests and methods. Typically, the methods used for Student’s t-Tests, ANOVA tables, F-tests, Normal Tolerance limits, and Process Capability Indices include such calculations.

A core criterion for ensuring the correct results is the one that the raw data used in such calculations be “normally distributed”. It is in view of this fact that the assurance that the FDA holds a company’s “valid statistical techniques” being “suitable for their intended use” as a critical component for making the assessment of whether or not data is “normally distributed”.

Now, which are the types of data that are normally distributed? Dimensional data, such as length, height and width are considered as being typically normally distributed. Which kinds of data are usually non-normal? The typical example of non-normal data are burst pressure, tensile strength, and time or cycles to failure. Some non-normal data can be transformed or converted into normality data for the purpose of validating statistical calculations, when they are run on the transformed data.

 

Get an understanding of the normally distributed and normality transformation data

Statistics professionals can get a thorough understanding of the concepts of what are normally distributed and how to transform non-normal data into normality at a webinar that is being organized by Compliance4All, a leading provider of professional trainings for the areas of regulatory compliance. At this webinar, John N. Zorich, a senior consultant for the medical device manufacturing industry, will be the Speaker. To register, please visit

http://www.compliance4all.com/control/w_product/~product_id=501103?Wordpress-SEO

John will take the participants of this webinar through all the aspects of “normally distributed” data and how to transform non-normal data. He will explain the various elements of data, such as:

o  What it means to be “normally distributed”

o  How to assess normality

o  How to test for normality

o  How to transform non-normal data into normal data

o  How to justify the transformations to internal and external quality system auditors.

A combination of graphical and numerical methods that have been in use for many years constitute Normality Tests and normality transformations. These methods are essential whenever one has to apply a statistical test or method whose fundamental assumption is that the inputted data is normally distributed. John will offer an explanation of these methods.

What goes into normality testing and normality transformation?

While normality “testing” involves creating a “normal probability plot” and calculating simple statistics for comparison to critical values in published tables; making simple changes to each of the raw-data values, such that the resulting values are more normally distributed than the original raw data, is what a normality “transformation” involves.

Professionals whose work involves normality testing, such as QA/QC Supervisor, Process Engineers, Manufacturing Engineers, QC/QC Technicians, Manufacturing Technicians, and R&D Engineers will find this session very helpful. John will offer guidance on both objective and some subjective decision-making that go into the evaluation of the results of “tests” and “transformations”.

John will cover the following areas at this webinar:

o  Regulatory requirements

o  Binomial distribution

o  Historical origin of the Normal distribution

o  Normal distribution formula, histogram, and curve

o  Validity of Normality transformations

o  Necessity for transformation to Normality

o  How to use Normality transformations

o  Normal Probability Plot

o  How to evaluate Normality of raw data and transformed data

o  Significance tests for Normality

o  Evaluating the results of a Normality test

o  Recommendations for implementation

o  Recommended reference textbooks.