Lyophilization: Validation and regulatory approaches

Lyophilization and its validation and regulatory approaches form an important aspect of parenteral drugs.

Lyophilization is a process in which a product is frozen and placed in a vacuum and water is removed from it. This removes the need for the liquid stage in the change of ice from solid to vapor. This process consists of three phases: primary, sublimation and desorption.

Lyophilization is considered complex

Parenteral products such as anti-infective drugs, many biotechnological products and in-vitro diagnostic products are some of the areas in which lyophilization is used. However, no matter for which field lyophilization is used, it is considered rather tough.

This is mainly because the process of obtaining the product from the lyophilization process is rather complex. It has to be carried out through a number of minute and delicate processes. Complex technology goes into the entire process.

Solution formulation, filling up vials and the validation of this process, sterilization, engineering, and the act of scaling up of the lyophilization cycle and its validation are some of the issues that make lyophilization difficult. Moreover, the processing and handling time needed for lyophilization is pretty high, as is the cost and complexity of the machinery and equipment needed for lyophilization.

Regulatory issues as well

On top of all these, the process of lyophilization has to be compliant with FDA compliance requirements. This adds to the complexity of lyophilization. However, despite these difficulties associated in many ways, lyophilization is a process that cannot be done away with, since it has tangible benefits for parenteral products.

Is there a way out of this difficulty? Should the complexity associated with lyophilization put manufacturers off? No. A webinar from Compliance4All, a leading provider of professional trainings for the areas of regulatory compliance, will suggest ways of simplifying lyophilization while meeting the required regulatory expectations.

Simplified learning about the complex art of lyophilization

At this webinar, the speaker, John Godshalk, will seek to simplify the science of lyophilization. Currently a Senior Consultant at the Biologics Consulting Group; John brings the many years of his experience in the regulatory areas into this webinar. To make lyophilization simpler and to derive the benefit of this teaching, please register for this webinar by logging on to http://www.compliance4all.com/control/w_product/~product_id=501111?Wordpress-SEO

Understanding the FDA’s line of thinking

At this session on lyophilization, which will be highly useful for professionals such as Compliance Managers, Process Engineers, Validation Managers, Validation Engineers and Regulatory Managers; John will explain the way the FDA thinks when it comes to inspection and regulation of the lyophilization process, equipment, and controls.

The speaker will offer insights into what the FDA and other regulatory bodies consider as important while inspecting lyophilizers, and in the validation process. He will devote a major portion of the presentation to the regulatory aspects of lyophilization.

While explaining the science and art of developing lyophilization cycles and the way in which lyophilizers work and are controlled; John will take up other important areas of lyophilization, such as lyophilization controls, of which computer controls and validation are a part, and quality-related aspects of lyophilization, such as how to obtain a resulting quality product.

During the course of the presentation, the speaker will cover the following areas:

o  Science of Lyophilization

o  Cycle development and tools

o  Validation of the Lyo Cycle

o  Lyo equipment validation

o  Regulatory requirements

o  How the lyo process and equipment are inspected

o  The science and the art

https://www.fda.gov/ICECI/Inspections/InspectionGuides/ucm074909.htm

Understanding and handling payment issues

A financial organization, or an organization involved in any business for that matter, faces the prospect of receiving duplicate, fraudulent or late payments. These are the typical payment issues an organization is likely to face at some point of time in its business.

Payment issues are something almost no organization is likely to be free from. Duplicate invoice payments, just one of the payment issues an organization is likely to face, account for losses of something like $100 million over a three-year period just for medium sized organizations. The amount is likely to be several times higher for large companies and those in the public sector, which most likely deal with billions of dollars in transactions.

Payment issues have their implications

The consequence of payment issues, be they duplicate, fraudulent or late payments, is whopping. It is likely to lead to business losses, because late payments, for instance, hinder investment into other productive activities by businesses. Although there are a number of sources at which payment issues can happen; it usually takes an organization quite a while to detect any payment issue, which could be duplicate, fraudulent or late payments. It also takes herculean efforts at times to get to the bottom of the payment issues.

Payment issues can happen due to a number of reasons

There is any number of reasons for which payment issues could arise. Manual data entry and processing, possible overlooking of characters while entering Accounts Payable (AP) or by Automated Clearinghouses (ACH) processors, oversights by manual checks and overlapping or duplication of payments while making payments from varied sources are just some of the reasons for which payment issues can occur with businesses.

Although the Sarbanes Oxley (SOX) Act has put in a number of checks and balances into the payment aspect of corporations; there are still a good number of loopholes that need to be e plugged if payment issues have to be addressed. How do organizations, especially those in finance, mitigate payment issues? What steps do they need to take to understand the regulations set out by the SOX Act, or take their own measures to prevent payment issues arising out of duplicate, fraudulent or late payments?

Learn the aspects of payment issues at a learning session

All these will be addressed at a very valuable learning session on this topic. The webinar, being organized by Compliance4All, a leading provider of professional trainings for all the areas of regulatory compliance, will have Ray Graber as speaker.

Ray is a senior BFSI professional who brings a deep and thorough understanding of banking, technology, and finance. To hear from him on how to understand and address payment issues such as duplicate, fraudulent or late payments; just register for this webinar by logging on to http://www.compliance4all.com/control/w_product/~product_id=501138?Wordpress-SEO

Insights for understanding payment issues

At this webinar, Ray will help participants understand how to foresee payments issues and strategize solutions. He will offer suggestions about how to put risk management plans in place to do this. The suggestions Ray will offer at this webinar will help participants from banks and corporations to get a clearer understanding of each other’s concerns and constraints, and ways of addressing them.

This session will arm them with the tools necessary for accurately auditing their existent processes and limit the potential for fraud. He will teach them how to understand the settlement process, which is part of the banking business. In other words, attending this session will equip participants with the insight needed for understanding payment issues and tackle them in relation to duplicate, fraudulent or late payments.

At this session, Ray will cover the following areas:

o  Payment System Risk Policy

o  FFIEC Action Summary for Retail Payments

o  Areas of Risk

o  Risk Assessment Activities

o  People, Processes, and Products

o  Is there an optimal organizational structure/for managing payments strategy?

o  Are there best practices that apply to my institution?

o  What are the hurdles in establishing an organization focused on the payments business?

o  Are there common pitfalls?

http://www.infor.com/content/whitepapers/detecting-prev-dup-invoice.pdf/

Understanding normality tests and normality transformations

That the inputted data should be “normally distributed” is a requirement of the calculations used in many statistical tests and methods. Typically, the methods used for Student’s t-Tests, ANOVA tables, F-tests, Normal Tolerance limits, and Process Capability Indices include such calculations.

A core criterion for ensuring the correct results is the one that the raw data used in such calculations be “normally distributed”. It is in view of this fact that the assurance that the FDA holds a company’s “valid statistical techniques” being “suitable for their intended use” as a critical component for making the assessment of whether or not data is “normally distributed”.

Now, which are the types of data that are normally distributed? Dimensional data, such as length, height and width are considered as being typically normally distributed. Which kinds of data are usually non-normal? The typical example of non-normal data are burst pressure, tensile strength, and time or cycles to failure. Some non-normal data can be transformed or converted into normality data for the purpose of validating statistical calculations, when they are run on the transformed data.

 

Get an understanding of the normally distributed and normality transformation data

Statistics professionals can get a thorough understanding of the concepts of what are normally distributed and how to transform non-normal data into normality at a webinar that is being organized by Compliance4All, a leading provider of professional trainings for the areas of regulatory compliance. At this webinar, John N. Zorich, a senior consultant for the medical device manufacturing industry, will be the Speaker. To register, please visit

http://www.compliance4all.com/control/w_product/~product_id=501103?Wordpress-SEO

John will take the participants of this webinar through all the aspects of “normally distributed” data and how to transform non-normal data. He will explain the various elements of data, such as:

o  What it means to be “normally distributed”

o  How to assess normality

o  How to test for normality

o  How to transform non-normal data into normal data

o  How to justify the transformations to internal and external quality system auditors.

A combination of graphical and numerical methods that have been in use for many years constitute Normality Tests and normality transformations. These methods are essential whenever one has to apply a statistical test or method whose fundamental assumption is that the inputted data is normally distributed. John will offer an explanation of these methods.

What goes into normality testing and normality transformation?

While normality “testing” involves creating a “normal probability plot” and calculating simple statistics for comparison to critical values in published tables; making simple changes to each of the raw-data values, such that the resulting values are more normally distributed than the original raw data, is what a normality “transformation” involves.

Professionals whose work involves normality testing, such as QA/QC Supervisor, Process Engineers, Manufacturing Engineers, QC/QC Technicians, Manufacturing Technicians, and R&D Engineers will find this session very helpful. John will offer guidance on both objective and some subjective decision-making that go into the evaluation of the results of “tests” and “transformations”.

John will cover the following areas at this webinar:

o  Regulatory requirements

o  Binomial distribution

o  Historical origin of the Normal distribution

o  Normal distribution formula, histogram, and curve

o  Validity of Normality transformations

o  Necessity for transformation to Normality

o  How to use Normality transformations

o  Normal Probability Plot

o  How to evaluate Normality of raw data and transformed data

o  Significance tests for Normality

o  Evaluating the results of a Normality test

o  Recommendations for implementation

o  Recommended reference textbooks.

The importance of Design of Experiments (DoE)

Design of Experiments (DoE) is an important component in many industries. It is a series of tests or runs that is carried out repeatedly and consistently over a period of time, and its outputs or responses, observed. Design of Experiments is very important in industry to help arrive at an understanding of the predictability and reproducibility of an experiment.

Design of Experiments is a very important aspect of the important elements of a product, such as quality, reliability and performance. What Design of Experiments does is that it helps to examine and investigate the inputs that lead to poor quality. This insight leads the entity carrying out the Design of Experiments to use these to improve their quality standards.

 

Ruling out chance

Design of Experiments does not rely on chance or providence to bring about the quality that is required of an experiment. It arrives at the optimal set of procedures that are needed to get the required quality standards after a series of tests and experiments, so that the final result shows in the process that goes into the product.

Fundamentally, Design of Experiments helps to put in place a system of control for a product. All the ingredients that go into the inputs needed for obtaining a product of a defined standard or quality are scientific and precise. This precision and accuracy is arrived at after carrying out as many runs or series of Design of Experiments as needed to finally arrive at it.

An introduction to Design of Experiments

The ways of understanding Design of Experiments and applying their standards into production will be the topic of a webinar that is being organized by Compliance4All, a leading provider of professional trainings for all areas of regulatory compliance. At this webinar, the speaker, William Levinson, an ASQ Fellow, Certified Quality Engineer, Quality Auditor, Quality Manager, Reliability Engineer, and Six Sigma Black Belt, who is the principal of Levinson Productivity Systems, P.C., will explain the fundamentals of Design of Experiments.

To gain a proper understanding of the principles of Design of Experiments and to get a grasp of how to implement this concept into your systems, please register for this webinar by logging on to http://www.compliance4all.com/control/w_product/~product_id=501202?Wordpress-SEO

An understanding of the significance level in hypothesis testing

William will make participants understand how to use Design of Experiments to identify and rule out the particular item or input that affects quality. The concept of significance level in hypothesis testing, which will serve as a basis for not only DoE, but also Statistical Process Control and acceptance sampling, will be explained.

A description of the other uses of DoE, such as supporting Corrective and Preventive Action (CAPA) and in process improvement, where it helps to identify and optimize the factors influenced by Critical to Quality (CTQ) characteristic, will be part of the learning that is on offer at this webinar.

 

 

Levinson will cover the following areas at this webinar:

·        Economic benefits of DOE

·        Hypothesis testing: the foundation of DOE, SPC, and acceptance sampling

o  Null and alternate hypothesis

o  Type I or alpha risk of concluding wrongly that the experiment differs from the control (or that a process is out of control, or that an acceptable production lot should be rejected)

o  Type II or beta risk of not detecting a difference between the control and the experiment, not detecting an out of control condition, and accepting a production lot that should be rejected

·        Factors, levels, and interactions

o  Interaction = “the whole is greater or less than the sum of its parts”. One variable at a time experiments cannot detect interactions.

·        Randomization and blocking exclude extraneous variation sources from the experiment.

·        Replication means taking multiple measurements to increase the experiment’s power.

·        Interpret the experiment’s results in terms of the significance level, or quantifiable “reasonable doubt” that the experiment differs from the control.

http://support.minitab.com/en-us/minitab/17/topic-library/modeling-statistics/doe/basics/what-is-a-designed-experiment/

International Financial Reporting Standards (IFRS) 6

The International Financial Reporting Standards (IFRS) standards are a set of standards pertaining to different industries and their activities and practices. IFRS 6 relates to guidance in the accounting practices of the extractive industries, such as oil, mining and gas. The IFRS 6 accounting standard states the requirements, as well as the disclosures that need to go into accounting practices for expenses that a company incurs during the course of exploring and evaluating expenditures.

Till the enactment of the IFRS 6, regulations on the accounting practices of the extractive industries were fragmented and piecemeal. The major change the IFRS 6 brought about it is that it consolidated these practices. Also, with the passage of IFRS 6, entities that were using accounting practices for exploration and evaluation assets that were in use prior to the enactment of the IFRS 6 could integrate these earlier practices with the provisions of the IFRS 6.

Core accounting requirements

One of its core requirements is that of the issuance of IFRS compliant financial statements by companies that have assets used for exploration and evaluation of mineral resources.

So, it is imperative for accounting professionals to have full knowledge of the IFRS 6. Working with the oil, mining or gas areas in companies that have assets that are used for exploration and valuation of mineral resources and being successful entails having to comply with the requirements set out by the IFRS 6.

A proper understanding of the IFRS 6

A learning session that is being organized by Compliance4All, a leading provider of professional trainings for the areas of regulatory compliance, will offer the learning needed for getting trained on how to comply with the requirements set out in IFRS 6.

At this webinar, Mike Morley A Certified Public Accountant and business author who organizes various training programs, such as IFRS, SOX, and Financial Statement Analysis that focus on providing continuing education opportunities for finance and accounting professionals, will be the speaker.

Professionals who work in the oil, mining or gas areas in companies that have assets that are used for exploration and valuation of mineral resources can gain insights into what the IFRS 6 means for them by enrolling for this webinar. To register, please visit

http://www.compliance4all.com/control/w_product/~product_id=501194?Wordpress-SEO

Familiarization with all the aspects of the IFRS 6

The aim of this presentation is to help oil; mining and gas professionals become knowledgeable about the latest information about IFRS 6. The speaker will familiarize participants with the unique accounting and reporting issues, particularly in regards to the evaluation of assets, revenues and expenditures that professionals in the extractive industries, involved in the search for mineral resources, including oil, gas, minerals, and similar exhaustible resources face.

Accounting professionals who work in these industries, and who need in-depth understanding of the way the IFRS 6 is structured, and the ways in which they need to apply the standards in the right manner, such as Auditors, Accountants, Financial Managers, Financial Controllers, Company Executives, and anyone involved in the SOX compliance process, will benefit immensely from this webinar on the accounting practices set out by IFRS 6.

At this session on the IFRS 6, the speaker will cover the following areas:

o  Why the accounting for this sector is different

o  How resource assets are evaluated

o  Special rules for measuring revenues and expenditures

o  How revaluation rules apply to the Oil, Gas, and Mining industries

o  Other specific requirements of IFRS 6

o  Required disclosures.

http://www.accaglobal.com/in/en/student/exam-support-resources/dipifr-study-resources/technical-articles/ifrs6.html

http://www.icaew.com/en/library/subject-gateways/accounting-standards/ifrs/ifrs-6

https://www.iasplus.com/en/standards/ifrs/ifrs6

Analyzing financial statements is an indispensable insight for managers

Financial statements are the ultimate indicator of a company’s financial health. Number crunching is a very important exercise that all executives at all levels of an organization need to be familiar with. Yet, given the heavy jargon that goes into financial statements and the complexity most of them have; many managers feel put off and don’t generally like to pore over financial statements.

The company’s financial statement is intended to provide insights into the most important aspect of the business –the financial one –to managers and executives at all levels and in all disciplines. Marketing, finance, HR, customer service, and sales need financial statements to get a grasp of and gain perspective of the financial health of the organization.

Financial statements are critical for helping understand the business

Despite financial statements being the surest indicator of the most important aspect of any business organization –Finance –most managers lack the perceptiveness needed to understand and analyze the meaning of numbers. It is often that they devote some much time to running their business that the priority that needs to be accorded to understanding financial statements gets buried and takes a backseat.

A perceptive analysis of financial statements is the foundation to getting the business in order. Wading through the numbers helps the organization to dig into the market trends, understand where they are getting it right or wrong, and then use financial statements to draw proper conclusions and take appropriate action. It is important to understand financial statements for another critical reason: The competition should not understand our financial statements faster and better than we do!

Trend and ration analysis of financial statements

But how does one make sense of heaps and heaps of seemingly unintelligible numbers? Numbers in themselves, without the necessary nous to decipher them, make little sense to any executive. A few techniques do exist to help understand the meaning of numbers. An effective model for assessing the financial condition and results of operations of any business is that of using trend and ratio analysis. Getting a grasp of this model will empower financial and other executive teams to derive the maximum benefit that accrues from a crystal clear understanding of financial statements.

Imparting this understanding is the intent of a webinar that is being organized by Compliance4All, a leading provider of professional trainings for all areas of regulatory compliance. Miles Hutchinson, an experienced CGMA and business adviser, will be the speaker at this session.

In easily comprehensible terms, he will explain how participants can imbibe the sagacity needed to quickly and thoroughly analyze the financial condition and results of operations of any publicly traded company. All that is needed to gain this highly useful understanding of financial statements is to register for this webinar by logging on to

http://www.compliance4all.com/control/w_product/~product_id=501197LIVE?wordpress-SEO

Attending this highly useful session on financial statements gives Financial Executives, HR Managers, Accounting Managers, Department Managers, and Business Unit Managers the ability to discern numbers and help understand where these numbers lead the organization to.

These are the areas this webinar on financial statements will cover:

o  Review the components of the annual report of a prominent publicly traded company and learn how to use this wealth of information

o  Use the annual report to perform a fundamental financial analysis

o  Learn the various types of financial analysis and their purpose

o  Learn the key ratios to evaluate a company’s liquidity, leverage and operating performance

o  Identify the key benchmarks to help determine whether a company’s ratios are in line with competitors

o  Understand horizontal and vertical analysis and how they can be used to identify key trends

o  Bonus: receive our advanced excel hosted financial model complete with all ratios, horizontal and vertical analysis

o  Use our model to perform financial analysis on other company financial statements, including yours

o  Receive benchmark information to use in determining the quality of your analyses

Learn about resources available to perform comparative studies between companies in the same economic sector – even private companies.

How do laboratories deal with Out of Specification (OOS) results?

Laboratory testing is the soul of the successful operation by a drug maker. It is required as part of current Good Manufacturing Practices (cGMP) regulations to confirm that all the elements that go into a laboratory product, namely raw materials; in-process materials, finished materials, and containers conform to set specifications. When a laboratory test throws up an Out of Specification (OOS), how do laboratories deal with it?

The FDA takes a very serious view of Out of Specification results

The FDA is very stern in dealing with laboratories which come up with Out of Specification results. It inspects laboratory operations very closely, and has clear guidance on how the laboratory investigates Out of Specification and Out-of-Tolerance observation investigations.

cGMP regulation Sec 211.165 specifies that finished Out of Specification products which fail to conform to set specifications, safety standards and other quality standards will be rejected. These cGMP regulations also state that any unexplained deviation from the set specifications of a batch or its contents will be thoroughly investigated if its test results show an Out of Specification result. This is the same rule for both batches that have been distributed into the market, and those that are not.

Steps to deal with Out of Specification results

cGMP regulation makes Out of Specification testing compulsory for the release of a test batch. Whenever an Out of Specification result is confirmed, the batch gets rejected, and if there is ambiguity in the result, then the company’s Quality Assurance (QA) will have to state the reasons for the release and justify it.

Section 501(a) 2 (b) of cGMP guidelines on Out of Specification requires that current Good Manufacturing Practices need to go into the manufacture of both active pharmaceutical ingredient and finished pharmaceuticals. Further, active pharmaceutical ingredients, raw material testing, in-process and stability testing and Process Validation all come under the purview of the cGMP guidelines.

The FDA guidance on Out of Specification covers the following products:

o  Human drugs

o  Biology and biotechnological products

o  Combination products

o  Veterinary drugs

o  Type A medicated articles

o  Transplantation of human tissues

o  Medicated feed

o  Finished products & active pharmaceutical ingredients

o  Dietary supplements

Need for understanding Out of Specification

All the complexity and depth of the issues relating to Out of Specification results need to be fully understood if a laboratory has to meet the required results. Important personnel in laboratories should have complete knowledge of the FDA expectations for Out of Specification results.

They have to use this knowledge to put in place procedures that define a complete, scientifically sound investigation of each Out of Specification and Out-of-Trend laboratory observation and to establish evidence that laboratory personnel are following the procedures.

A complete understanding of Out of Specification results and dealing with them

This will be the content of a training session that is being organized by Compliance4All, a highly popular provider of cost-effective professional trainings for all the areas of regulatory compliance.

At this session, Jerry Lanese, an independent consultant who focuses on Quality Systems and the components of an effective Quality System, will be the speaker. To understand the concept and workings of Out of Specification results and the ways of dealing with them, register for this webinar by logging on to  http://www.compliance4all.com/control/w_product/~product_id=501214?Wordpress-SEO

Tools that help deal with Out of Specification results

At this webinar, Jerry will help participants build the foundation for the implementation of adequate procedures that help avoid Out of Specification results, and will review existing procedures and practices. This webinar is aimed at helping participants develop an understanding of the steps a compliant laboratory needs to take to handle the investigation of Out of Specification test results.

Jerry will also explain the ways in which the laboratory has to interface with other units through the laboratory investigation process. The FDA guidance on handling OOS laboratory results will be the foundation for this webinar, which will offer a clear process for compliant laboratory Out of Specification investigations.

Jerry will cover the following areas at this webinar:

o  Why the regulators are concerned about the handling of OOS investigations

o  The FDA model for handling OOS investigations

o  Commonly accepted terminology such as repeat testing and retesting

o  How the laboratory can meet regulatory expectations for OOS investigations.

o  The interaction between the laboratory and other units in the organization.

http://www.fda.gov/downloads/drugs/guidances/ucm070287.pdf

http://sphinxsai.com/2013/JulySept13/phPDF/PT=11(943-948)JS13.pdf