How to deal with dynamic source data and the “refresh” concept

Microsoft Excel comes with a myriad of tools such as sorting, filtering, and subtotal to manage large lists of data. Yet, when it comes to analyzing all that data and doing it quickly, the MS Excel PivotTable is a very useful feature. It is particularly important and useful when the user is required to quickly create a compact summary report (based on lots of data) without needing to write complex formulas or rely on lengthy techniques.

Excel

MS Excel PivotTable is very versatile. It is considered Excel’s best analytical tool because in addition to speed, it also comes with amazing flexibility and dynamism the data interrelationships that are being viewed can be changed. It is easier to put the PivotTable MS Excel features into practical use by actually getting down to work on it, rather than trying to pore through the instructions on the printed page, which is a visually-oriented feature based on displaying fields in different locations. It offers the ability to create a complete summary report with heaps of data in very little time without having to write complex formulas and rely on obscure techniques.

Learn about all the features and aspects of PivotTable from the MS expert

A complete learning on the numerous PivotTable capabilities and its many tools and features will be offered at a webinar that is being organized by Compliance4All, a leading provider of professional trainings for all the areas of regulatory compliance.

excel-pivot-table-tutorial

Dennis Taylor, an Excel expert who has worked extensively with Microsoft products (especially spreadsheet programs) since the mid-1990’s and has taught hundreds of workshops and authored numerous works on this program; will be the speaker at this session. Just visit Excel PivotTables to register for this webinar and gain complete insights into the ways by which to put the full array of functionalities of MS Excel PivotTable.

Simplifying the use of PivotTable

The main objective that Dennis Taylor has for this webinar is to familiarize participants with the quickest and best ways to create PivotTables and Pivot Charts. These include the following capabilities:

  • How to compare two or more fields in a variety of layout styles
  • How to sort and filter results
  • How to perform ad-hoc grouping of information
  • How to use Slicers instead of filters to identify which field elements are displayed
  • How to drill down to see the details behind the summary
  • How to categorize date/time data in multiple levels
  • How to create a Pivot Chart that is in sync with a PivotTable
  • How to add calculated fields to perform additional analysis
  • How to hide/reveal detail/summary information with a simple click
  • How to create a PivotTable based on data from multiple worksheets.

He will cover these in detail. In addition, he will cover the following areas at this session on MS Excel PivotTable:

  • Pre-requisites for source data – preparing data so that it can be analyzed by PivotTables
  • Creating a PivotTable with a minimum number of steps, including the Recommended PivotTables option
  • Manipulating the appearance of a PivotTable via dragging and command techniques
  • Using Slicers to accentuate fields currently being shown (and which ones are not)
  • Using the new (in Excel 2013) Timeline feature
  • Creating ad hoc and date-based groupings within a PivotTable
  • Quickly create and manipulate a Pivot Chart to accompany a PivotTable

MS Excel users who are familiar with PivotTable concepts, but need expanded techniques to analyze lists of data are among the primary beneficiaries of this webinar, but anyone needing to know how to create PivotTables from multiple sources and use Slicers, Timelines, Calculated Fields, and Conditional Formatting will also benefit from this course.

The right way of choosing sample sizes based on valid statistical rationale

Performing at least some verification testings or validation studies of design-outputs and/or manufacturing processes is essential for almost all manufacturing and development companies. Yet, it is sometimes difficult to explain the rationale that goes into the selection of the sample sizes used in such efforts.

Removing this doubt and showing participants a way out of this quandary will be the content of a webinar that Compliance4All, a leading provider of professional trainings for all areas of regulatory compliance, will be organising. To understand the rationale for the selection of these sample sizes; just log on to http://bit.ly/2cObvdM

This webinar explains the logic behind sample-size choice for several statistical methods that are commonly used in verification or validation efforts, and will describe how to express a valid statistical justification for a chosen sample size.

The speaker at this webinar, John N. Zorich, who has spent 35 years in the medical device manufacturing industry, will offer the all-important guidance on how to justify such sample sizes. This will lead indirectly to guidance on how to choose sample sizes.

The justification for choosing a suitable sample:

He will also offer an explanation of the justifications for the section of such sample sizes and give an understanding of the ways by which to document them in Protocols or regulatory submissions. This can also be shared to regulatory auditors who may ask for them during a company’s onsite audits. Overall, this learning session will help participants understand ways by which to avoid regulatory delays in product approvals and to prevent an auditor from issuing them a nonconformity notice.

Point to note:

Participants of this webinar, however, are requested to note that this webinar does not address rationales for sample sizes used in clinical trials.

This webinar will discuss the following statistical methods:

o  Confidence intervals

o  Process Control Charts

o  Process Capability Indices

o  Confidence / Reliability Calculations

o  MTBF Studies (“Mean Time Between Failures” of electronic equipment)

o  QC Sampling Plans

John Zorich will cover the following areas at this session:

  • Introduction
  • Examples of regulatory requirements related to sample size rationale
  • Sample versus Population
  • Statistic versus Parameter
  • Rationales for sample size choices when using…
  • Confidence Intervals
  • Attribute data
  • Variables data
  • Statistical Process Control C harts (e.g., XbarR)
  • Process Capability Indices (e.g., Cpk )
  • Confidence/Reliability Calculation
  • Attribute data
  • Variables data (e.g., K-tables)
  • Significance Tests ( using t-Tests as an example )
  • When the “significance” is the desired outcome
  • When “non-significance” is the desired outcome (i.e., “Power” analysis)
  • AQL sampling plans

Examples of statistically valid “Sample-Size Rationale” statements