Supplier Partnership

Download as docx, pdf, or txt
Download as docx, pdf, or txt
You are on page 1of 11

SUPPLIER PARTNERSHIP

- A supplier partnership goes beyond a typical buyer-seller relationship. They openly talk, solve
problems hand-in-hand, and aim for shared success, not just the cheapest price. By collaborating
and focusing on long-term goals, both parties’ benefit and create greater value.

a. IMPORTANCE OF SUPPLIERS
 Suppliers are the backbone of any business, providing critical materials, components, and services.
 They play a vital role in:
1. They provide essential resources: Businesses rely on suppliers for the raw materials, products, or
services needed to function. Without them, the company wouldn't be able to operate or deliver its
own products or services.
2. They impact product quality and cost: The quality of the materials and services provided by
suppliers directly affects the final product offered by a company. Additionally, suppliers play a
big role in determining the production cost, ultimately impacting the company's pricing and
profitability.
3. They contribute to efficiency and innovation: Strong supplier partnerships can lead to improved
communication and collaboration, allowing for smoother operations and quicker problem-solving
within the supply chain. Additionally, collaboration can foster innovation, as both parties can
share knowledge and expertise to develop new products or processes.
4. They influence brand reputation: Any issues with suppliers, like product recalls or ethical
controversies, can tarnish a company's reputation. Therefore, careful selection and responsible
sourcing are crucial.
b. SUPPLIER SELECTION
- Supplier selection is a critical component of any successful business. It's the process of
identifying and choosing the right partners who can consistently and reliably provide the goods or
services you need to achieve your business goals. Choosing the wrong supplier can have a
significant negative impact on your operations, leading to delays, quality issues, and even
financial losses.
1. Quality: This is the cornerstone of any supplier relationship. You need a supplier who can
consistently deliver products or services that meet your defined quality standards. This involves
assessing their quality management systems, inspecting facilities, and reviewing past
performance data.
2. Reliability: A reliable supplier is one you can depend on to deliver on time, every time. This
includes factors like on-time delivery, consistent performance, and responsiveness to your needs.
Consider their past performance history, their ability to handle fluctuations in demand, and their
communication practices.
3. Cost: While price is always a consideration, it shouldn't be the sole deciding factor. Look for
competitive pricing that aligns with your budget constraints. However, remember that the lowest
price doesn't always guarantee the best value. Consider the total cost of ownership, including
factors like hidden fees, quality-related costs, and potential disruptions due to unreliable
deliveries.
4. Financial Stability: Partnering with a financially stable supplier minimizes risks and ensures long-
term viability. Assess the supplier's financial health, including their creditworthiness, debt levels,
and track record of profitability. This ensures they can continue to meet your needs and fulfill
their contractual obligations.
5. Innovation: Look for suppliers who are committed to continuous improvement and possess the
ability to adapt to changing market demands. This could include developing new products or
services, adopting innovative technologies, or implementing efficient production processes.
Partnering with such suppliers can help your business stay ahead of the curve.
6. Culture and Values: Choosing suppliers who align with your company's culture and values fosters
a strong and mutually beneficial relationship. This includes considerations like ethical sourcing
practices, environmental responsibility, and commitment to social good. Partnering with suppliers
who share your values can enhance your brand reputation and contribute to a positive corporate
image.

c. SUPPLIER STANDARD
1. Quality
2. Reliability
3. Compialiance
4. Ethical Conduct

d. SUPPLIER RATING
- The suppliers are to be continuously assessed/rated, based upon the performance of the supplies
with respect to some or all of the following parameters:
• QUALITY
• PRICE
• DELIVERY
• SERVICE

SUPPLIER SEMINARS
- Manufacturers must adopt a dynamic supplier acceptance approach, involving assessment,
decision-making, and planning to reduce or eliminate risk in supply chain activities to meet
industry standards.

PERFORMANCE MEASURES
CRITERIA SELECTION OF MEASURES
- All business organization have some measurements in place that can be adopted for TQM. In
order to evaluate the existing measures or add new ones, there are seven criteria to be followed:
• Simple
• Few in number
• Developed by users
• Relevance to customer
• Improvement
• Cost
• Visible
QUALITY FUNCTION DEPLOYMENT
- A process and set of tools used to effectively define customer requirements and convert them into
detailed specifications and plans to produce the products that fulfill those requirements, turning
them into products or services.

MEASUREMENT STRATEGY

 refers to a systematic plan or approach for collecting, analyzing, and interpreting data to assess the
performance, quality, or effectiveness of a process, product, or service.

PERFORMANCE MEASURE DESIGN

 is a process of quantifying and assessing the effectiveness and efficiency of an organization or


individual in achieving their objectives or goals.

BALANCE SCORE CARD (BSC)

 refers to a strategic management performance metric used to identify and improve various internal
business functions and their resulting external outcomes.
Statistical Process Control

 the use of statistical techniques to control a process or production method. SPC tools and
procedures can help you monitor process behavior, discover issues in internal systems, and find
solutions for production issues. This helps to ensure that the process operates efficiently,
producing more specification-conforming products with leas waste scrap
 An example of statistical process control would be a manufacturing process that produces
products with a weight that is within an acceptable range or variation
 Statistical process control is useful within the oil, gas and pertrochemical industry, helping to
increase product quality and reduce costs related to waste/scrap
 The three main components of an SPC chart are - a central line (CL) for the average, a lower
control line (LCL) for the lower control unit, and an upper control line (UCL) for the upper
control unit. SPC charts were initially developed by Dr. Walter A. Shewhart of Bell Laboratories
in the 1920s.

a. Process Control:

 A key component of operations management is process control, which makes sure that procedures
always operate within intended parameters. In order to uphold quality standards and maximize
efficiency, it entails monitoring and modifying processes.
The Seven Quality Control Tools are a set of techniques used in process control to identify and solve
quality-related issues. These tools include:
1. Check Sheets: Used for collecting and organizing data to identify patterns or trends in a process.
2. Histograms: Visual representations of data distribution to understand variations within a process.
3. Pareto Charts: Graphical tools that prioritize issues based on their frequency or impact, helping to
focus improvement efforts on critical areas.
4. Cause-and-Effect Diagrams (Fishbone Diagrams): Used to identify and analyze potential causes
of problems within a process.
5. Scatter Diagrams: Graphical tools to analyze the relationship between two variables, helping to
identify correlations.
6. Control Charts: Graphical representations of process performance over time, enabling the
detection of variations and trends.
7. Quality Control (QC) Tools: Various statistical techniques such as statistical process control
(SPC) and design of experiments (DOE) used to monitor and improve process quality.
b. Fundamentals of Statistics:
 Statistics are essential to operations management because they offer methods and instruments for
evaluating data, forming defensible conclusions, and streamlining procedures.
Some fundamental concepts of statistics relevant to operations management include:
1. Descriptive Statistics: Techniques for summarizing and describing data, such as measures of
central tendency (mean, median, mode) and measures of variability (range, variance, standard
deviation).
2. Inferential Statistics: Methods for making predictions or inferences about a population based on a
sample, such as hypothesis testing and confidence intervals.
3. Probability: The likelihood of events occurring, essential for understanding uncertainty in
processes and decision-making.
4. Sampling Techniques: Methods for selecting a representative subset of data from a larger
population, crucial for collecting reliable data without examining the entire population.
5. Regression Analysis: Techniques for modeling the relationship between variables, helping to
identify factors influencing process performance.
6. Statistical Distributions: Probability distributions such as the normal distribution, which describe
the likelihood of different outcomes in a process.
c. Characteristics of Normal Distribution Curve:
 The normal distribution curve, also known as the bell curve or Gaussian distribution, is a
fundamental concept in statistics and operations management. It is characterized by several key
features:
1. Symmetry: The curve is symmetrical, with the mean, median, and mode all located at the center.
This symmetry implies that half of the data falls on either side of the mean.
2. Bell Shape: The curve is bell-shaped, with the highest point (peak) at the mean and gradually
decreasing towards the tails.
3. Mean and Standard Deviation: The mean (average) and standard deviation (measure of
variability) fully describe the distribution. Approximately 68% of the data falls within one
standard deviation of the mean, 95% within two standard deviations, and 99.7% within three
standard deviations.
4. Empirical Rule: Also known as the 68-95-99.7 rule, it states the percentage of data falling within
certain standard deviations from the mean in a normal distribution.
5. Probability Density Function: The equation describing the normal distribution curve, which
allows for calculating probabilities of events occurring within the distribution.
 A thorough understanding of the normal distribution curve is necessary for many operations
management applications, such as decision-making, process control, and quality assurance. It
gives insights into the distribution and variability of data, empowering organizations to
successfully enhance processes and make well-informed decisions.
d. VARIABILITY
o is an inherent aspect of almost all processes, be they manufacturing, service-based, or biological
in nature. Understanding and managing this variability is essential for maintaining consistent
quality, improving process efficiency, and achieving desired outcomes.

PROCESS VARIABILITY
- refers to the inherent fluctuations, inconsistencies, or differences in a process's outputs. For a
manufacturing line, this might manifest as slight differences in the dimensions of produced items.
In a service environment, it might refer to the varied time it takes to complete a task.
SOURCES
There are typically two sources of process variability:
1) Common Causes (or Inherent Variability): These are the natural or expected variations that occur
within a process. They are part of the system and arise even when the process is under statistical
control. Examples include minor fluctuations in raw material quality or ambient temperature
changes.
2) Special Causes: These are unexpected sources of variation, arising due to unforeseen disturbances
or issues. Examples might include a malfunctioning machine, an untrained worker, or an
unexpected external event affecting the process.

e. ACCURACY AND PRECISION


o Accuracy measures how close the results are to the true or known value.
o Precision measures how close the results are to one another.
IMPORTANCE TO STATISTICS
In order to get the most reliable results in a scientific inquiry, it is important to minimize bias and error, as
well as to be precise and accurate in the collection of data. Both accuracy and precision have to do with
how close a measurement is to its actual or true value.
f. STANDARD NORMAL DISTRIBUTION
- The standard normal distribution is a normal distribution with a mean of zero and standard
deviation of 1. The standard normal distribution is centered at zero and the degree to which a
given measurement deviates from the mean is given by the standard deviation.
Mean is an another word for average. To find the mean, add all the numbers together and divide by
how many the numbers there are.
Standard deviation is a statistics that measures the dispersion of a data set relative to its mean and is
calculated as the square root of the variance.
g. ACCEPTABLE QUALITY MANAGEMENT
- The acceptable quality level (AQL) is a measure applied to products and defined in as the
“quality level that is the worst tolerable.
” The AQL tells you how many defective components are considered acceptable during random
sampling quality inspections. It is usually expressed as a percentage or ratio of the number of defects
compared to the total quantity.
• The acceptable quality level (AQL) is the worst quality level that is tolerable for a product.
• The AQL differs from product to product. Products that might cause more of a health risk will
have a lower AQL.
• Batches of products that do not meet the AQL, typically based on a percentage measurement, are
rejected when tested during pre-shipment inspections.

CONTROL CHARTS FOR IMPROVING PROCESS CAPABILITY


Process Capacity

 refers to the maximum output that a process can achieve under normal operating conditions over a
specific period. It's essentially the capability of a process to meet production demands efficiently and
effectively. Now, let's relate process capacity to the concepts mentioned:
a. Process Stability: Ensuring that a process is stable is crucial for maintaining its capacity. A
stable process exhibits consistent performance over time, which is essential for meeting
production targets reliably.
b. Central Limit Theorem: Understanding the statistical principles behind control charts, which
are used to monitor process performance, ensures that the process remains within its capacity
limits. The central limit theorem provides the basis for constructing these control charts, which
are essential tools for maintaining process capacity.
c. Control Charts: These are used to monitor process performance over time and detect any
deviations from the expected behavior. By continuously monitoring the process with control
charts, any issues affecting process capacity can be quickly identified and addressed.
d. Control Charts for Variable Data: Monitoring continuous processes with control charts for
variable data helps ensure that the process operates within its capacity limits. By tracking
variables such as temperature, pressure, or chemical concentrations, deviations that could
impact process capacity can be detected early.
e. Process Capability Indices: These indices provide a quantitative measure of how well a process
can meet its specifications. Assessing process capability is essential for understanding whether
the process is operating within its capacity limits and where improvements may be needed to
enhance capacity.
 In summary, process capacity is the overarching goal, and concepts like process stability, control
charts, and process capability indices are all tools and methodologies used to ensure that
processes operate within their capacity limits and meet production demands efficiently.

a. Defects and Six Sigma:


 Six Sigma works to minimize imperfections, in processes through the use of methods and quality
management techniques. Imperfections are any deviations from the intended result or
specifications in a process or product.
b. Definition of Six Sigma:
 Six Sigma is a methodology driven by data to enhance processes by identifying and eliminating
imperfections or variations that lead to errors or inefficiencies. Its goal is to achieve performance
allowing only 3.4 imperfections per million opportunities.
c. The Origin of Six Sigma:
 Six Sigma originated at Motorola in the 1980s, where engineer Bill Smith conceived the idea to
improve manufacturing processes. It later gained recognition at General Electric during Jack
Welchs leadership becoming an embraced approach for enhancing processes across sectors.
d. Essence of Six Sigma:
 The core principle of Six Sigma revolves around reducing variation and imperfections, in
processes to attain high quality results. It underscores the significance of data driven decision
making, thorough analysis and ongoing enhancements to meet customer needs and boost
performance.
 Six Sigma provides process models such, as DMAIC (Define, Measure, Analyze, Improve,
Control) for enhancing processes and DMADV (Define, Measure, Analyze, Design, Verify), for
creating new processes. These models offer methods to address issues and guarantee lasting
outcomes.
e. Six Sigma Process Models:
 Six Sigma provides process models, like DMAIC (Define, Measure, Analyze, Improve, Control)
to enhance processes and DMADV (Define, Measure, Analyze, Design, Verify) for creating new
processes. These models offer organized methods, for addressing problems and guarantee lasting
outcomes.

SEVEN NEW MANAGEMENT TOOLS


1. Affinity Diagram
o Affinity diagrams, also known as affinity mapping, K-J method, or cluster analysis, are a key tool
in Total Quality Management (TQM). They help tackle complex problems by categorizing a large
number of ideas into related groups. The process involves defining a problem, generating ideas on
sticky notes, and then collaboratively grouping these ideas. This fosters communication and helps
in better understanding and analyzing the problem. The final step involves documenting and
visualizing the grouped ideas for improved insight and problem-solving.
2. Interrelationship Diagram
o Interrelationship diagrams visually represent cause-and-effect relationships, aiding in
comprehensive system analysis. They are often used in problem-solving, decision-making, and
strategic planning processes. At first glance, interrelationship diagrams may look like a web of
lines and arrows, but they hold significant value in uncovering a system’s underlying connections
and dependencies.
3. Tree Diagram
o The tree diagram is a visual tool from the Seven new management and planning tools, ideal for
breaking down a main idea or objective into smaller components. To create one, start with the
main idea at the top, like the trunk of a tree. Then, add major branches representing key
components or sub-objectives. Further, expand these branches with smaller branches for specific
tasks or factors. This hierarchical structure visually organizes complex information, showing how
different elements interconnect and contribute to the overall goal. It aids in task prioritization,
resource allocation, and strategy development, offering a clear overview for better decision-
making.

4. Matrix Diagram
o Matrix diagram is a powerful visual tool for analyzing complex relationships and aiding decision-
making in individuals and organizations. At their core, matrix diagrams organize data or
information into a grid or matrix format, allowing for a clear and structured representation of
relationships between different variables.
o To understand the basics of matrix diagrams, it is essential to grasp the fundamental components.
The matrix consists of rows and columns, each intersecting cell representing a specific
relationship or interaction. The rows typically represent one set of variables or factors, while the
columns represent another set. This arrangement enables the identification of correlations,
dependencies, or patterns between the two sets of variables.
5. Matrix Data Analysis
o A prioritization matrix is a sophisticated yet practical tool used in place of more complex
mathematical methods for analyzing matrices. This L-shaped matrix simplifies decision-making
by facilitating direct comparisons between a range of options and a set of defined criteria. It
works by systematically evaluating each option against these criteria, usually through pairwise
comparisons. This process helps in effectively ranking or prioritizing the options based on how
well they meet the criteria. By providing a structured approach to assess multiple factors, the
prioritization matrix becomes an invaluable tool in choosing the most suitable option or options
from a list, ensuring decisions are made more thoughtfully and based on a comprehensive
evaluation.
6. Arrow Diagram
o An arrow diagram also called an activity network diagram, arrow network, or activity-on-arrow
diagram, represents the sequence of activities within a process, highlighting dependencies and
facilitating efficient resource allocation. Arrow diagrams are constructed by assigning each
activity or task a specific arrow, with the direction indicating the flow of work. The arrows are
then connected to illustrate the logical relationships and dependencies between activities. This
visual representation allows project teams to identify critical paths, bottlenecks, and areas for
improvement.
7. Process Decision Program Chart
o The Process Decision Program Chart (PDPC) is a tool for structured decision-making and
planning in businesses and individual projects. It maps out decision points and actions in
interconnected branches, highlighting potential risks and challenges. Starting with the main goal
at the top, each branch details options and associated risks, aiding in proactive problem-solving
and contingency planning.
Conclusion
The seven new management and planning tools developed by the Union of Japanese Scientists and
Engineers (JUSE) provide innovative methods to address the evolving management and planning needs.
From the Affinity Diagram for collaborative problem-solving to the Process Decision Program Chart for
structured decision-making, these tools offer a well-rounded approach to tackling complex business
challenges. Each tool serves a specific purpose, from analyzing data and identifying trends to mapping
out processes and evaluating risks. These tools contribute to more effective and thoughtful planning in
diverse organizational contexts by fostering communication, visualizing relationships, and aiding
decision-making.

Business Process Benchmarking (BPB)

 is a systematic process used by organizations to evaluate their business processes and compare
them against industry best practices or benchmarks. The goal of BPB is to identify areas where a
company can improve its efficiency, productivity, quality, and overall performance by adopting
practices that have proven successful in other organizations.
Here's a breakdown of the typical steps involved in Business Process Benchmarking:

 Identify Processes
 Select Benchmark Partners
 Collect Data
 Collect Data from Benchmark Partners
 Compare Performance
 Identify Best Practices
 Implement Improvements
 Monitor and Measure
 Business Process Benchmarking can be conducted internally within an organization (comparing
different departments or units) or externally (comparing with other organizations). It provides
valuable insights into how well an organization is performing relative to industry standards and
helps identify opportunities for improvement.

Quality Function Deployment (QFD)

 is a systematic methodology used in product development and process improvement to ensure


that customer needs and expectations are translated into specific product or service features. It
originated in Japan in the 1960s and gained popularity in various industries around the world.
Here's an overview of how QFD works:

 Voice of the Customer (VOC) Analysis: The process begins with gathering and analyzing the
Voice of the Customer (VOC). This involves understanding and prioritizing customer
requirements, preferences, and expectations regarding the product or service.
 House of Quality (HOQ): The central tool in QFD is the House of Quality, which is a matrix
that correlates customer requirements (WHATs) with engineering characteristics (HOWs). It
typically consists of a grid where customer requirements are listed on one side and engineering
characteristics or product features on the other. The intersection of these elements shows the
relationship between customer needs and how they will be fulfilled by specific product attributes.
 Inter-relationship Matrix: Besides the House of Quality, QFD may involve other matrices or
diagrams to analyze the relationships between various factors, such as technical requirements,
design parameters, and manufacturing processes. These matrices help identify dependencies and
interactions between different aspects of the product or service.
 Prioritization and Planning: Once the relationships between customer needs and product
features are established, priorities are assigned to each requirement based on customer importance
and technical feasibility. This helps in making decisions about which features to focus on during
the design and development process.
 Implementation and Continuous Improvement: QFD is not a one-time activity but rather a
continuous process that guides product development from concept to launch and beyond. As the
product evolves, feedback from customers and performance data are collected and used to refine
the design and improve future iterations.
QFD aims to align the entire organization with customer needs and preferences, leading to products and
services that are better suited to meet customer expectations. By systematically translating customer
requirements into design specifications, QFD helps in reducing development time, minimizing costs, and
increasing customer satisfaction and loyalty.

You might also like

pFad - Phonifier reborn

Pfad - The Proxy pFad of © 2024 Garber Painting. All rights reserved.

Note: This service is not intended for secure transactions such as banking, social media, email, or purchasing. Use at your own risk. We assume no liability whatsoever for broken pages.


Alternative Proxies:

Alternative Proxy

pFad Proxy

pFad v3 Proxy

pFad v4 Proxy