KEYWORDS: Signal detection, Overlay metrology, Metrology, Logic, Scanning electron microscopy, Etching, Yield improvement, Optical metrology, Inspection, Front end of line
In advanced logic nodes, edge placement error (EPE) budget becomes tighter. Such budget needs to account for items that were nearly negligible before FinFET era, such as rule-based etch bias error or overlay metrology to device (MTD) bias. Some of the new challenges are overlay metrology error due to process induced mark asymmetry, after Etch Inspection (AEI) pattern shift and aberration induced overlay difference between mark and device, all summarized as Metrology to Device bias. YieldStar In-Device Metrology (YS IDM) addresses device-like metrology and real AEI overlay, but in principle might suffer from process asymmetry. In this work we measure ASML Self Reference (ASR) targets by IDM. We use the detected IDM signal to quantify and address for the first time the asymmetry of the printed marks containing device-like structures on FEOL with respect to reference tool. Two main findings characterize this work:
- IDM has the capability to identify overlay and tilt signal from a multi-wavelength signal. Scanning electron microscopy (SEM) is a different metrology tool which, to our best knowledge, is instead detecting the two signals as one, without separating them. Overlay and tilt signals identified by IDM can be combined in order to match to SEM
- The relative amount of overlay and tilt carried by the IDM signal shows a monotonic and continuous wavelength dependency.
These findings increase the understanding of the delta IDM to SEM method, improving the matching between the two. The separation of overly and tilt allows to distinguish which part of the process is causing a certain fingerprint, as tilt is purely driven by non-litho processes. In addition, the combination of overlay and tilt metrology allows improved correlation of the detected AEI signal to yield, and the definition of KPIs for smaller MTD fingerprint. Finally, IDM provides the possibility to keep throughput benefits of optical metrology while overcoming the robustness challenges
Multilayer stack height in 3DNAND has reached the limit of the aspect ratio that etch technologies can cost-effectively achieve. The solution to achieve further bit density scaling is to build the stack in two tiers, each etched separately. While lowering the requirements on etch aspect ratio, stacking two tiers introduces a critical overlay at the interface between the stacks. Due to the height of each stack, stress- or etch-induced tilt in the channel holes is translated into overlay. Characterizing and controlling the resulting complex overlay fingerprints requires dense and frequent overlay metrology. The familiar electron beam metrology after etch-back (DECAP) is destructive and therefore too slow and expensive for frequent measurements. This paper will introduce a fast, accurate & robust data-driven method for In Device Overlay Metrology (IDM) on etched 3DNAND devices by making use of specially designed recipe setup targets. Also, potential applications for process control improvement will be demonstrated.
In this work a novel machine learning algorithm is used to calculate the after etch overlay of the memory holes in a 3DNAND device based on OCD metrology by YieldStar S1375. It is shown that the method can distinguish the overlay signals from the process induced signals in the acquired pupil image and therefore, enables for an overlay metrology approach which is highly robust to process variations. This metrology data is used to characterize and correct the process induced intra-die stress and the DUV scanner application fingerprint.
Ever increasing need for tighter on-product overlay (OPO), as well as enhanced accuracy in overlay metrology and methodology, is driving semiconductor industry’s technologists to innovate new approaches to OPO measurements. In case of High Volume Manufacturing (HVM) fabs, it is often critical to strive for both accuracy and robustness. Robustness, in particular, can be challenging in metrology since overlay targets can be impacted by proximity of other structures next to the overlay target (asymmetric effects), as well as symmetric stack changes such as photoresist height variations. Both symmetric and asymmetric contributors have impact on robustness. Furthermore, tweaking or optimizing wafer processing parameters for maximum yield may have an adverse effect on physical target integrity. As a result, measuring and monitoring physical changes or process abnormalities/artefacts in terms of new Key Performance Indicators (KPIs) is crucial for the end goal of minimizing true in-die overlay of the integrated circuits (ICs). IC manufacturing fabs often relied on CD-SEM in the past to capture true in-die overlay. Due to destructive and intrusive nature of CD-SEMs on certain materials, it’s desirable to characterize asymmetry effects for overlay targets via inline KPIs utilizing YieldStar (YS) metrology tools. These KPIs can also be integrated as part of (μDBO) target evaluation and selection for final recipe flow. In this publication, the Holistic Metrology Qualification (HMQ) flow was extended to account for process induced (asymmetric) effects such as Grating Imbalance (GI) and Bottom Grating Asymmetry (BGA). Local GI typically contributes to the intrafield OPO whereas BGA typically impacts the interfield OPO, predominantly at the wafer edge. Stack height variations highly impact overlay metrology accuracy, in particular in case of multi-layer LithoEtch Litho-Etch (LELE) overlay control scheme. Introducing a GI impact on overlay (in nm) KPI check quantifies the grating imbalance impact on overlay, whereas optimizing for accuracy using self-reference captures the bottom grating asymmetry effect. Measuring BGA after each process step before exposure of the top grating helps to identify which specific step introduces the asymmetry in the bottom grating. By evaluating this set of KPI's to a BEOL LELE overlay scheme, we can enhance robustness of recipe selection and target selection. Furthermore, these KPIs can be utilized to highlight process and equipment abnormalities. In this work, we also quantified OPO results with a self-contained methodology called Triangle Method. This method can be utilized for LELE layers with a common target and reference. This allows validating general μDBO accuracy, hence reducing the need for CD-SEM verification.
Overlay metrology setup today faces a continuously changing landscape of process steps. During Diffraction Based Overlay (DBO) metrology setup, many different metrology target designs are evaluated in order to cover the full process window. The standard method for overlay metrology setup consists of single-wafer optimization in which the performance of all available metrology targets is evaluated. Without the availability of external reference data or multiwafer measurements it is hard to predict the metrology accuracy and robustness against process variations which naturally occur from wafer-to-wafer and lot-to-lot. In this paper, the capabilities of the Holistic Metrology Qualification (HMQ) setup flow are outlined, in particular with respect to overlay metrology accuracy and process robustness. The significance of robustness and its impact on overlay measurements is discussed using multiple examples. Measurement differences caused by slight stack variations across the target area, called grating imbalance, are shown to cause significant errors in the overlay calculation in case the recipe and target have not been selected properly. To this point, an overlay sensitivity check on perturbations of the measurement stack is presented for improvement of the overlay metrology setup flow. An extensive analysis on Key Performance Indicators (KPIs) from HMQ recipe optimization is performed on µDBO measurements of product wafers. The key parameters describing the sensitivity to perturbations of the measurement stack are based on an intra-target analysis. Using advanced image analysis, which is only possible for image plane detection of μDBO instead of pupil plane detection of DBO, the process robustness performance of a recipe can be determined. Intra-target analysis can be applied for a wide range of applications, independent of layers and devices.
On-product overlay requirements are becoming more challenging with every next technology node due to the continued decrease of the device dimensions and process tolerances. Therefore, current and future technology nodes require demanding metrology capabilities such as target designs that are robust towards process variations and high overlay measurement density (e.g. for higher order process corrections) to enable advanced process control solutions. The impact of advanced control solutions based on YieldStar overlay data is being presented in this paper. Multi patterning techniques are applied for critical layers and leading to additional overlay measurement demands. The use of 1D process steps results in the need of overlay measurements relative to more than one layer. Dealing with the increased number of overlay measurements while keeping the high measurement density and metrology accuracy at the same time presents a challenge for high volume manufacturing (HVM). These challenges are addressed by the capability to measure multi-layer targets with the recently introduced YieldStar metrology tool, YS350. On-product overlay results of such multi-layers and standard targets are presented including measurement stability performance.
Overlay control is one of the most critical areas in advanced semiconductor processing. Maintaining
optimal product disposition and control requires high quality data as an input. Outliers can contaminate lot
statistics and negatively impact lot disposition and feedback control. Advanced outlier removal methods
have been developed to minimize their impact on overlay data processing. Rejection methods in use today
are generally based on metrology quality metrics, raw data statistics and/or residual data statistics.
Shortcomings of typical methods include the inability to detect multiple outliers as well as the unnecessary
rejection of valid data. As the semiconductor industry adopts high-order overlay modeling techniques,
outlier rejection becomes more important than for linear modeling. In this paper we discuss the use of
robust regression methods in order to more accurately eliminate outliers. We show the results of an
extensive simulation study, as well as a case study with data from a semiconductor manufacturer.
A primary concern when selecting an overlay sampling plan is the balance between accuracy and throughput. Two
significant inflections in the semiconductor industry require even more careful sampling consideration: the transition
from linear to high order overlay control, and the transition to dual patterning lithography (DPL) processes. To address
the sampling challenges, an analysis tool in KT-Analyzer has been developed to enable quantitative evaluation of
sampling schemes for both stage-grid and within-field analysis. Our previous studies indicated (1) the need for fully
automated solutions that takes individual interpretation from the optimization process, and (2) the need for improved
algorithms for this automation; both of which are described here.
Overlay continues to be one of the key challenges for lithography in advanced semiconductor manufacturing. It becomes
even more challenging due to the continued shrinking of the device node. Some low k1 techniques, such as Double
Exposure and Double Patterning also add additional loss of the overlay margin due to the fact that the single layer pattern
is created based on more than 1 exposure. Therefore, the overlay between 2 exposures requires very tight overlay
specification.
Mask registration is one of the major contributors to wafer overlay, especially field related overlay. We investigated
mask registration and wafer overlay by co-analyzing the mask data and the wafer overlay data. To achieve the accurate
cohesive results, we introduced the combined metrology mark which can be used for both mask registration
measurement as well as for wafer overlay measurement. Coincidence of both metrology marks make it possible to
subtract mask signature from wafer overlay without compromising the accuracy due to the physical distance between
measurement marks, if we use 2 different marks for both metrologies. Therefore, it is possible to extract pure scanner
related signatures, and to analyze the scanner related signatures in details to in order to enable root cause analysis and
ultimately drive higher wafer yield. We determined the exact mask registration error in order to decompose wafer overlay into mask, scanner, process and metrology. We also studied the impact of pellicle mounting by comparison of mask registration measurement pre-pellicle mounting and post-pellicle mounting in this investigation.
We have developed a new scheme of process control combining a CD metrology system and an exposure tool. A new
model based on Neural Networks has been created in KLA-Tencor's "KT Analyzer" which calculates the dose and
focus errors simultaneously from CD parameters, such as mid CD and height information, measured by a scatterometry
(OCD) measurement tool. The accuracy of this new model was confirmed by experiment. Nikon's "CDU master" then
calculated the control parameters for dose and focus per each field from the dose and focus error data of a reference
wafer provided by KT Analyzer. Using the corrected parameters for dose and focus from CDU master, we exposed
wafers on an NSR-S610C (ArF immersion scanner), and measured the CDU on a KLA SCD100 (OCD tool). As a result,
we confirmed that CDU in the entire wafer can be improved more than 60% (from 3.36nm (3σ) to 1.28nm (3σ)).
KEYWORDS: Semiconducting wafers, Overlay metrology, Process control, Time metrology, Monte Carlo methods, Optical lithography, Statistical methods, Control systems, Immersion lithography, Lithography
Overlay metrology and control have been critical for successful advanced microlithography for many years, and are
taking on an even more important role as time goes on. Due to throughput constraints it is necessary to sample only a
small subset of overlay metrology marks, and typical sample plans are static over time. Standard production monitoring
and control involves measuring sufficient samples to calculate up to 6 linear correctables. As design rules shrink and
processing becomes more complex, however, it is necessary to consider higher order models with additional degrees of
freedom for control, fault detection, and disposition. This in turn, requires a higher level of sampling and a careful
consideration of flyer removal. Due to throughput concerns, however, careful consideration is needed to establish a
baseline sampling plan using rigorous statistical methods. This study focuses on establishing a 3x nm node immersion
lithography production-worthy sampling plan for 3rd order modeling, verification of the accuracy, and proof of
robustness of the sampling. In addition we discuss motivation for dynamic sampling for application to higher order
modeling.
Overlay metrology and control have been critical for successful advanced microlithography for many years, and are
taking on an even more important role as time goes on. Due to throughput constraints it is necessary to sample only a
small subset of overlay metrology marks, and typical sample plans are static over time. Standard production monitoring
and control involves measuring sufficient samples to calculate up to 6 linear correctables. As design rules shrink and
processing becomes more complex, however, it is necessary to consider higher order modeled terms for control, fault
detection, and disposition. This in turn, requires a higher level of sampling. Due to throughput concerns, however,
careful consideration is needed to establish a base-line sampling, and higher levels of sampling can be considered on an
exception-basis based on automated trigger mechanisms. The goal is improved scanner control and lithographic cost of
ownership. This study addresses tools for establishing baseline sampling as well as motivation and initial results for
dynamic sampling for application to higher order modeling.
KEYWORDS: Semiconducting wafers, Optical alignment, Overlay metrology, Data modeling, Control systems, Critical dimension metrology, Semiconductors, Process control, Lithography, Nonlinear control
Overlay requirements for semiconductor devices are increasing faster than anticipated. Overlay becomes much harder to
control with current methods and therefore novel techniques are needed. In this paper, we present our investigation
methods for High Order Control, and the candidates for improvement. This paper will present the study for each
components of high order control. High order correction is one component for high order control and several correction
methods were compared for this study. High order alignment is another important component for higher order control
instead of using conventional linear model for the alignment. Alignment and overlay measurement sampling decision
becomes a more critical issue for sampling efficiency and accuracy. Optimal sampling for high order was studied for
high order control. Using all these studies, various applications for optimal high order control have also been studied.
This study will show the general approach for high order control with theory and actual experimental data.
The merits of hyper NA imaging using 193nm exposure wavelength with water immersion for 45nm is clear. Scanner
focus and dose control is always improving to allow small DOF manufacturing in immersion lithography. However,
other process parameters can affect focus and dose control and a real-time monitor capability to detect local focus and
exposure conditions on production wafers is required. In this paper we evaluated a focus-exposure monitor technique
based on Spectroscopic Critical Dimension (SCD) metrology following the promising results obtained by Kelvin Hung
[1] et al. The key attributes of this technique are the implementation on standard production wafers, the high sensitivity
to pattern profile modifications and the unique capability of spectroscopic ellipsometry to provide all the information
needed to decouple the effects on pattern formation coming from process variations of Advanced Patterning Films (APF)
[2], largely adopted for 65/45nm patterning, from coating and, finally, from the pure scanner imaging contributors like
focus and exposure. We will present the characterization of this technique for 2 critical layers: active and contacts of a
non-volatile memory device, 45nm technology.
The overlay control budget for the 32nm technology node will be 5.7nm according to the ITRS. The overlay metrology
budget is typically 1/10 of the overlay control budget resulting in overlay metrology total measurement uncertainty
(TMU) requirements of 0.57nm for the most challenging use cases of the 32nm node. The current state of the art
imaging overlay metrology technology does not meet this strict requirement, and further technology development is
required to bring it to this level. In this work we present results of a study of an alternative technology for overlay
metrology - Differential signal scatterometry overlay (SCOL). Theoretical considerations show that overlay technology
based on differential signal scatterometry has inherent advantages, which will allow it to achieve the 32nm technology
node requirements and go beyond it. We present results of simulations of the expected accuracy associated with a
variety of scatterometry overlay target designs. We also present our first experimental results of scatterometry overlay
measurements, comparing this technology with the standard imaging overlay metrology technology. In particular, we
present performance results (precision and tool induced shift) and address the issue of accuracy of scatterometry
overlay. We show that with the appropriate target design and algorithms scatterometry overlay achieves the accuracy
required for future technology nodes.
Layer to layer alignment in optical lithography is controlled by feedback of scanner correctibles provided by analysis
of in-line overlay metrology data from product wafers. There is mounting evidence that the "high order" field
dependence, i.e. the components which contribute to residuals in a linear model of the overlay across the scanner field
will likely need to be measured in production scenarios at the 45 and 32 nm half pitch nodes. This is in particular
true in immersion lithography where thermal issues are likely to impact intrafield overlay and double pitch patterning
scenarios where the high order reticle feature placement error contribution to the in-die overlay is doubled.
Production monitoring of in-field overlay must be achieved without compromise of metrology performance in order to
enable sample plans with viable cost of ownership models. In this publication we will show new results of in-die
metrology, which indicate that metrology performance comparable with standard scribeline metrology required for the
45 nm node is achievable with significantly reduced target size. Results from dry versus immersion on poly to active
45 nm design rule immersion lithography process layers indicate that a significant reduction in model residuals can be
achieved when HO intrafield overlay models are enabled.
As Moore's Law drives CD smaller and smaller, overlay budget is shrinking rapidly. Furthermore, the cost of advanced
lithography tools prohibits usage of latest and greatest scanners on non-critical layers, resulting in different layers being
exposed with different tools; a practice commonly known as 'mix and match.' Since each tool has its unique signature,
mix and match becomes the source of high order overlay errors. Scanner alignment performance can be degraded by a
factor of 2 in mix and match, compared to single tool overlay operation. In a production environment where scanners
from different vendors are mixed, errors will be even more significant. Mix and match may also be applied to a single
scanner when multiple illumination modes are used to expose critical levels. This is because different illuminations will
have different impact to scanner aberration fingerprint. The semiconductor technology roadmap has reached a point
where such errors are no longer negligible.
Mix and match overlay errors consist of scanner stage grid component, scanner field distortion component, and process
induced wafer distortion. Scanner components are somewhat systematic, so they can be characterized on non product
wafers using a dedicated reticle. Since these components are known to drift over time it becomes necessary to monitor
them periodically, per scanner, per illumination.
In this paper, we outline a methodology for automating characterization of mix and match errors, and a control system
for real-time correction.
In this publication, the contributors to in-field overlay metrology uncertainty have been parsed and quantified on a back
end process and compared with results from a previous front end study1. Particular focus is placed on the unmodeled
systematics, i.e. the components which contribute to residuals in a linear model after removal of random errors. These
are the contributors which are often the most challenging to quantify and are suspected to be significant in the model
residuals. The results show that in both back and front end processes, the unmodeled systematics are the dominant
residual contributor, accounting for 60 to 70% of the variance, even when subsequent exposures are on the same
scanner. A higher order overlay model analysis demonstrates that this element of the residuals can be further dissected
into correctible and non-correctible high order systematics. A preliminary sampling analysis demonstrates a major
opportunity to improve the accuracy of lot dispositioning parameters by transitioning to denser sample plans compared
with standard practices. Field stability is defined as a metric to quantify the field to field variability of the intrafield
correctibles.
KEYWORDS: Overlay metrology, Metrology, Image segmentation, Front end of line, Semiconducting wafers, Chemical mechanical planarization, Metals, Lithography, Scanning electron microscopy, Scanners
Accurate and precise overlay metrology is a critical requirement in order to achieve high product yield in microelectronic manufacturing. Meeting the tighter overlay measurement error requirements for 90nm technology and beyond is a dramatic challenge for optical metrology techniques using only conventional overlay marks like Bar in Bar (BiB) or Frame in Frames (FiF). New deficiencies, affecting traditional overlay marks, become evident as microlithography processes are developed for each new design rule node. The most serious problems are total measurement uncertainty, CMP process robustness, and device correlation. In this paper we will review the superior performances of grating-based AIM marks to provide a complete solution to control lithography overlay errors for new generation devices. Examples of successful application of AIM technology to FEOL and Cu-BEOL process steps of advanced non volatile memory devices manufacturing are illustrated. An additional advantage of the adoption of AIM marks is that the significant reduction of target noise versus conventional marks revealed systematic differences within the lithography cluster which were previously obscure offering a new tool to optimize litho cells. In this paper we demonstrated that AIM target architecture enables high performance metrology with design rule segmented targets - a prerequisite to have overlay marks fully compatible with design rule sensitive process steps.
In this publication, the contributors to in-field overlay metrology uncertainty have been parsed and quantified in a specific case study. Particular focus is placed on the unmodeled systematics, i.e. the components which contribute to residuals in a linear model after removal of random errors. These are the contributors which are often the most challenging to quantify and are suspected to be significant in the model residuals. The results show that even in a relatively "clean" front end process, the unmodeled systematics are the dominant residual contributor, accounting for 60 to 70% of the variance. Given the above results, new sampling and modeling methods are proposed which have the potential to improve the accuracy of modeled correctibles and lot dispositioning parameters.
Overlay metrology for production line-monitor and advanced process control (APC) has been dominated by 4-corner box-in-box (BiB) methods for many years. As we proceed following the ITRS roadmap with the development of 65 nm technologies and beyond, it becomes apparent that current overlay methodologies are becoming inadequate for the stringent requirements that lie ahead. It is already apparent that kerf metrology of large scale BiB structures does not
correlate well with in-chip design-rule features. The recent introduction of the Advanced Imaging Metrology (AIM) target, utilizing increased information content and advanced design and process compatibility, has demonstrated significant improvements in precision and overlay mark fidelity (OMF) in advanced processes. This paper compares methodologies and strategies for addressing cross-field variation of overlay and pattern placement issues. We compare the trade-offs of run-time intra-field sampling plans and the use of off-line lithography characterization and advanced
modeling analysis, and propose new methodologies to address advanced overlay metrology and control.
We have developed a method for calculating the statistical effects of spatial noise on the overlay measurement extracted from a given overlay target. The method has been applied to two kinds of overlay targets on three process layers, and the new metric, Target Noise, has been shown to correlate well to the random component of Overlay Mark Fidelity. A significant difference in terms of robustness has been observed between AIM targets and conventional Frame-in-Frame targets. The results fit well into the spatial noise hierarchy presented in this paper.
An improved overlay mark design was applied in high end semiconductor manufacturing to increase the total overlay measurement accuracy with respect to the standard box-in-box target. A comprehensive study has been conducted on the basis of selected front-end and back-end DRAM layers (short loop) to characterize contributors to overlay error. This analysis is necessary to keep within shrinking overlay budget requirements.
In this publication we introduce a new metric for process robustness of overlay metrology in microelectronic manufacturing. By straightforward statistical analysis of overlay metrology measurements on an array of adjacent, nominally identical overlay targets the Overlay Mark Fidelity (OMF) can be estimated. We present the results of such measurements and analysis on various marks, which were patterned using a DUV scanner. The same reticle set was used to pattern wafers on different process layers and process conditions. By appropriate statistical analysis, the breakdown of the total OMF into a reticle-induced OMF component and a process induced OMF component was facilitated. We compare the OMF of traditional box-in-box overlay marks with that of new gratingbased overlay marks and show that in all cases the grating marks are superior. The reticle related OMF showed an improvement of 30 % when using the new grating-based overlay mark. Furthermore, in a series of wafers run through an STI-process with different Chemical Mechanical Polish (CMP) times, the random component of the OMF of the new grating-based overlay mark was observed to be 40% lower and 50% less sensitive to process variation compared with Box in Box marks. These two observations are interpreted as improved process robustness of the grating mark over box in box, specifically in terms of reduced site by site variations and reduced wafer to wafer variations as process conditions change over time. Overlay Mark Fidelity, as defined in this publication, is a source of overlay metrology uncertainty, which is statistically independent of the standard error contributors, i.e. precision, TIS variability, and tool to tool matching. Current overlay metrology budgeting practices do not take this into consideration when calculating total measurement uncertainty (TMU). It is proposed that this be reconsidered, given the tightness of overlay and overlay metrology budgets at the 70 nm design rule node and below.
As overlay budgets shrink with design rules, the importance of overlay metrology accuracy increases. We have investigated the overlay accuracy of a 0.18mm design rule Copper-Dual-Damascene process by comparing the overlay metrology results at the After Develop (DI) and After Etch (FI) stages. The comparisons were done on five process layers on production wafers, while ensuring that the DI and FI measurements were always done on the same wafer. In addition, we measured the in-die overlay on one of the process layers (Poly Gate) using a CD-SEM, and compared the results to the optical overlay metrology in the scribe-line. We found that a serious limitation to in-die overlay calibration was the lack of suitable structures measurable by CD-SEM. We will present quantitative results from our comparisons, as well as a recommendation for incorporating CD-SEM-measurable structures in the chip area in future reticle designs.
While overlay precision has received much focus in the past, overlay accuracy has become more significant with shrinking process budgets. One component of accuracy is the difference between pre-etch (DI) and post-etch (FI) overlay, which is a function of wafer processing parameters. We investigated a specific case of overlay between metal and contact layers of a 0.16 mm SRAM process. This layer was chosen because a significant amount of wafer contraction was observed between DI and FI, resulting in as much as 30nm of DI-FI overlay difference. The purpose of the study was to characterize the systematic DI-FI differences and gain understanding of the wafer processing parameters that affect the DI-FI differences. A designed experiment showed how certain overlay mark widths were less sensitive to processing parameters. AFM profiles of the prior-level overlay marks identified issues with mark widths 1.0um or smaller. By performing localized etches on the inner vs. outer marks of the overlay targets, it was noted that the majority of the wafer contraction was induced by etching the outer (prior level) mark. Production measurements at photo and etch showed the wafer contraction to be fairly stable over a month timeframe and independent of device and exposure tool, though large fluctuation shifts in wafer contraction were noted over a nine-month period. The methods used in this study can be helpful in understanding other DI-FI processing issues.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
INSTITUTIONAL Select your institution to access the SPIE Digital Library.
PERSONAL Sign in with your SPIE account to access your personal subscriptions or to use specific features such as save to my library, sign up for alerts, save searches, etc.