LiteBIRD, the next-generation cosmic microwave background (CMB) experiment, aims for a launch in Japan’s fiscal year 2032, marking a major advancement in the exploration of primordial cosmology and fundamental physics. Orbiting the Sun-Earth Lagrangian point L2, this JAXA-led strategic L-class mission will conduct a comprehensive mapping of the CMB polarization across the entire sky. During its 3-year mission, LiteBIRD will employ three telescopes within 15 unique frequency bands (ranging from 34 through 448 GHz), targeting a sensitivity of 2.2 μK-arcmin and a resolution of 0.5° at 100 GHz. Its primary goal is to measure the tensor-toscalar ratio r with an uncertainty δr = 0.001, including systematic errors and margin. If r ≥ 0.01, LiteBIRD expects to achieve a > 5σ detection in the ℓ = 2–10 and ℓ = 11–200 ranges separately, providing crucial insight into the early Universe. We describe LiteBIRD’s scientific objectives, the application of systems engineering to mission requirements, the anticipated scientific impact, and the operations and scanning strategies vital to minimizing systematic effects. We will also highlight LiteBIRD’s synergies with concurrent CMB projects.
Euclid is a European Space Agency (ESA) wide-field space mission dedicated to the high-precision study of dark energy and dark matter. In July 2023 a Space X Falcon 9 launch vehicle put the spacecraft in its target orbit, located 1.5 million kilometers away from Earth, for a nominal lifetime of 6.5 years. The survey will be realized through a wide field telescope and two instruments: a visible imager (VIS) and a Near Infrared Spectrometer and Photometer (NISP). NISP is a state-of-the-art instrument composed of many subsystems, including an optomechanical assembly, cryogenic mechanisms, and active thermal control. The Instrument Control Unit (ICU) is interfaced with the SpaceCraft and manages the commanding and housekeeping production while the high-performance Data Processing Unit manages more than 200 Gbit of compressed data acquired daily during the nominal survey. To achieve the demanding performance necessary to meet the mission’s scientific goals, NISP requires periodic in-flight calibrations, instrument parameters monitoring, and careful control of systematic effects. The high stability required implies that operations are coordinated and synchronized with high precision between the two instruments and the platform. Careful planning of commanding sequences, lookahead, and forecasting instrument monitoring is needed, with greater complexity than previous survey missions. Furthermore, NISP is operated in different environments and configurations during development, verification, commissioning, and nominal operations. This paper presents an overview of the NISP instrument operations at the beginning of routine observations. The necessary tools, workflows, and organizational structures are described. Finally, we show examples of how instrument monitoring was implemented in flight during the crucial commissioning phase, the effect of intense Solar activity on the transmission of onboard data, and how IOT successfully addressed this issue.
R. Laureijs, R. Vavrek, G. Racca, R. Kohley, P. Ferruit, V. Pettorino, T. Bönke, A. Calvi, L. Gaspar Venancio, L. Campos, E. Maiorano, O. Piersanti, S. Prezelus, U. Ragnit, P. Rosato, C. Rosso, H. Rozemeijer, A. Short, P. Strada, D. Stramaccioni, M. Szafraniec, B. Altieri, G. Buenadicha, X. Dupac, P. Gómez Cambronero, K. Henares Vilaboa, C. Hernandez de la Torre, J. Hoar, M. Lopez-Caniego Alcarria, P. Marcos Arenal, J. Martin Fleitas, M. Miluzio, A. Mora, S. Nieto, R. Perez Bonilla, P. Teodoro Idiago, F. Cordero, J. Mendes, F. Renk, A. Rudolph, M. Schmidt, J. Schwartz, Y. Mellier, H. Aussel, M. Berthé, P. Casenove, M. Cropper, J. Cuillandre, J. Dinis, A. Gregorio, K. Kuijken, T. Maciaszek, L. Miller, R. Scaramella, M. Schirmer, I. Tereno, A. Zacchei, S. Awan, G. Candini, P. Liebing, R. Nakajima, S. Dusini, P. Battaglia, E. Medinaceli, C. Sirignano, I. Baldry, C. Baugh, F. Bernardeau, F. Castander, A. Cimatti, W. Gillard, L. Guzzo, H. Hoekstra, K. Jahnke, T. Kitching, E. Martin, J. Mohr, W. Percival, J. Rhodes
During its 6-year nominal mission, Euclid shall survey one third of the sky, enabling us to examine the spatial distributions of dark and luminous matter during the past 10 Gyr of cosmic history. The Euclid satellite was successfully launched on a SpaceX Falcon 9 launcher from Cape Canaveral on 1 July 2023 and is fully operational in a halo orbit around the Second Sun-Earth Lagrange point. We present an overview of the expected and unexpected findings during the early phases of the mission, in the context of technological heritage and lessons learnt. The first months of the mission were dedicated to the commissioning of the spacecraft, telescope and instruments, followed by a phase to verify the scientific performance and to carry out the in-orbit calibrations. We report that the key enabling scientific elements, the 1.2-meter telescope and the two scientific instruments, a visual imager (VIS) and a near-infrared spectrometer and photometer (NISP), show an inorbit performance in line with the expectations from ground tests. The scientific analysis of the observations from the Early Release Observations (ERO) program done before the start of the nominal mission showed sensitivities better than the prelaunch requirements. The nominal mission started in December 2023, and we allocated a 6-month early survey operations phase to closely monitor the performance of the sky survey. We conclude with an outlook of the activities for the remaining mission in the light of the in-orbit performance.
Euclid, an ESA mission designed to characterise dark energy and dark matter, passed its Mission Critical Design Review in November 2018. It was demonstrated that the project is ready to start integration and test of the main systems, and that it has the ability to fulfil its top-level mission requirements. In addition, based on the performances at M-CDR, the scientific community has verified that the science requirements can be achieved for the Weak Lensing and Galaxy Clustering dark energy probes, namely a dark energy Figure of Merit of 400 and a 2% accuracy in the growth factor exponent gamma. We present the status of the main elements of the Euclid mission in the light of the demanding high optical performance which is the essential design driver is the to meet the scientific requirements. We include the space segment comprising of a service module and payload module hosting the telescope and its two scientific instruments, and the ground segment, which encompasses the operational and science ground segment. The elements for the scientific success of the mission for a timely release of the data are shortly presented: the processing and calibration of the data, and the design of the sky survey. Euclid is presently on schedule for a launch in September 2022.
Euclid is an ESA M-class mission to study the geometry and nature of the dark universe, slated for launch in mid-2022. NASA is participating in the mission through the contribution of the near-infrared detectors and associated electronics, the nomination of scientists for membership in the Euclid Consortium, and by establishing the Euclid NASA Science Center at IPAC (ENSCI) to support the US community. As part of ENSCI’s work, we will participate in the Euclid Science Ground Segment (SGS) and build and operate the US Science Data Center (SDC-US), which will be a node in the distributed data processing system for the mission. SDC-US is one of 10 data centers, and will contribute about 5% of the computing and data storage for the distributed system. We discuss lessons learned in developing a node in a distributed system. For example, there is a significant advantage to SDC-US development in sharing of knowledge, problem solving, and resource burden with other parts of the system. On the other hand, fitting into a system that is distributed geographically and relies on diverse computing environments results in added complexity in constructing SDC-US.
In this paper we discuss the latest developments of the STRIP instrument of the “Large Scale Polarization Explorer” (LSPE) experiment. LSPE is a novel project that combines ground-based (STRIP) and balloon-borne (SWIPE) polarization measurements of the microwave sky on large angular scales to attempt a detection of the “B-modes” of the Cosmic Microwave Background polarization. STRIP will observe approximately 25% of the Northern sky from the “Observatorio del Teide” in Tenerife, using an array of forty-nine coherent polarimeters at 43 GHz, coupled to a 1.5 m fully rotating crossed-Dragone telescope. A second frequency channel with six-elements at 95 GHz will be exploited as an atmospheric monitor. At present, most of the hardware of the STRIP instrument has been developed and tested at sub-system level. System-level characterization, starting in July 2018, will lead STRIP to be shipped and installed at the observation site within the end of the year. The on-site verification and calibration of the whole instrument will prepare STRIP for a 2-years campaign for the observation of the CMB polarization.
KEYWORDS: Space operations, Galactic astronomy, Spectroscopy, Systems modeling, Databases, Point spread functions, Seaborgium, Data processing, Calibration, Telescopes
ESA's Dark Energy Mission Euclid will map the 3D matter distribution in our Universe using two Dark Energy probes: Weak Lensing (WL) and Galaxy Clustering (GC). The extreme accuracy required for both probes can only be achieved by observing from space in order to limit all observational biases in the measurements of the tracer galaxies. Weak Lensing requires an extremely high precision measurement of galaxy shapes realised with the Visual Imager (VIS) as well as photometric redshift measurements using near-infrared photometry provided by the Near Infrared Spectrometer Photometer (NISP). Galaxy Clustering requires accurate redshifts (Δz/(z+1)<0.1%) of galaxies to be obtained by the NISP Spectrometer.
Performance requirements on spacecraft, telescope assembly, scientific instruments and the ground data-processing have been carefully budgeted to meet the demanding top level science requirements. As part of the mission development, the verification of scientific performances needs mission-level end-to-end analyses in which the Euclid systems are modeled from as-designed to final as-built flight configurations. We present the plan to carry out end-to-end analysis coordinated by the ESA project team with the collaboration of the Euclid Consortium. The plan includes the definition of key performance parameters and their process of verification, the input and output identification and the management of applicable mission configurations in the parameter database.
KEYWORDS: Data processing, Data storage, Software development, Algorithm development, System on a chip, Data processing, Seaborgium, Prototyping, Space operations, Information technology, Lead
Euclid is an ESA mission aimed at understanding the nature of dark energy and dark matter by using simultaneously two probes (weak lensing and baryon acoustic oscillations). The mission will observe galaxies and clusters of galaxies out to z~2, in a wide extra-galactic survey covering 15000 deg2, plus a deep survey covering an area of 40 deg². The payload is composed of two instruments, an imager in the visible domain (VIS) and an imager-spectrometer (NISP) covering the near-infrared. The launch is planned in Q4 of 2020. The elements of the Euclid Science Ground Segment (SGS) are the Science Operations Centre (SOC) operated by ESA and nine Science Data Centres (SDCs) in charge of data processing, provided by the Euclid Consortium (EC), formed by over 110 institutes spread in 15 countries. SOC and the EC started several years ago a tight collaboration in order to design and develop a single, cost-efficient and truly integrated SGS. The distributed nature, the size of the data set, and the needed accuracy of the results are the main challenges expected in the design and implementation of the SGS. In particular, the huge volume of data (not only Euclid data but also ground based data) to be processed in the SDCs will require distributed storage to avoid data migration across SDCs. This paper describes the management challenges that the Euclid SGS is facing while dealing with such complexity. The main aspect is related to the organisation of a geographically distributed software development team. In principle algorithms and code is developed in a large number of institutes, while data is actually processed at fewer centers (the national SDCs) where the operational computational infrastructures are maintained. The software produced for data handling, processing and analysis is built within a common development environment defined by the SGS System Team, common to SOC and ECSGS, which has already been active for several years. The code is built incrementally through different levels of maturity, going from prototypes (developed mainly by scientists) to production code (engineered and tested at the SDCs). A number of incremental challenges (infrastructure, data processing and integrated) have been included in the Euclid SGS test plan to verify the correctness and accuracy of the developed systems.
KEYWORDS: Data processing, Galactic astronomy, Space operations, Telescopes, Point spread functions, K band, Sensors, Image quality, Data archive systems, Calibration
Euclid is a space-based optical/near-infrared survey mission of the European Space Agency (ESA) to investigate the
nature of dark energy, dark matter and gravity by observing the geometry of the Universe and on the formation of
structures over cosmological timescales. Euclid will use two probes of the signature of dark matter and energy: Weak
gravitational Lensing, which requires the measurement of the shape and photometric redshifts of distant galaxies, and
Galaxy Clustering, based on the measurement of the 3-dimensional distribution of galaxies through their spectroscopic
redshifts. The mission is scheduled for launch in 2020 and is designed for 6 years of nominal survey operations. The
Euclid Spacecraft is composed of a Service Module and a Payload Module. The Service Module comprises all the
conventional spacecraft subsystems, the instruments warm electronics units, the sun shield and the solar arrays. In
particular the Service Module provides the extremely challenging pointing accuracy required by the scientific objectives.
The Payload Module consists of a 1.2 m three-mirror Korsch type telescope and of two instruments, the visible imager
and the near-infrared spectro-photometer, both covering a large common field-of-view enabling to survey more than
35% of the entire sky. All sensor data are downlinked using K-band transmission and processed by a dedicated ground
segment for science data processing. The Euclid data and catalogues will be made available to the public at the ESA
Science Data Centre.
KEYWORDS: Space operations, Calibration, System on a chip, Sensors, Control systems, Satellites, Mathematical modeling, Visible radiation, Seaborgium, Data processing
Euclid is the future ESA mission, mainly devoted to Cosmology. Like WMAP and Planck, it is a
survey mission, to be launched in 2019 and injected in orbit far away from the Earth, for a nominal
lifetime of 7 years. Euclid has two instruments on-board, the Visible Imager (VIS) and the Near-
Infrared Spectro-Photometer (NISP). The NISP instrument includes cryogenic mechanisms, active
thermal control, high-performance Data Processing Unit and requires periodic in-flight calibrations
and instrument parameters monitoring. To fully exploit the capability of the NISP, a careful control
of systematic effects is required. From previous experiments, we have built the concept of an
integrated instrument development and verification approach, where the scientific, instrument and
ground-segment expertise have strong interactions from the early phases of the project. In particular,
we discuss the strong integration of test and calibration activities with the Ground Segment, starting
from early pre-launch verification activities. We want to report here the expertise acquired by the
Euclid team in previous missions, only citing the literature for detailed reference, and indicate how it
is applied in the Euclid mission framework.
S. Aiola, G. Amico, P. Battaglia, E. Battistelli, A. Baù, P. de Bernardis, M. Bersanelli, A. Boscaleri, F. Cavaliere, A. Coppolecchia, A. Cruciani, F. Cuttaia, A. D' Addabbo, G. D' Alessandro, S. De Gregori, F. Del Torto, M. De Petris, L. Fiorineschi, C. Franceschet, E. Franceschi, M. Gervasi, D. Goldie, A. Gregorio, V. Haynes, N. Krachmalnicoff, L. Lamagna, B. Maffei, D. Maino, S. Masi, A. Mennella, G. Morgante, F. Nati, M. W. Ng, L. Pagano, A. Passerini, O. Peverini, F. Piacentini, L. Piccirillo, G. Pisano, S. Ricciardi, P. Rissone, G. Romeo, M. Salatino, M. Sandri, A. Schillaci, L. Stringhetti, A. Tartari, R. Tascone, L. Terenzi, M. Tomasi, E. Tommasi, F. Villa, G. Virone, S. Withington, A. Zacchei, M. Zannoni
The LSPE is a balloon-borne mission aimed at measuring the polarization of the Cosmic Microwave Background (CMB)
at large angular scales, and in particular to constrain the curl component of CMB polarization (B-modes) produced by
tensor perturbations generated during cosmic inflation, in the very early universe. Its primary target is to improve the
limit on the ratio of tensor to scalar perturbations amplitudes down to r = 0.03, at 99.7% confidence. A second target is
to produce wide maps of foreground polarization generated in our Galaxy by synchrotron emission and interstellar dust
emission. These will be important to map Galactic magnetic fields and to study the properties of ionized gas and of
diffuse interstellar dust in our Galaxy. The mission is optimized for large angular scales, with coarse angular resolution
(around 1.5 degrees FWHM), and wide sky coverage (25% of the sky). The payload will fly in a circumpolar long
duration balloon mission during the polar night. Using the Earth as a giant solar shield, the instrument will spin in
azimuth, observing a large fraction of the northern sky. The payload will host two instruments. An array of coherent
polarimeters using cryogenic HEMT amplifiers will survey the sky at 43 and 90 GHz. An array of bolometric
polarimeters, using large throughput multi-mode bolometers and rotating Half Wave Plates (HWP), will survey the same
sky region in three bands at 95, 145 and 245 GHz. The wide frequency coverage will allow optimal control of the
polarized foregrounds, with comparable angular resolution at all frequencies.
M. Bersanelli, A. Mennella, G. Morgante, M. Zannoni, G. Addamo, A. Baschirotto, P. Battaglia, A. Baù, B. Cappellini, F. Cavaliere, F. Cuttaia, F. Del Torto, S. Donzelli, Z. Farooqui, M. Frailis, C. Franceschet, E. Franceschi, T. Gaier, S. Galeotta, M. Gervasi, A. Gregorio, P. Kangaslahti, N. Krachmalnicoff, C. Lawrence, G. Maggio, R. Mainini, D. Maino, N. Mandolesi, B. Paroli, A. Passerini, O. Peverini, S. Poli, S. Ricciardi, M. Rossetti, M. Sandri, M. Seiffert, L. Stringhetti, A. Tartari, R. Tascone, D. Tavagnacco, L. Terenzi, M. Tomasi, E. Tommasi, F. Villa, Gi. Virone, A. Zacchei
We discuss the design and expected performance of STRIP (STRatospheric Italian Polarimeter), an array of coherent receivers designed to fly on board the LSPE (Large Scale Polarization Explorer) balloon experiment. The STRIP focal plane array comprises 49 elements in Q band and 7 elements in W-band using cryogenic HEMT low noise amplifiers and high performance waveguide components. In operation, the array will be cooled to 20 K and placed in the focal plane of a ~0.6 meter telescope providing an angular resolution of ~1.5 degrees. The LSPE experiment aims at large scale, high sensitivity measurements of CMB polarization, with multi-frequency deep measurements to optimize component separation. The STRIP Q-band channel is crucial to accurately measure and remove the synchrotron polarized component, while the W-band channel, together with a bolometric channel at the same frequency, provides a crucial cross-check for systematic effects.
A. Mennella, B. Aja, E. Artal, M. Balasini, G. Baldan, P. Battaglia, T. Bernardino, M. Bersanelli, E. Blackhurst, L. Boschini, C. Burigana, R. Butler, B. Cappellini, F. Colombo, F. Cuttaia, O. D'Arcangelo, S. Donzelli, R. Davis, L. De La Fuente, F. Ferrari, L. Figini, S. Fogliani, C. Franceschet, E. Franceschi, T. Gaier, S. Galeotta, S. Garavaglia, A. Gregorio, M. Guerrini, R. Hoyland, N. Hughes, P. Jukkala, D. Kettle, M. Laaninen, P. Lapolla, D. Lawson, R. Leonardi, P. Leutenegger, G. Mari, P. Meinhold, M. Miccolis, D. Maino, M. Malaspina, N. Mandolesi, M. Maris, E. Martinez-Gonzalez, G. Morgante, L. Pagan, F. Pasian, P. Platania, M. Pecora, S. Pezzati, L. Popa, T. Poutanen, M. Pospieszalski, N. Roddis, M. Salmon, M. Sandri, R. Silvestri, A. Simonetto, C. Sozzi, L. Stringhetti, L. Terenzi, M. Tomasi, J. Tuovinen, L. Valenziano, J. Varis, F. Villa, A. Wilkinson, F. Winder, A. Zacchei
In this paper we present the test results of the qualification model (QM) of the LFI instrument, which is being
developed as part of the ESA Planck satellite. In particular we discuss the calibration plan which has defined
the main requirements of the radiometric tests and of the experimental setups. Then we describe how these
requirements have been implemented in the custom-developed cryo-facilities and present the main results. We
conclude with a discussion of the lessons learned for the testing of the LFI Flight Model (FM).
X-Shooter is the first 2nd generation instrument to be installed at Paranal early 2008. It is a single target spectrograph covering in a single exposure a wide spectral range from the UV to the K' band with maximum sensitivity. Another key feature of the instrument is its fast response, obtained by making it simple and easy to operate. Compared to other big VLT instruments X-Shooter has a relatively small number of moving functions, but nevertheless the requirements on the whole instrument software are quite demanding. In order to cover the wide spectral range with high efficiency, the instrument is split into three different arms, one being cryogenically cooled. The high level coordinating software architecture provides all the facilities for parallel operation with the maximum achievable level of synchronicity. Low level X-Shooter requirements are also quite stringent, since to compensate for slit misalignments among the three arms, an active piezoelectric actuator system is envisaged. The low-level architecture, besides the typical control of single devices (like motors, sensors and lamps), handles the required real-time operations. The software integration and test is also an issue, being X-Shooter a collaborative effort among several institutes spread around Europe. The whole instrument software architecture is presented here, entering in details into its main modules such as the instrument control software, the observation software and the observing templates structure and their integration in the VLT software environment.
KEYWORDS: Software development, Prototyping, Standards development, Data processing, Calibration, Data modeling, Interfaces, Data archive systems, Data communications, Device simulation
A geographically distibuited software project needs to have a well defined software integration & development plan to avoid extra work in the pipeline creation phase. Here we will describe the rationale in the case of the Planck/LFI DPC project and what was designed and developed to build the integration and testing environment.
The Workstation Software Sytem (WSS) is the high level control software of the Italian Galileo Galilei Telescope settled in La Palma Canary Island developed at the beginning of '90 for HP-UX workstations. WSS may be seen as a middle layer software system that manages the communications between the real time systems (VME), different workstations and high level applications providing a uniform distributed environment. The project to port the control software from the HP workstation to Linux environment started at the end of 2001. It is aimed to refurbish the control software introducing some of the new software technologies and languages, available for free in the Linux operating system. The project was realized by gradually substituting each HP workstation with a Linux PC with the goal to avoid main changes in the original software running under HP-UX. Three main phases characterized the project: creation of a simulated control room with several Linux PCs running WSS (to check all the functionality); insertion in the simulated control room of some HPs (to check the mixed environment); substitution of HP workstation in the real control room. From a software point of view, the project introduces some new technologies,
like multi-threading, and the possibility to develop high level WSS
applications with almost every programming language that implements the Berkley sockets. A library to develop java applications has also been created and tested.
In alt-azimuth telescopes Nasmyth foci are suitable focal planes to reduce mechanical instrument complexity and costs. However, they present some disadvantages mainly due to the field rotation. This is a particularly crucial point in polarimetric observations. Because of the folding mirror, the radiation polarization state is so modified that, to avoid systematic errors, instrumental polarization has to be removed as a function of the telescope position. A model of the polarization introduced by the Telescopio Nazionale Galileo (TNG) at its focal plane is presented. The model takes into account physical and geometrical properties of the optical system, complex refraction index of the mirrors and their relative position, deriving instrumental polarization as a function of the pointing coordinates of the telescope. This model has been developed by means of Muller matrices calculation. Telescope instrumental polarization has been measured following some standard polarization stars at different telescope positions. The mathematical model, here discussed, was confirmed comparing the theoretical results and the experimental measurements at the TNG instruments.
The Italian National "Galileo" Telescope (Telescopio Nazionale "Galileo" - TNG) is a 3.5m telescope located at La Palma, in the Canary islands, which has seen first light in 1998. Available TNG subsystems include four first-generation instruments, plus adaptive optics, meteo and seeing towers; the control and data handling systems are tightly coupled allowing a smooth data flow while preserving integrity. As a part of the data handling systems, the production of a local "Archive at the Telescope" (AaT) is included, and the production of database tables and hard media for the TNG Long-Term Archive (LTA) is supported. The implementation of a LTA prototype has been recently terminated, and the implementation of its operational version is being planned by the Italian National Institute for Astrophysics (INAF).
A description of the AaT and prototype LTA systems are given, including their data handling/archiving and data retrieval capabilities. A discussion of system features and lessons learned is also included, with particular reference to the issues of completeness and data quality. These issues are of particular importance in the perspective of the preparation of a national facility for the archives of data from ground-based telescopes, and its possible inclusion as a data provider in the Virtual Observatory framework.
The Telescope Nazionale Galileo (TNG) telescope is now operational. One of its main goals is to provide high quality images, in a wide range of operating conditions and for several observing modes. Telescope pointing and tracking performances can heavily affect achievement of this requirement, and particular care must be taken in order to reach the highest possible accuracy. Control of the three axes is implemented in one VME controller (minimizing data exchange through the TNG LAN), and telescope mount positions are computed from object coordinates taking into account physical and environmental aspects which alter the object apparent position. Improvement of telescope pointing and tracking performances is obtained by two means. Systematic errors are mostly corrected using a model compensation, that can be introduced in the coordinates transformation flow. The telescope model is derived by off-line analysis of the pointing errors on a specific set of data. In this way we could improve the pointing and tracking performances up to now by a factor of about 30. Tracking drift given by non- systematic and residuals of systematic errors is corrected using a guide camera, which mounts a 800 X 576 CCD (0.35 arcsec/pixel scale). Guide stars are selected from an on- line available star catalogue. Light from the selected star is focused on the guide camera by moving a probe housed in the rotator-adapter module. Tracking drift computation is performed on a two-axes scheme (Right Ascension and Declination), with sub-pixel accuracy. Here the first results of the telescope pointing and tracing accuracy are presented.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
INSTITUTIONAL Select your institution to access the SPIE Digital Library.
PERSONAL Sign in with your SPIE account to access your personal subscriptions or to use specific features such as save to my library, sign up for alerts, save searches, etc.