Images#
ImageHDU
#
- class astropy.io.fits.ImageHDU(data=None, header=None, name=None, do_not_scale_image_data=False, uint=True, scale_back=None, ver=None)[source]#
Bases:
_ImageBaseHDU
,ExtensionHDU
FITS image extension HDU class.
Construct an image HDU.
- Parameters:
- data
array
The data in the HDU.
- header
Header
The header to be used (as a template). If
header
isNone
, a minimal header will be provided.- name
str
, optional The name of the HDU, will be the value of the keyword
EXTNAME
.- do_not_scale_image_databool, optional
If
True
, image data is not scaled using BSCALE/BZERO values when read. (default: False)- uintbool, optional
Interpret signed integer data where
BZERO
is the central value andBSCALE == 1
as unsigned integer data. For example,int16
data withBZERO = 32768
andBSCALE = 1
would be treated asuint16
data. (default: True)- scale_backbool, optional
If
True
, when saving changes to a file that contained scaled image data, restore the data to the original type and reapply the original BSCALE/BZERO values. This could lead to loss of accuracy if scaling back to integer values after performing floating point operations on the data. Pseudo-unsigned integers are automatically rescaled unless scale_back is explicitly set toFalse
. (default: None)- ver
int
> 0 orNone
, optional The ver of the HDU, will be the value of the keyword
EXTVER
. If not given or None, it defaults to the value of theEXTVER
card of theheader
or 1. (default: None)
- data
- add_checksum(when=None, override_datasum=False, checksum_keyword='CHECKSUM', datasum_keyword='DATASUM')#
Add the
CHECKSUM
andDATASUM
cards to this HDU with the values set to the checksum calculated for the HDU and the data respectively. The addition of theDATASUM
card may be overridden.- Parameters:
- when
str
, optional comment string for the cards; by default the comments will represent the time when the checksum was calculated
- override_datasumbool, optional
add the
CHECKSUM
card only- checksum_keyword
str
, optional The name of the header keyword to store the checksum value in; this is typically ‘CHECKSUM’ per convention, but there exist use cases in which a different keyword should be used
- datasum_keyword
str
, optional See
checksum_keyword
- when
Notes
For testing purposes, first call
add_datasum
with awhen
argument, then calladd_checksum
with awhen
argument andoverride_datasum
set toTrue
. This will provide consistent comments for both cards and enable the generation of aCHECKSUM
card with a consistent value.
- add_datasum(when=None, datasum_keyword='DATASUM')#
Add the
DATASUM
card to this HDU with the value set to the checksum calculated for the data.- Parameters:
- when
str
, optional Comment string for the card that by default represents the time when the checksum was calculated
- datasum_keyword
str
, optional The name of the header keyword to store the datasum value in; this is typically ‘DATASUM’ per convention, but there exist use cases in which a different keyword should be used
- when
- Returns:
- checksum
int
The calculated datasum
- checksum
Notes
For testing purposes, provide a
when
argument to enable the comment value in the card to remain consistent. This will enable the generation of aCHECKSUM
card with a consistent value.
- copy()#
Make a copy of the HDU, both header and data are copied.
- property data#
Image/array data as a
ndarray
.Please remember that the order of axes on an Numpy array are opposite of the order specified in the FITS file. For example for a 2D image the “rows” or y-axis are the first dimension, and the “columns” or x-axis are the second dimension.
If the data is scaled using the BZERO and BSCALE parameters, this attribute returns the data scaled to its physical values unless the file was opened with
do_not_scale_image_data=True
.
- filebytes()#
Calculates and returns the number of bytes that this HDU will write to a file.
- fileinfo()#
Returns a dictionary detailing information about the locations of this HDU within any associated file. The values are only valid after a read or write of the associated file with no intervening changes to the
HDUList
.- Returns:
dict
orNone
The dictionary details information about the locations of this HDU within an associated file. Returns
None
when the HDU is not associated with a file.Dictionary contents:
Key
Value
file
File object associated with the HDU
filemode
Mode in which the file was opened (readonly, copyonwrite, update, append, ostream)
hdrLoc
Starting byte location of header in file
datLoc
Starting byte location of data block in file
datSpan
Data size including padding
- classmethod fromstring(data, checksum=False, ignore_missing_end=False, **kwargs)#
Creates a new HDU object of the appropriate type from a string containing the HDU’s entire header and, optionally, its data.
Note: When creating a new HDU from a string without a backing file object, the data of that HDU may be read-only. It depends on whether the underlying string was an immutable Python str/bytes object, or some kind of read-write memory buffer such as a
memoryview
.- Parameters:
- data
str
,bytes
,memoryview
,ndarray
A byte string containing the HDU’s header and data.
- checksumbool, optional
Check the HDU’s checksum and/or datasum.
- ignore_missing_endbool, optional
Ignore a missing end card in the header data. Note that without the end card the end of the header may be ambiguous and resulted in a corrupt HDU. In this case the assumption is that the first 2880 block that does not begin with valid FITS header data is the beginning of the data.
- **kwargsoptional
May consist of additional keyword arguments specific to an HDU type–these correspond to keywords recognized by the constructors of different HDU classes such as
PrimaryHDU
,ImageHDU
, orBinTableHDU
. Any unrecognized keyword arguments are simply ignored.
- data
- classmethod match_header(header)[source]#
_ImageBaseHDU is sort of an abstract class for HDUs containing image data (as opposed to table data) and should never be used directly.
- classmethod readfrom(fileobj, checksum=False, ignore_missing_end=False, **kwargs)#
Read the HDU from a file. Normally an HDU should be opened with
open()
which reads the entire HDU list in a FITS file. But this method is still provided for symmetry withwriteto()
.- Parameters:
- fileobjfile-like object
Input FITS file. The file’s seek pointer is assumed to be at the beginning of the HDU.
- checksumbool
If
True
, verifies that bothDATASUM
andCHECKSUM
card values (when present in the HDU header) match the header and data of all HDU’s in the file.- ignore_missing_endbool
Do not issue an exception when opening a file that is missing an
END
card in the last header.
- req_cards(keyword, pos, test, fix_value, option, errlist)#
Check the existence, location, and value of a required
Card
.- Parameters:
- keyword
str
The keyword to validate
- pos
int
,callable()
If an
int
, this specifies the exact location this card should have in the header. Remember that Python is zero-indexed, so this meanspos=0
requires the card to be the first card in the header. If given a callable, it should take one argument–the actual position of the keyword–and returnTrue
orFalse
. This can be used for custom evaluation. For example ifpos=lambda idx: idx > 10
this will check that the keyword’s index is greater than 10.- test
callable()
This should be a callable (generally a function) that is passed the value of the given keyword and returns
True
orFalse
. This can be used to validate the value associated with the given keyword.- fix_value
str
,int
,float
,complex
, bool,None
A valid value for a FITS keyword to to use if the given
test
fails to replace an invalid value. In other words, this provides a default value to use as a replacement if the keyword’s current value is invalid. IfNone
, there is no replacement value and the keyword is unfixable.- option
str
Output verification option. Must be one of
"fix"
,"silentfix"
,"ignore"
,"warn"
, or"exception"
. May also be any combination of"fix"
or"silentfix"
with"+ignore"
,+warn
, or+exception" (e.g. ``"fix+warn"
). See Verification Options for more info.- errlist
list
A list of validation errors already found in the FITS file; this is used primarily for the validation system to collect errors across multiple HDUs and multiple calls to
req_cards
.
- keyword
Notes
If
pos=None
, the card can be anywhere in the header. If the card does not exist, the new card will have thefix_value
as its value when created. Also check the card’s value by using thetest
argument.
- run_option(option='warn', err_text='', fix_text='Fixed.', fix=None, fixable=True)#
Execute the verification with selected option.
- scale(type=None, option='old', bscale=None, bzero=None)#
Scale image data by using
BSCALE
/BZERO
.Call to this method will scale
data
and update the keywords ofBSCALE
andBZERO
in the HDU’s header. This method should only be used right before writing to the output file, as the data will be scaled and is therefore not very usable after the call.- Parameters:
- type
str
, optional destination data type, use a string representing a numpy dtype name, (e.g.
'uint8'
,'int16'
,'float32'
etc.). If isNone
, use the current data type.- option
str
, optional How to scale the data:
"old"
uses the originalBSCALE
andBZERO
values from when the data was read/created (defaulting to 1 and 0 if they don’t exist). For integer data only,"minmax"
uses the minimum and maximum of the data to scale. User-specifiedbscale
/bzero
values always take precedence.- bscale, bzero
int
, optional User-specified
BSCALE
andBZERO
values
- type
- property section#
Access a section of the image array without loading the entire array into memory. The
Section
object returned by this attribute is not meant to be used directly by itself. Rather, slices of the section return the appropriate slice of the data, and loads only that section into memory.Sections are useful for retrieving a small subset of data from a remote file that has been opened with the
use_fsspec=True
parameter. For example, you can use this feature to download a small cutout from a large FITS image hosted in the Amazon S3 cloud (see the Working with remote and cloud-hosted files section of the Astropy documentation for more details.)For local files, sections are mostly obsoleted by memmap support, but should still be used to deal with very large scaled images.
Note that sections cannot currently be written to. Moreover, any in-memory updates to the image’s
.data
property may not be reflected in the slices obtained via.section
. See the Data Sections section of the documentation for more details.
- property shape#
Shape of the image array–should be equivalent to
self.data.shape
.
- property size#
Size (in bytes) of the data portion of the HDU.
- update_header()#
Update the header keywords to agree with the data.
- verify(option='warn')#
Verify all values in the instance.
- Parameters:
- option
str
Output verification option. Must be one of
"fix"
,"silentfix"
,"ignore"
,"warn"
, or"exception"
. May also be any combination of"fix"
or"silentfix"
with"+ignore"
,"+warn"
, or"+exception"
(e.g."fix+warn"
). See Verification Options for more info.
- option
- verify_checksum()#
Verify that the value in the
CHECKSUM
keyword matches the value calculated for the current HDU CHECKSUM.- Returns:
- valid
int
0 - failure
1 - success
2 - no
CHECKSUM
keyword present
- valid
- verify_datasum()#
Verify that the value in the
DATASUM
keyword matches the value calculated for theDATASUM
of the current HDU data.- Returns:
- valid
int
0 - failure
1 - success
2 - no
DATASUM
keyword present
- valid
- writeto(name, output_verify='exception', overwrite=False, checksum=False)#
Works similarly to the normal writeto(), but prepends a default
PrimaryHDU
are required by extension HDUs (which cannot stand on their own).
CompImageHDU
#
- class astropy.io.fits.CompImageHDU(data=None, header=None, name=None, compression_type='RICE_1', tile_shape=None, hcomp_scale=0, hcomp_smooth=0, quantize_level=16.0, quantize_method=-1, dither_seed=0, do_not_scale_image_data=False, uint=False, scale_back=False, tile_size=None)[source]#
Bases:
BinTableHDU
Compressed Image HDU class.
- Parameters:
- data
array
, optional Uncompressed image data
- header
Header
, optional Header to be associated with the image; when reading the HDU from a file (data=DELAYED), the header read from the file
- name
str
, optional The
EXTNAME
value; if this value isNone
, then the name from the input image header will be used; if there is no name in the input image header then the default nameCOMPRESSED_IMAGE
is used.- compression_type
str
, optional Compression algorithm: one of
'RICE_1'
,'RICE_ONE'
,'PLIO_1'
,'GZIP_1'
,'GZIP_2'
,'HCOMPRESS_1'
,'NOCOMPRESS'
- tile_shape
tuple
, optional Compression tile shape, which should be specified using the default Numpy convention for array shapes (C order). The default is to treat each row of image as a tile.
- hcomp_scale
float
, optional HCOMPRESS scale parameter
- hcomp_smooth
float
, optional HCOMPRESS smooth parameter
- quantize_level
float
, optional Floating point quantization level; see note below
- quantize_method
int
, optional Floating point quantization dithering method; can be either
NO_DITHER
(-1; default),SUBTRACTIVE_DITHER_1
(1), orSUBTRACTIVE_DITHER_2
(2); see note below- dither_seed
int
, optional Random seed to use for dithering; can be either an integer in the range 1 to 1000 (inclusive),
DITHER_SEED_CLOCK
(0; default), orDITHER_SEED_CHECKSUM
(-1); see note below
- data
Notes
The astropy.io.fits package supports 2 methods of image compression:
The entire FITS file may be externally compressed with the gzip or pkzip utility programs, producing a
*.gz
or*.zip
file, respectively. When reading compressed files of this type, Astropy first uncompresses the entire file into a temporary file before performing the requested read operations. The astropy.io.fits package does not support writing to these types of compressed files. This type of compression is supported in the_File
class, not in theCompImageHDU
class. The file compression type is recognized by the.gz
or.zip
file name extension.The
CompImageHDU
class supports the FITS tiled image compression convention in which the image is subdivided into a grid of rectangular tiles, and each tile of pixels is individually compressed. The details of this FITS compression convention are described at the FITS Support Office web site. Basically, the compressed image tiles are stored in rows of a variable length array column in a FITS binary table. The astropy.io.fits recognizes that this binary table extension contains an image and treats it as if it were an image extension. Under this tile-compression format, FITS header keywords remain uncompressed. At this time, Astropy does not support the ability to extract and uncompress sections of the image without having to uncompress the entire image.
The astropy.io.fits package supports 3 general-purpose compression algorithms plus one other special-purpose compression technique that is designed for data masks with positive integer pixel values. The 3 general purpose algorithms are GZIP, Rice, and HCOMPRESS, and the special-purpose technique is the IRAF pixel list compression technique (PLIO). The
compression_type
parameter defines the compression algorithm to be used.The FITS image can be subdivided into any desired rectangular grid of compression tiles. With the GZIP, Rice, and PLIO algorithms, the default is to take each row of the image as a tile. The HCOMPRESS algorithm is inherently 2-dimensional in nature, so the default in this case is to take 16 rows of the image per tile. In most cases, it makes little difference what tiling pattern is used, so the default tiles are usually adequate. In the case of very small images, it could be more efficient to compress the whole image as a single tile. Note that the image dimensions are not required to be an integer multiple of the tile dimensions; if not, then the tiles at the edges of the image will be smaller than the other tiles. The
tile_shape
parameter may be provided as a list of tile sizes, one for each dimension in the image. For example atile_shape
value of(100,100)
would divide a 300 X 300 image into 9 100 X 100 tiles.The 4 supported image compression algorithms are all ‘lossless’ when applied to integer FITS images; the pixel values are preserved exactly with no loss of information during the compression and uncompression process. In addition, the HCOMPRESS algorithm supports a ‘lossy’ compression mode that will produce larger amount of image compression. This is achieved by specifying a non-zero value for the
hcomp_scale
parameter. Since the amount of compression that is achieved depends directly on the RMS noise in the image, it is usually more convenient to specify thehcomp_scale
factor relative to the RMS noise. Settinghcomp_scale = 2.5
means use a scale factor that is 2.5 times the calculated RMS noise in the image tile. In some cases it may be desirable to specify the exact scaling to be used, instead of specifying it relative to the calculated noise value. This may be done by specifying the negative of the desired scale value (typically in the range -2 to -100).Very high compression factors (of 100 or more) can be achieved by using large
hcomp_scale
values, however, this can produce undesirable ‘blocky’ artifacts in the compressed image. A variation of the HCOMPRESS algorithm (called HSCOMPRESS) can be used in this case to apply a small amount of smoothing of the image when it is uncompressed to help cover up these artifacts. This smoothing is purely cosmetic and does not cause any significant change to the image pixel values. Setting thehcomp_smooth
parameter to 1 will engage the smoothing algorithm.Floating point FITS images (which have
BITPIX
= -32 or -64) usually contain too much ‘noise’ in the least significant bits of the mantissa of the pixel values to be effectively compressed with any lossless algorithm. Consequently, floating point images are first quantized into scaled integer pixel values (and thus throwing away much of the noise) before being compressed with the specified algorithm (either GZIP, RICE, or HCOMPRESS). This technique produces much higher compression factors than simply using the GZIP utility to externally compress the whole FITS file, but it also means that the original floating point value pixel values are not exactly preserved. When done properly, this integer scaling technique will only discard the insignificant noise while still preserving all the real information in the image. The amount of precision that is retained in the pixel values is controlled by thequantize_level
parameter. Larger values will result in compressed images whose pixels more closely match the floating point pixel values, but at the same time the amount of compression that is achieved will be reduced. Users should experiment with different values for this parameter to determine the optimal value that preserves all the useful information in the image, without needlessly preserving all the ‘noise’ which will hurt the compression efficiency.The default value for the
quantize_level
scale factor is 16, which means that scaled integer pixel values will be quantized such that the difference between adjacent integer values will be 1/16th of the noise level in the image background. An optimized algorithm is used to accurately estimate the noise in the image. As an example, if the RMS noise in the background pixels of an image = 32.0, then the spacing between adjacent scaled integer pixel values will equal 2.0 by default. Note that the RMS noise is independently calculated for each tile of the image, so the resulting integer scaling factor may fluctuate slightly for each tile. In some cases, it may be desirable to specify the exact quantization level to be used, instead of specifying it relative to the calculated noise value. This may be done by specifying the negative of desired quantization level for the value ofquantize_level
. In the previous example, one could specifyquantize_level = -2.0
so that the quantized integer levels differ by 2.0. Larger negative values forquantize_level
means that the levels are more coarsely-spaced, and will produce higher compression factors.The quantization algorithm can also apply one of two random dithering methods in order to reduce bias in the measured intensity of background regions. The default method, specified with the constant
SUBTRACTIVE_DITHER_1
adds dithering to the zero-point of the quantization array itself rather than adding noise to the actual image. The random noise is added on a pixel-by-pixel basis, so in order restore each pixel from its integer value to its floating point value it is necessary to replay the same sequence of random numbers for each pixel (see below). The other method,SUBTRACTIVE_DITHER_2
, is exactly like the first except that before dithering any pixel with a floating point value of0.0
is replaced with the special integer value-2147483647
. When the image is uncompressed, pixels with this value are restored back to0.0
exactly. Finally, a value ofNO_DITHER
disables dithering entirely.As mentioned above, when using the subtractive dithering algorithm it is necessary to be able to generate a (pseudo-)random sequence of noise for each pixel, and replay that same sequence upon decompressing. To facilitate this, a random seed between 1 and 10000 (inclusive) is used to seed a random number generator, and that seed is stored in the
ZDITHER0
keyword in the header of the compressed HDU. In order to use that seed to generate the same sequence of random numbers the same random number generator must be used at compression and decompression time; for that reason the tiled image convention provides an implementation of a very simple pseudo-random number generator. The seed itself can be provided in one of three ways, controllable by thedither_seed
argument: It may be specified manually, or it may be generated arbitrarily based on the system’s clock (DITHER_SEED_CLOCK
) or based on a checksum of the pixels in the image’s first tile (DITHER_SEED_CHECKSUM
). The clock-based method is the default, and is sufficient to ensure that the value is reasonably “arbitrary” and that the same seed is unlikely to be generated sequentially. The checksum method, on the other hand, ensures that the same seed is used every time for a specific image. This is particularly useful for software testing as it ensures that the same image will always use the same seed.- add_checksum(when=None, override_datasum=False, checksum_keyword='CHECKSUM', datasum_keyword='DATASUM')#
Add the
CHECKSUM
andDATASUM
cards to this HDU with the values set to the checksum calculated for the HDU and the data respectively. The addition of theDATASUM
card may be overridden.- Parameters:
- when
str
, optional comment string for the cards; by default the comments will represent the time when the checksum was calculated
- override_datasumbool, optional
add the
CHECKSUM
card only- checksum_keyword
str
, optional The name of the header keyword to store the checksum value in; this is typically ‘CHECKSUM’ per convention, but there exist use cases in which a different keyword should be used
- datasum_keyword
str
, optional See
checksum_keyword
- when
Notes
For testing purposes, first call
add_datasum
with awhen
argument, then calladd_checksum
with awhen
argument andoverride_datasum
set toTrue
. This will provide consistent comments for both cards and enable the generation of aCHECKSUM
card with a consistent value.
- add_datasum(when=None, datasum_keyword='DATASUM')#
Add the
DATASUM
card to this HDU with the value set to the checksum calculated for the data.- Parameters:
- when
str
, optional Comment string for the card that by default represents the time when the checksum was calculated
- datasum_keyword
str
, optional The name of the header keyword to store the datasum value in; this is typically ‘DATASUM’ per convention, but there exist use cases in which a different keyword should be used
- when
- Returns:
- checksum
int
The calculated datasum
- checksum
Notes
For testing purposes, provide a
when
argument to enable the comment value in the card to remain consistent. This will enable the generation of aCHECKSUM
card with a consistent value.
- property compression_type#
The name of the compression algorithm.
- copy()#
Make a copy of the table HDU, both header and data are copied.
- property data#
The decompressed data array.
Note that accessing this will cause all the tiles to be loaded, decompressed, and combined into a single data array. If you do not need to access the whole array, consider instead using the
section
property.
- dump(datafile=None, cdfile=None, hfile=None, overwrite=False)#
Dump the table HDU to a file in ASCII format. The table may be dumped in three separate files, one containing column definitions, one containing header parameters, and one for table data.
- Parameters:
- datafilepath-like object or file-like object, optional
Output data file. The default is the root name of the fits file associated with this HDU appended with the extension
.txt
.- cdfilepath-like object or file-like object, optional
Output column definitions file. The default is
None
, no column definitions output is produced.- hfilepath-like object or file-like object, optional
Output header parameters file. The default is
None
, no header parameters output is produced.- overwritebool, optional
If
True
, overwrite the output file if it exists. Raises anOSError
ifFalse
and the output file exists. Default isFalse
.
Notes
The primary use for the
dump
method is to allow viewing and editing the table data and parameters in a standard text editor. Theload
method can be used to create a new table from the three plain text (ASCII) files.datafile: Each line of the data file represents one row of table data. The data is output one column at a time in column order. If a column contains an array, each element of the column array in the current row is output before moving on to the next column. Each row ends with a new line.
Integer data is output right-justified in a 21-character field followed by a blank. Floating point data is output right justified using ‘g’ format in a 21-character field with 15 digits of precision, followed by a blank. String data that does not contain whitespace is output left-justified in a field whose width matches the width specified in the
TFORM
header parameter for the column, followed by a blank. When the string data contains whitespace characters, the string is enclosed in quotation marks (""
). For the last data element in a row, the trailing blank in the field is replaced by a new line character.For column data containing variable length arrays (‘P’ format), the array data is preceded by the string
'VLA_Length= '
and the integer length of the array for that row, left-justified in a 21-character field, followed by a blank.Note
This format does not support variable length arrays using the (‘Q’ format) due to difficult to overcome ambiguities. What this means is that this file format cannot support VLA columns in tables stored in files that are over 2 GB in size.
For column data representing a bit field (‘X’ format), each bit value in the field is output right-justified in a 21-character field as 1 (for true) or 0 (for false).
cdfile: Each line of the column definitions file provides the definitions for one column in the table. The line is broken up into 8, sixteen-character fields. The first field provides the column name (
TTYPEn
). The second field provides the column format (TFORMn
). The third field provides the display format (TDISPn
). The fourth field provides the physical units (TUNITn
). The fifth field provides the dimensions for a multidimensional array (TDIMn
). The sixth field provides the value that signifies an undefined value (TNULLn
). The seventh field provides the scale factor (TSCALn
). The eighth field provides the offset value (TZEROn
). A field value of""
is used to represent the case where no value is provided.hfile: Each line of the header parameters file provides the definition of a single HDU header card as represented by the card image.
- filebytes()#
Calculates and returns the number of bytes that this HDU will write to a file.
- fileinfo()#
Returns a dictionary detailing information about the locations of this HDU within any associated file. The values are only valid after a read or write of the associated file with no intervening changes to the
HDUList
.- Returns:
dict
orNone
The dictionary details information about the locations of this HDU within an associated file. Returns
None
when the HDU is not associated with a file.Dictionary contents:
Key
Value
file
File object associated with the HDU
filemode
Mode in which the file was opened (readonly, copyonwrite, update, append, ostream)
hdrLoc
Starting byte location of header in file
datLoc
Starting byte location of data block in file
datSpan
Data size including padding
- classmethod from_columns(columns, header=None, nrows=0, fill=False, character_as_bytes=False, **kwargs)#
Given either a
ColDefs
object, a sequence ofColumn
objects, or another table HDU or table data (aFITS_rec
or multi-fieldnumpy.ndarray
ornumpy.recarray
object, return a new table HDU of the class this method was called on using the column definition from the input.See also
FITS_rec.from_columns
.- Parameters:
- columnssequence of
Column
,ColDefs
astropy:-like The columns from which to create the table data, or an object with a column-like structure from which a
ColDefs
can be instantiated. This includes an existingBinTableHDU
orTableHDU
, or anumpy.recarray
to give some examples.If these columns have data arrays attached that data may be used in initializing the new table. Otherwise the input columns will be used as a template for a new table with the requested number of rows.
- header
Header
An optional
Header
object to instantiate the new HDU yet. Header keywords specifically related to defining the table structure (such as the “TXXXn” keywords like TTYPEn) will be overridden by the supplied column definitions, but all other informational and data model-specific keywords are kept.- nrows
int
Number of rows in the new table. If the input columns have data associated with them, the size of the largest input column is used. Otherwise the default is 0.
- fillbool
If
True
, will fill all cells with zeros or blanks. IfFalse
, copy the data from input, undefined cells will still be filled with zeros/blanks.- character_as_bytesbool
Whether to return bytes for string columns when accessed from the HDU. By default this is
False
and (unicode) strings are returned, but for large tables this may use up a lot of memory.
- columnssequence of
Notes
Any additional keyword arguments accepted by the HDU class’s
__init__
may also be passed in as keyword arguments.
- classmethod fromstring(data, checksum=False, ignore_missing_end=False, **kwargs)#
Creates a new HDU object of the appropriate type from a string containing the HDU’s entire header and, optionally, its data.
Note: When creating a new HDU from a string without a backing file object, the data of that HDU may be read-only. It depends on whether the underlying string was an immutable Python str/bytes object, or some kind of read-write memory buffer such as a
memoryview
.- Parameters:
- data
str
,bytes
,memoryview
,ndarray
A byte string containing the HDU’s header and data.
- checksumbool, optional
Check the HDU’s checksum and/or datasum.
- ignore_missing_endbool, optional
Ignore a missing end card in the header data. Note that without the end card the end of the header may be ambiguous and resulted in a corrupt HDU. In this case the assumption is that the first 2880 block that does not begin with valid FITS header data is the beginning of the data.
- **kwargsoptional
May consist of additional keyword arguments specific to an HDU type–these correspond to keywords recognized by the constructors of different HDU classes such as
PrimaryHDU
,ImageHDU
, orBinTableHDU
. Any unrecognized keyword arguments are simply ignored.
- data
- classmethod load(datafile, cdfile=None, hfile=None, replace=False, header=None)#
Create a table from the input ASCII files. The input is from up to three separate files, one containing column definitions, one containing header parameters, and one containing column data.
The column definition and header parameters files are not required. When absent the column definitions and/or header parameters are taken from the header object given in the header argument; otherwise sensible defaults are inferred (though this mode is not recommended).
- Parameters:
- datafilepath-like object or file-like object
Input data file containing the table data in ASCII format.
- cdfilepath-like object or file-like object, optional
Input column definition file containing the names, formats, display formats, physical units, multidimensional array dimensions, undefined values, scale factors, and offsets associated with the columns in the table. If
None
, the column definitions are taken from the current values in this object.- hfilepath-like object or file-like object, optional
Input parameter definition file containing the header parameter definitions to be associated with the table. If
None
, the header parameter definitions are taken from the current values in this objects header.- replacebool, optional
When
True
, indicates that the entire header should be replaced with the contents of the ASCII file instead of just updating the current header.- header
Header
, optional When the cdfile and hfile are missing, use this Header object in the creation of the new table and HDU. Otherwise this Header supersedes the keywords from hfile, which is only used to update values not present in this Header, unless
replace=True
in which this Header’s values are completely replaced with the values from hfile.
Notes
The primary use for the
load
method is to allow the input of ASCII data that was edited in a standard text editor of the table data and parameters. Thedump
method can be used to create the initial ASCII files.datafile: Each line of the data file represents one row of table data. The data is output one column at a time in column order. If a column contains an array, each element of the column array in the current row is output before moving on to the next column. Each row ends with a new line.
Integer data is output right-justified in a 21-character field followed by a blank. Floating point data is output right justified using ‘g’ format in a 21-character field with 15 digits of precision, followed by a blank. String data that does not contain whitespace is output left-justified in a field whose width matches the width specified in the
TFORM
header parameter for the column, followed by a blank. When the string data contains whitespace characters, the string is enclosed in quotation marks (""
). For the last data element in a row, the trailing blank in the field is replaced by a new line character.For column data containing variable length arrays (‘P’ format), the array data is preceded by the string
'VLA_Length= '
and the integer length of the array for that row, left-justified in a 21-character field, followed by a blank.Note
This format does not support variable length arrays using the (‘Q’ format) due to difficult to overcome ambiguities. What this means is that this file format cannot support VLA columns in tables stored in files that are over 2 GB in size.
For column data representing a bit field (‘X’ format), each bit value in the field is output right-justified in a 21-character field as 1 (for true) or 0 (for false).
cdfile: Each line of the column definitions file provides the definitions for one column in the table. The line is broken up into 8, sixteen-character fields. The first field provides the column name (
TTYPEn
). The second field provides the column format (TFORMn
). The third field provides the display format (TDISPn
). The fourth field provides the physical units (TUNITn
). The fifth field provides the dimensions for a multidimensional array (TDIMn
). The sixth field provides the value that signifies an undefined value (TNULLn
). The seventh field provides the scale factor (TSCALn
). The eighth field provides the offset value (TZEROn
). A field value of""
is used to represent the case where no value is provided.hfile: Each line of the header parameters file provides the definition of a single HDU header card as represented by the card image.
- classmethod match_header(header)[source]#
This is an abstract type that implements the shared functionality of the ASCII and Binary Table HDU types, which should be used instead of this.
- classmethod readfrom(fileobj, checksum=False, ignore_missing_end=False, **kwargs)#
Read the HDU from a file. Normally an HDU should be opened with
open()
which reads the entire HDU list in a FITS file. But this method is still provided for symmetry withwriteto()
.- Parameters:
- fileobjfile-like object
Input FITS file. The file’s seek pointer is assumed to be at the beginning of the HDU.
- checksumbool
If
True
, verifies that bothDATASUM
andCHECKSUM
card values (when present in the HDU header) match the header and data of all HDU’s in the file.- ignore_missing_endbool
Do not issue an exception when opening a file that is missing an
END
card in the last header.
- req_cards(keyword, pos, test, fix_value, option, errlist)#
Check the existence, location, and value of a required
Card
.- Parameters:
- keyword
str
The keyword to validate
- pos
int
,callable()
If an
int
, this specifies the exact location this card should have in the header. Remember that Python is zero-indexed, so this meanspos=0
requires the card to be the first card in the header. If given a callable, it should take one argument–the actual position of the keyword–and returnTrue
orFalse
. This can be used for custom evaluation. For example ifpos=lambda idx: idx > 10
this will check that the keyword’s index is greater than 10.- test
callable()
This should be a callable (generally a function) that is passed the value of the given keyword and returns
True
orFalse
. This can be used to validate the value associated with the given keyword.- fix_value
str
,int
,float
,complex
, bool,None
A valid value for a FITS keyword to to use if the given
test
fails to replace an invalid value. In other words, this provides a default value to use as a replacement if the keyword’s current value is invalid. IfNone
, there is no replacement value and the keyword is unfixable.- option
str
Output verification option. Must be one of
"fix"
,"silentfix"
,"ignore"
,"warn"
, or"exception"
. May also be any combination of"fix"
or"silentfix"
with"+ignore"
,+warn
, or+exception" (e.g. ``"fix+warn"
). See Verification Options for more info.- errlist
list
A list of validation errors already found in the FITS file; this is used primarily for the validation system to collect errors across multiple HDUs and multiple calls to
req_cards
.
- keyword
Notes
If
pos=None
, the card can be anywhere in the header. If the card does not exist, the new card will have thefix_value
as its value when created. Also check the card’s value by using thetest
argument.
- run_option(option='warn', err_text='', fix_text='Fixed.', fix=None, fixable=True)#
Execute the verification with selected option.
- scale(type=None, option='old', bscale=1, bzero=0)[source]#
Scale image data by using
BSCALE
andBZERO
.Calling this method will scale
self.data
and update the keywords ofBSCALE
andBZERO
inself._header
andself._image_header
. This method should only be used right before writing to the output file, as the data will be scaled and is therefore not very usable after the call.- Parameters:
- type
str
, optional destination data type, use a string representing a numpy dtype name, (e.g.
'uint8'
,'int16'
,'float32'
etc.). If isNone
, use the current data type.- option
str
, optional how to scale the data: if
"old"
, use the originalBSCALE
andBZERO
values when the data was read/created. If"minmax"
, use the minimum and maximum of the data to scale. The option will be overwritten by any user-specified bscale/bzero values.- bscale, bzero
int
, optional user specified
BSCALE
andBZERO
values.
- type
- property section#
Efficiently access a section of the image array
This property can be used to access a section of the data without loading and decompressing the entire array into memory.
The
CompImageSection
object returned by this attribute is not meant to be used directly by itself. Rather, slices of the section return the appropriate slice of the data, and loads only that section into memory. Any valid basic Numpy index can be used to sliceCompImageSection
.Note that accessing data using
CompImageHDU.section
will always load tiles one at a time from disk, and therefore when accessing a large fraction of the data (or slicing it in a way that would cause most tiles to be loaded) you may obtain better performance by usingCompImageHDU.data
.
- property shape#
Shape of the image array–should be equivalent to
self.data.shape
.
- property size#
Size (in bytes) of the data portion of the HDU.
- property tile_shape#
The tile shape used for the tiled compression.
This shape is given in Numpy/C order
- update()#
Deprecated since version v6.0: The update function is deprecated and may be removed in a future version. Use update_header instead.
- update_header()#
Update header keywords to reflect recent changes of columns.
- verify(option='warn')#
Verify all values in the instance.
- Parameters:
- option
str
Output verification option. Must be one of
"fix"
,"silentfix"
,"ignore"
,"warn"
, or"exception"
. May also be any combination of"fix"
or"silentfix"
with"+ignore"
,"+warn"
, or"+exception"
(e.g."fix+warn"
). See Verification Options for more info.
- option
- verify_checksum()#
Verify that the value in the
CHECKSUM
keyword matches the value calculated for the current HDU CHECKSUM.- Returns:
- valid
int
0 - failure
1 - success
2 - no
CHECKSUM
keyword present
- valid
- verify_datasum()#
Verify that the value in the
DATASUM
keyword matches the value calculated for theDATASUM
of the current HDU data.- Returns:
- valid
int
0 - failure
1 - success
2 - no
DATASUM
keyword present
- valid
- writeto(name, output_verify='exception', overwrite=False, checksum=False)#
Works similarly to the normal writeto(), but prepends a default
PrimaryHDU
are required by extension HDUs (which cannot stand on their own).
Section
#
- class astropy.io.fits.Section(hdu)[source]#
Bases:
object
Class enabling subsets of ImageHDU data to be loaded lazily via slicing.
Slices of this object load the corresponding section of an image array from the underlying FITS file, and applies any BSCALE/BZERO factors.
Section slices cannot be assigned to, and modifications to a section are not saved back to the underlying file.
See the Data Sections section of the Astropy documentation for more details.
CompImageSection
#
- class astropy.io.fits.CompImageSection(hdu)[source]#
Bases:
object
Class enabling subsets of CompImageHDU data to be loaded lazily via slicing.
Slices of this object load the corresponding section of an image array from the underlying FITS file, and applies any BSCALE/BZERO factors.
Section slices cannot be assigned to, and modifications to a section are not saved back to the underlying file.
See the Data Sections section of the Astropy documentation for more details.