pystellibs package

Subpackages

Submodules

pystellibs.basel module

BaSeL 2.2 library

class pystellibs.basel.BaSeL(*args, **kwargs)[source]

Bases: Stellib

BaSeL 2.2 library derived class

This library + Rauch is used in Pegase.2

The BaSeL stellar spectral energy distribution (SED) libraries are libraries of theoretical stellar SEDs recalibrated using empirical photometric data. Therefore, we call them semi-empirical libraries.

The BaSeL 2.2 library was calibrated using photometric data from solar metallicity stars.

References

Attributes:
NHI
NHeI
NHeII
Teff
Z
flux_units
logT
logZ
logg
nbytes

return the number of bytes of the object

wavelength

Methods

bbox([dlogT, dlogg])

Boundary of Basel 2.2 library

generate_individual_spectra(stars, **kwargs)

Generates individual spectra for the given stars and stellar library

generate_individual_values(stars, values, ...)

Generates individual spectra for the given stars and stellar library

generate_stellar_spectrum(logT, logg, logL, Z)

Generates individual spectrum for the given stars APs and the stellar library

get_boundaries([dlogT, dlogg])

Returns the closed boundary polygon around the stellar library with given margins

get_interpolation_data()

Default interpolation

get_radius(logl, logt)

Returns the radius of a star given its luminosity and temperature

get_weights(logT, logg, logL[, weights])

Returns the proper weights for the interpolation

plot_boundary([ax, dlogT, dlogg])

Parameters:

points_inside(xypoints[, dlogT, dlogg])

Returns if a point is inside the polygon defined by the boundary of the library

set_default_extrapolation_bounds

property NHI
property NHeI
property NHeII
property Teff
property Z
bbox(dlogT=0.05, dlogg=0.25)[source]

Boundary of Basel 2.2 library

Parameters:
dlogT: float

log-temperature tolerance before extrapolation limit

dlogg: float

log-g tolerance before extrapolation limit

Returns:
bbox: ndarray

(logT, logg) edges of the bounding polygon

property logT
property logZ
property logg

pystellibs.btsettl module

class pystellibs.btsettl.BTSettl(medres=True, *args, **kwargs)[source]

Bases: AtmosphereLib

BT-Settl Library

References

Paper: Few refereed publications

Older Ref = http://adsabs.harvard.edu/abs/2000ApJ…539..366A

Conference Proceedings:

http://adsabs.harvard.edu/abs/2016sf2a.conf..223A http://adsabs.harvard.edu/abs/2012RSPTA.370.2765A

Files available at: https://phoenix.ens-lyon.fr/Grids/BT-Settl/

Current Library: AGSS2009 Abundances (due to grid availability) Spectra rebinned to match Kurucz, and custom 2 Ang medium resolution

Attributes:
Teff
Z
flux_units
logT
logZ
logg
nbytes

return the number of bytes of the object

wavelength

Methods

bbox([dlogT, dlogg])

Boundary of BT-Settl library

generate_individual_spectra(stars, **kwargs)

Generates individual spectra for the given stars and stellar library

generate_individual_values(stars, values, ...)

Generates individual spectra for the given stars and stellar library

generate_stellar_spectrum(logT, logg, logL, Z)

Generates individual spectrum for the given stars APs and the stellar library

get_boundaries([dlogT, dlogg])

Returns the closed boundary polygon around the stellar library with given margins

get_interpolation_data()

interpolation needs alpha

get_radius(logl, logt)

Returns the radius of a star given its luminosity and temperature

get_weights(logT, logg, logL[, weights])

Returns the proper weights for the interpolation Stellar atmospheres are normalized to Radius = 1

plot_boundary([ax, dlogT, dlogg])

Parameters:

points_inside(xypoints[, dlogT, dlogg])

Returns if a point is inside the polygon defined by the boundary of the library

set_default_extrapolation_bounds

property Teff
property Z
bbox(dlogT=0.05, dlogg=0.25)[source]

Boundary of BT-Settl library

Parameters:
dlogT: float

log-temperature tolerance before extrapolation limit

dlogg: float

log-g tolerance before extrapolation limit

Returns:
bbox: ndarray

(logT, logg) edges of the bounding polygon

get_interpolation_data()[source]

interpolation needs alpha

property logT
property logZ
property logg

pystellibs.config module

pystellibs.elodie module

Elodie 3.1

class pystellibs.elodie.Elodie(*args, **kwargs)[source]

Bases: Stellib

Elodie 3.1 stellar library derived class

Attributes:
NHI
NHeI
NHeII
Teff
Z
flux_units
logT
logZ
logg
nbytes

return the number of bytes of the object

wavelength

Methods

bbox([dlogT, dlogg])

Boundary of Elodie library

generate_individual_spectra(stars, **kwargs)

Generates individual spectra for the given stars and stellar library

generate_individual_values(stars, values, ...)

Generates individual spectra for the given stars and stellar library

generate_stellar_spectrum(logT, logg, logL, Z)

Generates individual spectrum for the given stars APs and the stellar library

get_boundaries([dlogT, dlogg])

Returns the closed boundary polygon around the stellar library with given margins

get_interpolation_data()

Default interpolation

get_radius(logl, logt)

Returns the radius of a star given its luminosity and temperature

get_weights(logT, logg, logL[, weights])

Returns the proper weights for the interpolation

plot_boundary([ax, dlogT, dlogg])

Parameters:

points_inside(xypoints[, dlogT, dlogg])

Returns if a point is inside the polygon defined by the boundary of the library

set_default_extrapolation_bounds

property NHI
property NHeI
property NHeII
property Teff
property Z
bbox(dlogT=0.05, dlogg=0.25)[source]

Boundary of Elodie library

Parameters:
dlogT: float

log-temperature tolerance before extrapolation limit

dlogg: float

log-g tolerance before extrapolation limit

Returns:
bbox: ndarray

(logT, logg) edges of the bounding polygon

property logT
property logZ
property logg

pystellibs.ezmap module

pystellibs.helpers module

This is a first collection of tools making the design easier

class pystellibs.helpers.NameSpace(name, *args, **kwargs)[source]

Bases: dict

A dict subclass that exposes its items as attributes.

Methods

clear()

copy()

fromkeys(iterable[, value])

Create a new dictionary with keys from iterable and values set to value.

get(key[, default])

Return the value for key if key is in the dictionary, else default.

items()

keys()

pop(k[,d])

If key is not found, d is returned if given, otherwise KeyError is raised

popitem(/)

Remove and return a (key, value) pair as a 2-tuple.

setdefault(key[, default])

Insert key with a value of default if key is not in the dictionary.

update([E, ]**F)

If E is present and has a .keys() method, then does: for k in E: D[k] = E[k] If E is present and lacks a .keys() method, then does: for k, v in E: D[k] = v In either case, this is followed by: for k in F: D[k] = F[k]

values()

class pystellibs.helpers.Pipe(func, *args, **kwargs)[source]

Bases: object

Decorator overloading | operator (__ror__) such that you can pipe functions where the first argument is the variable on the left side of the | operator. The difference with Pipeable is that you cannot use decorated function outside of pipes but you gain the possibility to update the calling parameters

Used with keywords_first make this a powerful Task

Methods

__call__(*args, **kwargs)

Call self as a function.

class pystellibs.helpers.Pipeable(func, *args, **kwargs)[source]

Bases: object

Decorator overloading | operator (__ror__) such that you can pipe functions where the first argument is the variable on the left side of the | operator. This decorator allows you to use the decorated function normally and uses the provided values when using in pipes.

>>> import pylab as plt
>>> _p = Pipeable(plt.plot, color='red', linestyle='--')
>>> _p(range(10), 'o-')  #  works
>>> range(10) | _p      #  will plot a red dashed line

Methods

__call__(*args, **kwargs)

Call self as a function.

class pystellibs.helpers.Pipegroup(pipes, mode='sequential')[source]

Bases: object

Methods

__call__(val, *args, **kwargs)

Call self as a function.

append

multi_call

seq_call

append(other)[source]
multi_call(vals, iter=True)[source]
seq_call(val, *args, **kwargs)[source]
pystellibs.helpers.chunks(l, n)[source]

Yield successive n-sized chunks from l.

Parameters:
l: iterable

object to iter over

n: int

number of elements per slice

Returns:
chunk: tuple

n values from l

pystellibs.helpers.deprecated(func)[source]

A dummy decorator that warns against using a deprecated function

pystellibs.helpers.generator(func)[source]

A dummy decorator that only make codes mode readable. It allow to explicitly mark a function as generator (yielding values) and does nothing more than calling the initial function

pystellibs.helpers.isNestedInstance(obj, cl)[source]

Test for sub-classes types I could not find a universal test

Parameters:
obj: object instance

object to test

cl: Class

top level class to test

Returns:
r: bool

True if obj is indeed an instance or subclass instance of cl

pystellibs.helpers.keywords_first(f)[source]

Decorator that enables to access any argument or keyword as a keyword

pystellibs.helpers.kfpartial(fun, *args, **kwargs)[source]

Allows to create partial functions with arbitrary arguments/keywords

pystellibs.helpers.merge_records(lst)[source]

generates a stack of records even with slightly different but compatible dtypes

Parameters:
lst: sequence of np.recarray

sequence of individual records

Returns:
val: np.recarray

array of stacked records Note if if lst is empty, returns an empty list

pystellibs.helpers.missing_units_warning(name, defaultunit)[source]

Warn if any unit is missing

Parameters:
name: str

name of the variable

defaultunit: str

default unit definition

Raises:
warning: warnings.warn

warn if units are assumed

pystellibs.helpers.nbytes(obj, pprint=False)[source]

return the number of bytes of the object, which includes size of nested structures

Parameters:
obj: object

object to find the size of

pprint: bool, optional (default=False)

if set, returns the result after calling pretty_size_print

Returns:
num_bytes: int or str

total number of bytes or human readable corresponding string

pystellibs.helpers.path_of_module(mod=None)[source]

returns the definition code path of a given module, object or function If nothing is provided, the current frame will be query, i.e., the current working directory of the calling function.

Parameters:
mod: module, class, function

object to find the defintion if None, inspect.currentframe is used

Returns:
path: str

path of the definition

pystellibs.helpers.pretty_size_print(num_bytes)[source]

Output number of bytes in a human readable format

Parameters:
num_bytes: int

number of bytes to convert

Returns:
output: str

string representation of the size with appropriate unit scale

pystellibs.helpers.type_checker(name, obj, tp)[source]

Check a given type and raise a type error if not correct

Parameters:
name: str

name of the variable to show in the exception text

obj: object

object to check

tp: type

expected type of obj

Raises:
exc:TypeError:

raises a TypeError if object is not of the correct type of a subclass of it

pystellibs.helpers.val_in_unit(varname, value, defaultunit)[source]

check units and convert to defaultunit or create the unit information

Parameters:
varname: str

name of the variable

value: value

value of the variable, which may be unitless

defaultunit: str

default units is unitless

Returns:
quantity: ezunits.Quantity

value with units

pystellibs.kurucz module

class pystellibs.kurucz.Kurucz(*args, **kwargs)[source]

Bases: AtmosphereLib

The stellar atmosphere models by Castelli and Kurucz 2004 or ATLAS9

  • LTE

  • PP

  • line blanketing

Attributes:
Teff
Z
flux_units
logT
logZ
logg
nbytes

return the number of bytes of the object

wavelength

Methods

bbox([dlogT, dlogg])

Boundary of Kurucz 2004 library

generate_individual_spectra(stars, **kwargs)

Generates individual spectra for the given stars and stellar library

generate_individual_values(stars, values, ...)

Generates individual spectra for the given stars and stellar library

generate_stellar_spectrum(logT, logg, logL, Z)

Generates individual spectrum for the given stars APs and the stellar library

get_boundaries([dlogT, dlogg])

Returns the closed boundary polygon around the stellar library with given margins

get_interpolation_data()

Default interpolation

get_radius(logl, logt)

Returns the radius of a star given its luminosity and temperature

get_weights(logT, logg, logL[, weights])

Returns the proper weights for the interpolation Stellar atmospheres are normalized to Radius = 1

plot_boundary([ax, dlogT, dlogg])

Parameters:

points_inside(xypoints[, dlogT, dlogg])

Returns if a point is inside the polygon defined by the boundary of the library

set_default_extrapolation_bounds

property Teff
property Z
bbox(dlogT=0.05, dlogg=0.25)[source]

Boundary of Kurucz 2004 library

Parameters:
dlogT: float

log-temperature tolerance before extrapolation limit

dlogg: float

log-g tolerance before extrapolation limit

Returns:
bbox: ndarray

(logT, logg) edges of the bounding polygon

property logT
property logZ
property logg

pystellibs.marcs module

class pystellibs.marcs.Marcs(*args, **kwargs)[source]

Bases: AtmosphereLib

MARCS stellar atmosphere models

Gustafsson et al 2008.

http://marcs.astro.uu.se/

Attributes:
Teff
Z
alpha
flux_units
logT
logZ
logg
nbytes

return the number of bytes of the object

wavelength

Methods

bbox([dlogT, dlogg])

Boundary of Kurucz 2004 library

generate_individual_spectra(stars[, nthreads])

Generates individual spectra for the given stars and stellar library

generate_individual_values(stars, values, ...)

Generates individual spectra for the given stars and stellar library

generate_stellar_spectrum(logT, logg, logL, Z)

Generates individual spectrum for the given stars APs and the stellar library

get_boundaries([dlogT, dlogg])

Returns the closed boundary polygon around the stellar library with given margins

get_interpolation_data()

interpolation needs alpha

get_radius(logl, logt)

Returns the radius of a star given its luminosity and temperature

get_weights(logT, logg, logL[, weights])

Returns the proper weights for the interpolation Stellar atmospheres are normalized to Radius = 1

plot_boundary([ax, dlogT, dlogg])

Parameters:

points_inside(xypoints[, dlogT, dlogg])

Returns if a point is inside the polygon defined by the boundary of the library

set_default_extrapolation_bounds

property Teff
property Z
property alpha
bbox(dlogT=0.05, dlogg=0.25)[source]

Boundary of Kurucz 2004 library

Parameters:
dlogT: float

log-temperature tolerance before extrapolation limit

dlogg: float

log-g tolerance before extrapolation limit

Returns:
bbox: ndarray

(logT, logg) edges of the bounding polygon

generate_individual_spectra(stars, nthreads=0, **kwargs)[source]

Generates individual spectra for the given stars and stellar library

Returns NaN spectra if the boundary conditions are not met (no extrapolation)

Parameters:
stars: Table

contains at least (logT, logg, logL, Z) of the considered stars

Returns:
l0: ndarray, ndim=1

wavelength definition of the spectra wavelength in AA

s0: ndarray, shape=(len(stars), len(l0))

array of spectra, one per input star Spectrum in ergs/s/AA or ergs/s/AA/Lsun

generate_stellar_spectrum(logT, logg, logL, Z, alpha=0.0, raise_extrapolation=True, **kwargs)[source]

Generates individual spectrum for the given stars APs and the stellar library

Returns NaN spectra if the boundary conditions are not met (no extrapolation)

Parameters:
logT: float

temperature

logg: float

log-gravity

logL: float

log-luminosity

Z: float

metallicity

alpha: float

alpha element

raise_extrapolation: bool

if set throw error on extrapolation

null: value

value of the flux when extrapolation and raise_extrapolation is not set

Returns:
s0: ndarray, shape=(len(stars), len(l0))

array of spectra, one per input star Spectrum in ergs/s/AA or ergs/s/AA/Lsun

get_interpolation_data()[source]

interpolation needs alpha

property logT
property logZ
property logg

pystellibs.munari module

class pystellibs.munari.Munari(*args, **kwargs)[source]

Bases: AtmosphereLib

ATLAS9 stellar atmospheres providing higher res than Kurucz medium resolution (1 Ang/pix) in optical (2500-10500 Ang)

References

Paper: Munari et al. 2005 A&A 442 1127 http://adsabs.harvard.edu/abs/2005A%26A…442.1127M

Files available at: http://archives.pd.astro.it/2500-10500/

Attributes:
Teff
Z
flux_units
logT
logZ
logg
nbytes

return the number of bytes of the object

wavelength

Methods

bbox([dlogT, dlogg])

Boundary of Munari library

generate_individual_spectra(stars, **kwargs)

Generates individual spectra for the given stars and stellar library

generate_individual_values(stars, values, ...)

Generates individual spectra for the given stars and stellar library

generate_stellar_spectrum(logT, logg, logL, Z)

Generates individual spectrum for the given stars APs and the stellar library

get_boundaries([dlogT, dlogg])

Returns the closed boundary polygon around the stellar library with given margins

get_interpolation_data()

interpolation needs alpha

get_radius(logl, logt)

Returns the radius of a star given its luminosity and temperature

get_weights(logT, logg, logL[, weights])

Returns the proper weights for the interpolation Stellar atmospheres are normalized to Radius = 1

plot_boundary([ax, dlogT, dlogg])

Parameters:

points_inside(xypoints[, dlogT, dlogg])

Returns if a point is inside the polygon defined by the boundary of the library

set_default_extrapolation_bounds

property Teff
property Z
bbox(dlogT=0.05, dlogg=0.25)[source]

Boundary of Munari library

Parameters:
dlogT: float

log-temperature tolerance before extrapolation limit

dlogg: float

log-g tolerance before extrapolation limit

Returns:
bbox: ndarray

(logT, logg) edges of the bounding polygon

get_interpolation_data()[source]

interpolation needs alpha

property logT
property logZ
property logg

pystellibs.pbar module

Simple progressbar

This package implement a unique progress bar class that can be used to decorate an iterator, a function or even standalone.

The format of the meter is flexible and can display along with the progress meter, the running time, an eta, and the rate of the iterations.

An example is::

description [———-] k/n 10% [time: 00:00:00, eta: 00:00:00, 2.7 iters/sec]

class pystellibs.pbar.Pbar(maxval=None, desc=None, time=True, eta=True, rate=True, length=None, file=None, keep=True, mininterval=0.5, miniters=1, units='iters', **kwargs)[source]

Bases: object

make a progress string in a shape of:

[----------] k/n  10% [time: 00:00:00, eta: 00:00:00, 2.7 iters/sec]
Attributes:
time: bool, optional (default: True)

if set, add the runtime information

eta: bool, optional (default: True)

if set, add an estimated time to completion

rate: bool, optional (default: True)

if set, add the rate information

length: int, optional (default: None)

number of characters showing the progress meter itself if None, the meter will adapt to the buffer width

TODO: make it variable with the buffer length

keep: bool, optional (default: True)

If not set, deletes its traces from screen after completion

file: buffer

the buffer to write into

mininterval: float (default: 0.5)

minimum time in seconds between two updates of the meter

miniters: int, optional (default: 1)

minimum iteration number between two updates of the meter

units: str, optional (default: ‘iters’)

unit of the iteration

Methods

build_str_meter(n, total, elapsed)

make a progress string in a shape of.

decorator(func)

Provide a function decorator allowing for counting calls and rates

format_interval(t)

make a human readable time interval decomposed into days, hours, minutes and seconds

iterover(iterable[, total])

Get an iterable object, and return an iterator which acts exactly like the iterable, but prints a progress meter and updates it every time a value is requested.

print_status(s)

print a status s on the last file line and clean the rest of the line

update(n[, desc, total])

Kept for backward compatibility and the decorator feature

handle_resize

build_str_meter(n, total, elapsed)[source]

make a progress string in a shape of:

[----------] k/n  10% [time: 00:00:00, eta: 00:00:00, 2.7 iters/sec]
Parameters:
n: int

number of finished iterations

total: int

total number of iterations, or None

elapsed: int

number of seconds passed since start

Returns:
txt: str

string representing the meter

decorator(func)[source]

Provide a function decorator allowing for counting calls and rates

static format_interval(t)[source]

make a human readable time interval decomposed into days, hours, minutes and seconds

Parameters:
t: int

interval in seconds

Returns:
txt: str

string representing the interval (format: <days>d <hrs>:<min>:<sec>)

handle_resize(signum, frame)[source]
iterover(iterable, total=None)[source]

Get an iterable object, and return an iterator which acts exactly like the iterable, but prints a progress meter and updates it every time a value is requested.

Parameters:
iterable: generator or iterable object

object to iter over.

total: int, optional

the number of iterations is assumed to be the length of the iterator. But sometimes the iterable has no associated length or its length is not the actual number of future iterations. In this case, total can be set to define the number of iterations.

Returns:
gen: generator

pass the values from the initial iterator

print_status(s)[source]

print a status s on the last file line and clean the rest of the line

Parameters:
s: str

message to write

update(n, desc=None, total=None)[source]

Kept for backward compatibility and the decorator feature

Parameters:
n: int

force iteration number n

desc: str

update description string

total: int

update the total number of iterations

pystellibs.rauch module

Rauch White Dwarfs stellar atmospheres

class pystellibs.rauch.Rauch(*args, **kwargs)[source]

Bases: Stellib

Rauch White Dwarfs stellar atmospheres

References

Rauch, T.; Werner, K.; Bohlin, R.; Kruk, J. W., “The virtual observatory service TheoSSA: Establishing a database of synthetic stellar flux standards. I. NLTE spectral analysis of the DA-type white dwarf G191-B2B”

Attributes:
NHI
NHeI
NHeII
Teff
Z
flux_units
logT
logZ
logg
nbytes

return the number of bytes of the object

wavelength

Methods

bbox([dlogT, dlogg])

Boundary of Rauch library

generate_individual_spectra(stars, **kwargs)

Generates individual spectra for the given stars and stellar library

generate_individual_values(stars, values, ...)

Generates individual spectra for the given stars and stellar library

generate_stellar_spectrum(logT, logg, logL, Z)

Generates individual spectrum for the given stars APs and the stellar library

get_boundaries([dlogT, dlogg])

Returns the closed boundary polygon around the stellar library with given margins

get_interpolation_data()

Default interpolation

get_radius(logl, logt)

Returns the radius of a star given its luminosity and temperature

get_weights(logT, logg, logL[, weights])

Returns the proper weights for the interpolation

plot_boundary([ax, dlogT, dlogg])

Parameters:

points_inside(xypoints[, dlogT, dlogg])

Returns if a point is inside the polygon defined by the boundary of the library

set_default_extrapolation_bounds

property NHI
property NHeI
property NHeII
property Teff
property Z
bbox(dlogT=0.05, dlogg=0.25)[source]

Boundary of Rauch library

Parameters:
dlogT: float

log-temperature tolerance before extrapolation limit

dlogg: float

log-g tolerance before extrapolation limit

Returns:
bbox: ndarray

(logT, logg) edges of the bounding polygon

property logT
property logZ
property logg

pystellibs.simpletable module

This file implements a Table class

that is designed to be the basis of any format

Requirements

  • FIT format:
    • astropy:

      provides a replacement to pyfits pyfits can still be used instead but astropy is now the default

  • HDF5 format:
    • pytables

RuntimeError will be raised when writing to a format associated with missing package.

class pystellibs.simpletable.AstroHelpers[source]

Bases: object

Helpers related to astronomy data

Methods

conesearch(ra0, dec0, ra, dec, r[, outtype])

Perform a cone search on a table

deg2dms(val[, delim])

Convert degrees into hex coordinates Parameters ---------- deg: float angle in degrees

deg2hms(val[, delim])

Convert degrees into hex coordinates

dms2deg(_str[, delim])

Convert hex coordinates into degrees Parameters ---------- str: string or sequence string to convert

euler(ai_in, bi_in, select[, b1950, dtype])

Transform between Galactic, celestial, and ecliptic coordinates.

hms2deg(_str[, delim])

Convert hex coordinates into degrees

sphdist(ra1, dec1, ra2, dec2)

measures the spherical distance between 2 points

static conesearch(ra0, dec0, ra, dec, r, outtype=0)[source]

Perform a cone search on a table

Parameters:
ra0: ndarray[ndim=1, dtype=float]

column name to use as RA source in degrees

dec0: ndarray[ndim=1, dtype=float]

column name to use as DEC source in degrees

ra: float

ra to look for (in degree)

dec: float

ra to look for (in degree)

r: float

distance in degrees

outtype: int
type of outputs

0 – minimal, indices of matching coordinates 1 – indices and distances of matching coordinates 2 – full, boolean filter and distances

Returns:
t: tuple
if outtype is 0:

only return indices from ra0, dec0

elif outtype is 1:

return indices from ra0, dec0 and distances

elif outtype is 2:

return conditional vector and distance to all ra0, dec0

static deg2dms(val, delim=':')[source]

Convert degrees into hex coordinates Parameters ———- deg: float

angle in degrees

delimiter: str

character delimiting the fields

Returns:
str: string or sequence

string to convert

static deg2hms(val, delim=':')[source]

Convert degrees into hex coordinates

Parameters:
deg: float

angle in degrees

delimiter: str

character delimiting the fields

Returns:
str: string or sequence

string to convert

static dms2deg(_str, delim=':')[source]

Convert hex coordinates into degrees Parameters ———- str: string or sequence

string to convert

delimiter: str

character delimiting the fields

Returns:
deg: float

angle in degrees

static euler(ai_in, bi_in, select, b1950=False, dtype='f8')[source]

Transform between Galactic, celestial, and ecliptic coordinates. Celestial coordinates (RA, Dec) should be given in equinox J2000 unless the b1950 is True.

Parameters:
long_in: float, or sequence

Input Longitude in DEGREES, scalar or vector.

lat_in: float, or sequence

Latitude in DEGREES

select: int

Integer from 1 to 6 specifying type of coordinate transformation.

b1950: bool

set equinox set to 1950

Returns:
long_out: float, seq

Output Longitude in DEGREES

lat_out: float, seq

Output Latitude in DEGREES

REVISION HISTORY:
Written W. Landsman, February 1987
Adapted from Fortran by Daryl Yentis NRL
Converted to IDL V5.0 W. Landsman September 1997
Made J2000 the default, added /FK4 keyword W. Landsman December 1998
Add option to specify SELECT as a keyword W. Landsman March 2003
Converted from IDL to numerical Python: Erin Sheldon, NYU, 2008-07-02
static hms2deg(_str, delim=':')[source]

Convert hex coordinates into degrees

Parameters:
str: string or sequence

string to convert

delimiter: str

character delimiting the fields

Returns:
deg: float

angle in degrees

static sphdist(ra1, dec1, ra2, dec2)[source]

measures the spherical distance between 2 points

Parameters:
ra1: float or sequence

first right ascensions in degrees

dec1: float or sequence

first declination in degrees

ra2: float or sequence

second right ascensions in degrees

dec2: float or sequence

first declination in degrees

Returns:
Outputs: float or sequence

returns a distance in degrees

class pystellibs.simpletable.AstroTable(*args, **kwargs)[source]

Bases: SimpleTable

Derived from the Table, this class add implementations of common astro tools especially conesearch

Attributes:
colnames

Sequence of column names

dtype

dtype of the data

empty_row

Return an empty row array respecting the table format

name

name of the table given by the Header[‘NAME’] attribute

nbytes

number of bytes of the object

ncols

number of columns

nrows

number of lines

shape

shape of the data

Methods

__call__(*args, **kwargs)

Call self as a function.

addCol(name, data[, dtype, unit, description])

Add one or multiple columns to the table

addLine(iterable)

Append one row in this table.

add_column(name, data[, dtype, unit, ...])

Add one or multiple columns to the table

append_row(iterable)

Append one row in this table.

boxplot(*args, **kwargs)

Draw a box and whisker plot.

coneSearch(ra, dec, r[, outtype])

Perform a cone search on a table

delCol(names)

Remove several columns from the table

evalexpr(expr[, exprvars, dtype])

evaluate expression based on the data and external variables

find_duplicate([index_only, values_only])

Find duplication in the table entries, return a list of duplicated elements Only works at this time is 2 lines are the same entry not if 2 lines have the same values

get(v[, full_match])

returns a table from columns given as v

get_DEC([degree])

Returns RA, converted from hexa/sexa into degrees

get_RA([degree])

Returns RA, converted from hexa/sexa into degrees

groupby(key)

Create an iterator which returns (key, sub-table) grouped by each value of key(value)

hexbin(*args, **kwargs)

Make a 2D hexagonal binning plot of points x, y.

hist(*args, **kwargs)

Compute and plot a histogram.

hist2d(*args, **kwargs)

Make a 2D histogram plot.

iterkeys()

Iterator over the columns of the table

itervalues()

Iterator over the lines of the table

join_by(r2, key[, jointype, r1postfix, ...])

Join arrays r1 and r2 on key key.

keys([regexp, full_match])

Return the data column names or a subset of it

match(r2, key)

Returns the indices at which the tables match matching uses 2 columns that are compared in values

plot(*args, **kwargs)

Plot y versus x as lines and/or markers.

plot_function(fn, *args, **kwargs)

Generate a plotting method of tab from a given function

pop_columns(names)

Pop several columns from the table

pprint([idx, fields, ret, all, full_match, ...])

Pretty print the table content

pprint_entry(num[, keys])

print one line with key and values properly to be readable

remove_column(names)

Remove several columns from the table

remove_columns(names)

Remove several columns from the table

resolve_alias(colname)

Return the name of an aliased column.

reverse_alias(colname)

Return aliases of a given column.

scatter(*args, **kwargs)

A scatter plot of y vs.

select(fields[, indices])

Select only a few fields in the table

selectWhere(fields[, condition, condvars, ...])

Read table data fulfilling the given condition.

setComment(colname, comment)

Set the comment of a column referenced by its name

setUnit(colname, unit)

Set the unit of a column referenced by its name

set_DEC(val)

Set the column that defines DEC coordinates

set_RA(val)

Set the column that defines RA coordinates

set_alias(alias, colname)

Define an alias to a column

set_comment(colname, comment)

Set the comment of a column referenced by its name

set_unit(colname, unit)

Set the unit of a column referenced by its name

sort(keys[, copy])

Sort the table inplace according to one or more keys.

stack(r, *args, **kwargs)

Superposes arrays fields by fields inplace

stats([fn, fields, fill])

Make statistics on columns of a table

violinplot(*args, **kwargs)

Make a violin plot.

where([condition, condvars, cone, zone])

Read table data fulfilling the given condition.

write(fname, **kwargs)

write table into file

zoneSearch(ramin, ramax, decmin, decmax[, ...])

Perform a zone search on a table, i.e., a rectangular selection Parameters ---------- ramin: float minimal value of RA

info

coneSearch(ra, dec, r, outtype=0)[source]

Perform a cone search on a table

Parameters:
ra0: ndarray[ndim=1, dtype=float]

column name to use as RA source in degrees

dec0: ndarray[ndim=1, dtype=float]

column name to use as DEC source in degrees

ra: float

ra to look for (in degree)

dec: float

ra to look for (in degree)

r: float

distance in degrees

outtype: int
type of outputs

0 – minimal, indices of matching coordinates 1 – indices and distances of matching coordinates 2 – full, boolean filter and distances

Returns:
t: tuple
if outtype is 0:

only return indices from ra0, dec0

elif outtype is 1:

return indices from ra0, dec0 and distances

elif outtype is 2:

return conditional vector and distance to all ra0, dec0

get_DEC(degree=True)[source]

Returns RA, converted from hexa/sexa into degrees

get_RA(degree=True)[source]

Returns RA, converted from hexa/sexa into degrees

info()[source]
selectWhere(fields, condition=None, condvars=None, cone=None, zone=None, **kwargs)[source]

Read table data fulfilling the given condition. Only the rows fulfilling the condition are included in the result. conesearch is also possible through the keyword cone formatted as (ra, dec, r) zonesearch is also possible through the keyword zone formatted as (ramin, ramax, decmin, decmax)

Combination of multiple selections is also available.

set_DEC(val)[source]

Set the column that defines DEC coordinates

set_RA(val)[source]

Set the column that defines RA coordinates

where(condition=None, condvars=None, cone=None, zone=None, **kwargs)[source]

Read table data fulfilling the given condition. Only the rows fulfilling the condition are included in the result.

Parameters:
condition: str

expression to evaluate on the table includes mathematical operations and attribute names

condvars: dictionary, optional

A dictionary that replaces the local operands in current frame.

Returns:
out: ndarray/ tuple of ndarrays
result equivalent to np.where()
zoneSearch(ramin, ramax, decmin, decmax, outtype=0)[source]

Perform a zone search on a table, i.e., a rectangular selection Parameters ———- ramin: float

minimal value of RA

ramax: float

maximal value of RA

decmin: float

minimal value of DEC

decmax: float

maximal value of DEC

outtype: int
type of outputs

0 or 1 – minimal, indices of matching coordinates 2 – full, boolean filter and distances

Returns:
r: sequence

indices or conditional sequence of matching values

class pystellibs.simpletable.SimpleTable(fname, *args, **kwargs)[source]

Bases: object

Table class that is designed to be the basis of any format wrapping around numpy recarrays

Attributes:
fname: str or object

if str, the file to read from. This may be limited to the format currently handled automatically. If the format is not correctly handled, you can try by providing an object.__

if object with a structure like dict, ndarray, or recarray-like

the data will be encapsulated into a Table

caseless: bool

if set, column names will be caseless during operations

aliases: dict

set of column aliases (can be defined later set_alias())

units: dict

set of column units (can be defined later set_unit())

desc: dict

set of column description or comments (can be defined later set_comment())

header: dict

key, value pair corresponding to the attributes of the table

Methods

__call__(*args, **kwargs)

Call self as a function.

addCol(name, data[, dtype, unit, description])

Add one or multiple columns to the table

addLine(iterable)

Append one row in this table.

add_column(name, data[, dtype, unit, ...])

Add one or multiple columns to the table

append_row(iterable)

Append one row in this table.

boxplot(*args, **kwargs)

Draw a box and whisker plot.

delCol(names)

Remove several columns from the table

evalexpr(expr[, exprvars, dtype])

evaluate expression based on the data and external variables

find_duplicate([index_only, values_only])

Find duplication in the table entries, return a list of duplicated elements Only works at this time is 2 lines are the same entry not if 2 lines have the same values

get(v[, full_match])

returns a table from columns given as v

groupby(key)

Create an iterator which returns (key, sub-table) grouped by each value of key(value)

hexbin(*args, **kwargs)

Make a 2D hexagonal binning plot of points x, y.

hist(*args, **kwargs)

Compute and plot a histogram.

hist2d(*args, **kwargs)

Make a 2D histogram plot.

iterkeys()

Iterator over the columns of the table

itervalues()

Iterator over the lines of the table

join_by(r2, key[, jointype, r1postfix, ...])

Join arrays r1 and r2 on key key.

keys([regexp, full_match])

Return the data column names or a subset of it

match(r2, key)

Returns the indices at which the tables match matching uses 2 columns that are compared in values

plot(*args, **kwargs)

Plot y versus x as lines and/or markers.

plot_function(fn, *args, **kwargs)

Generate a plotting method of tab from a given function

pop_columns(names)

Pop several columns from the table

pprint([idx, fields, ret, all, full_match, ...])

Pretty print the table content

pprint_entry(num[, keys])

print one line with key and values properly to be readable

remove_column(names)

Remove several columns from the table

remove_columns(names)

Remove several columns from the table

resolve_alias(colname)

Return the name of an aliased column.

reverse_alias(colname)

Return aliases of a given column.

scatter(*args, **kwargs)

A scatter plot of y vs.

select(fields[, indices])

Select only a few fields in the table

selectWhere(fields, condition[, condvars])

Read table data fulfilling the given condition.

setComment(colname, comment)

Set the comment of a column referenced by its name

setUnit(colname, unit)

Set the unit of a column referenced by its name

set_alias(alias, colname)

Define an alias to a column

set_comment(colname, comment)

Set the comment of a column referenced by its name

set_unit(colname, unit)

Set the unit of a column referenced by its name

sort(keys[, copy])

Sort the table inplace according to one or more keys.

stack(r, *args, **kwargs)

Superposes arrays fields by fields inplace

stats([fn, fields, fill])

Make statistics on columns of a table

violinplot(*args, **kwargs)

Make a violin plot.

where(condition[, condvars])

Read table data fulfilling the given condition.

write(fname, **kwargs)

write table into file

info

addCol(name, data, dtype=None, unit=None, description=None)

Add one or multiple columns to the table

Parameters:
name: str or sequence(str)

The name(s) of the column(s) to add

data: ndarray, or sequence of ndarray

The column data, or sequence of columns

dtype: dtype

numpy dtype for the data to add

unit: str

The unit of the values in the column

description: str

A description of the content of the column

addLine(iterable)

Append one row in this table.

see also: stack()

Parameters:
iterable: iterable

line to add

add_column(name, data, dtype=None, unit=None, description=None)[source]

Add one or multiple columns to the table

Parameters:
name: str or sequence(str)

The name(s) of the column(s) to add

data: ndarray, or sequence of ndarray

The column data, or sequence of columns

dtype: dtype

numpy dtype for the data to add

unit: str

The unit of the values in the column

description: str

A description of the content of the column

append_row(iterable)[source]

Append one row in this table.

see also: stack()

Parameters:
iterable: iterable

line to add

boxplot(*args, **kwargs)

Draw a box and whisker plot.

The box extends from the first quartile (Q1) to the third quartile (Q3) of the data, with a line at the median. The whiskers extend from the box by 1.5x the inter-quartile range (IQR). Flier points are those past the end of the whiskers. See https://en.wikipedia.org/wiki/Box_plot for reference.

     Q1-1.5IQR   Q1   median  Q3   Q3+1.5IQR
                  |-----:-----|
  o      |--------|     :     |--------|    o  o
                  |-----:-----|
flier             <----------->            fliers
                       IQR
Parameters:
xArray or a sequence of vectors.

The input data. If a 2D array, a boxplot is drawn for each column in x. If a sequence of 1D arrays, a boxplot is drawn for each array in x.

notchbool, default: False

Whether to draw a notched boxplot (True), or a rectangular boxplot (False). The notches represent the confidence interval (CI) around the median. The documentation for bootstrap describes how the locations of the notches are computed by default, but their locations may also be overridden by setting the conf_intervals parameter.

Note

In cases where the values of the CI are less than the lower quartile or greater than the upper quartile, the notches will extend beyond the box, giving it a distinctive “flipped” appearance. This is expected behavior and consistent with other statistical visualization packages.

symstr, optional

The default symbol for flier points. An empty string (‘’) hides the fliers. If None, then the fliers default to ‘b+’. More control is provided by the flierprops parameter.

vertbool, default: True

If True, draws vertical boxes. If False, draw horizontal boxes.

whisfloat or (float, float), default: 1.5

The position of the whiskers.

If a float, the lower whisker is at the lowest datum above Q1 - whis*(Q3-Q1), and the upper whisker at the highest datum below Q3 + whis*(Q3-Q1), where Q1 and Q3 are the first and third quartiles. The default value of whis = 1.5 corresponds to Tukey’s original definition of boxplots.

If a pair of floats, they indicate the percentiles at which to draw the whiskers (e.g., (5, 95)). In particular, setting this to (0, 100) results in whiskers covering the whole range of the data.

In the edge case where Q1 == Q3, whis is automatically set to (0, 100) (cover the whole range of the data) if autorange is True.

Beyond the whiskers, data are considered outliers and are plotted as individual points.

bootstrapint, optional

Specifies whether to bootstrap the confidence intervals around the median for notched boxplots. If bootstrap is None, no bootstrapping is performed, and notches are calculated using a Gaussian-based asymptotic approximation (see McGill, R., Tukey, J.W., and Larsen, W.A., 1978, and Kendall and Stuart, 1967). Otherwise, bootstrap specifies the number of times to bootstrap the median to determine its 95% confidence intervals. Values between 1000 and 10000 are recommended.

usermedians1D array-like, optional

A 1D array-like of length len(x). Each entry that is not None forces the value of the median for the corresponding dataset. For entries that are None, the medians are computed by Matplotlib as normal.

conf_intervalsarray-like, optional

A 2D array-like of shape (len(x), 2). Each entry that is not None forces the location of the corresponding notch (which is only drawn if notch is True). For entries that are None, the notches are computed by the method specified by the other parameters (e.g., bootstrap).

positionsarray-like, optional

The positions of the boxes. The ticks and limits are automatically set to match the positions. Defaults to range(1, N+1) where N is the number of boxes to be drawn.

widthsfloat or array-like

The widths of the boxes. The default is 0.5, or 0.15*(distance between extreme positions), if that is smaller.

patch_artistbool, default: False

If False produces boxes with the Line2D artist. Otherwise, boxes are drawn with Patch artists.

labelssequence, optional

Labels for each dataset (one per dataset).

manage_ticksbool, default: True

If True, the tick locations and labels will be adjusted to match the boxplot positions.

autorangebool, default: False

When True and the data are distributed such that the 25th and 75th percentiles are equal, whis is set to (0, 100) such that the whisker ends are at the minimum and maximum of the data.

meanlinebool, default: False

If True (and showmeans is True), will try to render the mean as a line spanning the full width of the box according to meanprops (see below). Not recommended if shownotches is also True. Otherwise, means will be shown as points.

zorderfloat, default: Line2D.zorder = 2

The zorder of the boxplot.

Returns:
dict

A dictionary mapping each component of the boxplot to a list of the .Line2D instances created. That dictionary has the following keys (assuming vertical boxplots):

  • boxes: the main body of the boxplot showing the quartiles and the median’s confidence intervals if enabled.

  • medians: horizontal lines at the median of each box.

  • whiskers: the vertical lines extending to the most extreme, non-outlier data points.

  • caps: the horizontal lines at the ends of the whiskers.

  • fliers: points representing data that extend beyond the whiskers (fliers).

  • means: points or lines representing the means.

Other Parameters:
showcapsbool, default: True

Show the caps on the ends of whiskers.

showboxbool, default: True

Show the central box.

showfliersbool, default: True

Show the outliers beyond the caps.

showmeansbool, default: False

Show the arithmetic means.

cappropsdict, default: None

The style of the caps.

capwidthsfloat or array, default: None

The widths of the caps.

boxpropsdict, default: None

The style of the box.

whiskerpropsdict, default: None

The style of the whiskers.

flierpropsdict, default: None

The style of the fliers.

medianpropsdict, default: None

The style of the median.

meanpropsdict, default: None

The style of the mean.

dataindexable object, optional

If given, all parameters also accept a string s, which is interpreted as data[s] (unless this raises an exception).

See also

violinplot

Draw an estimate of the probability density function.

property colnames

Sequence of column names

delCol(names)

Remove several columns from the table

Parameters:
names: sequence

A list containing the names of the columns to remove

property dtype

dtype of the data

property empty_row

Return an empty row array respecting the table format

evalexpr(expr, exprvars=None, dtype=<class 'float'>)[source]
evaluate expression based on the data and external variables

all np function can be used (log, exp, pi…)

Parameters:
expr: str

expression to evaluate on the table includes mathematical operations and attribute names

exprvars: dictionary, optional

A dictionary that replaces the local operands in current frame.

dtype: dtype definition

dtype of the output array

Returns:
outNumPy array

array of the result

find_duplicate(index_only=False, values_only=False)[source]

Find duplication in the table entries, return a list of duplicated elements Only works at this time is 2 lines are the same entry not if 2 lines have the same values

get(v, full_match=False)[source]

returns a table from columns given as v

this function is equivalent to __getitem__() but preserve the Table format and associated properties (units, description, header)

Parameters:
v: str

pattern to filter the keys with

full_match: bool

if set, use re.fullmatch() instead of re.match()

groupby(key)[source]

Create an iterator which returns (key, sub-table) grouped by each value of key(value)

Parameters:
key: str

expression or pattern to filter the keys with

Returns:
key: str or sequence

group key

tab: SimpleTable instance

sub-table of the group header, aliases and column metadata are preserved (linked to the master table).

hexbin(*args, **kwargs)

Make a 2D hexagonal binning plot of points x, y.

If C is None, the value of the hexagon is determined by the number of points in the hexagon. Otherwise, C specifies values at the coordinate (x[i], y[i]). For each hexagon, these values are reduced using reduce_C_function.

Parameters:
x, yarray-like

The data positions. x and y must be of the same length.

Carray-like, optional

If given, these values are accumulated in the bins. Otherwise, every point has a value of 1. Must be of the same length as x and y.

gridsizeint or (int, int), default: 100

If a single int, the number of hexagons in the x-direction. The number of hexagons in the y-direction is chosen such that the hexagons are approximately regular.

Alternatively, if a tuple (nx, ny), the number of hexagons in the x-direction and the y-direction. In the y-direction, counting is done along vertically aligned hexagons, not along the zig-zag chains of hexagons; see the following illustration.

To get approximately regular hexagons, choose \(n_x = \sqrt{3}\,n_y\).

bins‘log’ or int or sequence, default: None

Discretization of the hexagon values.

  • If None, no binning is applied; the color of each hexagon directly corresponds to its count value.

  • If ‘log’, use a logarithmic scale for the colormap. Internally, \(log_{10}(i+1)\) is used to determine the hexagon color. This is equivalent to norm=LogNorm().

  • If an integer, divide the counts in the specified number of bins, and color the hexagons accordingly.

  • If a sequence of values, the values of the lower bound of the bins to be used.

xscale{‘linear’, ‘log’}, default: ‘linear’

Use a linear or log10 scale on the horizontal axis.

yscale{‘linear’, ‘log’}, default: ‘linear’

Use a linear or log10 scale on the vertical axis.

mincntint > 0, default: None

If not None, only display cells with more than mincnt number of points in the cell.

marginalsbool, default: False

If marginals is True, plot the marginal density as colormapped rectangles along the bottom of the x-axis and left of the y-axis.

extent4-tuple of float, default: None

The limits of the bins (xmin, xmax, ymin, ymax). The default assigns the limits based on gridsize, x, y, xscale and yscale.

If xscale or yscale is set to ‘log’, the limits are expected to be the exponent for a power of 10. E.g. for x-limits of 1 and 50 in ‘linear’ scale and y-limits of 10 and 1000 in ‘log’ scale, enter (1, 50, 1, 3).

Returns:
~matplotlib.collections.PolyCollection

A .PolyCollection defining the hexagonal bins.

  • .PolyCollection.get_offsets contains a Mx2 array containing the x, y positions of the M hexagon centers.

  • .PolyCollection.get_array contains the values of the M hexagons.

If marginals is True, horizontal bar and vertical bar (both PolyCollections) will be attached to the return collection as attributes hbar and vbar.

Other Parameters:
cmapstr or ~matplotlib.colors.Colormap, default: :rc:`image.cmap`

The Colormap instance or registered colormap name used to map scalar data to colors.

normstr or ~matplotlib.colors.Normalize, optional

The normalization method used to scale scalar data to the [0, 1] range before mapping to colors using cmap. By default, a linear scaling is used, mapping the lowest value to 0 and the highest to 1.

If given, this can be one of the following:

  • An instance of .Normalize or one of its subclasses (see /tutorials/colors/colormapnorms).

  • A scale name, i.e. one of “linear”, “log”, “symlog”, “logit”, etc. For a list of available scales, call matplotlib.scale.get_scale_names(). In that case, a suitable .Normalize subclass is dynamically generated and instantiated.

vmin, vmaxfloat, optional

When using scalar data and no explicit norm, vmin and vmax define the data range that the colormap covers. By default, the colormap covers the complete value range of the supplied data. It is an error to use vmin/vmax when a norm instance is given (but using a str norm name together with vmin/vmax is acceptable).

alphafloat between 0 and 1, optional

The alpha blending value, between 0 (transparent) and 1 (opaque).

linewidthsfloat, default: None

If None, defaults to 1.0.

edgecolors{‘face’, ‘none’, None} or color, default: ‘face’

The color of the hexagon edges. Possible values are:

  • ‘face’: Draw the edges in the same color as the fill color.

  • ‘none’: No edges are drawn. This can sometimes lead to unsightly unpainted pixels between the hexagons.

  • None: Draw outlines in the default color.

  • An explicit color.

reduce_C_functioncallable, default: numpy.mean

The function to aggregate C within the bins. It is ignored if C is not given. This must have the signature:

def reduce_C_function(C: array) -> float

Commonly used functions are:

  • numpy.mean: average of the points

  • numpy.sum: integral of the point values

  • numpy.amax: value taken from the largest point

dataindexable object, optional

If given, the following parameters also accept a string s, which is interpreted as data[s] (unless this raises an exception):

x, y, C

**kwargs~matplotlib.collections.PolyCollection properties

All other keyword arguments are passed on to .PolyCollection:

Properties: agg_filter: a filter function, which takes a (m, n, 3) float array and a dpi value, and returns a (m, n, 3) array and two offsets from the bottom left corner of the image alpha: array-like or scalar or None animated: bool antialiased or aa or antialiaseds: bool or list of bools array: array-like or None capstyle: .CapStyle or {‘butt’, ‘projecting’, ‘round’} clim: (vmin: float, vmax: float) clip_box: .Bbox clip_on: bool clip_path: Patch or (Path, Transform) or None cmap: .Colormap or str or None color: color or list of RGBA tuples edgecolor or ec or edgecolors: color or list of colors or ‘face’ facecolor or facecolors or fc: color or list of colors figure: .Figure gid: str hatch: {‘/’, ‘\’, ‘|’, ‘-’, ‘+’, ‘x’, ‘o’, ‘O’, ‘.’, ‘*’} in_layout: bool joinstyle: .JoinStyle or {‘miter’, ‘round’, ‘bevel’} label: object linestyle or dashes or linestyles or ls: str or tuple or list thereof linewidth or linewidths or lw: float or list of floats mouseover: bool norm: .Normalize or str or None offset_transform or transOffset: unknown offsets: (N, 2) or (2,) array-like path_effects: .AbstractPathEffect paths: list of array-like picker: None or bool or float or callable pickradius: unknown rasterized: bool sizes: numpy.ndarray or None sketch_params: (scale: float, length: float, randomness: float) snap: bool or None transform: .Transform url: str urls: list of str or None verts: list of array-like verts_and_codes: unknown visible: bool zorder: float

See also

hist2d

2D histogram rectangular bins

hist(*args, **kwargs)

Compute and plot a histogram.

This method uses numpy.histogram to bin the data in x and count the number of values in each bin, then draws the distribution either as a .BarContainer or .Polygon. The bins, range, density, and weights parameters are forwarded to numpy.histogram.

If the data has already been binned and counted, use ~.bar or ~.stairs to plot the distribution:

counts, bins = np.histogram(x)
plt.stairs(counts, bins)

Alternatively, plot pre-computed bins and counts using hist() by treating each bin as a single point with a weight equal to its count:

plt.hist(bins[:-1], bins, weights=counts)

The data input x can be a singular array, a list of datasets of potentially different lengths ([x0, x1, …]), or a 2D ndarray in which each column is a dataset. Note that the ndarray form is transposed relative to the list form. If the input is an array, then the return value is a tuple (n, bins, patches); if the input is a sequence of arrays, then the return value is a tuple ([n0, n1, …], bins, [patches0, patches1, …]).

Masked arrays are not supported.

Parameters:
x(n,) array or sequence of (n,) arrays

Input values, this takes either a single array or a sequence of arrays which are not required to be of the same length.

binsint or sequence or str, default: :rc:`hist.bins`

If bins is an integer, it defines the number of equal-width bins in the range.

If bins is a sequence, it defines the bin edges, including the left edge of the first bin and the right edge of the last bin; in this case, bins may be unequally spaced. All but the last (righthand-most) bin is half-open. In other words, if bins is:

[1, 2, 3, 4]

then the first bin is [1, 2) (including 1, but excluding 2) and the second [2, 3). The last bin, however, is [3, 4], which includes 4.

If bins is a string, it is one of the binning strategies supported by numpy.histogram_bin_edges: ‘auto’, ‘fd’, ‘doane’, ‘scott’, ‘stone’, ‘rice’, ‘sturges’, or ‘sqrt’.

rangetuple or None, default: None

The lower and upper range of the bins. Lower and upper outliers are ignored. If not provided, range is (x.min(), x.max()). Range has no effect if bins is a sequence.

If bins is a sequence or range is specified, autoscaling is based on the specified bin range instead of the range of x.

densitybool, default: False

If True, draw and return a probability density: each bin will display the bin’s raw count divided by the total number of counts and the bin width (density = counts / (sum(counts) * np.diff(bins))), so that the area under the histogram integrates to 1 (np.sum(density * np.diff(bins)) == 1).

If stacked is also True, the sum of the histograms is normalized to 1.

weights(n,) array-like or None, default: None

An array of weights, of the same shape as x. Each value in x only contributes its associated weight towards the bin count (instead of 1). If density is True, the weights are normalized, so that the integral of the density over the range remains 1.

cumulativebool or -1, default: False

If True, then a histogram is computed where each bin gives the counts in that bin plus all bins for smaller values. The last bin gives the total number of datapoints.

If density is also True then the histogram is normalized such that the last bin equals 1.

If cumulative is a number less than 0 (e.g., -1), the direction of accumulation is reversed. In this case, if density is also True, then the histogram is normalized such that the first bin equals 1.

bottomarray-like, scalar, or None, default: None

Location of the bottom of each bin, i.e. bins are drawn from bottom to bottom + hist(x, bins) If a scalar, the bottom of each bin is shifted by the same amount. If an array, each bin is shifted independently and the length of bottom must match the number of bins. If None, defaults to 0.

histtype{‘bar’, ‘barstacked’, ‘step’, ‘stepfilled’}, default: ‘bar’

The type of histogram to draw.

  • ‘bar’ is a traditional bar-type histogram. If multiple data are given the bars are arranged side by side.

  • ‘barstacked’ is a bar-type histogram where multiple data are stacked on top of each other.

  • ‘step’ generates a lineplot that is by default unfilled.

  • ‘stepfilled’ generates a lineplot that is by default filled.

align{‘left’, ‘mid’, ‘right’}, default: ‘mid’

The horizontal alignment of the histogram bars.

  • ‘left’: bars are centered on the left bin edges.

  • ‘mid’: bars are centered between the bin edges.

  • ‘right’: bars are centered on the right bin edges.

orientation{‘vertical’, ‘horizontal’}, default: ‘vertical’

If ‘horizontal’, ~.Axes.barh will be used for bar-type histograms and the bottom kwarg will be the left edges.

rwidthfloat or None, default: None

The relative width of the bars as a fraction of the bin width. If None, automatically compute the width.

Ignored if histtype is ‘step’ or ‘stepfilled’.

logbool, default: False

If True, the histogram axis will be set to a log scale.

colorcolor or array-like of colors or None, default: None

Color or sequence of colors, one per dataset. Default (None) uses the standard line color sequence.

labelstr or None, default: None

String, or sequence of strings to match multiple datasets. Bar charts yield multiple patches per dataset, but only the first gets the label, so that ~.Axes.legend will work as expected.

stackedbool, default: False

If True, multiple data are stacked on top of each other If False multiple data are arranged side by side if histtype is ‘bar’ or on top of each other if histtype is ‘step’

Returns:
narray or list of arrays

The values of the histogram bins. See density and weights for a description of the possible semantics. If input x is an array, then this is an array of length nbins. If input is a sequence of arrays [data1, data2, ...], then this is a list of arrays with the values of the histograms for each of the arrays in the same order. The dtype of the array n (or of its element arrays) will always be float even if no weighting or normalization is used.

binsarray

The edges of the bins. Length nbins + 1 (nbins left edges and right edge of last bin). Always a single array even when multiple data sets are passed in.

patches.BarContainer or list of a single .Polygon or list of such objects

Container of individual artists used to create the histogram or list of such containers if there are multiple input datasets.

Other Parameters:
dataindexable object, optional

If given, the following parameters also accept a string s, which is interpreted as data[s] (unless this raises an exception):

x, weights

**kwargs

~matplotlib.patches.Patch properties

See also

hist2d

2D histogram with rectangular bins

hexbin

2D histogram with hexagonal bins

stairs

Plot a pre-computed histogram

bar

Plot a pre-computed histogram

Notes

For large numbers of bins (>1000), plotting can be significantly accelerated by using ~.Axes.stairs to plot a pre-computed histogram (plt.stairs(*np.histogram(data))), or by setting histtype to ‘step’ or ‘stepfilled’ rather than ‘bar’ or ‘barstacked’.

hist2d(*args, **kwargs)

Make a 2D histogram plot.

Parameters:
x, yarray-like, shape (n, )

Input values

binsNone or int or [int, int] or array-like or [array, array]

The bin specification:

  • If int, the number of bins for the two dimensions (nx=ny=bins).

  • If [int, int], the number of bins in each dimension (nx, ny = bins).

  • If array-like, the bin edges for the two dimensions (x_edges=y_edges=bins).

  • If [array, array], the bin edges in each dimension (x_edges, y_edges = bins).

The default value is 10.

rangearray-like shape(2, 2), optional

The leftmost and rightmost edges of the bins along each dimension (if not specified explicitly in the bins parameters): [[xmin, xmax], [ymin, ymax]]. All values outside of this range will be considered outliers and not tallied in the histogram.

densitybool, default: False

Normalize histogram. See the documentation for the density parameter of ~.Axes.hist for more details.

weightsarray-like, shape (n, ), optional

An array of values w_i weighing each sample (x_i, y_i).

cmin, cmaxfloat, default: None

All bins that has count less than cmin or more than cmax will not be displayed (set to NaN before passing to imshow) and these count values in the return value count histogram will also be set to nan upon return.

Returns:
h2D array

The bi-dimensional histogram of samples x and y. Values in x are histogrammed along the first dimension and values in y are histogrammed along the second dimension.

xedges1D array

The bin edges along the x-axis.

yedges1D array

The bin edges along the y-axis.

image~.matplotlib.collections.QuadMesh
Other Parameters:
cmapstr or ~matplotlib.colors.Colormap, default: :rc:`image.cmap`

The Colormap instance or registered colormap name used to map scalar data to colors.

normstr or ~matplotlib.colors.Normalize, optional

The normalization method used to scale scalar data to the [0, 1] range before mapping to colors using cmap. By default, a linear scaling is used, mapping the lowest value to 0 and the highest to 1.

If given, this can be one of the following:

  • An instance of .Normalize or one of its subclasses (see /tutorials/colors/colormapnorms).

  • A scale name, i.e. one of “linear”, “log”, “symlog”, “logit”, etc. For a list of available scales, call matplotlib.scale.get_scale_names(). In that case, a suitable .Normalize subclass is dynamically generated and instantiated.

vmin, vmaxfloat, optional

When using scalar data and no explicit norm, vmin and vmax define the data range that the colormap covers. By default, the colormap covers the complete value range of the supplied data. It is an error to use vmin/vmax when a norm instance is given (but using a str norm name together with vmin/vmax is acceptable).

alpha0 <= scalar <= 1 or None, optional

The alpha blending value.

dataindexable object, optional

If given, the following parameters also accept a string s, which is interpreted as data[s] (unless this raises an exception):

x, y, weights

**kwargs

Additional parameters are passed along to the ~.Axes.pcolormesh method and ~matplotlib.collections.QuadMesh constructor.

See also

hist

1D histogram plotting

hexbin

2D histogram with hexagonal bins

Notes

  • Currently hist2d calculates its own axis limits, and any limits previously set are ignored.

  • Rendering the histogram with a logarithmic color scale is accomplished by passing a .colors.LogNorm instance to the norm keyword argument. Likewise, power-law normalization (similar in effect to gamma correction) can be accomplished with .colors.PowerNorm.

info()[source]
iterkeys()[source]

Iterator over the columns of the table

itervalues()[source]

Iterator over the lines of the table

join_by(r2, key, jointype='inner', r1postfix='1', r2postfix='2', defaults=None, asrecarray=False, asTable=True)[source]

Join arrays r1 and r2 on key key.

The key should be either a string or a sequence of string corresponding to the fields used to join the array. An exception is raised if the key field cannot be found in the two input arrays. Neither r1 nor r2 should have any duplicates along key: the presence of duplicates will make the output quite unreliable. Note that duplicates are not looked for by the algorithm.

Parameters:
key: str or seq(str)

corresponding to the fields used for comparison.

r2: Table

Table to join with

jointype: str in {‘inner’, ‘outer’, ‘leftouter’}
  • ‘inner’ : returns the elements common to both r1 and r2.

  • ‘outer’ : returns the common elements as well as the elements of r1 not in r2 and the elements of not in r2.

  • ‘leftouter’ : returns the common elements and the elements of r1 not in r2.

r1postfix: str

String appended to the names of the fields of r1 that are present in r2

r2postfix: str

String appended to the names of the fields of r2 that are present in r1

defaults: dict

Dictionary mapping field names to the corresponding default values.

Returns:
tab: Table

joined table

Note

  • The output is sorted along the key.

  • A temporary array is formed by dropping the fields not in the key for the two arrays and concatenating the result. This array is then sorted, and the common entries selected. The output is constructed by filling the fields with the selected entries. Matching is not preserved if there are some duplicates…

keys(regexp=None, full_match=False)[source]

Return the data column names or a subset of it

Parameters:
regexp: str

pattern to filter the keys with

full_match: bool

if set, use re.fullmatch() instead of re.match()

Try to apply the pattern at the start of the string, returning
a match object, or None if no match was found.
Returns:
seq: sequence

sequence of keys

match(r2, key)[source]

Returns the indices at which the tables match matching uses 2 columns that are compared in values

Parameters:
r2: Table

second table to use

key: str

fields used for comparison.

Returns:
indexes: tuple

tuple of both indices list where the two columns match.

property name

name of the table given by the Header[‘NAME’] attribute

property nbytes

number of bytes of the object

property ncols

number of columns

property nrows

number of lines

plot(*args, **kwargs)

Plot y versus x as lines and/or markers.

Call signatures:

plot([x], y, [fmt], *, data=None, **kwargs)
plot([x], y, [fmt], [x2], y2, [fmt2], ..., **kwargs)

The coordinates of the points or line nodes are given by x, y.

The optional parameter fmt is a convenient way for defining basic formatting like color, marker and linestyle. It’s a shortcut string notation described in the Notes section below.

>>> plot(x, y)        # plot x and y using default line style and color
>>> plot(x, y, 'bo')  # plot x and y using blue circle markers
>>> plot(y)           # plot y using x as index array 0..N-1
>>> plot(y, 'r+')     # ditto, but with red plusses

You can use .Line2D properties as keyword arguments for more control on the appearance. Line properties and fmt can be mixed. The following two calls yield identical results:

>>> plot(x, y, 'go--', linewidth=2, markersize=12)
>>> plot(x, y, color='green', marker='o', linestyle='dashed',
...      linewidth=2, markersize=12)

When conflicting with fmt, keyword arguments take precedence.

Plotting labelled data

There’s a convenient way for plotting objects with labelled data (i.e. data that can be accessed by index obj['y']). Instead of giving the data in x and y, you can provide the object in the data parameter and just give the labels for x and y:

>>> plot('xlabel', 'ylabel', data=obj)

All indexable objects are supported. This could e.g. be a dict, a pandas.DataFrame or a structured numpy array.

Plotting multiple sets of data

There are various ways to plot multiple sets of data.

  • The most straight forward way is just to call plot multiple times. Example:

    >>> plot(x1, y1, 'bo')
    >>> plot(x2, y2, 'go')
    
  • If x and/or y are 2D arrays a separate data set will be drawn for every column. If both x and y are 2D, they must have the same shape. If only one of them is 2D with shape (N, m) the other must have length N and will be used for every data set m.

    Example:

    >>> x = [1, 2, 3]
    >>> y = np.array([[1, 2], [3, 4], [5, 6]])
    >>> plot(x, y)
    

    is equivalent to:

    >>> for col in range(y.shape[1]):
    ...     plot(x, y[:, col])
    
  • The third way is to specify multiple sets of [x], y, [fmt] groups:

    >>> plot(x1, y1, 'g^', x2, y2, 'g-')
    

    In this case, any additional keyword argument applies to all datasets. Also, this syntax cannot be combined with the data parameter.

By default, each line is assigned a different style specified by a ‘style cycle’. The fmt and line property parameters are only necessary if you want explicit deviations from these defaults. Alternatively, you can also change the style cycle using :rc:`axes.prop_cycle`.

Parameters:
x, yarray-like or scalar

The horizontal / vertical coordinates of the data points. x values are optional and default to range(len(y)).

Commonly, these parameters are 1D arrays.

They can also be scalars, or two-dimensional (in that case, the columns represent separate data sets).

These arguments cannot be passed as keywords.

fmtstr, optional

A format string, e.g. ‘ro’ for red circles. See the Notes section for a full description of the format strings.

Format strings are just an abbreviation for quickly setting basic line properties. All of these and more can also be controlled by keyword arguments.

This argument cannot be passed as keyword.

dataindexable object, optional

An object with labelled data. If given, provide the label names to plot in x and y.

Note

Technically there’s a slight ambiguity in calls where the second label is a valid fmt. plot('n', 'o', data=obj) could be plt(x, y) or plt(y, fmt). In such cases, the former interpretation is chosen, but a warning is issued. You may suppress the warning by adding an empty format string plot('n', 'o', '', data=obj).

Returns:
list of .Line2D

A list of lines representing the plotted data.

Other Parameters:
scalex, scaleybool, default: True

These parameters determine if the view limits are adapted to the data limits. The values are passed on to ~.axes.Axes.autoscale_view.

**kwargs~matplotlib.lines.Line2D properties, optional

kwargs are used to specify properties like a line label (for auto legends), linewidth, antialiasing, marker face color. Example:

>>> plot([1, 2, 3], [1, 2, 3], 'go-', label='line 1', linewidth=2)
>>> plot([1, 2, 3], [1, 4, 9], 'rs', label='line 2')

If you specify multiple lines with one plot call, the kwargs apply to all those lines. In case the label object is iterable, each element is used as labels for each set of data.

Here is a list of available .Line2D properties:

Properties: agg_filter: a filter function, which takes a (m, n, 3) float array and a dpi value, and returns a (m, n, 3) array and two offsets from the bottom left corner of the image alpha: scalar or None animated: bool antialiased or aa: bool clip_box: .Bbox clip_on: bool clip_path: Patch or (Path, Transform) or None color or c: color dash_capstyle: .CapStyle or {‘butt’, ‘projecting’, ‘round’} dash_joinstyle: .JoinStyle or {‘miter’, ‘round’, ‘bevel’} dashes: sequence of floats (on/off ink in points) or (None, None) data: (2, N) array or two 1D arrays drawstyle or ds: {‘default’, ‘steps’, ‘steps-pre’, ‘steps-mid’, ‘steps-post’}, default: ‘default’ figure: .Figure fillstyle: {‘full’, ‘left’, ‘right’, ‘bottom’, ‘top’, ‘none’} gapcolor: color or None gid: str in_layout: bool label: object linestyle or ls: {‘-’, ‘–’, ‘-.’, ‘:’, ‘’, (offset, on-off-seq), …} linewidth or lw: float marker: marker style string, ~.path.Path or ~.markers.MarkerStyle markeredgecolor or mec: color markeredgewidth or mew: float markerfacecolor or mfc: color markerfacecoloralt or mfcalt: color markersize or ms: float markevery: None or int or (int, int) or slice or list[int] or float or (float, float) or list[bool] mouseover: bool path_effects: .AbstractPathEffect picker: float or callable[[Artist, Event], tuple[bool, dict]] pickradius: unknown rasterized: bool sketch_params: (scale: float, length: float, randomness: float) snap: bool or None solid_capstyle: .CapStyle or {‘butt’, ‘projecting’, ‘round’} solid_joinstyle: .JoinStyle or {‘miter’, ‘round’, ‘bevel’} transform: unknown url: str visible: bool xdata: 1D array ydata: 1D array zorder: float

See also

scatter

XY scatter plot with markers of varying size and/or color ( sometimes also called bubble chart).

Notes

Format Strings

A format string consists of a part for color, marker and line:

fmt = '[marker][line][color]'

Each of them is optional. If not provided, the value from the style cycle is used. Exception: If line is given, but no marker, the data will be a line without markers.

Other combinations such as [color][marker][line] are also supported, but note that their parsing may be ambiguous.

Markers

character

description

'.'

point marker

','

pixel marker

'o'

circle marker

'v'

triangle_down marker

'^'

triangle_up marker

'<'

triangle_left marker

'>'

triangle_right marker

'1'

tri_down marker

'2'

tri_up marker

'3'

tri_left marker

'4'

tri_right marker

'8'

octagon marker

's'

square marker

'p'

pentagon marker

'P'

plus (filled) marker

'*'

star marker

'h'

hexagon1 marker

'H'

hexagon2 marker

'+'

plus marker

'x'

x marker

'X'

x (filled) marker

'D'

diamond marker

'd'

thin_diamond marker

'|'

vline marker

'_'

hline marker

Line Styles

character

description

'-'

solid line style

'--'

dashed line style

'-.'

dash-dot line style

':'

dotted line style

Example format strings:

'b'    # blue markers with default shape
'or'   # red circles
'-g'   # green solid line
'--'   # dashed line with default color
'^k:'  # black triangle_up markers connected by a dotted line

Colors

The supported color abbreviations are the single letter codes

character

color

'b'

blue

'g'

green

'r'

red

'c'

cyan

'm'

magenta

'y'

yellow

'k'

black

'w'

white

and the 'CN' colors that index into the default property cycle.

If the color is the only part of the format string, you can additionally use any matplotlib.colors spec, e.g. full names ('green') or hex strings ('#008000').

plot_function(fn, *args, **kwargs)

Generate a plotting method of tab from a given function

Parameters:
tab: SimpleTable instance

table instance

fn: str or callable

if str, will try a function in matplotlib if callable, calls the function directly

xname: str

expecting a column name from the table

yname: str, optional

if provided, another column to use for the plot

onlywhere: sequence or str, optional

if provided, selects only data with this condition the condition can be a ndarray slice or a string. When a string is given, the evaluation calls SimpleTable.where()

ax: matplotlib.Axes instance

if provided make sure it uses the axis to do the plots if a mpl function is used.

Returns:
r: object

anything returned by the called function

pop_columns(names)[source]

Pop several columns from the table

Parameters:
names: sequence

A list containing the names of the columns to remove

Returns:
values: tuple

list of columns

pprint(idx=None, fields=None, ret=False, all=False, full_match=False, headerChar='-', delim=' | ', endline='\n', **kwargs)[source]
Pretty print the table content

you can select the table parts to display using idx to select the rows and fields to only display some columns (ret is only for insternal use)

Parameters:
idx: sequence, slide

sub selection to print

fields: str, sequence

if str can be a regular expression, and/or list of fields separated by spaces or commas

ret: bool

if set return the string representation instead of printing the result

all: bool

if set, force to show all rows

headerChar: char

Character to be used for the row separator line

delim: char

The column delimiter.

pprint_entry(num, keys=None)[source]

print one line with key and values properly to be readable

Parameters:
num: int, slice

indice selection

keys: sequence or str

if str, can be a regular expression if sequence, the sequence of keys to print

remove_column(names)

Remove several columns from the table

Parameters:
names: sequence

A list containing the names of the columns to remove

remove_columns(names)[source]

Remove several columns from the table

Parameters:
names: sequence

A list containing the names of the columns to remove

resolve_alias(colname)[source]

Return the name of an aliased column.

Given an alias, return the column name it aliases. This function is a no-op if the alias is a column name itself.

Aliases are defined by using .define_alias()

reverse_alias(colname)[source]

Return aliases of a given column.

Given a colname, return a sequence of aliases associated to this column Aliases are defined by using .define_alias()

scatter(*args, **kwargs)

A scatter plot of y vs. x with varying marker size and/or color.

Parameters:
x, yfloat or array-like, shape (n, )

The data positions.

sfloat or array-like, shape (n, ), optional

The marker size in points**2 (typographic points are 1/72 in.). Default is rcParams['lines.markersize'] ** 2.

carray-like or list of colors or color, optional

The marker colors. Possible values:

  • A scalar or sequence of n numbers to be mapped to colors using cmap and norm.

  • A 2D array in which the rows are RGB or RGBA.

  • A sequence of colors of length n.

  • A single color format string.

Note that c should not be a single numeric RGB or RGBA sequence because that is indistinguishable from an array of values to be colormapped. If you want to specify the same RGB or RGBA value for all points, use a 2D array with a single row. Otherwise, value-matching will have precedence in case of a size matching with x and y.

If you wish to specify a single color for all points prefer the color keyword argument.

Defaults to None. In that case the marker color is determined by the value of color, facecolor or facecolors. In case those are not specified or None, the marker color is determined by the next color of the Axes’ current “shape and fill” color cycle. This cycle defaults to :rc:`axes.prop_cycle`.

marker~.markers.MarkerStyle, default: :rc:`scatter.marker`

The marker style. marker can be either an instance of the class or the text shorthand for a particular marker. See matplotlib.markers for more information about marker styles.

cmapstr or ~matplotlib.colors.Colormap, default: :rc:`image.cmap`

The Colormap instance or registered colormap name used to map scalar data to colors.

This parameter is ignored if c is RGB(A).

normstr or ~matplotlib.colors.Normalize, optional

The normalization method used to scale scalar data to the [0, 1] range before mapping to colors using cmap. By default, a linear scaling is used, mapping the lowest value to 0 and the highest to 1.

If given, this can be one of the following:

  • An instance of .Normalize or one of its subclasses (see /tutorials/colors/colormapnorms).

  • A scale name, i.e. one of “linear”, “log”, “symlog”, “logit”, etc. For a list of available scales, call matplotlib.scale.get_scale_names(). In that case, a suitable .Normalize subclass is dynamically generated and instantiated.

This parameter is ignored if c is RGB(A).

vmin, vmaxfloat, optional

When using scalar data and no explicit norm, vmin and vmax define the data range that the colormap covers. By default, the colormap covers the complete value range of the supplied data. It is an error to use vmin/vmax when a norm instance is given (but using a str norm name together with vmin/vmax is acceptable).

This parameter is ignored if c is RGB(A).

alphafloat, default: None

The alpha blending value, between 0 (transparent) and 1 (opaque).

linewidthsfloat or array-like, default: :rc:`lines.linewidth`

The linewidth of the marker edges. Note: The default edgecolors is ‘face’. You may want to change this as well.

edgecolors{‘face’, ‘none’, None} or color or sequence of color, default: :rc:`scatter.edgecolors`

The edge color of the marker. Possible values:

  • ‘face’: The edge color will always be the same as the face color.

  • ‘none’: No patch boundary will be drawn.

  • A color or sequence of colors.

For non-filled markers, edgecolors is ignored. Instead, the color is determined like with ‘face’, i.e. from c, colors, or facecolors.

plotnonfinitebool, default: False

Whether to plot points with nonfinite c (i.e. inf, -inf or nan). If True the points are drawn with the bad colormap color (see .Colormap.set_bad).

Returns:
~matplotlib.collections.PathCollection
Other Parameters:
dataindexable object, optional

If given, the following parameters also accept a string s, which is interpreted as data[s] (unless this raises an exception):

x, y, s, linewidths, edgecolors, c, facecolor, facecolors, color

**kwargs~matplotlib.collections.Collection properties

See also

plot

To plot scatter plots when markers are identical in size and color.

Notes

  • The .plot function will be faster for scatterplots where markers don’t vary in size or color.

  • Any or all of x, y, s, and c may be masked arrays, in which case all masks will be combined and only unmasked points will be plotted.

  • Fundamentally, scatter works with 1D arrays; x, y, s, and c may be input as N-D arrays, but within scatter they will be flattened. The exception is c, which will be flattened only if its size matches the size of x and y.

select(fields, indices=None, **kwargs)[source]

Select only a few fields in the table

Parameters:
fields: str or sequence

fields to keep in the resulting table

indices: sequence or slice

extract only on these indices

Returns:
tab: SimpleTable instance

resulting table

selectWhere(fields, condition, condvars=None, **kwargs)[source]
Read table data fulfilling the given condition.

Only the rows fulfilling the condition are included in the result.

Parameters:
fields: str or sequence

fields to keep in the resulting table

condition: str

expression to evaluate on the table includes mathematical operations and attribute names

condvars: dictionary, optional

A dictionary that replaces the local operands in current frame.

Returns:
tab: SimpleTable instance

resulting table

setComment(colname, comment)

Set the comment of a column referenced by its name

Parameters:
colname: str

column name or registered alias

comment: str

column description

setUnit(colname, unit)

Set the unit of a column referenced by its name

Parameters:
colname: str

column name or registered alias

unit: str

unit description

set_alias(alias, colname)[source]

Define an alias to a column

Parameters:
alias: str

The new alias of the column

colname: str

The column being aliased

set_comment(colname, comment)[source]

Set the comment of a column referenced by its name

Parameters:
colname: str

column name or registered alias

comment: str

column description

set_unit(colname, unit)[source]

Set the unit of a column referenced by its name

Parameters:
colname: str

column name or registered alias

unit: str

unit description

property shape

shape of the data

sort(keys, copy=False)[source]

Sort the table inplace according to one or more keys. This operates on the existing table (and does not return a new table).

Parameters:
keys: str or seq(str)

The key(s) to order by

copy: bool

if set returns a sorted copy instead of working inplace

stack(r, *args, **kwargs)[source]

Superposes arrays fields by fields inplace

t.stack(t1, t2, t3, default=None, inplace=True)

Parameters:
r: Table
stats(fn=None, fields=None, fill=None)[source]

Make statistics on columns of a table

Returns:
tab: Table instance

collection of statistics, one column per function in fn and 1 ligne per column in the table

violinplot(*args, **kwargs)

Make a violin plot.

Make a violin plot for each column of dataset or each vector in sequence dataset. Each filled area extends to represent the entire data range, with optional lines at the mean, the median, the minimum, the maximum, and user-specified quantiles.

Parameters:
datasetArray or a sequence of vectors.

The input data.

positionsarray-like, default: [1, 2, …, n]

The positions of the violins. The ticks and limits are automatically set to match the positions.

vertbool, default: True.

If true, creates a vertical violin plot. Otherwise, creates a horizontal violin plot.

widthsarray-like, default: 0.5

Either a scalar or a vector that sets the maximal width of each violin. The default is 0.5, which uses about half of the available horizontal space.

showmeansbool, default: False

If True, will toggle rendering of the means.

showextremabool, default: True

If True, will toggle rendering of the extrema.

showmediansbool, default: False

If True, will toggle rendering of the medians.

quantilesarray-like, default: None

If not None, set a list of floats in interval [0, 1] for each violin, which stands for the quantiles that will be rendered for that violin.

pointsint, default: 100

Defines the number of points to evaluate each of the gaussian kernel density estimations at.

bw_methodstr, scalar or callable, optional

The method used to calculate the estimator bandwidth. This can be ‘scott’, ‘silverman’, a scalar constant or a callable. If a scalar, this will be used directly as kde.factor. If a callable, it should take a matplotlib.mlab.GaussianKDE instance as its only parameter and return a scalar. If None (default), ‘scott’ is used.

dataindexable object, optional

If given, the following parameters also accept a string s, which is interpreted as data[s] (unless this raises an exception):

dataset

Returns:
dict

A dictionary mapping each component of the violinplot to a list of the corresponding collection instances created. The dictionary has the following keys:

  • bodies: A list of the ~.collections.PolyCollection instances containing the filled area of each violin.

  • cmeans: A ~.collections.LineCollection instance that marks the mean values of each of the violin’s distribution.

  • cmins: A ~.collections.LineCollection instance that marks the bottom of each violin’s distribution.

  • cmaxes: A ~.collections.LineCollection instance that marks the top of each violin’s distribution.

  • cbars: A ~.collections.LineCollection instance that marks the centers of each violin’s distribution.

  • cmedians: A ~.collections.LineCollection instance that marks the median values of each of the violin’s distribution.

  • cquantiles: A ~.collections.LineCollection instance created to identify the quantile values of each of the violin’s distribution.

where(condition, condvars=None, *args, **kwargs)[source]

Read table data fulfilling the given condition. Only the rows fulfilling the condition are included in the result.

Parameters:
condition: str

expression to evaluate on the table includes mathematical operations and attribute names

condvars: dictionary, optional

A dictionary that replaces the local operands in current frame.

Returns:
out: ndarray/ tuple of ndarrays
result equivalent to np.where()
write(fname, **kwargs)[source]

write table into file

Parameters:
fname: str

filename to export the table into

.. note::

additional keywords are forwarded to the corresponding libraries pyfits.writeto() or pyfits.append() np.savetxt()

class pystellibs.simpletable.stats[source]

Bases: object

Methods

has_nan

max

mean

min

p16

p50

p84

std

var

classmethod has_nan(v)[source]
classmethod max(v)[source]
classmethod mean(v)[source]
classmethod min(v)[source]
classmethod p16(v)[source]
classmethod p50(v)[source]
classmethod p84(v)[source]
classmethod std(v)[source]
classmethod var(v)[source]

pystellibs.stellib module

Stellar library class

Intent to implement a generic module to manage stellar library from various sources.

The interpolation is implemented from the pegase.2 fortran converted algorithm. (this may not be super pythonic though)

Note

a cython version is available for speed up and should be used transparently when available (run make once)

class pystellibs.stellib.AtmosphereLib(*args, **kwargs)[source]

Bases: Stellib

Almost identical to a spectral library. The difference lies into the units of the input libraries.

Attributes:
flux_units
nbytes

return the number of bytes of the object

wavelength

Methods

generate_individual_spectra(stars, **kwargs)

Generates individual spectra for the given stars and stellar library

generate_individual_values(stars, values, ...)

Generates individual spectra for the given stars and stellar library

generate_stellar_spectrum(logT, logg, logL, Z)

Generates individual spectrum for the given stars APs and the stellar library

get_boundaries([dlogT, dlogg])

Returns the closed boundary polygon around the stellar library with given margins

get_interpolation_data()

Default interpolation

get_radius(logl, logt)

Returns the radius of a star given its luminosity and temperature

get_weights(logT, logg, logL[, weights])

Returns the proper weights for the interpolation Stellar atmospheres are normalized to Radius = 1

plot_boundary([ax, dlogT, dlogg])

Parameters:

points_inside(xypoints[, dlogT, dlogg])

Returns if a point is inside the polygon defined by the boundary of the library

set_default_extrapolation_bounds

get_weights(logT, logg, logL, weights=None)[source]

Returns the proper weights for the interpolation Stellar atmospheres are normalized to Radius = 1

Parameters:
logT: float or ndarray

log-temperatures log(T/K)

logg: float or ndarray

log-gravity log(g)

logL: float or ndarray

bolometric luminosity (log (L/Lsun))

class pystellibs.stellib.CompositeStellib(osllist, *args, **kwargs)[source]

Bases: Stellib

Generates an object from the union of multiple individual libraries

Attributes:
Teff
Z
flux_units
logT
logZ
logg
name
nbytes

return the number of bytes of the object

source
wavelength

return a common wavelength sampling to all libraries. This can be

Methods

generate_individual_spectra(stars, **kwargs)

Generates individual spectra for the given stars and stellar library

generate_individual_values(stars, values, ...)

Generates individual spectra for the given stars and stellar library

generate_stellar_spectrum(logT, logg, logL, Z)

Generates individual spectrum for the given stars APs and the stellar library

get_boundaries(**kwargs)

Returns the closed boundary polygon around the stellar library with given margins

get_interpolation_data()

Default interpolation

get_radius(logl, logt)

Returns the radius of a star given its luminosity and temperature

get_weights(logT, logg, logL[, weights])

Returns the proper weights for the interpolation

plot_boundary([ax, dlogT, dlogg])

Parameters:

points_inside(xypoints[, dlogT, dlogg])

Returns if a point is inside the polygon defined by the boundary of the library

reinterpolate_spectra(l0, specs, **kwargs)

One-dimensional linear interpolation onto the common wavelength.

which_osl(xypoints, **kwargs)

Returns the library indice that contains each point in xypoints

set_default_extrapolation_bounds

property Teff
property Z
property flux_units
generate_individual_spectra(stars, **kwargs)[source]

Generates individual spectra for the given stars and stellar library

Returns NaN spectra if the boundary conditions are not met (no extrapolation)

Parameters:
stars: Table

contains at least (logT, logg, logL, Z) of the considered stars

Returns:
l0: ndarray, ndim=1

wavelength definition of the spectra wavelength in AA

s0: ndarray, shape=(len(stars), len(l0))

array of spectra, one per input star Spectrum in ergs/s/AA or ergs/s/AA/Lsun

generate_individual_values(stars, values, **kwargs)[source]

Generates individual spectra for the given stars and stellar library

Returns NaN spectra if the boundary conditions are not met (no extrapolation)

Parameters:
stars: Table

contains at least (logT, logg, logL, Z) of the considered stars

values: sequence or attribute name

value to interpolate

Returns:
values: sequence

value to interpolate

generate_stellar_spectrum(logT, logg, logL, Z, raise_extrapolation=True, **kwargs)[source]

Generates individual spectrum for the given stars APs and the stellar library

Returns NaN spectra if the boundary conditions are not met (no extrapolation)

Parameters:
logT: float

temperature

logg: float

log-gravity

logL: float

log-luminosity

Z: float

metallicity

raise_extrapolation: bool

if set throw error on extrapolation

null: value

value of the flux when extrapolation and raise_extrapolation is not set

Returns:
s0: ndarray, shape=(len(stars), len(l0))

array of spectra, one per input star Spectrum in ergs/s/AA or ergs/s/AA/Lsun

get_boundaries(**kwargs)[source]

Returns the closed boundary polygon around the stellar library with given margins

Parameters:
s: Stellib

Stellar library object

dlogT: float

margin in logT

dlogg: float

margin in logg

Returns:
b: ndarray[float, ndim=2]

(closed) boundary points: [logg, Teff] (or [Teff, logg] is swap is True)

Note

as computing the boundary could take time, it is saved in the object and only recomputed when parameters are updated

property logT
property logZ
property logg
property name
reinterpolate_spectra(l0, specs, **kwargs)[source]

One-dimensional linear interpolation onto the common wavelength.

Returns the one-dimensional interpolated spectrum

Parameters:
l01-D sequence of floats (with units or not)

wavelength of the spectrum to interpolate

specs1-D sequence of floats

spectrum to reinterpolate

leftfloat, optional

Value to return for x < xp[0], default is fp[0].

rightfloat, optional

Value to return for x > xp[-1], default is fp[-1].

periodNone or float, optional

A period for the x-coordinates. This parameter allows the proper interpolation of angular x-coordinates. Parameters left and right are ignored if period is specified.

Returns:
specndarray

The interpolated values

set_default_extrapolation_bounds(dlogT=None, dlogg=None)[source]
property source
property wavelength

return a common wavelength sampling to all libraries. This can be used to reinterpolate any spectrum onto a common definition

which_osl(xypoints, **kwargs)[source]

Returns the library indice that contains each point in xypoints

The decision is made from a two step search:

  • first, each point is checked against the strict boundary of each library (i.e., dlogT = 0, dlogg = 0).

  • second, if points are not found in strict mode, the boundary is relaxed and a new search is made.

Each point is associated to the first library matching the above conditions.

Parameters:
xypoints: sequence

a sequence of N logg, logT pairs.

dlogT: float

margin in logT

dlogg: float

margin in logg

Returns:
res: ndarray(dtype=int)

a ndarray, 0 meaning no library covers the point, and 1, … n, for the n-th library

class pystellibs.stellib.Stellib(*args, **kwargs)[source]

Bases: object

Basic stellar library class

Attributes:
interpolator: interpolator.BaseInterpolator

interpolator to use, default LeujeuneInterpolator

Methods

generate_individual_spectra(stars, **kwargs)

Generates individual spectra for the given stars and stellar library

generate_individual_values(stars, values, ...)

Generates individual spectra for the given stars and stellar library

generate_stellar_spectrum(logT, logg, logL, Z)

Generates individual spectrum for the given stars APs and the stellar library

get_boundaries([dlogT, dlogg])

Returns the closed boundary polygon around the stellar library with given margins

get_interpolation_data()

Default interpolation

get_radius(logl, logt)

Returns the radius of a star given its luminosity and temperature

get_weights(logT, logg, logL[, weights])

Returns the proper weights for the interpolation

plot_boundary([ax, dlogT, dlogg])

Parameters:

points_inside(xypoints[, dlogT, dlogg])

Returns if a point is inside the polygon defined by the boundary of the library

set_default_extrapolation_bounds

property flux_units
generate_individual_spectra(stars, **kwargs)[source]

Generates individual spectra for the given stars and stellar library

Returns NaN spectra if the boundary conditions are not met (no extrapolation)

Parameters:
stars: Table

contains at least (logT, logg, logL, Z) of the considered stars

dlogT: float

margin in logT

dlogg: float

margin in logg

Returns:
l0: ndarray, ndim=1

wavelength definition of the spectra wavelength in AA

s0: ndarray, shape=(len(stars), len(l0))

array of spectra, one per input star Spectrum in ergs/s/AA or lsun/AA

generate_individual_values(stars, values, **kwargs)[source]

Generates individual spectra for the given stars and stellar library

Returns NaN spectra if the boundary conditions are not met (no extrapolation)

Parameters:
stars: Table

contains at least (logT, logg, logL, Z) of the considered stars

values: sequence or attribute name

value to interpolate

dlogT: float

margin in logT

dlogg: float

margin in logg

Returns:
values: sequence

value to interpolate

generate_stellar_spectrum(logT, logg, logL, Z, raise_extrapolation=True, **kwargs)[source]

Generates individual spectrum for the given stars APs and the stellar library

Returns NaN spectra if the boundary conditions are not met (no extrapolation)

Parameters:
logT: float

temperature

logg: float

log-gravity

logL: float

log-luminosity

Z: float

metallicity

raise_extrapolation: bool

if set throw error on extrapolation

null: value

value of the flux when extrapolation and raise_extrapolation is not set

Returns:
s0: ndarray, shape=(len(stars), len(l0))

array of spectra, one per input star Spectrum in ergs/s/AA or ergs/s/AA/Lsun

get_boundaries(dlogT=0.1, dlogg=0.3, **kwargs)[source]

Returns the closed boundary polygon around the stellar library with given margins

Parameters:
s: Stellib

Stellar library object

dlogT: float

margin in logT

dlogg: float

margin in logg

Returns:
b: ndarray[float, ndim=2]

closed boundary edge points: [logT, logg]

Note

as computing the boundary could take time, it is saved in the object and only recomputed when parameters are updated

get_interpolation_data()[source]

Default interpolation

get_radius(logl, logt)[source]

Returns the radius of a star given its luminosity and temperature

Assuming a black body, it comes:

\[R ^ 2 = L / ( 4 \pi \sigma T ^ 4 ),\]

with:

  • L, luminosity in W,

  • pi, 3.141592…

  • sig, Stefan constant in W * m**-2 * K**-4

  • T, temperature in K

Parameters:
logl: ndarray[float, ndim=1]

log luminosities from the isochrones, in Lsun

logt: ndarray[float, ndim=1]

log temperatures from the isochrones, in K

Returns:
radii: ndarray[float, ndim=1]

array of radii in m (SI units)

get_weights(logT, logg, logL, weights=None)[source]

Returns the proper weights for the interpolation

in spectra libraries the default is to have Lbol=1 normalization

Parameters:
logT: float or ndarray

log-temperatures log(T/K)

logg: float or ndarray

log-gravity log(g)

logL: float or ndarray

bolometric luminosity (log (L/Lsun))

property nbytes

return the number of bytes of the object

plot_boundary(ax=None, dlogT=0.0, dlogg=0.0, **kwargs)[source]
Parameters:
dlogT: float

margin in logT (see get_boundaries)

dlogg: float

margin in logg (see get_boundaries)

.. see also::
matplotlib.plot()

For additional kwargs

points_inside(xypoints, dlogT=0.1, dlogg=0.5)[source]

Returns if a point is inside the polygon defined by the boundary of the library

Parameters:
xypoints: sequence

a sequence of N logg, logT pairs.

dlogT: float

margin in logT

dlogg: float

margin in logg

Returns:
r: ndarray(dtype=bool)

a boolean ndarray, True for points inside the polygon. A point on the boundary may be treated as inside or outside.

set_default_extrapolation_bounds(dlogT=None, dlogg=None)[source]
property wavelength

pystellibs.tlusty module

class pystellibs.tlusty.Tlusty(*args, **kwargs)[source]

Bases: AtmosphereLib

Tlusty O and B stellar atmospheres

  • NLTE

  • Parallel Planes

  • line blanketing

References

Hubeny 1988 for initial reference Lanz, T., & Hubeny, I. (2003) for more recent (NL TE) developments

  • OSTAR2002 Grid: O-type stars, 27500 K <= Teff <= 55000 K
    • Reference: Lanz & Hubeny (2003)

  • BSTAR2006 Grid: Early B-type stars, 15000 K <= Teff <= 30000 K
    • Reference: Lanz & Hubeny (2007)

files are available at: http://nova.astro.umd.edu/Tlusty2002/database/

O and B stars rebinned to nearly 20,000 frequency points (for CLOUDY usage) http://nova.astro.umd.edu/Tlusty2002/database/obstar_merged_3d.ascii.gz

Attributes:
Teff
Z
flux_units
logT
logZ
logg
nbytes

return the number of bytes of the object

wavelength

Methods

bbox([dlogT, dlogg])

Boundary of Tlusty library

generate_individual_spectra(stars, **kwargs)

Generates individual spectra for the given stars and stellar library

generate_individual_values(stars, values, ...)

Generates individual spectra for the given stars and stellar library

generate_stellar_spectrum(logT, logg, logL, Z)

Generates individual spectrum for the given stars APs and the stellar library

get_boundaries([dlogT, dlogg])

Returns the closed boundary polygon around the stellar library with given margins

get_interpolation_data()

Default interpolation

get_radius(logl, logt)

Returns the radius of a star given its luminosity and temperature

get_weights(logT, logg, logL[, weights])

Returns the proper weights for the interpolation Stellar atmospheres are normalized to Radius = 1

plot_boundary([ax, dlogT, dlogg])

Parameters:

points_inside(xypoints[, dlogT, dlogg])

Returns if a point is inside the polygon defined by the boundary of the library

set_default_extrapolation_bounds

property Teff
property Z
bbox(dlogT=0.05, dlogg=0.25)[source]

Boundary of Tlusty library

Parameters:
dlogT: float

log-temperature tolerance before extrapolation limit

dlogg: float

log-g tolerance before extrapolation limit

Returns:
bbox: ndarray

(logT, logg) edges of the bounding polygon

property logT
property logZ
property logg

Module contents