Skip to content

llmcompressor.metrics.logger

Contains code for loggers that help visualize the information from each modifier.

Classes:

  • BaseLogger

    Base class that all modifier loggers must implement.

  • LambdaLogger

    Logger that handles calling back to a lambda function with any logs.

  • LoggerManager

    Wrapper around loggers that handles log scheduling and handing off logs to intended

  • PythonLogger

    Modifier metrics that handles printing values into a python metrics instance.

  • SparsificationGroupLogger

    Modifier metrics that handles outputting values to other supported systems.

  • TensorBoardLogger

    Modifier metrics that handles outputting values into a TensorBoard log directory

  • WANDBLogger

    Modifier metrics that handles outputting values to Weights and Biases.

BaseLogger

BaseLogger(name: str, enabled: bool = True)

Bases: ABC

Base class that all modifier loggers must implement.

Parameters:

  • name

    (str) –

    name given to the metrics, used for identification

  • enabled

    (bool, default: True ) –

    True to log, False otherwise

Methods:

  • log_hyperparams

    :param params: Each key-value pair in the dictionary is the name of the

  • log_scalar

    :param tag: identifying tag to log the value with

  • log_scalars

    :param tag: identifying tag to log the values with

  • log_string

    :param tag: identifying tag to log the values with

  • save

    :param file_path: path to a file to be saved

Attributes:

  • enabled (bool) –

    :return: True to log, False otherwise

  • name (str) –

    :return: name given to the metrics, used for identification

Source code in llmcompressor/metrics/logger.py
def __init__(self, name: str, enabled: bool = True):
    self._name = name
    self._enabled = enabled

enabled property writable

enabled: bool

Returns:

  • bool

    True to log, False otherwise

name property

name: str

Returns:

  • str

    name given to the metrics, used for identification

log_hyperparams

log_hyperparams(params: Dict[str, float]) -> bool

Parameters:

  • params

    (Dict[str, float]) –

    Each key-value pair in the dictionary is the name of the hyper parameter and it's corresponding value.

Returns:

  • bool

    True if logged, False otherwise.

Source code in llmcompressor/metrics/logger.py
def log_hyperparams(self, params: Dict[str, float]) -> bool:
    """
    :param params: Each key-value pair in the dictionary is the name of the
        hyper parameter and it's corresponding value.
    :return: True if logged, False otherwise.
    """
    return False

log_scalar

log_scalar(
    tag: str,
    value: float,
    step: Optional[int] = None,
    wall_time: Optional[float] = None,
    **kwargs
) -> bool

Parameters:

  • tag

    (str) –

    identifying tag to log the value with

  • value

    (float) –

    value to save

  • step

    (Optional[int], default: None ) –

    global step for when the value was taken

  • wall_time

    (Optional[float], default: None ) –

    global wall time for when the value was taken

  • kwargs

    additional logging arguments to support Python and custom loggers

Returns:

  • bool

    True if logged, False otherwise.

Source code in llmcompressor/metrics/logger.py
def log_scalar(
    self,
    tag: str,
    value: float,
    step: Optional[int] = None,
    wall_time: Optional[float] = None,
    **kwargs,
) -> bool:
    """
    :param tag: identifying tag to log the value with
    :param value: value to save
    :param step: global step for when the value was taken
    :param wall_time: global wall time for when the value was taken
    :param kwargs: additional logging arguments to support Python and custom loggers
    :return: True if logged, False otherwise.
    """
    return False

log_scalars

log_scalars(
    tag: str,
    values: Dict[str, float],
    step: Optional[int] = None,
    wall_time: Optional[float] = None,
    **kwargs
) -> bool

Parameters:

  • tag

    (str) –

    identifying tag to log the values with

  • values

    (Dict[str, float]) –

    values to save

  • step

    (Optional[int], default: None ) –

    global step for when the values were taken

  • wall_time

    (Optional[float], default: None ) –

    global wall time for when the values were taken

  • kwargs

    additional logging arguments to support Python and custom loggers

Returns:

  • bool

    True if logged, False otherwise.

Source code in llmcompressor/metrics/logger.py
def log_scalars(
    self,
    tag: str,
    values: Dict[str, float],
    step: Optional[int] = None,
    wall_time: Optional[float] = None,
    **kwargs,
) -> bool:
    """
    :param tag: identifying tag to log the values with
    :param values: values to save
    :param step: global step for when the values were taken
    :param wall_time: global wall time for when the values were taken
    :param kwargs: additional logging arguments to support Python and custom loggers
    :return: True if logged, False otherwise.
    """
    return False

log_string

log_string(
    tag: str,
    string: str,
    step: Optional[int] = None,
    wall_time: Optional[float] = None,
    **kwargs
) -> bool

Parameters:

  • tag

    (str) –

    identifying tag to log the values with

  • values

    values to save

  • step

    (Optional[int], default: None ) –

    global step for when the values were taken

  • wall_time

    (Optional[float], default: None ) –

    global wall time for when the values were taken

  • kwargs

    additional logging arguments to support Python and custom loggers

Returns:

  • bool

    True if logged, False otherwise.

Source code in llmcompressor/metrics/logger.py
def log_string(
    self,
    tag: str,
    string: str,
    step: Optional[int] = None,
    wall_time: Optional[float] = None,
    **kwargs,
) -> bool:
    """
    :param tag: identifying tag to log the values with
    :param values: values to save
    :param step: global step for when the values were taken
    :param wall_time: global wall time for when the values were taken
    :param kwargs: additional logging arguments to support Python and custom loggers
    :return: True if logged, False otherwise.
    """
    return False

save

save(file_path: str, **kwargs) -> bool

Parameters:

  • file_path

    (str) –

    path to a file to be saved

  • kwargs

    additional arguments that a specific metrics might use

Returns:

  • bool

    True if saved, False otherwise

Source code in llmcompressor/metrics/logger.py
def save(
    self,
    file_path: str,
    **kwargs,
) -> bool:
    """
    :param file_path: path to a file to be saved
    :param kwargs: additional arguments that a specific metrics might use
    :return: True if saved, False otherwise
    """
    return False

LambdaLogger

LambdaLogger(
    lambda_func: Callable[
        [
            Optional[str],
            Optional[Union[float, str]],
            Optional[Dict[str, float]],
            Optional[int],
            Optional[float],
            Optional[int],
        ],
        bool,
    ],
    name: str = "lambda",
    enabled: bool = True,
)

Bases: BaseLogger

Logger that handles calling back to a lambda function with any logs.

Parameters:

  • lambda_func

    (Callable[[Optional[str], Optional[Union[float, str]], Optional[Dict[str, float]], Optional[int], Optional[float], Optional[int]], bool]) –

    the lambda function to call back into with any logs. The expected call sequence is (tag, value, values, step, wall_time) -> bool The return type is True if logged and False otherwise.

  • name

    (str, default: 'lambda' ) –

    name given to the metrics, used for identification; defaults to lambda

  • enabled

    (bool, default: True ) –

    True to log, False otherwise

Methods:

  • log_hyperparams

    :param params: Each key-value pair in the dictionary is the name of the

  • log_scalar

    :param tag: identifying tag to log the value with

  • log_scalars

    :param tag: identifying tag to log the values with

Attributes:

  • lambda_func (Callable[[Optional[str], Optional[Union[float, str]], Optional[Dict[str, float]], Optional[int], Optional[float], Optional[int]], bool]) –

    :return: the lambda function to call back into with any logs.

Source code in llmcompressor/metrics/logger.py
def __init__(
    self,
    lambda_func: Callable[
        [
            Optional[str],
            Optional[Union[float, str]],
            Optional[Dict[str, float]],
            Optional[int],
            Optional[float],
            Optional[int],
        ],
        bool,
    ],
    name: str = "lambda",
    enabled: bool = True,
):
    super().__init__(name, enabled)
    self._lambda_func = lambda_func
    assert lambda_func, "lambda_func must be set to a callable function"

lambda_func property

lambda_func: Callable[
    [
        Optional[str],
        Optional[Union[float, str]],
        Optional[Dict[str, float]],
        Optional[int],
        Optional[float],
        Optional[int],
    ],
    bool,
]

Returns:

  • Callable[[Optional[str], Optional[Union[float, str]], Optional[Dict[str, float]], Optional[int], Optional[float], Optional[int]], bool]

    the lambda function to call back into with any logs. The expected call sequence is (tag, value, values, step, wall_time)

log_hyperparams

log_hyperparams(
    params: Dict, level: Optional[Union[int, str]] = None
) -> bool

Parameters:

  • params

    (Dict) –

    Each key-value pair in the dictionary is the name of the hyper parameter and it's corresponding value.

  • level

    (Optional[Union[int, str]], default: None ) –

    minimum severity level for the log message

Returns:

  • bool

    True if logged, False otherwise.

Source code in llmcompressor/metrics/logger.py
def log_hyperparams(
    self,
    params: Dict,
    level: Optional[Union[int, str]] = None,
) -> bool:
    """
    :param params: Each key-value pair in the dictionary is the name of the
        hyper parameter and it's corresponding value.
    :param level: minimum severity level for the log message
    :return: True if logged, False otherwise.
    """
    if not self.enabled:
        return False

    return self._lambda_func(
        tag=None,
        value=None,
        values=params,
        step=None,
        wall_time=None,
        level=level,
    )

log_scalar

log_scalar(
    tag: str,
    value: float,
    step: Optional[int] = None,
    wall_time: Optional[float] = None,
    level: Optional[Union[int, str]] = None,
) -> bool

Parameters:

  • tag

    (str) –

    identifying tag to log the value with

  • value

    (float) –

    value to save

  • step

    (Optional[int], default: None ) –

    global step for when the value was taken

  • wall_time

    (Optional[float], default: None ) –

    global wall time for when the value was taken, defaults to time.time()

  • level

    (Optional[Union[int, str]], default: None ) –

    minimum severity level for the log message

  • kwargs

    additional logging arguments to support Python and custom loggers

Returns:

  • bool

    True if logged, False otherwise.

Source code in llmcompressor/metrics/logger.py
def log_scalar(
    self,
    tag: str,
    value: float,
    step: Optional[int] = None,
    wall_time: Optional[float] = None,
    level: Optional[Union[int, str]] = None,
) -> bool:
    """
    :param tag: identifying tag to log the value with
    :param value: value to save
    :param step: global step for when the value was taken
    :param wall_time: global wall time for when the value was taken,
        defaults to time.time()
    :param level: minimum severity level for the log message
    :param kwargs: additional logging arguments to support Python and custom loggers
    :return: True if logged, False otherwise.
    """
    if not wall_time:
        wall_time = time.time()

    return self._lambda_func(
        tag=tag,
        value=value,
        values=None,
        step=step,
        wall_time=wall_time,
        level=level,
    )

log_scalars

log_scalars(
    tag: str,
    values: Dict[str, float],
    step: Optional[int] = None,
    wall_time: Optional[float] = None,
    level: Optional[Union[int, str]] = None,
) -> bool

Parameters:

  • tag

    (str) –

    identifying tag to log the values with

  • values

    (Dict[str, float]) –

    values to save

  • step

    (Optional[int], default: None ) –

    global step for when the values were taken

  • wall_time

    (Optional[float], default: None ) –

    global wall time for when the values were taken, defaults to time.time()

  • level

    (Optional[Union[int, str]], default: None ) –

    minimum severity level for the log message

  • kwargs

    additional logging arguments to support Python and custom loggers

Returns:

  • bool

    True if logged, False otherwise.

Source code in llmcompressor/metrics/logger.py
def log_scalars(
    self,
    tag: str,
    values: Dict[str, float],
    step: Optional[int] = None,
    wall_time: Optional[float] = None,
    level: Optional[Union[int, str]] = None,
) -> bool:
    """
    :param tag: identifying tag to log the values with
    :param values: values to save
    :param step: global step for when the values were taken
    :param wall_time: global wall time for when the values were taken,
        defaults to time.time()
    :param level: minimum severity level for the log message
    :param kwargs: additional logging arguments to support Python and custom loggers
    :return: True if logged, False otherwise.
    """
    if not wall_time:
        wall_time = time.time()

    return self._lambda_func(
        tag=tag,
        value=None,
        values=values,
        step=step,
        wall_time=wall_time,
        level=level,
    )

LoggerManager

LoggerManager(
    loggers: Optional[List[BaseLogger]] = None,
    log_frequency: Union[float, None] = 0.1,
    log_python: bool = True,
    name: str = "manager",
    mode: LoggingModeType = "exact",
    frequency_type: FrequencyType = "epoch",
)

Bases: ABC

Wrapper around loggers that handles log scheduling and handing off logs to intended loggers.

Parameters:

  • loggers

    (Optional[List[BaseLogger]], default: None ) –

    list of loggers assigned to this manager

  • log_frequency

    (Union[float, None], default: 0.1 ) –

    number of stes or fraction of steps to wait between logs

  • mode

    (LoggingModeType, default: 'exact' ) –

    The logging mode to use, either "on_change" or "exact", "on_change" will log when the model has been updated since the last log, "exact" will log at the given frequency regardless of model updates. Defaults to "exact"

  • frequency_type

    (FrequencyType, default: 'epoch' ) –

    The frequency type to use, either "epoch", "step", or "batch" controls what the frequency manager is tracking, e.g. if the frequency type is "epoch", then the frequency manager will track the number of epochs that have passed since the last log, if the frequency type is "step", then the frequency manager will track the number of optimizer steps

Methods:

  • add_logger

    add a BaseLogger implementation to the loggers of this manager

  • log_hyperparams

    (Note: this method is deprecated and will be removed in a future version,

  • log_ready

    Check if there is a metrics that is ready to accept a log

  • log_scalar

    (Note: this method is deprecated and will be removed in a future version,

  • log_scalars

    (Note: this method is deprecated and will be removed in a future version,

  • log_string

    (Note: this method is deprecated and will be removed in a future version,

  • log_written

    Update the frequency manager with the last log step written

  • model_updated

    Update the frequency manager with the last model update step

  • save

    :param file_path: path to a file to be saved

  • time

    Context manager to log the time it takes to run the block of code

Attributes:

  • log_frequency (Union[str, float, None]) –

    :return: number of epochs or fraction of epochs to wait between logs

  • loggers (List[BaseLogger]) –

    :return: list of loggers assigned to this manager

  • name (str) –

    :return: name given to the metrics, used for identification

  • wandb (Optional[ModuleType]) –

    :return: wandb module if initialized

Source code in llmcompressor/metrics/logger.py
def __init__(
    self,
    loggers: Optional[List[BaseLogger]] = None,
    log_frequency: Union[float, None] = 0.1,
    log_python: bool = True,
    name: str = "manager",
    mode: LoggingModeType = "exact",
    frequency_type: FrequencyType = "epoch",
):
    self._name = name
    self._loggers = (
        loggers
        or SparsificationGroupLogger(
            python=log_python,
            name=name,
            tensorboard=False,
            wandb_=False,
        ).loggers
    )

    self.frequency_manager = FrequencyManager(
        mode=mode,
        frequency_type=frequency_type,
        log_frequency=log_frequency,
    )

    self.system = SystemLoggingWraper(
        loggers=self._loggers, frequency_manager=self.frequency_manager
    )
    self.metric = MetricLoggingWrapper(
        loggers=self._loggers, frequency_manager=self.frequency_manager
    )

log_frequency property writable

log_frequency: Union[str, float, None]

Returns:

  • Union[str, float, None]

    number of epochs or fraction of epochs to wait between logs

loggers property writable

loggers: List[BaseLogger]

Returns:

  • List[BaseLogger]

    list of loggers assigned to this manager

name property

name: str

Returns:

  • str

    name given to the metrics, used for identification

wandb property

wandb: Optional[ModuleType]

Returns:

  • Optional[ModuleType]

    wandb module if initialized

add_logger

add_logger(logger: BaseLogger)

add a BaseLogger implementation to the loggers of this manager

Parameters:

Source code in llmcompressor/metrics/logger.py
def add_logger(self, logger: BaseLogger):
    """
    add a BaseLogger implementation to the loggers of this manager

    :param logger: metrics object to add
    """
    if not isinstance(logger, BaseLogger):
        raise ValueError(f"metrics {type(logger)} must be of type BaseLogger")
    self._loggers.append(logger)

log_hyperparams

log_hyperparams(
    params: Dict,
    log_types: Union[str, List[str]] = ALL_TOKEN,
    level: Optional[Union[int, str]] = None,
)

(Note: this method is deprecated and will be removed in a future version, use LoggerManager().metric.log_hyperparams instead)

Parameters:

  • params

    (Dict) –

    Each key-value pair in the dictionary is the name of the hyper parameter and it's corresponding value.

Source code in llmcompressor/metrics/logger.py
def log_hyperparams(
    self,
    params: Dict,
    log_types: Union[str, List[str]] = ALL_TOKEN,
    level: Optional[Union[int, str]] = None,
):
    """
    (Note: this method is deprecated and will be removed in a future version,
    use LoggerManager().metric.log_hyperparams instead)

    :param params: Each key-value pair in the dictionary is the name of the
        hyper parameter and it's corresponding value.
    """

    self.metric.log_hyperparams(
        params=params,
        log_types=log_types,
        level=level,
    )

log_ready

log_ready(
    current_log_step,
    last_log_step=None,
    check_model_update: bool = False,
)

Check if there is a metrics that is ready to accept a log

Parameters:

  • current_log_step

    current step log is requested at

  • last_log_step

    last time a log was recorder for this object. (Deprecated)

  • check_model_update

    (bool, default: False ) –

    if True, will check if the model has been updated, if False, will only check the log frequency

Returns:

  • True if a metrics is ready to accept a log.

Source code in llmcompressor/metrics/logger.py
def log_ready(
    self, current_log_step, last_log_step=None, check_model_update: bool = False
):
    """
    Check if there is a metrics that is ready to accept a log

    :param current_log_step: current step log is requested at
    :param last_log_step: last time a log was recorder for this object. (Deprecated)
    :param check_model_update: if True, will check if the model has been updated,
        if False, will only check the log frequency
    :return: True if a metrics is ready to accept a log.
    """
    log_enabled = any(logger.enabled for logger in self.loggers)
    if last_log_step is not None:
        self.frequency_manager.log_written(step=last_log_step)

    return log_enabled and self.frequency_manager.log_ready(
        current_log_step=current_log_step,
        check_model_update=check_model_update,
    )

log_scalar

log_scalar(
    tag: str,
    value: float,
    step: Optional[int] = None,
    wall_time: Optional[float] = None,
    log_types: Union[str, List[str]] = ALL_TOKEN,
    level: Optional[Union[int, str]] = None,
)

(Note: this method is deprecated and will be removed in a future version, use LoggerManager().metric.log_scalar instead)

Parameters:

  • tag

    (str) –

    identifying tag to log the value with

  • value

    (float) –

    value to save

  • step

    (Optional[int], default: None ) –

    global step for when the value was taken

  • wall_time

    (Optional[float], default: None ) –

    global wall time for when the value was taken

  • level

    (Optional[Union[int, str]], default: None ) –

    minimum severity level for the log message

  • kwargs

    additional logging arguments to support Python and custom loggers

Returns:

  • True if logged, False otherwise.

Source code in llmcompressor/metrics/logger.py
def log_scalar(
    self,
    tag: str,
    value: float,
    step: Optional[int] = None,
    wall_time: Optional[float] = None,
    log_types: Union[str, List[str]] = ALL_TOKEN,
    level: Optional[Union[int, str]] = None,
):
    """
    (Note: this method is deprecated and will be removed in a future version,
    use LoggerManager().metric.log_scalar instead)

    :param tag: identifying tag to log the value with
    :param value: value to save
    :param step: global step for when the value was taken
    :param wall_time: global wall time for when the value was taken
    :param level: minimum severity level for the log message
    :param kwargs: additional logging arguments to support Python and custom loggers
    :return: True if logged, False otherwise.
    """

    self.metric.log_scalar(
        tag=tag,
        value=value,
        step=step,
        wall_time=wall_time,
        log_types=log_types,
        level=level,
    )

log_scalars

log_scalars(
    tag: str,
    values: Dict[str, float],
    step: Optional[int] = None,
    wall_time: Optional[float] = None,
    log_types: Union[str, List[str]] = ALL_TOKEN,
    level: Optional[Union[int, str]] = None,
)

(Note: this method is deprecated and will be removed in a future version, use LoggerManager().metric.log_scalars instead)

Parameters:

  • tag

    (str) –

    identifying tag to log the values with

  • values

    (Dict[str, float]) –

    values to save

  • step

    (Optional[int], default: None ) –

    global step for when the values were taken

  • wall_time

    (Optional[float], default: None ) –

    global wall time for when the values were taken

  • level

    (Optional[Union[int, str]], default: None ) –

    minimum severity level for the log message

  • kwargs

    additional logging arguments to support Python and custom loggers

Returns:

  • True if logged, False otherwise.

Source code in llmcompressor/metrics/logger.py
def log_scalars(
    self,
    tag: str,
    values: Dict[str, float],
    step: Optional[int] = None,
    wall_time: Optional[float] = None,
    log_types: Union[str, List[str]] = ALL_TOKEN,
    level: Optional[Union[int, str]] = None,
):
    """
    (Note: this method is deprecated and will be removed in a future version,
    use LoggerManager().metric.log_scalars instead)

    :param tag: identifying tag to log the values with
    :param values: values to save
    :param step: global step for when the values were taken
    :param wall_time: global wall time for when the values were taken
    :param level: minimum severity level for the log message
    :param kwargs: additional logging arguments to support Python and custom loggers
    :return: True if logged, False otherwise.
    """

    self.metric.log_scalars(
        tag=tag,
        values=values,
        step=step,
        wall_time=wall_time,
        log_types=log_types,
        level=level,
    )

log_string

log_string(
    tag: str,
    string: str,
    step: Optional[int] = None,
    wall_time: Optional[float] = None,
    log_types: Union[str, List[str]] = ALL_TOKEN,
    level: Optional[Union[int, str]] = None,
)

(Note: this method is deprecated and will be removed in a future version, use LoggerManager().system.log_string instead)

Parameters:

  • tag

    (str) –

    identifying tag to log the values with

  • values

    values to save

  • step

    (Optional[int], default: None ) –

    global step for when the values were taken

  • wall_time

    (Optional[float], default: None ) –

    global wall time for when the values were taken

  • kwargs

    additional logging arguments to support Python and custom loggers

  • level

    (Optional[Union[int, str]], default: None ) –

    minimum severity level for the log message

Returns:

  • True if logged, False otherwise.

Source code in llmcompressor/metrics/logger.py
def log_string(
    self,
    tag: str,
    string: str,
    step: Optional[int] = None,
    wall_time: Optional[float] = None,
    log_types: Union[str, List[str]] = ALL_TOKEN,
    level: Optional[Union[int, str]] = None,
):
    """
    (Note: this method is deprecated and will be removed in a future version,
    use LoggerManager().system.log_string instead)

    :param tag: identifying tag to log the values with
    :param values: values to save
    :param step: global step for when the values were taken
    :param wall_time: global wall time for when the values were taken
    :param kwargs: additional logging arguments to support Python and custom loggers
    :param level: minimum severity level for the log message
    :return: True if logged, False otherwise.
    """
    self.system.log_string(
        tag=tag,
        string=string,
        step=step,
        wall_time=wall_time,
        log_types=log_types,
        level=level,
    )

log_written

log_written(step: LogStepType)

Update the frequency manager with the last log step written

Parameters:

  • step

    (LogStepType) –

    step that was last logged

Source code in llmcompressor/metrics/logger.py
def log_written(self, step: LogStepType):
    """
    Update the frequency manager with the last log step written

    :param step: step that was last logged
    """
    self.frequency_manager.log_written(step=step)

model_updated

model_updated(step: LogStepType)

Update the frequency manager with the last model update step

Parameters:

  • step

    (LogStepType) –

    step that was last logged

Source code in llmcompressor/metrics/logger.py
def model_updated(self, step: LogStepType):
    """
    Update the frequency manager with the last model update step

    :param step: step that was last logged
    """
    self.frequency_manager.model_updated(step=step)

save

save(file_path: str, **kwargs)

Parameters:

  • file_path

    (str) –

    path to a file to be saved

  • kwargs

    additional arguments that a specific metrics might use

Source code in llmcompressor/metrics/logger.py
def save(
    self,
    file_path: str,
    **kwargs,
):
    """
    :param file_path: path to a file to be saved
    :param kwargs: additional arguments that a specific metrics might use
    """
    for log in self._loggers:
        if log.enabled:
            log.save(file_path, **kwargs)

time

time(tag: Optional[str] = None, *args, **kwargs)

Context manager to log the time it takes to run the block of code

Usage:

with LoggerManager().time("my_block"): time.sleep(1)

Parameters:

  • tag

    (Optional[str], default: None ) –

    identifying tag to log the values with

Source code in llmcompressor/metrics/logger.py
@contextmanager
def time(self, tag: Optional[str] = None, *args, **kwargs):
    """
    Context manager to log the time it takes to run the block of code

    Usage:
    >>> with LoggerManager().time("my_block"):
    >>>    time.sleep(1)

    :param tag: identifying tag to log the values with
    """

    start = time.time()
    yield
    elapsed = time.time() - start
    if not tag:
        tag = f"{DEFAULT_TAG}_time_secs"
    self.log_scalar(tag=tag, value=float(f"{elapsed:.3f}"), *args, **kwargs)

LoggingWrapperBase

LoggingWrapperBase(
    loggers: List[BaseLogger],
    frequency_manager: FrequencyManager,
)

Base class that holds a reference to the loggers and frequency manager

Source code in llmcompressor/metrics/logger.py
def __init__(self, loggers: List[BaseLogger], frequency_manager: FrequencyManager):
    self.loggers = loggers
    self._frequency_manager = frequency_manager

MetricLoggingWrapper

MetricLoggingWrapper(
    loggers: List[BaseLogger],
    frequency_manager: FrequencyManager,
)

Bases: LoggingWrapperBase

Wraps utilities and convenience methods for logging metrics to the system

Methods:

  • add_scalar

    Add a scalar value to the metrics

  • add_scalars

    Adds multiple scalar values to the metrics

  • log

    :param data: A dict of serializable python objects i.e str,

  • log_hyperparams

    :param params: Each key-value pair in the dictionary is the name of the

  • log_scalar

    :param tag: identifying tag to log the value with

  • log_scalars

    :param tag: identifying tag to log the values with

Source code in llmcompressor/metrics/logger.py
def __init__(self, loggers: List[BaseLogger], frequency_manager: FrequencyManager):
    self.loggers = loggers
    self._frequency_manager = frequency_manager

add_scalar

add_scalar(
    value,
    tag: str = DEFAULT_TAG,
    step: Optional[int] = None,
    wall_time: Union[int, float, None] = None,
    **kwargs
)

Add a scalar value to the metrics

Parameters:

  • value

    value to log

  • tag

    (str, default: DEFAULT_TAG ) –

    tag to log the value with, defaults to DEFAULT_TAG

  • step

    (Optional[int], default: None ) –

    global step for when the value was taken

  • wall_time

    (Union[int, float, None], default: None ) –

    global wall time for when the value was taken

  • kwargs

    additional logging arguments to to pass through to the metrics

Source code in llmcompressor/metrics/logger.py
def add_scalar(
    self,
    value,
    tag: str = DEFAULT_TAG,
    step: Optional[int] = None,
    wall_time: Union[int, float, None] = None,
    **kwargs,
):
    """
    Add a scalar value to the metrics

    :param value: value to log
    :param tag: tag to log the value with, defaults to DEFAULT_TAG
    :param step: global step for when the value was taken
    :param wall_time: global wall time for when the value was taken
    :param kwargs: additional logging arguments to to pass through to the
        metrics
    """
    self.log_scalar(tag=tag, value=value, step=step, wall_time=wall_time, **kwargs)

add_scalars

add_scalars(
    values: Dict[str, Any],
    tag: str = DEFAULT_TAG,
    step: Optional[int] = None,
    wall_time: Union[int, float, None] = None,
    **kwargs
)

Adds multiple scalar values to the metrics

Parameters:

  • values

    (Dict[str, Any]) –

    values to log, must be A dict of serializable python objects i.e str, ints, floats, Tensors, dicts, etc

  • tag

    (str, default: DEFAULT_TAG ) –

    tag to log the value with, defaults to DEFAULT_TAG

  • step

    (Optional[int], default: None ) –

    global step for when the value was taken

  • wall_time

    (Union[int, float, None], default: None ) –

    global wall time for when the value was taken

  • kwargs

    additional logging arguments to to pass through to the metrics

Source code in llmcompressor/metrics/logger.py
def add_scalars(
    self,
    values: Dict[str, Any],
    tag: str = DEFAULT_TAG,
    step: Optional[int] = None,
    wall_time: Union[int, float, None] = None,
    **kwargs,
):
    """
    Adds multiple scalar values to the metrics

    :param values: values to log, must be A dict of serializable
        python objects i.e `str`, `ints`, `floats`, `Tensors`, `dicts`, etc
    :param tag: tag to log the value with, defaults to DEFAULT_TAG
    :param step: global step for when the value was taken
    :param wall_time: global wall time for when the value was taken
    :param kwargs: additional logging arguments to to pass through to the
        metrics
    """
    self.log_scalars(
        tag=tag, values=values, step=step, wall_time=wall_time, **kwargs
    )

log

log(
    data: Dict[str, Any],
    step: Optional[int] = None,
    tag: Optional[str] = DEFAULT_TAG,
    **kwargs
) -> None

Parameters:

  • data

    (Dict[str, Any]) –

    A dict of serializable python objects i.e str, ints, floats, Tensors, dicts, etc

  • step

    (Optional[int], default: None ) –

    global step for when the values were taken

  • tag

    (Optional[str], default: DEFAULT_TAG ) –

    identifying tag to log the values with, defaults to DEFAULT_TAG

  • kwargs

    additional logging arguments to support Python and custom loggers

Source code in llmcompressor/metrics/logger.py
def log(
    self,
    data: Dict[str, Any],
    step: Optional[int] = None,
    tag: Optional[str] = DEFAULT_TAG,
    **kwargs,
) -> None:
    """
    :param data:  A dict of serializable python objects i.e `str`,
            `ints`, `floats`, `Tensors`, `dicts`, etc
    :param step: global step for when the values were taken
    :param tag: identifying tag to log the values with, defaults to DEFAULT_TAG
    :param kwargs: additional logging arguments to support
        Python and custom loggers
    """
    self.log_scalars(tag=tag, values=data, step=step, **kwargs)

log_hyperparams

log_hyperparams(
    params: Dict,
    log_types: Union[str, List[str]] = ALL_TOKEN,
    level: Optional[Union[int, str]] = None,
)

Parameters:

  • params

    (Dict) –

    Each key-value pair in the dictionary is the name of the hyper parameter and it's corresponding value.

  • level

    (Optional[Union[int, str]], default: None ) –

    minimum severity level for the log message

Source code in llmcompressor/metrics/logger.py
def log_hyperparams(
    self,
    params: Dict,
    log_types: Union[str, List[str]] = ALL_TOKEN,
    level: Optional[Union[int, str]] = None,
):
    """
    :param params: Each key-value pair in the dictionary is the name of the
        hyper parameter and it's corresponding value.
    :param level: minimum severity level for the log message
    """
    for log in self.loggers:
        if log.enabled and (log_types == ALL_TOKEN or log.name in log_types):
            log.log_hyperparams(params, level)

log_scalar

log_scalar(
    tag: str,
    value: float,
    step: Optional[int] = None,
    wall_time: Optional[float] = None,
    log_types: Union[str, List[str]] = ALL_TOKEN,
    level: Optional[Union[int, str]] = None,
)

Parameters:

  • tag

    (str) –

    identifying tag to log the value with

  • value

    (float) –

    value to save

  • step

    (Optional[int], default: None ) –

    global step for when the value was taken

  • wall_time

    (Optional[float], default: None ) –

    global wall time for when the value was taken

  • level

    (Optional[Union[int, str]], default: None ) –

    minimum severity level for the log message

  • kwargs

    additional logging arguments to support Python and custom loggers

Returns:

  • True if logged, False otherwise.

Source code in llmcompressor/metrics/logger.py
def log_scalar(
    self,
    tag: str,
    value: float,
    step: Optional[int] = None,
    wall_time: Optional[float] = None,
    log_types: Union[str, List[str]] = ALL_TOKEN,
    level: Optional[Union[int, str]] = None,
):
    """
    :param tag: identifying tag to log the value with
    :param value: value to save
    :param step: global step for when the value was taken
    :param wall_time: global wall time for when the value was taken
    :param level: minimum severity level for the log message
    :param kwargs: additional logging arguments to support Python and custom loggers
    :return: True if logged, False otherwise.
    """
    for log in self.loggers:
        if log.enabled and (log_types == ALL_TOKEN or log.name in log_types):
            log.log_scalar(
                tag=tag,
                value=value,
                step=step,
                wall_time=wall_time,
                level=level,
            )

log_scalars

log_scalars(
    tag: str,
    values: Dict[str, float],
    step: Optional[int] = None,
    wall_time: Optional[float] = None,
    log_types: Union[str, List[str]] = ALL_TOKEN,
    level: Optional[Union[int, str]] = None,
)

Parameters:

  • tag

    (str) –

    identifying tag to log the values with

  • values

    (Dict[str, float]) –

    values to save

  • step

    (Optional[int], default: None ) –

    global step for when the values were taken

  • wall_time

    (Optional[float], default: None ) –

    global wall time for when the values were taken

  • level

    (Optional[Union[int, str]], default: None ) –

    minimum severity level for the log message

  • kwargs

    additional logging arguments to support Python and custom loggers

Returns:

  • True if logged, False otherwise.

Source code in llmcompressor/metrics/logger.py
def log_scalars(
    self,
    tag: str,
    values: Dict[str, float],
    step: Optional[int] = None,
    wall_time: Optional[float] = None,
    log_types: Union[str, List[str]] = ALL_TOKEN,
    level: Optional[Union[int, str]] = None,
):
    """
    :param tag: identifying tag to log the values with
    :param values: values to save
    :param step: global step for when the values were taken
    :param wall_time: global wall time for when the values were taken
    :param level: minimum severity level for the log message
    :param kwargs: additional logging arguments to support Python and custom loggers
    :return: True if logged, False otherwise.
    """
    for log in self.loggers:
        if log.enabled and (log_types == ALL_TOKEN or log.name in log_types):
            log.log_scalars(
                tag=tag,
                values=values,
                step=step,
                wall_time=wall_time,
                level=level,
            )

PythonLogger

PythonLogger(name: str = 'python', enabled: bool = True)

Bases: LambdaLogger

Modifier metrics that handles printing values into a python metrics instance.

Parameters:

  • name

    (str, default: 'python' ) –

    name given to the metrics, used for identification; defaults to python

  • enabled

    (bool, default: True ) –

    True to log, False otherwise

Methods:

  • log_string

    :param tag: identifying tag to log the values with

Source code in llmcompressor/metrics/logger.py
def __init__(
    self,
    name: str = "python",
    enabled: bool = True,
):
    self._create_default_logger()

    super().__init__(
        lambda_func=self._log_lambda,
        name=name,
        enabled=enabled,
    )

log_string

log_string(
    tag: Optional[str],
    string: Optional[str],
    step: Optional[int],
    wall_time: Optional[float] = None,
    level: Optional[Union[int, str]] = None,
) -> bool

Parameters:

  • tag

    (Optional[str]) –

    identifying tag to log the values with

  • string

    (Optional[str]) –

    string to log

  • step

    (Optional[int]) –

    global step for when the values were taken

  • wall_time

    (Optional[float], default: None ) –

    global wall time for when the values were taken, defaults to time.time()

  • level

    (Optional[Union[int, str]], default: None ) –

    minimum severity level for the log message

Returns:

  • bool

    True if logged, False otherwise.

Source code in llmcompressor/metrics/logger.py
def log_string(
    self,
    tag: Optional[str],
    string: Optional[str],
    step: Optional[int],
    wall_time: Optional[float] = None,
    level: Optional[Union[int, str]] = None,
) -> bool:
    """
    :param tag: identifying tag to log the values with
    :param string: string to log
    :param step: global step for when the values were taken
    :param wall_time: global wall time for when the values were taken,
        defaults to time.time()
    :param level: minimum severity level for the log message
    :return: True if logged, False otherwise.
    """
    if not wall_time:
        wall_time = time.time()

    return self._lambda_func(
        tag=tag,
        value=string,
        values=None,
        step=step,
        level=level,
        wall_time=wall_time,
    )

SparsificationGroupLogger

SparsificationGroupLogger(
    lambda_func: Optional[
        Callable[
            [
                Optional[str],
                Optional[float],
                Optional[Dict[str, float]],
                Optional[int],
                Optional[float],
            ],
            bool,
        ]
    ] = None,
    python: bool = False,
    python_log_level: Optional[Union[int, str]] = "INFO",
    tensorboard: Optional[
        Union[bool, str, SummaryWriter]
    ] = None,
    wandb_: Optional[Union[bool, Dict]] = None,
    name: str = "sparsification",
    enabled: bool = True,
)

Bases: BaseLogger

Modifier metrics that handles outputting values to other supported systems. Supported ones include: - Python logging - Tensorboard - Weights and Biases - Lambda callback

All are optional and can be bulk disabled and enabled by this root.

Parameters:

  • lambda_func

    (Optional[Callable[[Optional[str], Optional[float], Optional[Dict[str, float]], Optional[int], Optional[float]], bool]], default: None ) –

    an optional lambda function to call back into with any logs. The expected call sequence is (tag, value, values, step, wall_time) -> bool The return type is True if logged and False otherwise.

  • python

    (bool, default: False ) –

    an bool argument for logging to a python metrics. True to create a metrics instance, or False to not log anything

  • python_log_level

    (Optional[Union[int, str]], default: 'INFO' ) –

    if python, the level to log any incoming data at on the loguru.logger instance

  • tensorboard

    (Optional[Union[bool, str, SummaryWriter]], default: None ) –

    an optional argument for logging to a tensorboard writer. May be a SummaryWriter instance to log to, a string representing the directory to create a new SummaryWriter to log to, True to create a new SummaryWriter, or non truthy to not log anything (False, None)

  • wandb_

    (Optional[Union[bool, Dict]], default: None ) –

    an optional argument for logging to wandb. May be a dictionary to pass to the init call for wandb, True to log to wandb (will not call init), or non truthy to not log anything (False, None)

  • name

    (str, default: 'sparsification' ) –

    name given to the metrics, used for identification; defaults to sparsification

  • enabled

    (bool, default: True ) –

    True to log, False otherwise

Methods:

  • enabled

    :param value: True to log, False otherwise

  • log_hyperparams

    :param params: Each key-value pair in the dictionary is the name of the

  • log_scalar

    :param tag: identifying tag to log the value with

  • log_scalars

    :param tag: identifying tag to log the values with

Attributes:

  • loggers (List[BaseLogger]) –

    :return: the created metrics sub instances for this metrics

Source code in llmcompressor/metrics/logger.py
def __init__(
    self,
    lambda_func: Optional[
        Callable[
            [
                Optional[str],
                Optional[float],
                Optional[Dict[str, float]],
                Optional[int],
                Optional[float],
            ],
            bool,
        ]
    ] = None,
    python: bool = False,
    python_log_level: Optional[Union[int, str]] = "INFO",
    tensorboard: Optional[Union[bool, str, SummaryWriter]] = None,
    wandb_: Optional[Union[bool, Dict]] = None,
    name: str = "sparsification",
    enabled: bool = True,
):
    super().__init__(name, enabled)
    self._loggers: List[BaseLogger] = []

    if lambda_func:
        self._loggers.append(
            LambdaLogger(lambda_func=lambda_func, name=name, enabled=enabled)
        )

    if python:
        self._loggers.append(
            PythonLogger(
                name=name,
                enabled=enabled,
            )
        )

    if tensorboard and TensorBoardLogger.available():
        self._loggers.append(
            TensorBoardLogger(
                log_path=tensorboard if isinstance(tensorboard, str) else None,
                writer=(
                    tensorboard if isinstance(tensorboard, SummaryWriter) else None
                ),
                name=name,
                enabled=enabled,
            )
        )

    if wandb_ and WANDBLogger.available():
        self._loggers.append(
            WANDBLogger(
                init_kwargs=wandb_ if isinstance(wandb_, Dict) else None,
                name=name,
                enabled=enabled,
            )
        )

loggers property

loggers: List[BaseLogger]

Returns:

  • List[BaseLogger]

    the created metrics sub instances for this metrics

enabled

enabled(value: bool)

Parameters:

  • value

    (bool) –

    True to log, False otherwise

Source code in llmcompressor/metrics/logger.py
@BaseLogger.enabled.setter
def enabled(self, value: bool):
    """
    :param value: True to log, False otherwise
    """
    self._enabled = value

    for log in self._loggers:
        log.enabled = value

log_hyperparams

log_hyperparams(
    params: Dict, level: Optional[Union[int, str]] = None
)

Parameters:

  • params

    (Dict) –

    Each key-value pair in the dictionary is the name of the hyper parameter and it's corresponding value.

Source code in llmcompressor/metrics/logger.py
def log_hyperparams(self, params: Dict, level: Optional[Union[int, str]] = None):
    """
    :param params: Each key-value pair in the dictionary is the name of the
        hyper parameter and it's corresponding value.
    """
    for log in self._loggers:
        log.log_hyperparams(params, level)

log_scalar

log_scalar(
    tag: str,
    value: float,
    step: Optional[int] = None,
    wall_time: Optional[float] = None,
    level: Optional[Union[int, str]] = None,
)

Parameters:

  • tag

    (str) –

    identifying tag to log the value with

  • value

    (float) –

    value to save

  • step

    (Optional[int], default: None ) –

    global step for when the value was taken

  • wall_time

    (Optional[float], default: None ) –

    global wall time for when the value was taken, defaults to time.time()

  • level

    (Optional[Union[int, str]], default: None ) –

    minimum severity level for the log message

Source code in llmcompressor/metrics/logger.py
def log_scalar(
    self,
    tag: str,
    value: float,
    step: Optional[int] = None,
    wall_time: Optional[float] = None,
    level: Optional[Union[int, str]] = None,
):
    """
    :param tag: identifying tag to log the value with
    :param value: value to save
    :param step: global step for when the value was taken
    :param wall_time: global wall time for when the value was taken,
        defaults to time.time()
    :param level: minimum severity level for the log message
    """
    for log in self._loggers:
        log.log_scalar(tag, value, step, wall_time, level)

log_scalars

log_scalars(
    tag: str,
    values: Dict[str, float],
    step: Optional[int] = None,
    wall_time: Optional[float] = None,
    level: Optional[Union[int, str]] = None,
)

Parameters:

  • tag

    (str) –

    identifying tag to log the values with

  • values

    (Dict[str, float]) –

    values to save

  • step

    (Optional[int], default: None ) –

    global step for when the values were taken

  • wall_time

    (Optional[float], default: None ) –

    global wall time for when the values were taken, defaults to time.time()

  • level

    (Optional[Union[int, str]], default: None ) –

    minimum severity level for the log message

Source code in llmcompressor/metrics/logger.py
def log_scalars(
    self,
    tag: str,
    values: Dict[str, float],
    step: Optional[int] = None,
    wall_time: Optional[float] = None,
    level: Optional[Union[int, str]] = None,
):
    """
    :param tag: identifying tag to log the values with
    :param values: values to save
    :param step: global step for when the values were taken
    :param wall_time: global wall time for when the values were taken,
        defaults to time.time()
    :param level: minimum severity level for the log message
    """
    for log in self._loggers:
        log.log_scalars(tag, values, step, wall_time, level)

SystemLoggingWraper

SystemLoggingWraper(
    loggers: List[BaseLogger],
    frequency_manager: FrequencyManager,
)

Bases: LoggingWrapperBase

Wraps utilities and convenience methods for logging strings to the system

Methods:

  • critical

    logs a string message with level CRITICAL on all

  • debug

    logs a string message with level DEBUG on all

  • error

    logs a string message with level ERROR on all

  • info

    logs a string message with level INFO on all

  • log_string

    :param tag: identifying tag to log the values with

  • warning

    logs a string message with level WARNING on all

Source code in llmcompressor/metrics/logger.py
def __init__(self, loggers: List[BaseLogger], frequency_manager: FrequencyManager):
    self.loggers = loggers
    self._frequency_manager = frequency_manager

critical

critical(tag, string, *args, **kwargs)

logs a string message with level CRITICAL on all loggers that are enabled

Parameters:

  • tag

    Identifying tag to log the string with

  • string

    The string to log

  • args

    additional arguments to pass to the metrics, see log_string for more details

  • kwargs

    additional arguments to pass to the metrics, see log_string for more details

Source code in llmcompressor/metrics/logger.py
def critical(self, tag, string, *args, **kwargs):
    """
    logs a string message with level CRITICAL on all
    loggers that are enabled

    :param tag: Identifying tag to log the string with
    :param string: The string to log
    :param args: additional arguments to pass to the metrics,
        see `log_string` for more details
    :param kwargs: additional arguments to pass to the metrics,
        see `log_string` for more details
    """
    kwargs["level"] = "CRITICAL"
    self.log_string(tag=tag, string=string, *args, **kwargs)

debug

debug(tag, string, *args, **kwargs)

logs a string message with level DEBUG on all loggers that are enabled

Parameters:

  • tag

    Identifying tag to log the string with

  • string

    The string to log

  • args

    additional arguments to pass to the metrics, see log_string for more details

  • kwargs

    additional arguments to pass to the metrics, see log_string for more details

Source code in llmcompressor/metrics/logger.py
def debug(self, tag, string, *args, **kwargs):
    """
    logs a string message with level DEBUG on all
    loggers that are enabled

    :param tag: Identifying tag to log the string with
    :param string: The string to log
    :param args: additional arguments to pass to the metrics,
        see `log_string` for more details
    :param kwargs: additional arguments to pass to the metrics,
        see `log_string` for more details
    """
    kwargs["level"] = "DEBUG"
    self.log_string(tag=tag, string=string, *args, **kwargs)

error

error(tag, string, *args, **kwargs)

logs a string message with level ERROR on all loggers that are enabled

Parameters:

  • tag

    Identifying tag to log the string with

  • string

    The string to log

  • args

    additional arguments to pass to the metrics, see log_string for more details

  • kwargs

    additional arguments to pass to the metrics, see log_string for more details

Source code in llmcompressor/metrics/logger.py
def error(self, tag, string, *args, **kwargs):
    """
    logs a string message with level ERROR on all
    loggers that are enabled

    :param tag: Identifying tag to log the string with
    :param string: The string to log
    :param args: additional arguments to pass to the metrics,
        see `log_string` for more details
    :param kwargs: additional arguments to pass to the metrics,
        see `log_string` for more details
    """
    kwargs["level"] = "ERROR"
    self.log_string(tag=tag, string=string, *args, **kwargs)

info

info(tag, string, *args, **kwargs)

logs a string message with level INFO on all loggers that are enabled

Parameters:

  • tag

    Identifying tag to log the string with

  • string

    The string to log

  • args

    additional arguments to pass to the metrics, see log_string for more details

  • kwargs

    additional arguments to pass to the metrics, see log_string for more details

Source code in llmcompressor/metrics/logger.py
def info(self, tag, string, *args, **kwargs):
    """
    logs a string message with level INFO on all
    loggers that are enabled

    :param tag: Identifying tag to log the string with
    :param string: The string to log
    :param args: additional arguments to pass to the metrics,
        see `log_string` for more details
    :param kwargs: additional arguments to pass to the metrics,
        see `log_string` for more details
    """
    kwargs["level"] = "INFO"
    self.log_string(tag=tag, string=string, *args, **kwargs)

log_string

log_string(
    tag: str,
    string: str,
    step: Optional[int] = None,
    wall_time: Optional[float] = None,
    log_types: Union[str, List[str]] = ALL_TOKEN,
    level: Optional[Union[int, str]] = None,
)

Parameters:

  • tag

    (str) –

    identifying tag to log the values with

  • values

    values to save

  • step

    (Optional[int], default: None ) –

    global step for when the values were taken

  • wall_time

    (Optional[float], default: None ) –

    global wall time for when the values were taken

  • level

    (Optional[Union[int, str]], default: None ) –

    minimum severity level for the log message

  • kwargs

    additional logging arguments to support Python and custom loggers

Returns:

  • True if logged, False otherwise.

Source code in llmcompressor/metrics/logger.py
def log_string(
    self,
    tag: str,
    string: str,
    step: Optional[int] = None,
    wall_time: Optional[float] = None,
    log_types: Union[str, List[str]] = ALL_TOKEN,
    level: Optional[Union[int, str]] = None,
):
    """
    :param tag: identifying tag to log the values with
    :param values: values to save
    :param step: global step for when the values were taken
    :param wall_time: global wall time for when the values were taken
    :param level: minimum severity level for the log message
    :param kwargs: additional logging arguments to support Python and custom loggers
    :return: True if logged, False otherwise.
    """
    for log in self.loggers:
        if log.enabled and (log_types == ALL_TOKEN or log.name in log_types):
            log.log_string(
                tag=tag,
                string=string,
                step=step,
                wall_time=wall_time,
                level=level,
            )

warning

warning(tag, string, *args, **kwargs)

logs a string message with level WARNING on all loggers that are enabled

Parameters:

  • tag

    Identifying tag to log the string with

  • string

    The string to log

  • args

    additional arguments to pass to the metrics, see log_string for more details

  • kwargs

    additional arguments to pass to the metrics, see log_string for more details

Source code in llmcompressor/metrics/logger.py
def warning(self, tag, string, *args, **kwargs):
    """
    logs a string message with level WARNING on all
    loggers that are enabled

    :param tag: Identifying tag to log the string with
    :param string: The string to log
    :param args: additional arguments to pass to the metrics,
        see `log_string` for more details
    :param kwargs: additional arguments to pass to the metrics,
        see `log_string` for more details
    """
    kwargs["level"] = "WARNING"
    self.log_string(tag=tag, string=string, *args, **kwargs)

TensorBoardLogger

TensorBoardLogger(
    log_path: str = None,
    writer: SummaryWriter = None,
    name: str = "tensorboard",
    enabled: bool = True,
)

Bases: LambdaLogger

Modifier metrics that handles outputting values into a TensorBoard log directory for viewing in TensorBoard.

Parameters:

  • log_path

    (str, default: None ) –

    the path to create a SummaryWriter at. writer must be None to use if not supplied (and writer is None), will create a TensorBoard dir in cwd

  • writer

    (SummaryWriter, default: None ) –

    the writer to log results to, if none is given creates a new one at the log_path

  • name

    (str, default: 'tensorboard' ) –

    name given to the metrics, used for identification; defaults to tensorboard

  • enabled

    (bool, default: True ) –

    True to log, False otherwise

Methods:

  • available

    :return: True if tensorboard is available and installed, False, otherwise

Attributes:

  • writer (SummaryWriter) –

    :return: the writer to log results to,

Source code in llmcompressor/metrics/logger.py
def __init__(
    self,
    log_path: str = None,
    writer: SummaryWriter = None,
    name: str = "tensorboard",
    enabled: bool = True,
):
    if tensorboard_import_error:
        raise tensorboard_import_error

    if writer and log_path:
        raise ValueError(
            (
                "log_path given:{} and writer object passed in, "
                "to create a writer at the log path set writer=None"
            ).format(log_path)
        )
    elif not writer and not log_path:
        log_path = os.path.join("", "tensorboard")

    if os.environ.get("NM_TEST_MODE"):
        test_log_root = os.environ.get("NM_TEST_LOG_DIR")
        log_path = (
            os.path.join(test_log_root, log_path) if log_path else test_log_root
        )

    if log_path:
        _create_dirs(log_path)

    self._writer = writer if writer is not None else SummaryWriter(log_path)
    super().__init__(
        lambda_func=self._log_lambda,
        name=name,
        enabled=enabled,
    )

writer property

writer: SummaryWriter

Returns:

  • SummaryWriter

    the writer to log results to, if none is given creates a new one at the log_path

available staticmethod

available() -> bool

Returns:

  • bool

    True if tensorboard is available and installed, False, otherwise

Source code in llmcompressor/metrics/logger.py
@staticmethod
def available() -> bool:
    """
    :return: True if tensorboard is available and installed, False, otherwise
    """
    return not tensorboard_import_error

WANDBLogger

WANDBLogger(
    init_kwargs: Optional[Dict] = None,
    name: str = "wandb",
    enabled: bool = True,
    wandb_err: Optional[Exception] = wandb_err,
)

Bases: LambdaLogger

Modifier metrics that handles outputting values to Weights and Biases.

Parameters:

  • init_kwargs

    (Optional[Dict], default: None ) –

    the args to call into wandb.init with; ex: wandb.init(**init_kwargs). If not supplied, then init will not be called

  • name

    (str, default: 'wandb' ) –

    name given to the metrics, used for identification; defaults to wandb

  • enabled

    (bool, default: True ) –

    True to log, False otherwise

Methods:

  • available

    :return: True if wandb is available and installed, False, otherwise

  • save

    :param file_path: path to a file to be saved

Source code in llmcompressor/metrics/logger.py
def __init__(
    self,
    init_kwargs: Optional[Dict] = None,
    name: str = "wandb",
    enabled: bool = True,
    wandb_err: Optional[Exception] = wandb_err,
):
    if wandb_err:
        raise wandb_err

    super().__init__(
        lambda_func=self._log_lambda,
        name=name,
        enabled=enabled,
    )

    if os.environ.get("NM_TEST_MODE"):
        test_log_path = os.environ.get("NM_TEST_LOG_DIR")
        _create_dirs(test_log_path)
        if init_kwargs:
            init_kwargs["dir"] = test_log_path
        else:
            init_kwargs = {"dir": test_log_path}

    if wandb_err:
        raise wandb_err

    if init_kwargs:
        wandb.init(**init_kwargs)
    else:
        wandb.init()

    self.wandb = wandb

available staticmethod

available() -> bool

Returns:

  • bool

    True if wandb is available and installed, False, otherwise

Source code in llmcompressor/metrics/logger.py
@staticmethod
def available() -> bool:
    """
    :return: True if wandb is available and installed, False, otherwise
    """
    return wandb_available

save

save(file_path: str) -> bool

Parameters:

  • file_path

    (str) –

    path to a file to be saved

Source code in llmcompressor/metrics/logger.py
def save(
    self,
    file_path: str,
) -> bool:
    """
    :param file_path: path to a file to be saved
    """
    wandb.save(file_path)
    return True