llmcompressor.utils.fsdp.helpers
Functions:
-
get_fsdp_parent
–Gets the closest parent of layer_name that is wrapped by FSDP. If no FSDP wrapper
-
is_fsdp_model
–Check if a model instance is wrapped by FSDP
-
maybe_get_wrapped
–Given a model that may or may not have a distributed wrapper, return the underlying
-
set_wrapped_model
–Given a state with a model that may or may not have a distributed wrapper, set
get_fsdp_parent
Gets the closest parent of layer_name that is wrapped by FSDP. If no FSDP wrapper is found just return None
:model: pytorch module to search through
Parameters:
-
layer_name
str
) –layer name in model to get parent of
Returns:
-
Optional[Module]
–FSDP wrapped parent of layer_name if available, otherwise None
Source code in llmcompressor/utils/fsdp/helpers.py
is_fsdp_model
Check if a model instance is wrapped by FSDP
Parameters:
-
model
Module
) –pytorch model to check
Returns:
-
bool
–True if module is wrapped, False otherwise
Source code in llmcompressor/utils/fsdp/helpers.py
maybe_get_wrapped
Given a model that may or may not have a distributed wrapper, return the underlying wrapped model.
Parameters:
-
model
Module
) –input model to get wrapped model from
Returns:
-
Module
–wrapped model
Source code in llmcompressor/utils/fsdp/helpers.py
set_wrapped_model
Given a state with a model that may or may not have a distributed wrapper, set the underlying wrapped model.
Parameters:
-
state
State
) –state to update model of
-
updated_wrapped
model to inject into input_model