Releases: BloodAxe/pytorch-toolbelt
PyTorch Toolbelt 0.4.3
PyTorch Toolbelt 0.4.3
Modules
- Added missing
sigmoidactivation support toget_activation_block - Make Encoders support JIT & Tracing
- Better support for encoders from
timm(They named with prefixTimm)
Utils
rgb_image_from_tensornow clip values
TTA & Ensembling
Ensemblernow supports arithmetic, geometric & harmonic averaging viareductionparameter.- Bring geometric & harmonic averaging to all TTA functions as well
Datasets
read_binary_mask- Refactor
SegmentationDatasetto support strided masks for deep supervision - Added
RandomSubsetDatasetandRandomSubsetWithMaskDatasetto sample dataset based on some condition (E.g. sample only samples of particular class)
Other
As usual, more tests, better type annotations & comments
PyTorch Toolbelt 0.4.2
Breaking Changes
- Bump up minimal PyTorch version to 1.7.1
New features
- New dataset classes
ClassificationDataset,SegmentationDatasetfor easy every-day use in Kaggle - New losses:
FocalCosineLoss,BiTemperedLogisticLoss,SoftF1Loss - Support of new activations for
get_activation_block(Silu, Softplus, Gelu) - More encoders from timm package: NFNets, NFRegNet, HRNet, DPN
RocAucMetricCallbackfor CatalystMultilabelAccuracyCallbackandAccuracyCallbackwith DDP support
Bugfixes
- Fix invalid prefix in catalyst registry to from
tbttotbt.
PyTorch Toolbelt 0.4.1
New features
- Added Soft-F1 loss for direct optimization of F1 score (Binary case only)
- Fully rework TTA (Kept backward compatibility where it's possible) module for inference.
- Added support of
ignore_indexto Dice & Jaccard losses. - Improved Lovasz loss to work in
fp16mode. - Added option to override selected params in
make_n_channel_input. - More Encoders, from
timmpackage. FPNFusemodule not works on 2D, 3D and N-D inputs.- Added Global K-Max 2D pooling block.
- Added Generalized mean pooling 2D block.
- Added
softmax_over_dim_X,argmax_over_dim_Xshorthand functions for use in metrics to get soft/hard labels without using lambda functions. - Added helper visualization functions to add fancy header to image, stack images of different sizes.
- Improved rendering of confusion matrix.
Catalyst goodies
- Encoders & Losses are available in Catalyst registry
StopIfNanCallback- Added
OutputDistributionCallbackto log distribtion of predictions to TensorBoard. - Added
UMAPCallbackto visualize embedding space using UMAP in TensorBoard.
Breaking Changes
- Renamed
CudaTileMergertoTileMerger.TileMergerallows to specify target device explicitly. tensor_from_rgb_imageremoved in favor ofimage_to_tensor.
Bug fixes & Improvements
- Improve numeric stability of
focal_loss_with_logitswhenreduction="sum" - Prevent
NaNin FocalLoss when all elements are equal toignore_indexvalue. - A LOT of type hints.
PyTorch Toolbelt 0.4.0
New features
- Memory-efficient
SwishandMishactivation functions (Credits goes to http://github.com/rwightman/pytorch-image-models) - Refactor EfficientNet encoders (no pretrained weights yet)
Fixes
- Fixed incorrect default value for
ignore_indexinSoftCrossEntropyLoss
Breaking changes
- All catalyst-related utils updated to be compatible with Catalyst 20.8.2
- Remove PIL package dependency
Improvements
- More comments, more type hints
Pytorch Toolbelt 0.3.2
New features
- Many helpful callbacks for Catalyst library: HyperParameterCallback, LossAdapter to name a few.
- New losses for deep model supervision (Helpful, when size of target and output mask are different)
- Stacked Hourglass encoder
- Context Aggregation Network decoder
Breaking Changes
-
ABN module will now resolve as nn.Sequential(BatchNorm2d, Activation) instead of a hand-crafted module. This enables easier conversion of batch normalization modules to the nn.SyncBatchNorm.
-
Almost every Encoder/Decoder implementation has been refactored for better clarity and flexibility. Please double-check your pipelines.
Important bugfixes
- Improved numerical stability of Dice / Jaccard losses (Using log_sigmoid() + exp() instead of plain sigmoid() )
Other
- A lots of comments for functions and modules
- Code cleanup, thanks for DeepSource
- Type annotations for modules and functions
- Update of README
Pytorch toolbelt 0.3.1
Fixes
- Fixed bug in computation IoU metric in
binary_dice_iou_scorefunction - Fixed incorrect default value in
SoftCrossEntropyLoss#38
Improvements
- Function
draw_binary_segmentation_predictionsnow has parameterimage_format(rgb|bgr|gray) to specify format of the image to visualize correctly images in TB - More type annotations across the codebase
New features
- New visualization function
draw_multilabel_segmentation_predictions
Pytorch Toolbel 0.3.0
Pytorch Toolbel 0.3.0
This release has a huge set of new features, bugfixes and breaking changes. So be careful, when upgrading.
pip install pytorch-toolbelt==0.3.0
New features
Encoders
- HRNetV2
- DenseNets
- EfficientNet
Encoderclass haschange_input_channelsmethod to change number of channels in input image
New losses
BCELosswith support ofignore_indexSoftBCELoss(Label smoothing loss for binary case with support ofignore_index)SoftCrossEntropyLoss(Label smoothing loss for multiclass case with support ofignore_index)
Catalyst goodies
- Online pseudolabeling callback
- Training signal annealing callback
Other
- New activation functions support in
ABNblock: Swish, Mish, HardSigmoid - New decoders (Unet, FPN, DeeplabV3, PPM) to simplify creation of segmentation models
CREDITS.mdto include all the references to code/articles. Existing list is definitely not complete, so feel free to make PR's- Object context block from OCNet
API changes
- Focal loss now supports normalized focal loss and reduced focal loss extensions.
- Optimize computation of pyramid weight matrix #34
- Default value
align_corners=FalseinF.interpolatewhen doing bilinear upsampling.
Bugfixes
- Fix missing call to batch normalization block in
FPNBottleneckBN - Fix numerical stability for
DiceLossandJaccardLosswhenlog_loss=True - Fix numerical stability when computing normalized focal loss
PyTorch Toolbelt 0.2.1
New features
- Added normalized focal loss
Bugfixes
- Fixed wrong shape of intermediate layers of DenseNet
PyTorch Toolbelt 0.2.0
PyTorch Toolbelt 0.2.0
This release dedicated to housekeeping work. Dice/IoU metrics and losses have been redesigned to reduce amount of duplicated code and bring more clarity. Code is now auto-formatted using Black.
pip install pytorch_toolbelt==0.2.0
Catalyst contrib
- Refactor Dice/IoU loss into single metric
IoUMetricsCallbackwith a few cool features:metric="dice|jaccard"to choose what metric should be used;mode=binary|multiclass|multilabelto specify problem type (binary, multiclass or multi-label segmentation)';classes_of_interest=[1,2,4]to select for which set of classes metric should be computed andnan_score_on_empty=Falseto computeDice Accuracy(Counts as a 1.0 if bothy_trueandy_predare empty; 0.0 ify_predis not empty). - Added L-p regularization callback to apply L1 and L2 regularization to model with support of regularization strength scheduling.
Losses
- Refactor
DiceLoss/JaccardLosslosses in a same fashion as metrics.
Models
- Add Densenet encoders
- Bugfix: Fix missing BN+Relu in
UNetDecoder - Global pooling modules can squeeze spatial channel dimensions if
flatten=True.
Misc
- Add more unit tests
- Code-style is now managed with Black
to_numpynow supportsint,floatscalar types