Skip to content

incompatible weights #21

@melih-unsal

Description

@melih-unsal

When I run the test code I got the error due to the mismatch of the weights

size mismatch for RPN.anchor_generator.cell_anchors.0: copying a param with shape torch.Size([3, 4]) from checkpoint, the shape in current model is torch.Size([15, 4]).
size mismatch for RPN.head.conv.weight: copying a param with shape torch.Size([256, 256, 3, 3]) from checkpoint, the shape in current model is torch.Size([2048, 2048, 3, 3]).
size mismatch for RPN.head.conv.bias: copying a param with shape torch.Size([256]) from checkpoint, the shape in current model is torch.Size([2048]).
size mismatch for RPN.head.cls_logits.weight: copying a param with shape torch.Size([3, 256, 1, 1]) from checkpoint, the shape in current model is torch.Size([15, 2048, 1, 1]).
size mismatch for RPN.head.cls_logits.bias: copying a param with shape torch.Size([3]) from checkpoint, the shape in current model is torch.Size([15]).
size mismatch for RPN.head.bbox_pred.weight: copying a param with shape torch.Size([12, 256, 1, 1]) from checkpoint, the shape in current model is torch.Size([60, 2048, 1, 1]).
size mismatch for RPN.head.bbox_pred.bias: copying a param with shape torch.Size([12]) from checkpoint, the shape in current model is torch.Size([60]).

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions