Skip to content
This repository has been archived by the owner on Nov 9, 2023. It is now read-only.

Easier option to pretrain on own dataset #872

Open
IUsedToBeAPygmy opened this issue Aug 25, 2020 · 1 comment
Open

Easier option to pretrain on own dataset #872

IUsedToBeAPygmy opened this issue Aug 25, 2020 · 1 comment

Comments

@IUsedToBeAPygmy
Copy link

I'm using a single face set (data_src) as input for a lot of different outputs (data_dst).
Therefore, I'd like to pretrain a model on just my input faceset (data_src).

To facilitate this, I've made a faceset pack of my data_src, and copied that over the "Pretrain_celebA" set, and then pretrain using my own set.
This allows me to pretrain with my own dataset with better settings than when I'm doing a "normal" train that also includes a data_dst set as less memory is being used, so I can bump the batch size up to 7 when pretraining.
Once my pretrain using my own dataset is up to acceptable levels, I can copy it to a new model and start doing a "normal" train with both data_src and data_dst, only this time my data_src has already been pretrained / copied so starts out with a big advantage.

I'd love to be able to do "single input" training like this a bit easier, so I can train on ONLY data_src, or ONLY data_dst.
This way I can train partial input models / output models and reuse / mix and match later.

I hope you get what I mean :-)

@Lubsey1
Copy link

Lubsey1 commented Aug 26, 2020

You want to do a pretrain using your own data_src and the "Pretrain_celebA" as the data_dst?
Is that what you're trying to do?

Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants