How do I implement transfer learning in NiftyNet?
I'd like to perform some transfer learning using the NiftyNet stack, as my dataset of labeled images is rather small. In TensorFlow, this is possible--I can load a variety of pre-trained networks and directly work with their layers. To fine-tune the network, I could freeze training of the intermediate layers and only train the final layer, or I could just use the output of the intermediate layers as a feature vector to feed into another classifier.
How do I do this in NiftyNet? The only mention of "transfer learning" in the documentation or the source code is in reference to the model zoo, but for my task (image classification), there are no networks available in the zoo. The ResNet architecture seems to be implemented and available to use, but as far as I can tell, it's not trained on anything yet. In addition, it seems the only way I can train a network is by running net_classify train
, using the various TRAIN
configuration options in the config file, none of which have options for freezing networks. The various layers in niftynet.layer
also do not seem to have options to enable them to be trained or not.
I suppose the questions I have are:
- Is it possible to port over a pre-trained TensorFlow network?
- If I manually recreate the layer architecture in NiftyNet, is there a way to import the weights from a pre-trained TF network?
- How do I access the intermediate weights and layers of a model? (How can I get access to intermediate activation maps of the pre-trained models in NiftyNet? refers to the model zoo, where they can be obtained using
net_download
, but not to any arbitrary model) - As an aside, it also seems that learning rate is a constant--to vary this over time, would I have to run the network for some number of iterations, change
lr
, then restart training from the last checkpoint?
python machine-learning multilabel-classification niftynet
add a comment |
I'd like to perform some transfer learning using the NiftyNet stack, as my dataset of labeled images is rather small. In TensorFlow, this is possible--I can load a variety of pre-trained networks and directly work with their layers. To fine-tune the network, I could freeze training of the intermediate layers and only train the final layer, or I could just use the output of the intermediate layers as a feature vector to feed into another classifier.
How do I do this in NiftyNet? The only mention of "transfer learning" in the documentation or the source code is in reference to the model zoo, but for my task (image classification), there are no networks available in the zoo. The ResNet architecture seems to be implemented and available to use, but as far as I can tell, it's not trained on anything yet. In addition, it seems the only way I can train a network is by running net_classify train
, using the various TRAIN
configuration options in the config file, none of which have options for freezing networks. The various layers in niftynet.layer
also do not seem to have options to enable them to be trained or not.
I suppose the questions I have are:
- Is it possible to port over a pre-trained TensorFlow network?
- If I manually recreate the layer architecture in NiftyNet, is there a way to import the weights from a pre-trained TF network?
- How do I access the intermediate weights and layers of a model? (How can I get access to intermediate activation maps of the pre-trained models in NiftyNet? refers to the model zoo, where they can be obtained using
net_download
, but not to any arbitrary model) - As an aside, it also seems that learning rate is a constant--to vary this over time, would I have to run the network for some number of iterations, change
lr
, then restart training from the last checkpoint?
python machine-learning multilabel-classification niftynet
Possible duplicate of Implement transfer learning on niftynet
– manza
Nov 27 '18 at 7:26
add a comment |
I'd like to perform some transfer learning using the NiftyNet stack, as my dataset of labeled images is rather small. In TensorFlow, this is possible--I can load a variety of pre-trained networks and directly work with their layers. To fine-tune the network, I could freeze training of the intermediate layers and only train the final layer, or I could just use the output of the intermediate layers as a feature vector to feed into another classifier.
How do I do this in NiftyNet? The only mention of "transfer learning" in the documentation or the source code is in reference to the model zoo, but for my task (image classification), there are no networks available in the zoo. The ResNet architecture seems to be implemented and available to use, but as far as I can tell, it's not trained on anything yet. In addition, it seems the only way I can train a network is by running net_classify train
, using the various TRAIN
configuration options in the config file, none of which have options for freezing networks. The various layers in niftynet.layer
also do not seem to have options to enable them to be trained or not.
I suppose the questions I have are:
- Is it possible to port over a pre-trained TensorFlow network?
- If I manually recreate the layer architecture in NiftyNet, is there a way to import the weights from a pre-trained TF network?
- How do I access the intermediate weights and layers of a model? (How can I get access to intermediate activation maps of the pre-trained models in NiftyNet? refers to the model zoo, where they can be obtained using
net_download
, but not to any arbitrary model) - As an aside, it also seems that learning rate is a constant--to vary this over time, would I have to run the network for some number of iterations, change
lr
, then restart training from the last checkpoint?
python machine-learning multilabel-classification niftynet
I'd like to perform some transfer learning using the NiftyNet stack, as my dataset of labeled images is rather small. In TensorFlow, this is possible--I can load a variety of pre-trained networks and directly work with their layers. To fine-tune the network, I could freeze training of the intermediate layers and only train the final layer, or I could just use the output of the intermediate layers as a feature vector to feed into another classifier.
How do I do this in NiftyNet? The only mention of "transfer learning" in the documentation or the source code is in reference to the model zoo, but for my task (image classification), there are no networks available in the zoo. The ResNet architecture seems to be implemented and available to use, but as far as I can tell, it's not trained on anything yet. In addition, it seems the only way I can train a network is by running net_classify train
, using the various TRAIN
configuration options in the config file, none of which have options for freezing networks. The various layers in niftynet.layer
also do not seem to have options to enable them to be trained or not.
I suppose the questions I have are:
- Is it possible to port over a pre-trained TensorFlow network?
- If I manually recreate the layer architecture in NiftyNet, is there a way to import the weights from a pre-trained TF network?
- How do I access the intermediate weights and layers of a model? (How can I get access to intermediate activation maps of the pre-trained models in NiftyNet? refers to the model zoo, where they can be obtained using
net_download
, but not to any arbitrary model) - As an aside, it also seems that learning rate is a constant--to vary this over time, would I have to run the network for some number of iterations, change
lr
, then restart training from the last checkpoint?
python machine-learning multilabel-classification niftynet
python machine-learning multilabel-classification niftynet
asked Jun 8 '18 at 21:54
Max ZhouMax Zhou
112
112
Possible duplicate of Implement transfer learning on niftynet
– manza
Nov 27 '18 at 7:26
add a comment |
Possible duplicate of Implement transfer learning on niftynet
– manza
Nov 27 '18 at 7:26
Possible duplicate of Implement transfer learning on niftynet
– manza
Nov 27 '18 at 7:26
Possible duplicate of Implement transfer learning on niftynet
– manza
Nov 27 '18 at 7:26
add a comment |
1 Answer
1
active
oldest
votes
[Edit]: Here are the docs for transfer learning with NiftyNet.
This feature is currently being worked on. See here for full details.
Intended capabilities include the following:
- Command for printing all trainable variable names (with optional regular expression matching)
- Ability to randomly initialize a subset of variables, this subset is created by regex name matching
- Ability to restore (from an existing checkpoint) and continue updating a subset of the variables. If the optimization method is changed, deal with method-specific variables (e.g. momentum)
- Being able to restore (from an existing checkpoint) and freeze trained weights for the rest of the variables
- Saving all trainable variables after training
- Add configuration parameters for finetuning, variable name regex,
unit tests - A demo/tutorial
- Preprocess the checkpoints for compatibility issues
- Deal with batch norm and dropout layers (editing networks to remove batch norm variables)
add a comment |
Your Answer
StackExchange.ifUsing("editor", function () {
StackExchange.using("externalEditor", function () {
StackExchange.using("snippets", function () {
StackExchange.snippets.init();
});
});
}, "code-snippets");
StackExchange.ready(function() {
var channelOptions = {
tags: "".split(" "),
id: "1"
};
initTagRenderer("".split(" "), "".split(" "), channelOptions);
StackExchange.using("externalEditor", function() {
// Have to fire editor after snippets, if snippets enabled
if (StackExchange.settings.snippets.snippetsEnabled) {
StackExchange.using("snippets", function() {
createEditor();
});
}
else {
createEditor();
}
});
function createEditor() {
StackExchange.prepareEditor({
heartbeatType: 'answer',
autoActivateHeartbeat: false,
convertImagesToLinks: true,
noModals: true,
showLowRepImageUploadWarning: true,
reputationToPostImages: 10,
bindNavPrevention: true,
postfix: "",
imageUploader: {
brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
allowUrls: true
},
onDemand: true,
discardSelector: ".discard-answer"
,immediatelyShowMarkdownHelp:true
});
}
});
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fstackoverflow.com%2fquestions%2f50768864%2fhow-do-i-implement-transfer-learning-in-niftynet%23new-answer', 'question_page');
}
);
Post as a guest
Required, but never shown
1 Answer
1
active
oldest
votes
1 Answer
1
active
oldest
votes
active
oldest
votes
active
oldest
votes
[Edit]: Here are the docs for transfer learning with NiftyNet.
This feature is currently being worked on. See here for full details.
Intended capabilities include the following:
- Command for printing all trainable variable names (with optional regular expression matching)
- Ability to randomly initialize a subset of variables, this subset is created by regex name matching
- Ability to restore (from an existing checkpoint) and continue updating a subset of the variables. If the optimization method is changed, deal with method-specific variables (e.g. momentum)
- Being able to restore (from an existing checkpoint) and freeze trained weights for the rest of the variables
- Saving all trainable variables after training
- Add configuration parameters for finetuning, variable name regex,
unit tests - A demo/tutorial
- Preprocess the checkpoints for compatibility issues
- Deal with batch norm and dropout layers (editing networks to remove batch norm variables)
add a comment |
[Edit]: Here are the docs for transfer learning with NiftyNet.
This feature is currently being worked on. See here for full details.
Intended capabilities include the following:
- Command for printing all trainable variable names (with optional regular expression matching)
- Ability to randomly initialize a subset of variables, this subset is created by regex name matching
- Ability to restore (from an existing checkpoint) and continue updating a subset of the variables. If the optimization method is changed, deal with method-specific variables (e.g. momentum)
- Being able to restore (from an existing checkpoint) and freeze trained weights for the rest of the variables
- Saving all trainable variables after training
- Add configuration parameters for finetuning, variable name regex,
unit tests - A demo/tutorial
- Preprocess the checkpoints for compatibility issues
- Deal with batch norm and dropout layers (editing networks to remove batch norm variables)
add a comment |
[Edit]: Here are the docs for transfer learning with NiftyNet.
This feature is currently being worked on. See here for full details.
Intended capabilities include the following:
- Command for printing all trainable variable names (with optional regular expression matching)
- Ability to randomly initialize a subset of variables, this subset is created by regex name matching
- Ability to restore (from an existing checkpoint) and continue updating a subset of the variables. If the optimization method is changed, deal with method-specific variables (e.g. momentum)
- Being able to restore (from an existing checkpoint) and freeze trained weights for the rest of the variables
- Saving all trainable variables after training
- Add configuration parameters for finetuning, variable name regex,
unit tests - A demo/tutorial
- Preprocess the checkpoints for compatibility issues
- Deal with batch norm and dropout layers (editing networks to remove batch norm variables)
[Edit]: Here are the docs for transfer learning with NiftyNet.
This feature is currently being worked on. See here for full details.
Intended capabilities include the following:
- Command for printing all trainable variable names (with optional regular expression matching)
- Ability to randomly initialize a subset of variables, this subset is created by regex name matching
- Ability to restore (from an existing checkpoint) and continue updating a subset of the variables. If the optimization method is changed, deal with method-specific variables (e.g. momentum)
- Being able to restore (from an existing checkpoint) and freeze trained weights for the rest of the variables
- Saving all trainable variables after training
- Add configuration parameters for finetuning, variable name regex,
unit tests - A demo/tutorial
- Preprocess the checkpoints for compatibility issues
- Deal with batch norm and dropout layers (editing networks to remove batch norm variables)
edited Nov 23 '18 at 6:42
answered Sep 19 '18 at 4:41
Aleksandar DjuricAleksandar Djuric
476
476
add a comment |
add a comment |
Thanks for contributing an answer to Stack Overflow!
- Please be sure to answer the question. Provide details and share your research!
But avoid …
- Asking for help, clarification, or responding to other answers.
- Making statements based on opinion; back them up with references or personal experience.
To learn more, see our tips on writing great answers.
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fstackoverflow.com%2fquestions%2f50768864%2fhow-do-i-implement-transfer-learning-in-niftynet%23new-answer', 'question_page');
}
);
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Possible duplicate of Implement transfer learning on niftynet
– manza
Nov 27 '18 at 7:26