How do I implement transfer learning in NiftyNet?












2















I'd like to perform some transfer learning using the NiftyNet stack, as my dataset of labeled images is rather small. In TensorFlow, this is possible--I can load a variety of pre-trained networks and directly work with their layers. To fine-tune the network, I could freeze training of the intermediate layers and only train the final layer, or I could just use the output of the intermediate layers as a feature vector to feed into another classifier.



How do I do this in NiftyNet? The only mention of "transfer learning" in the documentation or the source code is in reference to the model zoo, but for my task (image classification), there are no networks available in the zoo. The ResNet architecture seems to be implemented and available to use, but as far as I can tell, it's not trained on anything yet. In addition, it seems the only way I can train a network is by running net_classify train, using the various TRAIN configuration options in the config file, none of which have options for freezing networks. The various layers in niftynet.layer also do not seem to have options to enable them to be trained or not.



I suppose the questions I have are:




  1. Is it possible to port over a pre-trained TensorFlow network?


    • If I manually recreate the layer architecture in NiftyNet, is there a way to import the weights from a pre-trained TF network?



  2. How do I access the intermediate weights and layers of a model? (How can I get access to intermediate activation maps of the pre-trained models in NiftyNet? refers to the model zoo, where they can be obtained using net_download, but not to any arbitrary model)

  3. As an aside, it also seems that learning rate is a constant--to vary this over time, would I have to run the network for some number of iterations, change lr, then restart training from the last checkpoint?










share|improve this question























  • Possible duplicate of Implement transfer learning on niftynet

    – manza
    Nov 27 '18 at 7:26
















2















I'd like to perform some transfer learning using the NiftyNet stack, as my dataset of labeled images is rather small. In TensorFlow, this is possible--I can load a variety of pre-trained networks and directly work with their layers. To fine-tune the network, I could freeze training of the intermediate layers and only train the final layer, or I could just use the output of the intermediate layers as a feature vector to feed into another classifier.



How do I do this in NiftyNet? The only mention of "transfer learning" in the documentation or the source code is in reference to the model zoo, but for my task (image classification), there are no networks available in the zoo. The ResNet architecture seems to be implemented and available to use, but as far as I can tell, it's not trained on anything yet. In addition, it seems the only way I can train a network is by running net_classify train, using the various TRAIN configuration options in the config file, none of which have options for freezing networks. The various layers in niftynet.layer also do not seem to have options to enable them to be trained or not.



I suppose the questions I have are:




  1. Is it possible to port over a pre-trained TensorFlow network?


    • If I manually recreate the layer architecture in NiftyNet, is there a way to import the weights from a pre-trained TF network?



  2. How do I access the intermediate weights and layers of a model? (How can I get access to intermediate activation maps of the pre-trained models in NiftyNet? refers to the model zoo, where they can be obtained using net_download, but not to any arbitrary model)

  3. As an aside, it also seems that learning rate is a constant--to vary this over time, would I have to run the network for some number of iterations, change lr, then restart training from the last checkpoint?










share|improve this question























  • Possible duplicate of Implement transfer learning on niftynet

    – manza
    Nov 27 '18 at 7:26














2












2








2








I'd like to perform some transfer learning using the NiftyNet stack, as my dataset of labeled images is rather small. In TensorFlow, this is possible--I can load a variety of pre-trained networks and directly work with their layers. To fine-tune the network, I could freeze training of the intermediate layers and only train the final layer, or I could just use the output of the intermediate layers as a feature vector to feed into another classifier.



How do I do this in NiftyNet? The only mention of "transfer learning" in the documentation or the source code is in reference to the model zoo, but for my task (image classification), there are no networks available in the zoo. The ResNet architecture seems to be implemented and available to use, but as far as I can tell, it's not trained on anything yet. In addition, it seems the only way I can train a network is by running net_classify train, using the various TRAIN configuration options in the config file, none of which have options for freezing networks. The various layers in niftynet.layer also do not seem to have options to enable them to be trained or not.



I suppose the questions I have are:




  1. Is it possible to port over a pre-trained TensorFlow network?


    • If I manually recreate the layer architecture in NiftyNet, is there a way to import the weights from a pre-trained TF network?



  2. How do I access the intermediate weights and layers of a model? (How can I get access to intermediate activation maps of the pre-trained models in NiftyNet? refers to the model zoo, where they can be obtained using net_download, but not to any arbitrary model)

  3. As an aside, it also seems that learning rate is a constant--to vary this over time, would I have to run the network for some number of iterations, change lr, then restart training from the last checkpoint?










share|improve this question














I'd like to perform some transfer learning using the NiftyNet stack, as my dataset of labeled images is rather small. In TensorFlow, this is possible--I can load a variety of pre-trained networks and directly work with their layers. To fine-tune the network, I could freeze training of the intermediate layers and only train the final layer, or I could just use the output of the intermediate layers as a feature vector to feed into another classifier.



How do I do this in NiftyNet? The only mention of "transfer learning" in the documentation or the source code is in reference to the model zoo, but for my task (image classification), there are no networks available in the zoo. The ResNet architecture seems to be implemented and available to use, but as far as I can tell, it's not trained on anything yet. In addition, it seems the only way I can train a network is by running net_classify train, using the various TRAIN configuration options in the config file, none of which have options for freezing networks. The various layers in niftynet.layer also do not seem to have options to enable them to be trained or not.



I suppose the questions I have are:




  1. Is it possible to port over a pre-trained TensorFlow network?


    • If I manually recreate the layer architecture in NiftyNet, is there a way to import the weights from a pre-trained TF network?



  2. How do I access the intermediate weights and layers of a model? (How can I get access to intermediate activation maps of the pre-trained models in NiftyNet? refers to the model zoo, where they can be obtained using net_download, but not to any arbitrary model)

  3. As an aside, it also seems that learning rate is a constant--to vary this over time, would I have to run the network for some number of iterations, change lr, then restart training from the last checkpoint?







python machine-learning multilabel-classification niftynet






share|improve this question













share|improve this question











share|improve this question




share|improve this question










asked Jun 8 '18 at 21:54









Max ZhouMax Zhou

112




112













  • Possible duplicate of Implement transfer learning on niftynet

    – manza
    Nov 27 '18 at 7:26



















  • Possible duplicate of Implement transfer learning on niftynet

    – manza
    Nov 27 '18 at 7:26

















Possible duplicate of Implement transfer learning on niftynet

– manza
Nov 27 '18 at 7:26





Possible duplicate of Implement transfer learning on niftynet

– manza
Nov 27 '18 at 7:26












1 Answer
1






active

oldest

votes


















0














[Edit]: Here are the docs for transfer learning with NiftyNet.



This feature is currently being worked on. See here for full details.



Intended capabilities include the following:




  • Command for printing all trainable variable names (with optional regular expression matching)

  • Ability to randomly initialize a subset of variables, this subset is created by regex name matching

  • Ability to restore (from an existing checkpoint) and continue updating a subset of the variables. If the optimization method is changed, deal with method-specific variables (e.g. momentum)

  • Being able to restore (from an existing checkpoint) and freeze trained weights for the rest of the variables

  • Saving all trainable variables after training

  • Add configuration parameters for finetuning, variable name regex,
    unit tests

  • A demo/tutorial

  • Preprocess the checkpoints for compatibility issues

  • Deal with batch norm and dropout layers (editing networks to remove batch norm variables)






share|improve this answer

























    Your Answer






    StackExchange.ifUsing("editor", function () {
    StackExchange.using("externalEditor", function () {
    StackExchange.using("snippets", function () {
    StackExchange.snippets.init();
    });
    });
    }, "code-snippets");

    StackExchange.ready(function() {
    var channelOptions = {
    tags: "".split(" "),
    id: "1"
    };
    initTagRenderer("".split(" "), "".split(" "), channelOptions);

    StackExchange.using("externalEditor", function() {
    // Have to fire editor after snippets, if snippets enabled
    if (StackExchange.settings.snippets.snippetsEnabled) {
    StackExchange.using("snippets", function() {
    createEditor();
    });
    }
    else {
    createEditor();
    }
    });

    function createEditor() {
    StackExchange.prepareEditor({
    heartbeatType: 'answer',
    autoActivateHeartbeat: false,
    convertImagesToLinks: true,
    noModals: true,
    showLowRepImageUploadWarning: true,
    reputationToPostImages: 10,
    bindNavPrevention: true,
    postfix: "",
    imageUploader: {
    brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
    contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
    allowUrls: true
    },
    onDemand: true,
    discardSelector: ".discard-answer"
    ,immediatelyShowMarkdownHelp:true
    });


    }
    });














    draft saved

    draft discarded


















    StackExchange.ready(
    function () {
    StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fstackoverflow.com%2fquestions%2f50768864%2fhow-do-i-implement-transfer-learning-in-niftynet%23new-answer', 'question_page');
    }
    );

    Post as a guest















    Required, but never shown

























    1 Answer
    1






    active

    oldest

    votes








    1 Answer
    1






    active

    oldest

    votes









    active

    oldest

    votes






    active

    oldest

    votes









    0














    [Edit]: Here are the docs for transfer learning with NiftyNet.



    This feature is currently being worked on. See here for full details.



    Intended capabilities include the following:




    • Command for printing all trainable variable names (with optional regular expression matching)

    • Ability to randomly initialize a subset of variables, this subset is created by regex name matching

    • Ability to restore (from an existing checkpoint) and continue updating a subset of the variables. If the optimization method is changed, deal with method-specific variables (e.g. momentum)

    • Being able to restore (from an existing checkpoint) and freeze trained weights for the rest of the variables

    • Saving all trainable variables after training

    • Add configuration parameters for finetuning, variable name regex,
      unit tests

    • A demo/tutorial

    • Preprocess the checkpoints for compatibility issues

    • Deal with batch norm and dropout layers (editing networks to remove batch norm variables)






    share|improve this answer






























      0














      [Edit]: Here are the docs for transfer learning with NiftyNet.



      This feature is currently being worked on. See here for full details.



      Intended capabilities include the following:




      • Command for printing all trainable variable names (with optional regular expression matching)

      • Ability to randomly initialize a subset of variables, this subset is created by regex name matching

      • Ability to restore (from an existing checkpoint) and continue updating a subset of the variables. If the optimization method is changed, deal with method-specific variables (e.g. momentum)

      • Being able to restore (from an existing checkpoint) and freeze trained weights for the rest of the variables

      • Saving all trainable variables after training

      • Add configuration parameters for finetuning, variable name regex,
        unit tests

      • A demo/tutorial

      • Preprocess the checkpoints for compatibility issues

      • Deal with batch norm and dropout layers (editing networks to remove batch norm variables)






      share|improve this answer




























        0












        0








        0







        [Edit]: Here are the docs for transfer learning with NiftyNet.



        This feature is currently being worked on. See here for full details.



        Intended capabilities include the following:




        • Command for printing all trainable variable names (with optional regular expression matching)

        • Ability to randomly initialize a subset of variables, this subset is created by regex name matching

        • Ability to restore (from an existing checkpoint) and continue updating a subset of the variables. If the optimization method is changed, deal with method-specific variables (e.g. momentum)

        • Being able to restore (from an existing checkpoint) and freeze trained weights for the rest of the variables

        • Saving all trainable variables after training

        • Add configuration parameters for finetuning, variable name regex,
          unit tests

        • A demo/tutorial

        • Preprocess the checkpoints for compatibility issues

        • Deal with batch norm and dropout layers (editing networks to remove batch norm variables)






        share|improve this answer















        [Edit]: Here are the docs for transfer learning with NiftyNet.



        This feature is currently being worked on. See here for full details.



        Intended capabilities include the following:




        • Command for printing all trainable variable names (with optional regular expression matching)

        • Ability to randomly initialize a subset of variables, this subset is created by regex name matching

        • Ability to restore (from an existing checkpoint) and continue updating a subset of the variables. If the optimization method is changed, deal with method-specific variables (e.g. momentum)

        • Being able to restore (from an existing checkpoint) and freeze trained weights for the rest of the variables

        • Saving all trainable variables after training

        • Add configuration parameters for finetuning, variable name regex,
          unit tests

        • A demo/tutorial

        • Preprocess the checkpoints for compatibility issues

        • Deal with batch norm and dropout layers (editing networks to remove batch norm variables)







        share|improve this answer














        share|improve this answer



        share|improve this answer








        edited Nov 23 '18 at 6:42

























        answered Sep 19 '18 at 4:41









        Aleksandar DjuricAleksandar Djuric

        476




        476
































            draft saved

            draft discarded




















































            Thanks for contributing an answer to Stack Overflow!


            • Please be sure to answer the question. Provide details and share your research!

            But avoid



            • Asking for help, clarification, or responding to other answers.

            • Making statements based on opinion; back them up with references or personal experience.


            To learn more, see our tips on writing great answers.




            draft saved


            draft discarded














            StackExchange.ready(
            function () {
            StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fstackoverflow.com%2fquestions%2f50768864%2fhow-do-i-implement-transfer-learning-in-niftynet%23new-answer', 'question_page');
            }
            );

            Post as a guest















            Required, but never shown





















































            Required, but never shown














            Required, but never shown












            Required, but never shown







            Required, but never shown

































            Required, but never shown














            Required, but never shown












            Required, but never shown







            Required, but never shown







            Popular posts from this blog

            Create new schema in PostgreSQL using DBeaver

            Deepest pit of an array with Javascript: test on Codility

            Fotorealismo