registering hparam set in user dir

41 views
Skip to first unread message

David Liebman

unread,
Feb 15, 2020, 1:47:53 PM2/15/20
to tensor2tensor
hi. I'm trying to use my own user dir. I need to register the hparam set so that I can use visualization later. How do I verify that the name is set properly? I get the following error.

KeyError: 'transformer_chat never registered with registry hparams. 

'transformer_chat' is the name of my hparam set. After that error is a long list of what I imagine must be built-in sets. How do I check mine is registered? I use the following code:

@registry.register_hparams('transformer_chat')
def transformer_chat():
    hparams
= transformer.transformer_base_v2()
    hparams
.num_hidden_layers = 6
    hparams
.hidden_size = 512
    hparams
.filter_size = 2048
    hparams
.num_heads = 8
    hparams
.attention_dropout = 0.6
    hparams
.layer_prepostprocess_dropout = 0.1
    hparams
.learning_rate = 0.05
   

    hparams
.learning_rate_schedule = 'legacy'


   
return hparams

    
I don't know where I saw the name declared like that. If it's wrong please point out the correct method.

Thanks much

Martin Popel

unread,
Feb 16, 2020, 3:31:39 AM2/16/20
to David Liebman, tensor2tensor
Hi,
see https://github.com/tensorflow/tensor2tensor/#adding-your-own-components
and don't forget the __init__.py file
https://github.com/tensorflow/tensor2tensor/blob/master/tensor2tensor/test_data/example_usr_dir/__init__.py

Martin
> --
> You received this message because you are subscribed to the Google Groups
> "tensor2tensor" group.
> To unsubscribe from this group and stop receiving emails from it, send an email
> to tensor2tenso...@googlegroups.com.
> To view this discussion on the web visit
> https://groups.google.com/d/msgid/tensor2tensor/add44d9e-d18f-48ed-99d3-b2643135e551%40googlegroups.com.

David Liebman

unread,
Feb 16, 2020, 9:25:09 AM2/16/20
to tensor2tensor
Thanks for your quick response. I'm still having the same problem.

This is the notebook I'm looking at.


This is my user_dir and the file where I try to register my hparams.


Though very messy, this is the file where I train my transformer. I do believe that the hparams are set because tf tells me that the base learning rate matches the one I set in the `problem.py` file above. It tells me this in the printout to the terminal from tensor2tensor when I run the train option.


I looked at the link to the 'user_dir' example and also the 'adding-your-own-components' example. It has changed my code slightly but I do not get the notebook to run anyway.

Do you have any other examples? Thanks. This is my new listing (in part)

@registry.register_hparams #('transformer_chat')

def transformer_chat():
 hparams
= transformer.transformer_base_v2()

 hparams
.num_hidden_layers = 6 # 2
 hparams
.hidden_size = 512 # 128
 hparams
.filter_size = 2048 # 512
 hparams
.num_heads = 8 #4
 hparams
.attention_dropout = 0.6
 hparams
.layer_prepostprocess_dropout = 0.1 #0.6
 hparams
.learning_rate = 0.05
 
#hparams.learning_rate_constant = 0.05

 hparams
.learning_rate_schedule = 'legacy'

 
return hparams

D Liebman

David Liebman

unread,
Feb 16, 2020, 12:09:07 PM2/16/20
to tensor2tensor
from tensor2tensor.utils import usr_dir
usr_dir.import_usr_dir('my-usr-dir')


I tried this as you suggested and I am seeing some output. Thanks.

On Saturday, February 15, 2020 at 1:47:53 PM UTC-5, David Liebman wrote:
Reply all
Reply to author
Forward
0 new messages