Grupos
Iniciar sesión
Grupos
tensor2tensor
Conversaciones
Información
Enviar comentarios
Ayuda
tensor2tensor
Contactar con los propietarios y administradores
1-30 de 184
Welcome to Tensor2Tensor.
This group is dedicated to discussing issues related to the Tensor2Tensor library:
https://github.com/tensorflow/
tensor2tensor
We're discussing issues with the library, problems with training and use of the models,
other issues related to ML and TensorFlow, and recent Tensor2Tensor releases.
Feel free to post here!
T2T Team
Marcar todo como leído
Denunciar grupo
0 grupos seleccionados
subramaniyam cabila
,
Joydeep Mitra
3
12/11/23
Clarification
I apologize for the error. On Mon, 13 Nov 2023, 09:13 Joydeep Mitra, <joyd...@gmail.com> wrote
no leída,
Clarification
I apologize for the error. On Mon, 13 Nov 2023, 09:13 Joydeep Mitra, <joyd...@gmail.com> wrote
12/11/23
Betizazu Alemu
6/6/22
I need Just one Command
I know it's no longer supported, but I am just starting on NLP and working on a Translation
no leída,
I need Just one Command
I know it's no longer supported, but I am just starting on NLP and working on a Translation
6/6/22
Lohith Naj N
5/12/21
Unable to train the model
Hello, I used your code in machine learning foundations video lecture on your youtube channel and I
no leída,
Unable to train the model
Hello, I used your code in machine learning foundations video lecture on your youtube channel and I
5/12/21
孟焯
8/7/20
Model Apply and Application - Image Classification
Dear experts, I want to use Tensor2Tensor for image classification tasks. After model training, I got
no leída,
Model Apply and Application - Image Classification
Dear experts, I want to use Tensor2Tensor for image classification tasks. After model training, I got
8/7/20
Rohan
,
Lukasz Kaiser
2
20/3/20
GSOC 2020
Hi Rohan. We're not planning to convert T2T to TF2 as is. Instead, please contribute to Trax, the
no leída,
GSOC 2020
Hi Rohan. We're not planning to convert T2T to TF2 as is. Instead, please contribute to Trax, the
20/3/20
Arben Sabani
2
19/3/20
Tranformer model with float inputs and outputs
I know that the word sequnce in translate problem is converted to tensors of float, but i want to
no leída,
Tranformer model with float inputs and outputs
I know that the word sequnce in translate problem is converted to tensors of float, but i want to
19/3/20
Paloma Jimeno
,
Martin Popel
3
20/2/20
tensorflow_model_server translation from translate_enes_wmt32k t2t library. "Could not parse example input"
Thank you, Martin, It is still a bad request (see picture) Could be an option to use the t2t-query-
no leída,
tensorflow_model_server translation from translate_enes_wmt32k t2t library. "Could not parse example input"
Thank you, Martin, It is still a bad request (see picture) Could be an option to use the t2t-query-
20/2/20
David Liebman
,
Martin Popel
4
16/2/20
registering hparam set in user dir
from tensor2tensor.utils import usr_dir usr_dir.import_usr_dir('my-usr-dir') I tried this as
no leída,
registering hparam set in user dir
from tensor2tensor.utils import usr_dir usr_dir.import_usr_dir('my-usr-dir') I tried this as
16/2/20
Ravi Jain
1/11/19
Regarding attention
Why is ``` Q*(K).t() ``` ( t() mean transpose) done in attention, and not ``` Q*(Q+K).t() ``` for
no leída,
Regarding attention
Why is ``` Q*(K).t() ``` ( t() mean transpose) done in attention, and not ``` Q*(Q+K).t() ``` for
1/11/19
Sumeet Singh
29/10/19
Invalid argument: In[0] and In[1] must have compatible batch dimensions: [1,8,1288,4608,4] vs. [1,8,1250,4,256]
This error is occurring inside the function dot_product_attention. I don't even know how
no leída,
Invalid argument: In[0] and In[1] must have compatible batch dimensions: [1,8,1288,4608,4] vs. [1,8,1250,4,256]
This error is occurring inside the function dot_product_attention. I don't even know how
29/10/19
Sumeet Singh
23/10/19
local_attention_2d vs unmasked_local_attention_2d_tpu
Hi All, I am writing a new model for images that uses 2D Local Self Attention for the encoder as
no leída,
local_attention_2d vs unmasked_local_attention_2d_tpu
Hi All, I am writing a new model for images that uses 2D Local Self Attention for the encoder as
23/10/19
Sumeet Singh
21/10/19
ImageMsCocoCharacters (image_ms_coco_characters)
Hi, Does anybody know what model goes along with the ImageMsCocoCharacters (image_ms_coco_characters)
no leída,
ImageMsCocoCharacters (image_ms_coco_characters)
Hi, Does anybody know what model goes along with the ImageMsCocoCharacters (image_ms_coco_characters)
21/10/19
Weiguang Guan
, …
Sumeet Singh
7
21/10/19
conflict between TF 2.0 and T2T 1.14.1
Got it to work with Python 3.6 using pipenv --skip-lock. Pasting the pipfile for what its worth. [[
no leída,
conflict between TF 2.0 and T2T 1.14.1
Got it to work with Python 3.6 using pipenv --skip-lock. Pasting the pipfile for what its worth. [[
21/10/19
Aleksas Pielikis
21/10/19
Defining Text Multi-label Classification Problem
Trying to define a text multi-label classification problem in t2t with kaggle jigsaw toxic comment
no leída,
Defining Text Multi-label Classification Problem
Trying to define a text multi-label classification problem in t2t with kaggle jigsaw toxic comment
21/10/19
Tianyu Jiang
8/10/19
how to indicate input and output node names
Hi everyone, I am trying to convert the t2t model to Apple coreML model.( Transformer model). But i
no leída,
how to indicate input and output node names
Hi everyone, I am trying to convert the t2t model to Apple coreML model.( Transformer model). But i
8/10/19
Simon Mc Duff
, …
Erik Chan
8
3/10/19
How to add Input Parameter in T2T using Tensorflow Serving
Hi Simon Would you mind sharing your code? I am also trying to do Input Parameter in T2T using
no leída,
How to add Input Parameter in T2T using Tensorflow Serving
Hi Simon Would you mind sharing your code? I am also trying to do Input Parameter in T2T using
3/10/19
陳裕政
,
Eugene Kuznetsov
5
1/10/19
with Tensor2tensor when making a transformer training.......
I think the dateset is fine. Because, the training could be continued. The following is including
no leída,
with Tensor2tensor when making a transformer training.......
I think the dateset is fine. Because, the training could be continued. The following is including
1/10/19
Jonny Saunders
29/9/19
Conditional Language Modeling?
Hello! I've been playing around with T2T for awhile now, still getting my feet around some of the
no leída,
Conditional Language Modeling?
Hello! I've been playing around with T2T for awhile now, still getting my feet around some of the
29/9/19
David Liebman
,
John Ed Alvinez
2
29/9/19
learning rate 2.0
Hi David, Good day. I think the Base learning rate text is logged here: https://github.com/tensorflow
no leída,
learning rate 2.0
Hi David, Good day. I think the Base learning rate text is logged here: https://github.com/tensorflow
29/9/19
siddhant sharma
,
Martin Popel
2
1/8/19
Training the a new model based on previous trained model
Just use the same output_dir and continue training. Make sure the new dataset uses the same subword
no leída,
Training the a new model based on previous trained model
Just use the same output_dir and continue training. Make sure the new dataset uses the same subword
1/8/19
Mrinal Roy
, …
Praneet
3
30/7/19
CIFAR-10, CIFAR-100 Models with Mesh TensorFlow
+1 On Thursday, 13 June 2019 14:09:11 UTC-4, Mrinal Roy wrote: Anyone tried or have pointers to
no leída,
CIFAR-10, CIFAR-100 Models with Mesh TensorFlow
+1 On Thursday, 13 June 2019 14:09:11 UTC-4, Mrinal Roy wrote: Anyone tried or have pointers to
30/7/19
Suhas Shekhar
,
Praneet
2
30/7/19
Model-Parallel Layout in Mesh-Tensorflow
+1, looking for answer/ Thanks you On Monday, 1 July 2019 20:39:07 UTC-4, Suhas Shekhar wrote: I am
no leída,
Model-Parallel Layout in Mesh-Tensorflow
+1, looking for answer/ Thanks you On Monday, 1 July 2019 20:39:07 UTC-4, Suhas Shekhar wrote: I am
30/7/19
Youwei Liang
, …
Lukasz Kaiser
5
27/7/19
Does anyone know some easy-to-train datasets?
Thank you. I want to train on an NLP dataset.
no leída,
Does anyone know some easy-to-train datasets?
Thank you. I want to train on an NLP dataset.
27/7/19
Eugene Kuznetsov
,
Lukasz Kaiser
7
22/7/19
bfloat16 GPU performance
Found another fairly significant optimization. It turns out that, if you add a dense layer to
no leída,
bfloat16 GPU performance
Found another fairly significant optimization. It turns out that, if you add a dense layer to
22/7/19
Seongjin Cho
,
Lukasz Kaiser
2
20/7/19
"Generic conv implementation only supports NHWC tensor format for now" error when using t2t-decode
This looks very much like a TF or CUDA problem. I'm not sure what the solution is, but please ask
no leída,
"Generic conv implementation only supports NHWC tensor format for now" error when using t2t-decode
This looks very much like a TF or CUDA problem. I'm not sure what the solution is, but please ask
20/7/19
Martin Popel
18/7/19
Re: Formula used to calculate BLEU
Hi Arben, t2t-bleu implementation (as well as SacreBLEU) uses the correct BLEU formula (as in the
no leída,
Re: Formula used to calculate BLEU
Hi Arben, t2t-bleu implementation (as well as SacreBLEU) uses the correct BLEU formula (as in the
18/7/19
Santosh Gupta
,
Lukasz Kaiser
2
18/7/19
Is the Text2textCopyableTokens problem for summarization? If so, what to put for 'extra_label'?
I must admit I'm not sure what problem this file is about, does anyone know? Lukasz On Tue, Jul
no leída,
Is the Text2textCopyableTokens problem for summarization? If so, what to put for 'extra_label'?
I must admit I'm not sure what problem this file is about, does anyone know? Lukasz On Tue, Jul
18/7/19
Arben Sabani
,
Martin Popel
3
13/7/19
mathematical formula used for t2t-bleu
Thanks a lot, Martin. Thant helps a lot. best Arben On Saturday, 13 July 2019 18:30:47 UTC+2, Arben
no leída,
mathematical formula used for t2t-bleu
Thanks a lot, Martin. Thant helps a lot. best Arben On Saturday, 13 July 2019 18:30:47 UTC+2, Arben
13/7/19
siddhant sharma
,
Martin Popel
2
13/7/19
Generation of word embedding in tensor2tensor-transformer model
Hi Siddhant, by default, word embeddings are treated as all other weights in the Transformer model,
no leída,
Generation of word embedding in tensor2tensor-transformer model
Hi Siddhant, by default, word embeddings are treated as all other weights in the Transformer model,
13/7/19
Arben Sabani
, …
est namoc
6
13/7/19
new translate problem which can deal with corrections
Thanks a lot, very interesting articles. I had a similar idea or approach how translation should work
no leída,
new translate problem which can deal with corrections
Thanks a lot, very interesting articles. I had a similar idea or approach how translation should work
13/7/19