Why the read-limited scope's expiration timestamps changed.

22 views
Skip to first unread message

Andrew Gilmartin

unread,
Feb 1, 2018, 9:10:24 AM2/1/18
to ORCID API Users
On Tuesday I posted about a problem Crossref had with our read-limited access tokens expiration values. (The posting has since been deleted as it contained access tokens. I approve of this deletion.) I wanted to follow up with the root cause of this problem as this might affect others too.

On Monday I replaced the use of 1.0 API /orcid-work/read-limited scope with the 2.0 API /read-limited scope in the code and a MySql table. I immediately started seeing problems that the access tokens were unusable, I assumed that this was because my scope change was invalid and so reverted the change. I continued to see problems in the log, but assumed these were just the final draining of the invalid scope errors. The following morning I saw that this was not the case and found that instead the cause was that all the access token expirations were set to Monday. I disabled the service -- it continues to collect data, but did not act on it in this disabled state -- and started looking for the root problem.

The root cause of the problem was due to a forgotten about MySql feature. When a column is defined as a not null timestamp without a default value then that column will be automatically updated to the current time when ANY other column is modified. (While this behavior is bizarre it is documented.) Normally the table containing the access tokens is only added to and so this feature had not been triggered. When I modified the scope column the feature was triggered and all the expiration dates were to set to the current timestamp (2018-01-30 14:07:29.) You can repeat this yourself using

drop table test1;
create table test1 ( id int, ts timestamp not null );

insert into test1 ( id, ts ) values ( 1, '2000-01-01');
select * from test1;

the data will be 

| 1 |  2000-01-01 00:00:00 |

Now execute 

update test1 set id = 2 where id = 1;
select * from test1;

and the data will be

| 2  | 2018-01-30 14:07:29 |

ie, 2 and the current timestamp.

-- Andrew

Reply all
Reply to author
Forward
0 new messages