Related publication URL redirects to the current page

160 views
Skip to first unread message

Shuxian Zhang

unread,
Apr 13, 2025, 11:27:14 PM4/13/25
to Dataverse Users Community

Hello everyone,

We recently encounter an issue with the Related Publication URL field in our metadata. When we click the link, instead of being redirected to the correct external webpage (a publisher website URL or a handle link, starting with https://), the page just reloads back to the current dataset page.

Has anyone experienced this issue before?

We’ve checked our citation.tsv file, and the publicationURL field appears to be configured correctly (screenshotted below).

Any suggestions or similar experiences would be greatly appreciated. Thank you!

publicationURL.jpg

Best regards,
Shuxian from NTU Library

Shuxian Zhang

unread,
Apr 13, 2025, 11:31:39 PM4/13/25
to Dataverse Users Community
The screenshot in the previous email looks blurry. I am resending it as an attachment here.
publicationURL.png

Shuxian Zhang

unread,
Apr 13, 2025, 11:38:10 PM4/13/25
to Dataverse Users Community
Attaching an an example dataset.

Yuyun W

unread,
Apr 13, 2025, 11:39:59 PM4/13/25
to Dataverse Users Community
Details:
We are on 5.11.1 
We noticed this behavior after we performed 'reloading metadata block'. We modified displayOnCreaate for Grant, but left the rest of citation.tsv untouched. 

The URL of Related Publication in metadata tab is erroneous, but the same URL works fine in the "summary" section. 

James Myers

unread,
Apr 14, 2025, 9:21:51 AM4/14/25
to dataverse...@googlegroups.com

I’d suspect some problem with your database entry for the formatting of that field. If I look in the page source for the example dataset you sent I see:

<a href="" https:="" hdl.handle.net="" 10356="" 174165""="" target="" _blank""="" rel="" noopener""="">https://hdl.handle.net/10356/174165</a>

 

Which looks like it has double sets of quotation marks, etc. I think that can happen when a file containing quote marks is run through Excel. I’d suggest checking the specific copy of the tsv file you’re using and trying to reload the citation block with a clean file.

 

-- Jim

 

 

Best regards,

Shuxian from NTU Library

--
You received this message because you are subscribed to the Google Groups "Dataverse Users Community" group.
To unsubscribe from this group and stop receiving emails from it, send an email to dataverse-commu...@googlegroups.com.
To view this discussion visit https://groups.google.com/d/msgid/dataverse-community/eda682ee-c3f7-40a1-a5f2-27d09a956d81n%40googlegroups.com.

Shuxian Zhang

unread,
Apr 15, 2025, 3:59:55 AM4/15/25
to Dataverse Users Community
Hi Jim,

Thanks a lot for your advice. 

Indeed, we were using Excel to edit the TSV file previously. After we change to using a text editor to edit and save the TSV file, the extra quotation marks disappear.

However, after we reload the new TSV, we encounter another issue. There are duplicated controlled vocabularies(CVs) created for the language field. I attach the screenshot below.

For some languages, there are two "identical" options in the dropdown list, one with quotation marks, the other without. And the option with the quotation marks seems to come from the old TSV file we used earlier.

Any suggestion on how we can remove the duplicated CVs (those with the quotation marks)? 

Best regards,
Shuxian
duplicate_language_list.png

James Myers

unread,
Apr 15, 2025, 8:38:09 AM4/15/25
to dataverse...@googlegroups.com

Shuxian,

Unfortunately, these can’t be removed via API. In the database, you can remove these by finding the incorrect terms in the controlledvocabularyvalue table and then deleting any related controlledvocabalternate values and the entry itself. (E.g.

 

DELETE from controlledvocabalternate where controlledvocabularyvalue_id=<id of term being deleted>;

 

And

 

DELETE from controlledvocabularyvalue where id=<id of term being deleted>;

 

If the incorrect values have been used already, you’ll first need to change all references to the correct value, e.g.

 

UPDATE datasetfield_controlledvocabularyvalue set controlledvocabularyvalues_id=<correct vocab id> where controlledvocabularyvalues_id=<incorrect vocab id>;

 

As always, you should have a db backup, check the sql on a test system, etc.

Zhang Shuxian

unread,
Apr 16, 2025, 1:45:58 AM4/16/25
to dataverse...@googlegroups.com, qqm...@hotmail.com, Toby Teng, Jeyalakshmi Sambasivam, Ong Hong Leong, Yuyun Wirawati, Nguyen Quynh Nga

Hi Jim,

 

Thanks a lot for the advice.

 

I’m not entirely sure about the database backup, so I’ve CCed our tech team here for their input.

 

Assuming that we can’t roll back to a backup version and need to delete the incorrect values manually — as far as I know, these CV values haven’t been used in any datasets yet (it would be great if you could suggest a way to verify this via the DB, too) —may I confirm with you whether the following steps are appropriate?

 

Step 1: SELECT id FROM datasetfieldtype WHERE name = 'language'

 

Step 2: SELECT id, value, identifier, displayorder FROM controlledvocabularyvalue WHERE datasetfieldtype_id = <ID from Step 1>

 

Step 3: From the result of Step 2, get a list of IDs corresponding to the incorrect CV values

 

Step 4: DELETE FROM controlledvocabalternate WHERE controlledvocabularyvalue_id = <IDs from Step 3>

 

Step 5: DELETE FROM controlledvocabularyvalue where id = =<IDs from Step 3>

 

Thanks.

 

Warm regards,

Shuxian

 

 

From: dataverse...@googlegroups.com <dataverse...@googlegroups.com> On Behalf Of James Myers
Sent: Tuesday, 15 April 2025 8:38 pm
To: dataverse...@googlegroups.com
Subject: RE: [Dataverse-Users] Re: Related publication URL redirects to the current page

 

[Alert: Non-NTU Email] Be cautious before clicking any link or attachment.

--
You received this message because you are subscribed to a topic in the Google Groups "Dataverse Users Community" group.
To unsubscribe from this topic, visit https://groups.google.com/d/topic/dataverse-community/395KSpiGcfc/unsubscribe.
To unsubscribe from this group and all its topics, send an email to dataverse-commu...@googlegroups.com.
To view this discussion visit https://groups.google.com/d/msgid/dataverse-community/IA2P220MB19864C78FC20EC22EA02BE9DBFB22%40IA2P220MB1986.NAMP220.PROD.OUTLOOK.COM.


CONFIDENTIALITY: This email is intended solely for the person(s) named and may be confidential and/or privileged. If you are not the intended recipient, please delete it, notify us and do not copy, use, or disclose its contents.
Towards a sustainable earth: Print only when necessary. Thank you.

James Myers

unread,
Apr 16, 2025, 7:01:47 AM4/16/25
to Zhang Shuxian, dataverse...@googlegroups.com, Toby Teng, Jeyalakshmi Sambasivam, Ong Hong Leong, Yuyun Wirawati, Nguyen Quynh Nga

~Yes – in step 2, the value column is ‘strvalue’ (not ‘value’). You could also find the incorrect ones more quickly with something like and strvalue like '%"%'  in the where clause (assuming they all have a “ char).

 

-- Jim

Zhang Shuxian

unread,
Apr 27, 2025, 11:03:17 PM4/27/25
to James Myers, dataverse...@googlegroups.com, Toby Teng, Jeyalakshmi Sambasivam, Ong Hong Leong, Yuyun Wirawati, Nguyen Quynh Nga

Hi Jim,

 

Thanks a lot for your advice.

 

We have completed the first 3 steps mentioned earlier and have identified 23 CV values to delete (shown below).

 

SELECT * FROM controlledvocabularyvalue WHERE id in (3623,3624,3625,3626,3627,3628,3629,3630,3631,3632,3633,3634,3635,3636,3637,3638,3639,3640,3641,3642,3643,3644,3645);

  id  | displayorder | identifier |                      strvalue                       | datasetfieldtype_id

------+--------------+------------+-----------------------------------------------------+---------------------

3623 |           18 |            | "Bengali, Bangla"                                   |                  34

3624 |           25 |            | "Catalan,Valencian"                                 |                  34

3625 |           28 |            | "Chichewa, Chewa, Nyanja"                           |                  34

3626 |           37 |            | "Divehi, Dhivehi, Maldivian"                        |                  34

3627 |           48 |            | "Fula, Fulah, Pulaar, Pular"                        |                  34

3628 |           55 |            | "Haitian, Haitian Creole"                           |                  34

3629 |           74 |            | "Kalaallisut, Greenlandic"                          |                  34

3630 |           80 |            | "Kikuyu, Gikuyu"                                    |                  34

3631 |           87 |            | "Kwanyama, Kuanyama"                                |                  34

3632 |           89 |            | "Luxembourgish, Letzeburgesch"                      |                  34

3633 |           91 |            | "Limburgish, Limburgan, Limburger"                  |                  34

3634 |          109 |            | "Navajo, Navaho"                                    |                  34

3635 |          119 |            | "Ojibwe, Ojibwa"                                    |                  34

3636 |          120 |            | "Old Church Slavonic,Church Slavonic,Old Bulgarian" |                  34

3637 |          123 |            | "Ossetian, Ossetic"                                 |                  34

3638 |          124 |            | "Panjabi, Punjabi"                                  |                  34

3639 |          128 |            | "Pashto, Pushto"                                    |                  34

3640 |          142 |            | "Scottish Gaelic, Gaelic"                           |                  34

3641 |          144 |            | "Sinhala, Sinhalese"                                |                  34

3642 |          149 |            | "Spanish, Castilian"                                |                  34

3643 |          159 |            | "Tibetan Standard, Tibetan, Central"                |                  34

3644 |          169 |            | "Uyghur, Uighur"                                    |                  34

3645 |          183 |            | "Zhuang, Chuang"                                    |                  34

(23 rows)

 

 

We also checked the controlledvocabalternate table. We find that most of the controlledvocabularyvalue_id have more than one associated strvalue in the controlledvocabalternate table (shown below).

 

Although we will back up the DB before proceeding with the deletion, we would still like to confirm two things before proceeding:

 

  1. Is it normal that one controlled vocabulary value has multiple associated alternate values?

 

  1. Is it safe to delete the above 23 rows from the controlledvocabularyvalue table and the corresponding 45 rows from the controlledvocabalternate table?

We want to ensure that removing these rows will not cause issues in the Dataverse UI or backend functionalities.

 

 

SELECT * FROM controlledvocabalternate WHERE controlledvocabularyvalue_id in (3623,3624,3625,3626,3627,3628,3629,3630,3631,3632,3633,3634,3635,3636,3637,3638,3639,3640,3641,

3642,3643,3644,3645);

  id  | strvalue | controlledvocabularyvalue_id | datasetfieldtype_id

------+----------+------------------------------+---------------------

3289 | bn       |                         3623 |                  34

3290 | ben      |                         3623 |                  34

3304 | cat ca   |                         3624 |                  34

3309 | nya      |                         3625 |                  34

3310 | ny       |                         3625 |                  34

3330 | div      |                         3626 |                  34

3331 | dv       |                         3626 |                  34

3354 | ff       |                         3627 |                  34

3355 | ful      |                         3627 |                  34

3371 | hat ht   |                         3628 |                  34

3409 | kl       |                         3629 |                  34

3410 | kal      |                         3629 |                  34

3421 | ki       |                         3630 |                  34

3422 | kik      |                         3630 |                  34

3433 | kua      |                         3631 |                  34

3434 | kj       |                         3631 |                  34

3437 | ltz      |                         3632 |                  34

3438 | lb       |                         3632 |                  34

3441 | li       |                         3633 |                  34

3442 | lim      |                         3633 |                  34

3479 | nav      |                         3634 |                  34

3480 | nv       |                         3634 |                  34

3497 | oj       |                         3635 |                  34

3498 | oji      |                         3635 |                  34

3499 | cu       |                         3636 |                  34

3500 | chu      |                         3636 |                  34

3505 | os       |                         3637 |                  34

3506 | oss      |                         3637 |                  34

3507 | pan      |                         3638 |                  34

3508 | pa       |                         3638 |                  34

3516 | ps       |                         3639 |                  34

3517 | pus      |                         3639 |                  34

3546 | gd       |                         3640 |                  34

3547 | gla      |                         3640 |                  34

3550 | sin      |                         3641 |                  34

3551 | si       |                         3641 |                  34

3561 | spa      |                         3642 |                  34

3562 | es       |                         3642 |                  34

3581 | bo       |                         3643 |                  34

3582 | bod      |                         3643 |                  34

3583 | tib      |                         3643 |                  34

3602 | uig      |                         3644 |                  34

3603 | ug       |                         3644 |                  34

3631 | za       |                         3645 |                  34

3632 | zha      |                         3645 |                  34

(45 rows)

 

 

Thanks again!!

 

Warm regards,

Shuxian

 

From: James Myers <qqm...@hotmail.com>
Sent: Wednesday, 16 April 2025 7:02 pm
To: Zhang Shuxian <shuxia...@ntu.edu.sg>; dataverse...@googlegroups.com
Cc: Toby Teng <toby...@ntu.edu.sg>; Jeyalakshmi Sambasivam <jeyal...@ntu.edu.sg>; Ong Hong Leong <ON...@ntu.edu.sg>; Yuyun Wirawati <yuyun...@ntu.edu.sg>; Nguyen Quynh Nga <QNNg...@ntu.edu.sg>
Subject: RE: [Dataverse-Users] Re: Related publication URL redirects to the current page

 

[Alert: Non-NTU Email] Be cautious before clicking any link or attachment.

James Myers

unread,
Apr 28, 2025, 8:33:02 AM4/28/25
to Zhang Shuxian, dataverse...@googlegroups.com, Toby Teng, Jeyalakshmi Sambasivam, Ong Hong Leong, Yuyun Wirawati, Nguyen Quynh Nga

Shuxian,

 

Deleting those should be fine. If someone has used those values in a dataset, you won’t be able to delete them unless/until you edit/delete such entries. (As you’d want – the database won’t allow references to values that are to be deleted.)

 

It is common to have multiple alternates. You can see them in the citation.tsv definition, e.g. in

                language              Bengali, Bangla  ben        766        ben        Bengali  Bangla   bn

Everything after the 766 (the display order) is an alternate. In your case, you only have ben and bn – my example is from 6.6 and earlier versions did not have Bengali and Bangla as alternates, just bn and ben.

 

(I see you also have alternates like “cat ca” – looks to me like older versions of the citation block had a bug with a space between those two options instead of a tab character, making them one entry instead of two.)

Zhang Shuxian

unread,
Jun 2, 2025, 5:53:11 AM6/2/25
to dataverse...@googlegroups.com, James Myers, Toby Teng, Jeyalakshmi Sambasivam, Ong Hong Leong, Yuyun Wirawati, Nguyen Quynh Nga

Hi Jim,

 

Thanks a lot for the information.

 

We tried to delete the problematic CV values and encountered an error (screenshotted below).

 

 

According to the error message, the id field in the controlledvocabularyvalue table is also a foreign key in the datasetfield_controlledvocabularyvalue table.

 

Here is some relevant documentation for v5.14, https://guides.dataverse.org/en/5.14/schemaspy/tables/controlledvocabularyvalue.html. (We are at Version 5.11.1, but I can’t find the documentation for v5.11.1.)

 

According to the above documentation, , the id field in the controlledvocabularyvalue table is also a foreign key in the dataversesubjects table.

 

  • Do we need to delete all the corresponding rows in these associated tables before we make deletion in the controlledvocabularyvalue table?

 

  • If yes, how would it affect our Dataverse UI and backend functionalities?

 

 

 

 

Thanks again!

 

Warm regards,

Shuxian

James Myers

unread,
Jun 20, 2025, 12:45:38 PM6/20/25
to Zhang Shuxian, dataverse...@googlegroups.com, Toby Teng, Jeyalakshmi Sambasivam, Ong Hong Leong, Yuyun Wirawati, Nguyen Quynh Nga

Shuxian,

Sorry for the delay – too much travel lately.

 

The errors you are seeing are from people having used those values. That’s what I meant by “you won’t be able to delete them unless/until you edit/delete such entries.” The two options are either to edit the database to have the datasetfield_controlledvocabularyvalue and dataversesubjects entries for the values being remove refer to the new/good the controlledvocabularyvalue entries instead, or you can delete the entries in those tables. The latter is really deleting the entries from the datasets/dataverses so it is probably better to just reference the new values. Once that’s done, you can go ahead and delete the bad cvv values and any associated alternate values.

Zhang Shuxian

unread,
Jun 26, 2025, 10:04:19 PM6/26/25
to James Myers, dataverse...@googlegroups.com, Toby Teng, Jeyalakshmi Sambasivam, Ong Hong Leong, Yuyun Wirawati, Nguyen Quynh Nga

Hi Jim,

 

Thanks a lot for your advice.

 

We have successfully removed the problematic CVVs from our UI and the application is working fine so far.

 

Thank you! 😊

Reply all
Reply to author
Forward
0 new messages