g.tmp <- graph_from_adjacency_matrix(opt.norm.sub[[j]][, , inds[[i]][k]], mode = 'undirected', diag = F, weighted = T)
g.tmp <- set_brainGraph_attr(g.tmp, atlas = atlas, modality = modality, weighting = 'pearson', threshold = thresholds[j], subject = covars[groups[i], ID[k]], group = groups[i], use.parallel=FALSE, A = opt.norm.sub[[j]][, , inds[[i]][k]])
thresholds <- rev(seq(0.2, 0.8, 0.05))
sub.thresh <- 0
…
> ## STEP 1 of 7.3 Graph Creation ## OPT
>
> setwd(savedirA)
>
> g.group <- g <- fnames <- vector('list', length=length(groups))
>
> for (i in seq_along(groups)) {
+ for (j in seq_along(thresholds)) {
+ print(paste0('Threshold ', j, '/', length(thresholds), '; group ', i, '; ', format(Sys.time(), '%H:%M:%S')))
+ for (k in seq_along(inds[[i]])) {
+ g.tmp <- graph_from_adjacency_matrix(opt.norm.sub[[j]][, , inds[[i]][k]], mode = 'undirected', diag = F, weighted = T)
+ g.tmp <- set_brainGraph_attr(g.tmp, atlas = atlas, modality = modality, weighting = 'pearson', threshold = thresholds[j], subject = covars[groups[i], ID[k]], group = groups[i], use.parallel=FALSE, A = opt.norm.sub[[j]][, , inds[[i]][k]])
+ saveRDS(g.tmp, file = paste0(savedirA, sprintf('g%i_thr%02i_subj%03i%s', i, j, k, '.rds')))
+ }
+ }
+
+ # Group mean weighted graphs
+ print(paste0('Group', i, '; ', format(Sys.time(), '%H:%M:%S')))
+ g.group[[i]] <- lapply(seq_along(thresholds), function(x) graph_from_adjacency_matrix(opt.norm.mean[[x]][[i]], mode = 'undirected', diag = F, weighted = T))
+ g.group[[i]] <- llply(seq_along(thresholds), function(x) set_brainGraph_attr(g.group[[i]][[x]], atlas = atlas, modality = modality, weighting = 'pearson', threshold = thresholds[x], group = groups[i], A = opt.norm.mean[[x]][[i]], use.parallel = FALSE), .parallel = TRUE)
+ }
[1] "Threshold 1/13; group 1; 16:28:25"
[1] "Threshold 2/13; group 1; 16:31:35"
[1] "Threshold 3/13; group 1; 16:35:17"
[1] "Threshold 4/13; group 1; 16:39:08"
[1] "Threshold 5/13; group 1; 16:42:58"
[1] "Threshold 6/13; group 1; 16:46:49"
Error in if (nrow(adjmatrix) != ncol(adjmatrix)) { :
argument is of length zero
The code only works when I set sub.thresh to at least 0.5; that seems unusually high for fMRI data though.
My guess is that the ROIs are too highly correlated and will be completely connected above a certain (lower) threshold when I use a sub.thresh < 0.5
Thanks for your help
Hi Philipp, it looks like the graphs are in fact "too dense" so that an error is
thrown when calculating local efficiency. This occurs when calculating the
neighborhood of each vertex; in a fully connected graph, the neighborhood of
each vertex is the whole graph, and the local efficiency is identical for all
vertices. In this case, the value for local efficiency is equal to the global
efficiency.
It first happens at threshold 9 because 1 of the subject graphs is fully
connected, although several others are very high. I think you will have to use
higher thresholds. Even for the first threshold, the average density is ~0.74.
Another option is to increase "sub.thresh".
Chris
On Thu, Jul 11, 2019 at 11:17 PM, Chris Watson <chris....@gmail.com> wrote:
> from: Chris Watson <chris....@gmail.com>
> date: Thu, Jul 11 11:17 PM -05:00 2019
> to: brainGr...@googlegroups.com
> subject: [brainGraph-help] Re: Use of Covariance Matrices from R
>
> So it looks like an issue when calculating local efficiency. I have seen a
> similar error from other users recently. If you send me the data I will look
> into it. I will try to push a fix to GitHub this weekend if it is
> straightforward enough.
>
> On Thu, Jul 11, 2019 at 07:11 PM, Philipp Riedel <philipp.ri...@gmail.com> wrote:
>
>> from: Philipp Riedel <philipp.ri...@gmail.com>
>> To unsubscribe from this group and stop receiving emails from it, send an email to brainGr...@googlegroups.com.
I am not sure about doing global signal regression. I don't read too closely the
literature for rs-fMRI preprocessing, but I have seen quite a few papers in the
past few years that look at these kinds of issues specifically. See for example:
* Chen et al., 2018, Human Brain Mapping 39(11)
* Andellini et al., 2015, J Neurosci Methods
* Bright et al., 2017, NeuroImage 154
* Murphy et al., 2017, NeuroImage 154
and references therein. I am not sure if consensus thresholding is common for
fMRI but I think it is a good idea to use one; maybe not 0.5, but something
greater than 0.
Chris
> To unsubscribe from this group and stop receiving emails from it, send an email to brainGr...@googlegroups.com.
A1 <- brainGraph:::read.array(CM.list)
Nv <- nrow(A1)
A1[is.nan(A1)] <- 0
A.norm.sub <- lapply(seq_along(thresholds), function(z) {
#cat("Current threshold: ", thresholds[z], "\n")
lapply(seq_along(inds), function(x) {
array(sapply(inds[[x]], function(y) {
#cat(" --- current inds: ", inds[[x]][y], "\n")
ifelse(A1[, , y] > thresholds[z], A1[, , y], 0)
})
,dim = dim(A1[, , inds[[x]]]))
})
})
A.norm.sub <- lapply(A.norm.sub, function(x) do.call(abind::abind, x))
I am not sure about doing global signal regression. I don't read too closely the
literature for rs-fMRI preprocessing, but I have seen quite a few papers in the
past few years that look at these kinds of issues specifically. See for example:
* Chen et al., 2018, Human Brain Mapping 39(11)
* Andellini et al., 2015, J Neurosci Methods
* Bright et al., 2017, NeuroImage 154
* Murphy et al., 2017, NeuroImage 154
and references therein. I am not sure if consensus thresholding is common for
fMRI but I think it is a good idea to use one; maybe not 0.5, but something
greater than 0.
Chris
> To unsubscribe from this group and stop receiving emails from it, send an email to brainGr...@googlegroups.com.