I read the import/export document which mentions that write.table
command converts NaN's to NA. Is there any other way I can store the
NaN's. I tried the write syntax it gives me error codes.
Each data files are of dimensions 1000 x 21 .
I would appreciate any help in this regard.
Many thanks
______________________________________________
R-h...@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.
> Hello,
> I am working multiple simulated data sets with missing values, I would
> like to store these data sets in either tab delimited format for .csv
> format with missing values marked as NaN's instead of NA's.
>
> I read the import/export document which mentions that write.table
> command converts NaN's to NA. Is there any other way I can store the
> NaN's. I tried the write syntax it gives me error codes.
> Each data files are of dimensions 1000 x 21 .
>
> I would appreciate any help in this regard.
A feasible workaround is to convert your data to character before writing them.
Suppose that your data are in a data frame called ``clyde''. Set
mung <- as.data.frame(lapply(clyde,as.character))
write.csv(mung,"mung.csv",row.names=FALSE,quote=FALSE)
# Check:
gorp <- read.csv("mung.csv")
all.equal(gorp,clyde)
[1] TRUE
HTH
cheers,
Rolf Turner
######################################################################
Attention:
This e-mail message is privileged and confidential. If you are not the
intended recipient please delete the message and notify the sender.
Any views or opinions presented are solely those of the author.
This e-mail has been scanned and cleared by MailMarshal
www.marshalsoftware.com
######################################################################
shankar-17 wrote:
>
> Hello,
> I am working multiple simulated data sets with missing values, I would
> like to store these data sets in either tab delimited format for .csv
> format with missing values marked as NaN's instead of NA's.
>
> I read the import/export document which mentions that write.table
> command converts NaN's to NA. Is there any other way I can store the
> NaN's. I tried the write syntax it gives me error codes.
> Each data files are of dimensions 1000 x 21 .
>
> I would appreciate any help in this regard.
>
> Many thanks
>
> foo <- matrix(0,nrow=3,ncol=3)
> foo
[,1] [,2] [,3]
[1,] 0 0 0
[2,] 0 0 0
[3,] 0 0 0
> foo[3,3] <- NA
> foo
[,1] [,2] [,3]
[1,] 0 0 0
[2,] 0 0 0
[3,] 0 0 NA
> write.csv( foo, file='tst.csv', na = "NaN", row.names = F )
> readLines( 'tst.csv' )
[1] "\"V1\",\"V2\",\"V3\"" "0,0,0" "0,0,0"
[4] "0,0,NaN"
Seems to work fine for me. If you post a reproducible example, we could
probably figure out why it is not working for you.
-Charlie
-----
Charlie Sharpsteen
Undergraduate-- Environmental Resources Engineering
Humboldt State University
--
View this message in context: http://n4.nabble.com/Saving-tab-csv-delimited-data-with-NaN-s-tp1679673p1679844.html
Sent from the R help mailing list archive at Nabble.com.
This doesn't work if you have both NAs and NaNs in your data frame and you
want to distinguish between these. I.e. when you read the data back in,
all NAs will have been converted to NaNs.
Admittedly the OP said he wanted to represent all NAs as NaNs, so your
solution would seem to work for him.
cheers,
Rolf Turner
######################################################################
Attention:\ This e-mail message is privileged and confid...{{dropped:9}}
Aye, it still works if I replace the NA in my matrix with NaN. If there is
a mixture of NAs and NaNs, there will be some loss of distinction as you
say.
However, I can not tell if this is the case from the original post-- hence
the need for an example!
-Charlie
-----
Charlie Sharpsteen
Undergraduate-- Environmental Resources Engineering
Humboldt State University
--
View this message in context: http://n4.nabble.com/Saving-tab-csv-delimited-data-with-NaN-s-tp1679673p1679909.html
Sent from the R help mailing list archive at Nabble.com.
______________________________________________