DownvotedConsidering converting from epoch is one of the most common Splunk questions of all time, considering this page has 46k views, and considering that each and every answer is entirely incorrect (and the actual question itself is misleading) this page is desperately in need of removal.
1) The question doesn't actually provide a standard epoch time. A millisecond epoch time is provided
2) The answer with 16 votes (?????) fails to divide by 1000 OR provide the correct format
3) The answer with 3 votes (?????) fails to provide the correct format
@somesoni2's comment of "%a,%d %b %Y %H:%M:%S"is correct, although technically you need to divide by 1000 if you are to use the millisecond epoch time that the post provides. 99% of people who find this page are merely looking to convert epoch time to the default Splunk human-readable format, in which case what they are looking for is barely on this page. They are most likely looking for "%Y-%m-%d %H:%M:%S" which is mentioned nowhere, or possibly "%F %T" as mentioned in the comments.
I've been told that the initial question has not been retroactively edited in any way which begs the question of what happened???? I understand comments from a comment chain were likely converted to answers without the correct context, but still. Part of the problem is that, in the comment chain, the parameters surrounding the initial question were changed by the asker. Smh. This is a giant mess.
As the title says, I have Unix Timestamps on a sheet and I need to convert them to normal human-readable dates. So far I'm striking out, Google searches have turned up a few suggestions but none have worked for me at all. Does anyone have a formula that works for converting these?
I prefer a formula that is more transparent, with fewer magic numbers. It makes it easier to see what's going on and also avoids the likelihood of having a bug (like the current top answer to this question which is off by 1 day):
UNIX Epoch time, or POSIX time, is a system for describing moments in time, defined as the number of seconds that have elapsed since 00:00:00 Coordinated Universal Time (UTC) on Thursday, 1 January 1970, not counting leap seconds.
Some logs and reports on DXi systems may use UNIX epoch time (such as the history bash under the DXicollect log or reports generated by the redb_util script). This article gives you some suggestions on how to convert UNIX Epoch time data to a human-readable format.
I have a field that is represented in milliseconds since epoch. I want to use this field as the x axis of a histogram bar chart but I want the dates to be human readable. See the screenshot of what it looks like now. What are some ways to do this?
@murlin99 Thanks. That makes sense. Unfortunately, I didn't create the original mapping. It's created by a third party plugin to Jenkins build server. I'm just getting familiar with Kibana. Is this something I should be able to do from the Management tab in Kibana? The type doesn't seem to be editable.
I would recommend looking into Truffle. It's an excellent framework for building web3 applications, which are essentially just websites that can talk to contracts on the Ethereum blockchain. Then you can just make a web3 call to get block.timestamp in seconds, and use normal JavaScript to convert it to a human readable format and display it in a browser like Google Chrome, using extensions like Metamask.
You can directly convert created_at date to a human readable format like "1 minute ago" or "1 month ago" using diffForHumans() function. Laravel Eloquent by default enables Carbon instance on the created_at field.
We would like to push a meeting time to HubSpot via the Demodesk integration. Unfortunately, we can only get it as a UNIXTIMESTAMP in a single-line text. We want to convert this into a human readable property using custom code. According to the test run, it should work. However, when we send a contact through the workflow, the UNIXTIMESTAMP still remains. Can anyone help us here? Thanks in advance.
I created a test environment. The contact has "Meeing Time" and "Meeting Time DE" properties. I change the "Create record" action to "Copy property value" to copy the "cx___demodesk__meetinglink" to "Meeting Time DE", and the result is correct.
I imagine one of the challenges in displaying durations like this in a readable way is months, since we need additional metadata to go from a date difference to a number of months. There are two ways to go about this:
hi @Lea_Verou , you are right that Coda is not doing a great job here. The duration function does not feel like it has given much thought. The AI is a bit of a help, but not always since it makes mistakes. We need a programmed solution as long as AI fails us
Converts an epoch/unix timestamp into a human readable date. It also lets you do the inverse, i.e. converts a human readable date into an epoch/unix timestamp. It alsodisplays the current epoch/unix timestamp in both seconds and milliseconds.
Barring any better options, I'll probably write a regex block to convert format strings from a human-readable format to the POSIX format so I can use the standard functions. Before I embark on that however, does anyone have other suggestions? Thanks.
CPAN has to have a date parser that works better for you than you can manage with regex. Date parsing is seriously thorny territory. Take a look at Date::Manip::Date or DateTime for starters. Maybe you already have and aren't happy with the lack of format control for the parsing; if so i'm not sure what to recommend.
I didn't spend time on Date::Manip because I keep reading posts that say to stop using it... But looking at the documentation now, it seems like it might be exactly what I'm looking for, so I'll give it a try. Thanks!
Edit: On closer examination, while Date::Manip::Date does support a wide variety of date formats for input, it uses the POSIX standard for date format strings in output, so doesn't really help after all. :-(
I use DateTime to parse dates (and times) all the time. However, usually I'm parsing some specific pre-defined format, such as the format used by a particular RDBMS, the format used in email headers, or the format found in Apache log files; there are corresponding DateTime::Format::Foo modules for a wide variety of these.
General-purpose "whatever format the user types" date parsing is a fundamentally intractable problem, because the user will type junk like "6/8", and without further context there is absolutely no way to know what they mean. The five most likely answers are probably (not necessarily in this order) June of 2008, June 8th of the current year, August 6th of the current year, June 8th in the adjacent year (coming year if it's past June 8th already, previous year otherwise), and August 6th in the adjacent year. When getting date input from users, the only really reasonable approach I've discovered so far is to ask for the year, the month, and the day, each in its own appropriately labeled field.
Answering my own question here, this turned out not to be such a big deal. For anyone interested in the future, here is the translate_date_format() function I came up with, and a Test script to demonstrate it:
Like POSIX::strftime(), anything not recognized as a formatting element is left as-is, yes (although 'yyy' in specific would translate to '%yy'). Unlike strftime(), I didn't add an escape sequence in case you wanted a literal 'yy' in the string. Exercise for the user I guess. :-)
I'd like to work with human-readable date formats - ie, allow the user to select the date format for input and output using a human-readable string such as 'yyyy-mm-dd'. What user where in what context? I think that is too ambiguous, its better to provide the human a list of examples, or an interactive program.... or a way to contact a programmer OTOH see DateTime::Format::Natural :) use a non-human-friendly %X format They're made for programmers, who are supposed to make life easier for "humans" :)[reply]Back toSeekers of Perl Wisdom
I have imported a CSV into LibreOffice Calc and some columns are in human-readable form (notably dates and currencies). Those are imported as text, but this way I cannot do anything useful with them, like graphing.
There are a number of ways, including choosing proper import settings; using regular expressions and cell formats (along with correct locales and LibreOffice date detection settings) to change already imported data; use formulas to extract relevant information into different columns etc. It would help if you provided a link to a sample file.
AWS CLI version 2, the latest major version of AWS CLI, is now stable and recommended for general use. To view this page for the AWS CLI version 2, click here. For more information see the AWS CLI version 2 installation instructions and migration guide.
--request-payer (string)Confirms that the requester knows that they will be charged for the request. Bucket owners need not specify this parameter in their requests. Documentation on downloading objects from requester pays buckets can be found at
By default, the AWS CLI uses SSL when communicating with AWS services. For each SSL connection, the AWS CLI will verify SSL certificates. This option overrides the default behavior of verifying SSL certificates.
The following ls command lists all of the bucket owned by the user. In this example, the user owns the buckets mybucket and mybucket2. The timestamp is the date the bucket was created, shown in your machine's time zone. This date can change when making changes to your bucket, such as editing its bucket policy. Note if s3:// is used for the path argument , it will list all of the buckets as well.
3a8082e126