Incorrect Data

0 views
Skip to first unread message

Paul

unread,
Aug 3, 2024, 11:12:14 AM8/3/24
to tochunkutzlect

Hi everyone, I'm moving hundreds of gigabytes to DropBox. However, from what I see on the top bar of DropBox on my Mac, the calculation of the amount of data to move is happening slowly. To clarify, I have been seeing roughly the same time estimate for completion since this afternoon. However, it's like DropBox keeps finding new files to upload, so it updates the estimated upload time as it goes. The problem is that this way, I can't get a clear idea of how much time it will actually take to transfer all the data. Why is this happening? How can I solve it? Thank you.

Keep in mind, that the syncing time you see when checking the app's syncing status, is an "instant" estimate based on the current upload speed. This value can change if you check the value just a few seconds later.

You also mentioned it's rather slow: have you tried to change the app's bandwidth using these steps? If you change the bandwidth settings and Dropbox still doesn't seem to be making progress over the next few hours please let me know.

Did this post help you? If so, give it a Like below to let us know.
Need help with something else? Ask me a question!
Find Tips & Tricks Discover more ways to use Dropbox here!
Interested in Community Groups? Click here to join!

Maybe I didn't explain myself properly because your response seems a bit off-topic. The problem is that the time estimate given by DropBox is not accurate. Perhaps it says that DropBox will finish syncing in 10 minutes, but it's not true. I still see a lot of files and the time it takes is much, much longer. The syncing hasn't even finished since yesterday... just to give you an idea. The speed of my connection has nothing to do with it. For example, I've now set the maximum speed to 10 MB/s per second, and it tells me it will finish in 4 minutes, but that's not the case. It will definitely take longer.

Hi @cloudres, this numerical estimate appears to be correct, assuming you have a fixed rate (with nothing else affecting the connection), then it would be possible to sync 2.4 GB in 4 minutes at 10 MB/s.

If it turns out to be longer than that, then your internet connection is being affected by usage elsewhere on your machine or the local network. The app can only make an estimate based on how much bandwidth is available to it at the time, and can't make a 100% accurate guess, since numerous other apps or machines could use up the internet bandwidth elsewhere.

I decided to run a new test because the given solution didn't convince me. So I moved 200 GB onto my hard disk, currently loading onto DropBox at a fixed rate of 2 MB/s (with limitations). The DropBox app tells me it will take 3 hours. However, after doing a quick calculation, it seems that it should take over a week for 200 GB at a fixed rate of 2 MB/s. It's clear to me that the predictive calculation is fundamentally flawed. Maybe, and I say maybe, the DropBox app doesn't scan all new files to be synced immediately, but over time. Therefore, that predictive calculation is necessarily based on a false, provisional premise. But then, what is the point of proposing an upload time that won't be respected? Bear in mind that I've been uploading several gigabytes to DropBox for days, gradually emptying my physical hard disks and uploading everything online. So I'm not making these observations up; they're the result of my direct experience.

I am also having this issue! I began uploading a 30GB file at 10PM and it did not finish until around 10AM. However, the estimated time at the beginning of the upload was 55 minutes, and when I checked on it at 9AM, it said two minutes. I routinely have this same issue.

Wait a minute. This output is bothering me for few reasons. a) those values were never ever supplied in the test cases. b) it took the APIs 111.36 seconds to respond for the input prompt. c) those text formatting were not requested yet it was provided in the response. How can I turn all the noise off and get just the output format what I want.?

Input: Provided are few(n) HL7 messages in the form of a map with Integer keys and String values.
Analyze the messages and perform comparisons between them.
Starting from the A28 message (the base version), identify and record the changes made to the patient demographics only.
Perform a total of n-1 comparisons.
Extract the value of MSH.7, which represents the date and time of the message.
Extract the value of EVN.5, which represents the OperatorId who generated the message.
Based on the value of EVN.5, assign the appropriate username as follows:
If EVN.5 equals 1023, set username = LISA.
If EVN.5 equals 1024, set username = GREG.
If EVN.5 equals 1025, set username = SAMANTHA.
Give precise values and all the items that you observed were changed in ascending order of the field name, a delta was recorded.Generate only a report in the following format only:
Do not add any input information back in the response, like the messages or syntax.
Do not compare MSH and EVN segments.
Do not return double entries.
Only compare PID segments.
Syntax: updated from to on by

Temperature. I gradually increased the temperature from 0 to 0.1 and then to 0.2 and so on. Higher values (closer to 1) make output more diverse and sometimes random, while lower values (closer to 0) make the model more deterministic and focused. Taken directly from ChatGPT 4.

Feedback Prompting or supplemental prompts : The first response of the prompt need to be submitted back to the APIs, with a supplemental prompt that asks it to generate a formatted report. So there are total 2 prompts, fired one after the other both as a role of a user. I am not sure what difference does it make but as per the docs, I am using user.
The reason for this is that any structured data will require parsing and computational power of gpt3.5, which gives out an unstructured textual report. From unstructured the supplemental prompt can give a formatted structured JSON report, which I originally intended.

I passed my CISSP exam on 27th Feb 2024 and am currently in the process of completing my accreditation . I'm currently filling out the online application form and coming across an error under the Job experience section. I'm facing this error "Data was submitted with an incorrect format" and this is not letting me save and proceed further.I have checked at least a few times to verify if the data format is correct and it is correct. Can anyone of the members please advice if anyone have faced such an error and provide some suggestion.I have also emailed the ISC2 team for help. Thanks a lot in advance

I was having the same issue here and I spent a long time working it out to get it to submit. Another user posted that they got past this by reducing the length of their Job Description, which did not work for me.

I want to see the Invoice Sale(measure) for the account IDs even if the corresponding details are not present.
However, I am getting results like below and some incorrect values for account_name as the account id's between 2 tables are not mapping correctly.



So far, I have tried changing the cardinality between customer table and detail table (made it both and single but in different direction) nothing worked. Also, if I remove the Invoice sale from table, I see the mapping correctly and I see no records for account_name but as soon as I add the measure, I get weird data and seems like the relationship doesn't work

I would appreciate if someone can help me here please.
I am stuck with this for a week now and I am not sure what I am doing wrong.

Thanks in advance.
@ChandeepChhabra

It's glad to hear that your problem has been resolved. Could you please mark the helpful post as Answered since it work now? It will help the others in the community find the solution easily if they face the same problem with you. Thank you.

@alexa_0028 it will be easier to provide the solution if you share a sample pbix file with the expected output. Usually, you want to avoid many to many relationship but without looking at the data, it is hard to explain what is going on

Thus, a string value for OrdType in an inbound FIX message will cause QuickFIX to reject the message for containing invalid data. You will need to modify QuickFIX to allow string values for OrdType to be able to accommodate inbound and outbound messages with the CTS custom values.

Using ArcGIS Pro 2.5 on a portal (unsure what version) produces incorrect data statistics. It seems to not grab updates made to the dataset. I have included a screenshot of the statistics. All non-selected rows are displayed and clearly the sum nor the min are displayed correct. Those values were updated and saved to the database.

That would appear to be the issue. I suppose I would expect some sort of indicator that would indicate I'm working on cached data. It's also bizarre to me that the cache didn't update when the data changed as I'm the one who made the change. At least I know now...half the battle and all.

Hi knmille7, please see for how we work (in short: from volunteers like you!). And the best thing is to fix the errors yourself. Sign up, and you're welcome. Just, please keep in mind that we are not allowed to use other databases or maps for the contributions to our data (with a few explicit exceptions).

@knmille7: please could you ask a new question clearly mentioning the bulk thing and that you do not want to fix yourself? Thank you! :-) This keeps this discussion more useful for others with the same questions. -> Thanks for your second question.

c80f0f1006
Reply all
Reply to author
Forward
0 new messages