Nhl 09 Rebuild

0 views
Skip to first unread message

Rebecca Donnelly

unread,
Aug 5, 2024, 8:22:46 AM8/5/24
to rateposca
Leveragethe people power of AmeriCorps to grow the capacity of your organization. OneStar provides funding and resources for national service programs to help you jumpstart new initiatives, scale your services, and advance your mission in meaningful ways.

Donate now to support the families and communities affected by the Panhandle and South Plains wildfires in February 2024. The Panhandle Disaster Relief Fund is administered locally by Amarillo Area Foundation.


When disasters strike Texas communities, the Rebuild Texas Fund connects your helping hand to Texan communities in need. As the organization designated by the Office of the Governor to receive private donations following a disaster, we strategically collect and distribute funding and additional resources to eligible organizations in impacted areas to build resilience.


Rebuild Texas Fund is not currently accepting applications for funding. Check back for future opportunities, and sign up for our newsletter to learn about Rebuild Texas Fund grants when they are announced.


In 2017, OneStar and the Michael & Susan Dell Foundation launched the Rebuild Texas Fund in response to the devastating impact of Hurricane Harvey. Through the Rebuild Texas Fund, OneStar continues to support Texas communities in times of disaster and crisis. Through this effort, together we served more than 2 million Texans to rebuild their lives and communities


We worked collaboratively with local leaders in 41 Texas counties impacted by Hurricane Harvey to support recovery and rebuilding efforts. Our efforts focused on five key areas: community & economic development, education & child care, workforce & transportation, housing, health, and small business.


For every iteration it:

-make a list of vectors from fitting curve to original, as said above (by equally lenght-subdividing both curves)

-for every Greville point making a sub-list of the vector list, picking only the near ones (near by curve parameter)

-for every Greville point, moving the corrispective control-point by the average of the vector sub-list


One thing you might want to try is translating the original curve out of the plane, and then loft between the original and the refit. That will show you how much the parameterization is changing between the two.


Riccardo,

As interesting as this discussion is, it might be in your best interest to fall silent on the Forum and pursue the matter privately. A smart(er) Rebuild command could be (in my view as an endlessly frustrated rebuilder) a real Godsend.

Michael


It recently came to my attention that data coming from a lookup within my accelerated data model was not populating correctly. The symptom was that I was finding blank fields where the lookup data should have been. I managed to resolve this issue by simply rebuilding the model by manually clicking the "rebuild" button. I have no idea why this happened, but I would like to have the opportunity of automatically calling this rebuild function for the model so that I can avoid a re-occurrence in future.


I did a search at datamodel.conf and I did not find any command where this can be done automatically, but it seams splunk run a type of correction when identifies the datamodel is not up to date for acceleration function. This is the only attribute I found when I source for rebuild


Thanks for taking the time, I appreciate it! I also found that setting and I'm assuming that it's better to be left to the default value of "false". I mean, I think it's better to have Splunk rebuild the summaries.


RAID status rebuild is at 16%. When I refresh the page, there is a number at the end of the status that does change (UI does not display the entire number/message). At this rate, it will take a month to complete.


The rebuild finished the next day after configuring things differently. Strange that setting the minimum limit higher resolved the issue and got things accomplished in less than a day compared to over a month if I had done nothing. As pointed out, the maximum setting is usually the the culprit, not the minimum. Thanks for the insight as to how to resolve this issue.


WD should be ashamed that they release a product which is not capable of the bare minimum capabilities and relies on the user community to work out all the issues. Is that where we are at now? Release a product that is not capable of providing the consumer the minimum functionality that is advertised and ostensibly justifies the price that we pay while providing little or no support? Thankfully, we have a forum to work things out amongst ourselves, but this is not the way it used to be or could have been in the past. Companies used to have their reputation and survival on the line, but now everything is out the door. Hire cheap labor and let the chips fall where they may and rely on knowledgable consumers to help others out.


I had a similar issue last week. I had my EX4 configured with 2 x 3TB drives in a RAID-1 mirrored volume. I backed everything up and added a 3rd 3TB drive. I went through the wizard to change the RAID mode from RAID-1 to RAID-5 and migrate the data.


Thank you for your excellent advice. That is going to speed things up considerably. Once the sync is complete, do you recommend reducing the speed limit back to the original setting? If yes, what parameter would you use? If no, do you think the higher speed raid syncing will impact user upload/download performance?


Could the filtered decks, when opened, automatically rebuild themselves? I do this 999 of 1000 times I open those decks, both desktop and in iOS. Also, if I forget to do it, then I might study a stale filtered deck.


Is this the case for these read-only queries, that they also modify the database? I would also think that anyone making bad filtered decks and hitting rebuild over and over will trigger the same issues compared with this done when opening the filtered deck.


Yes, for example if the filtered output generates a real deck, that would be fine, too. But the beauty with filtered decks/auto folders et rest is the dynamic nature where when opening such a thing, it reflects updated values in the database (or file system).


Don't know how this is still not a thing. Why do sections and elevations have 'Auto-rebuild' and 'Manual rebuild' where as details and worksheets are converted to dumb fills and lines?? This means you essentially can't add detailing like dimensions, labels or annotations to a floor plan callout with out checking it every time a change is made on the associated floor plan. Making them consistent with floor plans, sections, elevations should be a simple fix and a huge benefit to users.


Our current work around is also horrible. We place a worksheet marker on the drawing, then save a floor plan view at 1:50 and place a 'background fill' around everything except the worksheet callout boundary. We then have seperate layers used for dimensioning that callout. Its messy, you can't have callouts close together, and display orders also become an issue.


You use sections as detail views, has been this work around for as long as I can remember. Agree, should be live detail views. That's how far behind Archicad is compared to every other software in usability.


And remember you can stack views in layouts on top of each other, so you can crop one for just the walls of a room and then overlay lay it with a view for text and dimensions that isn't cropped (or not cropped quite so much).


@Barry Kellythats what we currently do. We place the worksheet markers on the floor plan, but do not use the tool because its so useless and impractical. We then have ''callout 1" "callout 2" and "callout 3" layers which host a background fill that covers everything inside the worksheet area, as well as the dimensions and additional labels that are used for the items within that callout drawing. The reason we have 3 seperate layers is so that callouts which are very close together can be dimensioned independently of each other without affecting the or obstructing the others view. It works. But its definitely a work around and not a seamless way of working.



Your last sentence proves my point. You have never used a worksheet and very rarely ever used the detail tool. Because they aren't any good. However, making them live views would make the useful tools that would help us document our plans without clunky work arounds.


When I open the report in Power BI Desktop, and then click refresh, I get a refresh error for one or more queries saying "Query ... references other queries or steps, so it may not directly access a data source. Please rebuild this data combination"


However, this is misleading, because after receiving the error messages, I can open Power Query, refresh any source where the preview was cancelled (the ones with question marks against the icons), then close and apply. After doing this, and nothing else, the data refresh will happen without any errors.


I've read lots of the normal advice relating to this error message, and have checked privacy levels for the various sources - everything is set to organisational, and as I've explained above, will work on the second attempt when Power Query previews have been refreshed.


2 years and i am still getting the same ridiculous error. Same case and no answers. Same privacy level. I guess this is something to do with a added columns with some M code but this is just dumb. It is still a source like source created via PQ. I have read and watch all theory about folding but as always - these rules don't aplly here. No idea what to do except turns privacy level to off. Still - bad idea.

l


For new builds, I've tried the method of separating merge sources into intermediate queries. This has maybe worked, but given the lack of clarity on why these errors are appearing in the first place, I'm not sure.


I am having the same issue. I also have a dataset built on excel files saved on a SharePoint site location. I imported in Power Query using the standard "SharePoint folder" option under more data sources. Even though I receive the error when using the global "Refresh" button, everything seems to update fine if I click Transform data and just refresh each table's preview in Power Query.

3a8082e126
Reply all
Reply to author
Forward
0 new messages