If performance issues cannot be resolved,I have to give up shapr3d

I use ipad m1, Ever since history Feature publish。I can barely use shapr3d,every step is laggy。
I responded to sharp3d by email,but have not been resolved。
What’s going on inside you,I’m very confused.

1 Like

No one saw it?

Have you tried collapsing history? That should speed things up at the expense of losing the parametric steps.

1 Like

yes,every step

Preach it brother and it goes far beyond performance - you may eventually just end up with a project file that you can’t open anymore.

I have some pretty complex sketches (glyph cutouts) for film props. I’ve imported stl files along the way for sizing and alignment, then deleted those stl files out of the item tree and extrude the complex objects from the sketch.

I’ve noticed that subtractions from the resulting body (stamping glyphs into other bodies, which were also extruded in Shapr) - can take 10-20 minutes.
They appear to be CPU-bound processes, the GPU on my mac doesn’t even blip.

My project (with 18k or so bodies) won’t open on my 6th gen iPad Pro but still opens on my Mac. I can merge histories and remove every single item from the item tree and still end up with a project that can’t be opened on that iPad Pro.

I opened a support case and asked for help and just got told “don’t use stl files” - It is like they didn’t even read the support ticket, they skimmed it.

You can buy yourself some time by merging history but I suggest that you frequently export your project to .shapr project save points in case you end up with a corrupted project file.

I know it sounds trite but a swipe away / close and restart of the device may help in the short term. There are quite a few memory leaks on the Mac version of the App, I don’t know how many of those are also present in the iPad version but a fresh memory landscape never hurts.

I’ve been using Shapr3D since nearly the beginning but I have to say I’m growing frustrated with it. Bugs are one thing and often the result of short development sprints. I’m not mad at the support person, they probably lack the resources to be effective. Poor support is a choice above their pay grade and makes me wonder where my subscription dollars have been going after all of these years.

2 Likes

Unfortunately 18,000 (mesh or b-rep) parts are not supported, especially not on an iPad with Shapr3D at the moment. Editing large amount of text in a sketch/glyphs is also not a use case that we optimized our sketch engine for. Based on what you told us in the support conversation, we suggested that Shapr3D is not the right tool for your workflow.

1 Like

I don’t want to hijack this guys’ thread but that doesn’t really answer the question about the project file stability and performance degradation as an early signal towards a corrupted project file, does it?

FWIW, I didn’t say the 18,000 bodies were mesh bodies. Just bodies, bonafide and extruded right here in Shapr3d with extrusion faces and camfer lines all in tact waiting for manipulation.

But following your logic, I can remove every single mesh that is in the project, merge histories and it still crashes when trying to open on the iPad.

I get your core point though, time to find a different solution. Not because of mesh body handling but because of lack of support, here.

1 Like

I understand. What I said is that 18,000 bodies (mesh or brep, any kind) is not supported at the moment.

… and if your project has ever, even temporarily housed such anathema there appears to be no way to actually purge those from the project file completely.
I get the technical challenges of resource-constrained devices supporting workload of processing a trillion polygons but my question is much simpler - Why doesn’t deleting them solve the problem? It isn’t unreasonable to expect deletes to… you know… delete.

It would be reasonable to expect me to delete those out and work them somewhere else, simplifying my shapr3d project to the workload it was designed for. But, I can’t do that if the project doesn’t actually delete those or if having-ever-hosted them introduces instability.

Because it’s in the parametric history. If you delete it, and merge the history, it will be gone completely. Alternatively you can also delete the steps from the history that created those parts. Also you mentioned the file size in your support ticket. File size doesn’t matter, not related directly to the performance.

I’m sorry, you just aren’t listening. …as I said… you can delete nearly everything out of that project and merge histories after every delete and you still have a file that crashes on import (.shapr) on the iPad Pro 6th gen.

See… that project was pretty well organized. So I go through these steps:

exported a copy of it
opened the copy
Deleted every single item but a single parent-folder item (a fairly simple component)
merged the history

exported this as another .shapr project export.

The resulting simplified .shapr project file with 95% of the items removed still doesn’t open up in the iPad (it crashes) and is still suspiciously large. You say file size doesn’t matter but the file size combined with the persistence of corruption seems to indicate that deleting items and deleting histories isn’t as complete as you claim.

One could:
export the tree items as a different intermediate modeling format.
(step, iges, x_t,etc)
create a new shapr project
import those.

And now you have a tiny, fresh project that won’t crash (and to the OP’s purpose would be day-one fast) but should by all measures be no different than the simplified .shapr project above. The fact that they aren’t the same is proof that the parametric history purge (or item deletes) isn’t working as completely as you claim.

This comes back around to the original posters’ issue in that - if deletes and merges aren’t actually deleting completely from the project file (as appears to be the case) then if you solve this “ghost remains of project history” bug, you may solve his (and others’) performance issue.

1 Like

This is a wrong assumption, there was a discussion about this on this forum before, you can look it up if you are interested in the technical details.

This is not true, performance issues can arise anytime, even with no parametric history. If you can share the design in .shapr format with our support team, we can look into the root cause of the issue.

Here it is:

I provided support a file already but they didn’t bother to open it.

Lets see if I can help you get past the notion that file size doesn’t effect performance and that merging history and deleting items is a complete process.

Here is a simplified version of my project. Everything is deleted but 4 cut out cylinders. This .shapr file will not open on my iPad Pro. Note that histories are merged.

Anyone that wants to follow along, here’s the project, this was an open source project anyway.
Dropbox

Just at a glance, HistoryNames, MetaData and I’m sure other sql lite tables have an extraordinary row count for a four-body model.


As I said:

deleting, doesn’t actually delete everything

…even if those are lightweight rows, it is a lot of rows to process on import, opening up for [data type] overflows that are probably the cause of my crash and contrary to your opinion that size doesn’t contribute to performance degradation - a giant sql lite table absolutely has an impact on query performance. If you don’t actually purge those rows as part of the merge, they are rows that must be traversed in a seek or part of an index (though it doesn’t look like you have indexes defined)

Thanks to your link above, now that I know it is a sqlLite db file in a zip file I can probably reason out what I need to recover my project once I have a quiet few minutes to work out the entity relations in the db.

I’m pretty sure that… I could do a qualified deletes from some history tables, metadata tables and remove all of the orphaned entries, run vacuum and take this 425mb file down to under a mb. I’ll poke around with it later.

Back to the OP’s post again. If their project has been around and has alot of work done to it, even a merge of parametric history doesn’t solve sql-lite related performance issues since clearly Shapr3d doesn’t remove everything in the merge.

Of course they did. Both designs contain 8000+ bodies.

I sent three from the same source projects in three different states

… and the last one contains four bodies and still crashes on iPad Pro load. they didn’t even attempt to answer my “corrupted file” support case with any steps whatsoever to recover that project. Just the lazy “wrong tool for the job” response. I feel like Bill Murray in Ghostbusters, paraphrased: “You know I’m a customer, right? Aren’t you supposed to support me?”

I’m thinking more broadly than my own support ticket here. More broadly:

A shapr project that “did” contain 8000+ bodies and now has four bodies crashes just like a project that still has 8000+ bodies

Which proves: merging and deleting doesn’t get everything.

effectively, your customers can work themselves into a nearly unrecoverable state if at any point in the history of the project they import less-than-ideal files.

deleting them leaves the DB with orphaned records.
excessive orphaned records in a project can absolutely impact performance.

I haven’t looked at the db structure long enough but will probably be able to recover my own file to a working state so at least I can get back to breaking it into smaller projects. This is so easy on the face of it that I assume there must be a relational data design decision that makes this extra cleanup hard for your devs. Probably relational keys being buried in json values that need to be iterated in HistoryNames, given the size of the thing relevant to a small number of objects. But that’s only a guess and from a glance based on my own dev work and hard lessons learned.

I mentioned size early on b/c the size growth pattern was a clear indication of an underlying problem of deleted items continuing to impact a problem. Strictly speaking, file size may not matter but record counts absolutely do, where performance is concerned.

You seem to have this canned answer that file size doesn’t matter and frankly you seem defensive about it. Why not actually roadmap out a clean and optimize feature for the product and address a problem that clearly others are also having. I don’t get the apathy and mild hostility here, I’m trying to help you make the product better.

3 Likes

I have access to two designs that you shared with us. One has 8k parts. The other has 16k parts. If you’d like to share the one that you think is problematic, I can take a look of that too.

For any sttng fans that stumble into this later…

image

(Meant to mean: I’m not crazy, there is a problem here)

2 Likes

I don’t know what this is but please share a design where you deleted something, merged the history, and the bodies are still present. I can’t reproduce the issue, and the two designs that you shared with our support team that I have access to don not support this claim.

https://www.dropbox.com/scl/fi/31rz931flj6fwgsphtn5m/Mk42ofTheseus.shapr?rlkey=3gr16h0zl10yca8fsf7zmze8f&dl=0

This link from above and from the support case.

"you deleted something, merged the history, and the bodies are still present. "

  • that isn’t what I said.

Expectation:

Starting from a very complex shapr file with (apparently) too many bodies, attempting to find a way to break that file into more manageable chunks without loosing progress.

  • How to proceed?

It seemed reasonable that breaking the project into smaller chunks may be a temporary path forward. You know: to prevent loosing helpful organizational structure between final bodies and their interim templates. The nature of the parametric history playback appears to prevent this.

Steps to reproduce:

(Breaking the file apart manually)

With the second model in my support case as a starting point.
-this file still opens on my Mac, it crashes when opening on the iPad Pro 6th Gen

I make a copy of the project. Open the Copy.

I identify a set of folders whose contents I’d like to export into a simpler project file, one that represents a subassembly of the overall project.

I delete everything but these.
I merge histories.

I export this as a new shapr file - > SubAssembly[n].shapr.

The result:

Multiple shapr projects that are still unable to be opened on the iPad Pro and still exhibit the same degraded performance characteristics of the larger whole-project file.

It doesn’t matter how simple the exported project is, any .shapr project generated from this is effectively poisoned moving forward because they include histories unrelated to the objects that I exported despite having merged the histories.

The dropbox link above is an example of this process where only four cylinders are targeted to be saved in a .shapr project export. Everything else was removed and histories were merged. Still, this very simple four cylinder .shapr project cannot be opened on the iPad Pro (it crashes) and is slow to load. The fact that the four cylinder export is the same size as the originating project and this recent peak at the table structure leads me to this conclusion:

Merging Histories doesn’t actually delete all histories.

I found this thread because I felt like the OPs performance observations are related to the fact that the historical playback nature without a (real) way to clear that history puts you on a path for degraded performance the longer you work with a project.

This has gnarly implications. If you ever accidentally import one project into another, you can’t really undo that. If you import second-class mesh files to trace or match sizes, there are historical traces for every-single-action and merging histories doesn’t get them out of the table. (I’ve proven that above)

1 Like