How scalable is this bookmarks app? Can it handle 1 million+ links? #2021
Replies: 5 comments 6 replies
-
|
Hey! I haven't done intensive testing on scalabiltiy so far. I know @muppeth were running a big instance in the past. Not sure if you're still running that. Maybe you can shed some light on this :) |
Beta Was this translation helpful? Give feedback.
-
|
Heh, just reread my comment and realized I didn't even say hi. Sorry for my rudeness, I typed it in a rush |
Beta Was this translation helpful? Give feedback.
-
|
Did you give it whirl to test this? I think an important aspect for this is how many links there are per user. |
Beta Was this translation helpful? Give feedback.
-
|
So, I have data from an instance that has ~1500 users and ~500000 links which works smoothly. |
Beta Was this translation helpful? Give feedback.
-
|
I'm sorry this is only half an answer... I've just checked my browser, which is syncing perfectly, and my bookmark HTML export runs to about 7Mb on disc. I appreciate how imprecise that is, given the variable nature of the file. If someone can suggest a link-specific XML tag, I could load my bookmarks file in to e.g. a Word Processor and "Find" the tag - it should be easy to get a count that way. I would just also like to add that if you really need a million bookmarks, then it may well be that your bottleneck could become your browser rather than this solution. I know that Firefox uses SQLite for instance, which in theory can easily handle that many bookmarks... although performance guidance for large tables with that product carry some interesting warnings about performance drop-off if you have inefficient tables, or if you get your indexing wrong, or if you don't keep your table clean [as in, the need to avoid disjointed index splits]. Of course, this gives the question of how to practically test. It must be possible to synthetically create a URL with a general form like this:- You can create a textfile with that pattern and 10 million rows with a couple of lines of python... The next part of the problem would be to get this in to a web browser's bookmarks repository. If there is no easy way to do so by simply directly writing to the browser's SQLite instance, maybe you could set up Apache JMeter to automate it for you? [Sorry, not sufficiently familiar with JMeter myself...] But once you've got the URL's loaded in to a browser, proving/disproving the scalability should be pretty straightforward. |
Beta Was this translation helpful? Give feedback.
Uh oh!
There was an error while loading. Please reload this page.
-
Title is self-explanatory. I'm searching for a solution that's scalable. I know performance will depend on the host machine specs, so imagine a high end, last gen domestic rig running Nextcloud and bookmarks app. Would it be able to handle that many links and, say, 10k tags?
Beta Was this translation helpful? Give feedback.
All reactions