• 0 Posts
  • 10 Comments
Joined 9 days ago
cake
Cake day: June 6th, 2025

help-circle

  • Presearch is not fully decentralized.

    All the services that manage advertising, staking/marketplace/rewards functionality, and unnamed “other critical Presearch services” are all “centrally managed by Presearch” according to their own documentation.

    The nodes that actually help scrape and serve content are also reliant on Presearch’s centralized servers. Every search must go through Presearch’s “Node Gateway Server,” which is centrally managed by them. That removes identifying metadata and IP info.

    That central server then determines where your request goes. It could be going to open nodes run by volunteers, or it could be their own personal nodes. You cannot verify this due to how the structure of the network works.

    Presearch’s search index is not decentralized. It’s a frontend for other indexes. (e.g. it outsources queries to other search engines, databases, and APIs for services it’s configured to use) This means it does not actually have an index that is independent from these central services. I’ll give it a pass for this since most search engines are like this today, but many of them are developing their own indexes that are much more robust than what Presearch seems to be doing.

    This node can return results to the gateway. There doesn’t seem to be any way that the gateway can verify that what it’s being provided is actually what was available on the open web. For example, the node could just send back results with links that are all affiliate links to services it thinks are vaguely relevant to the query, and the gateway would assume that these queries are valid.

    For the gateway to verify these are accurate, it would have to additionally scrape these services itself, which would render the entire purpose of the nodes pointless. The docs claim it can “ensure that each node is only running trusted Presearch software,” but it does not control the root of trust, and thus it has the same pitfalls that games have had for years trying to enforce anticheat (that is to say, it’s simply impossible to guarantee unless presearch could do all the processing within a TPM module that they entirely control, which they don’t. Not to mention that it would cause a number of privacy issues)

    A better model would be one where nodes are solely used for hosting to take the burden off a central server for storing the index, and chunks sent to nodes would be hashed, with the hash stored on the central server. When the central server needs a chunk of data based on a query, it sends a request, verifies the hash matches, then forwards it to the user, thus taking the storage burden off the main server and making the only cost bottleneck the bandwidth, but that’s not what Presearch is doing here.

    This doesn’t make Presearch bad in itself, but it’s most definitely not decentralized. All core search functionality relies on their servers alone, and it simply adds additional risk of bad actors being able to manipulate search results.



  • Not to mention the fact that the remaining sites that can still hold on, but would just have to cut costs, will just start using language models like Google’s to generate content on their website, which will only worsen the quality of Google’s own answers over time, which will then generate even worse articles, etc etc.

    It doesn’t just create a monetization death spiral, it also makes it harder and harder for answers to be sourced reliably, making Google’s own service worse while all the sites hanging on rely on their worse service to exist.


  • This is fundamentally worse than a lot of what we’ve seen already though, is it not?

    AI overviews are parasitic to traffic itself. If AI overviews are where people begin to go for information, websites get zero ad revenue, subscription revenue, or even traffic that can change their ranking in search.

    Previous changes just did things like pulling a little better context previews from sites, which only somewhat decreased traffic, and adding more ads, which just made the experience of browsing worse, but this eliminates the entire business model of every website completely if Google continues pushing down this path.

    It centralizes all actual traffic solely into Google, yet Google would still be relying on the sites it’s eliminating the traffic of for its information. Those sites cut costs by replacing human writers with more and more AI models, search quality gets infinitely worse, sourcing from articles that themselves were sourced from nothing, then most websites which are no longer receiving enough traffic to be profitable collapse.






  • This seems like it could be a viable replacement for many plastics, but it isn’t the silver bullet I feel that the article is acting as if it is.

    From the linked article in the post:

    the new material is as strong as petroleum-based plastics but breaks down into its original components when exposed to salt.

    Those components can then be further processed by naturally occurring bacteria, thereby avoiding generating microplastics

    The plastic is non-toxic, non-flammable, and does not emit carbon dioxide, he added.

    This is great. Good stuff. Wonderful.

    From another article (this shows that this isn’t as recent, too. This news was from many months ago)

    the team was able to generate plastics that had varying hardnesses and tensile strengths, all comparable or better than conventional plastics.

    Plastics like these can be used in 3D printing as well as medical or health-related applications.

    Wide applications and uses, much better than a lot of other proposed solutions. Still good so far.

    After dissolving the initial new plastic in salt water, they were able to recover 91% of the hexametaphosphate and 82% of the guanidinium as powders, indicating that recycling is easy and efficient.

    Easy to recycle and reclaim material from. Great! Not perfect, but still pretty damn good.

    In soil, sheets of the new plastic degraded completely over the course of 10 days, supplying the soil with phosphorous and nitrogen similar to a fertilizer.

    You could compost these in your backyard. Who needs the local recycling pickup for plastics when you can just chuck it in a bin in the back? Still looking good.

    using polysaccharides that form cross-linked salt bridges with guanidinium monomers.

    Polysaccharides are literally carbohydrates found in food.

    This is really good. Commonly found compound, easy to actually re-integrate back into the environment. But now the problems start. They don’t specify much about the guanidinium monomers in their research in terms of which specific ones are used, so it’s hard to say the exact implications, but…

    …they appear to often be toxic, sometimes especially to marine life, soil quality, and plant growth, and have been used in medicine with mixed results as to their effectiveness and safety.

    I’m a bit disappointed they didn’t talk about this more in the articles, to be honest. It seems this would definitely be better than traditional plastic in terms of its ecological effects, but still much worse than not dumping it in the ocean at all. In my opinion, in practice it looks like this would simply make the recycling process much more efficient (as mentioned before, a 91% and 82% recovery rate for plastics is much better than the current average of less than 10%) while reducing the overall harm from plastic being dumped in the ocean, even if it’s still not good enough to eliminate the harm altogether.