Hacker Newsnew | past | comments | ask | show | jobs | submitlogin
Google's shortened links will stop working next year (theverge.com)
133 points by uladzislau on July 20, 2024 | hide | past | favorite | 65 comments


Given how relatively easy it is to run a redirect service, and how many links this will break this is vandalism.


It's in Google's interest that coming generations will find any URL as weird as we find an IPv6 address. Everybody should use their search^W profit generation engine.


Indexing and deeplinking into walled gardens is a pretty fragile affair.

Can search live without an open web?


If most of the knowledge retreats into a finite number of silo's they can probably afford to pay for backend data access to them.


Google Search Appliance was their attempt into indexing silos and it failed. Would the current Google do any better?

https://en.wikipedia.org/wiki/Google_Search_Appliance


One or more pages will be generated but ideally a tiny info box to drape the ads around.


For a company that couldn't have started itself without functioning hypertext media ecosystem it seems even more callous and destructive.

As a Xoogler I'm very disappointed


Is there a service to lookup and cache redirect service links?


> When Google announced in 2018 that it was shutting down goo.gl, the company encouraged developers to migrate to Firebase Dynamic Links (FDL) — which has also since been deprecated.

Why wouldn't they just change the backend and leave the service alive for the end users? It seems nuts to give up all that sweet sweet browsing data.


THIS!

Seriously... if they retire it, make the backend read-only, that way it can be highly optimized, and ran with minimal costs (from a mammoth company's perspective).

I don't know, make it an interview question and deploy the best answer? They put more effort into tortuting aspirants than to EOL-ing some of their cheap-ass services in a reasonable way.


The problem is that Google infra requires everything running to be new. There is a build horizon of 6 months. Everything built with code older than 6 months is not able to run on Borg. And since Google deprecates many internal infra tools/libraries routinely, a team is required to make sure the service remains up-to-date. Google doesn't want to pay for such maintenance.


This what I'd suspected as pressure to discontinue services that could otherwise virtually maintenance-free at low cost.


In S3, you can implement this with a single bucket, no code at all: objects can cause redirects, using x-amz-website-redirect-location

Since Google buckets don't seem to implement this feature, maybe they should point goo.gl at S3 :-)


Oh cool, I've done this sort of thing (mass redirects) with Lambda@Edge which allows for more flexibility, but probably costs more.


Note that when Google made a blog post telling people to migrate from goo.gl to the also now deprecated Firebase Dynamic Links, the post states explicitly[1]:

"While most features of goo.gl will eventually sunset, [bold]all existing links will continue to redirect to the intended destination.[/bold]"

The [bold] section is bold in the original post.

[1] https://developers.googleblog.com/en/transitioning-google-ur...


Every time I change a ULR (or a set of URLs), I put a test for the redirect into my end-2-end tests, which run once per day. So I know all my URLs will work forever.

I have not thought about it for years now. Just checked for my first ever Show HN from 10 years ago:

https://news.ycombinator.com/item?id=7465980

The URL has long changed, but the redirect still works. Phew :) So all seems to be good. Here's to the next 10 years!

Google should do the same. Set up a seperate server for the redirect service itself. And then I guess they have multi project end-2-end tests running somewhere in their infrastructure. Just add testing this service and thats it. Amount of work per year to keep it up should be less than an hour, right?


It will easily take 1-2 engineers to maintain it in Google.

Why 1-2 engineers? Security patches / Internal service deprecation / Migration / Use of deprecated dependency / etc


The sad truth is that no one is getting a promotion to staff for just maintaining a service.

I wish this wasn’t so. At a previous job I had a VP tell me that my team was like a public utility and I took that as a compliment. Later my boss explained they were saying that they only noticed my team when something was broken. Sort of explained my lack of career progression in retrospect.


They could outsource this to someone only too grateful to keep it running for less than the cost of an internal engineer


This is where you start your own company and sell "maintenance" to your company you work for.


Are links generated from Google apps like

https://maps.app.goo.gl/xxxxxxx

going to continue working?


There is also `g.page` which they have suspended: https://support.google.com/business/answer/9273900?hl=en-GB


> migrate to Firebase Dynamic Links (FDL)

The announcement of “we’re breaking your stuff” contains an appeal to trust them on next round?

They must think their clients are complete morons



For those not wanting to click the link:

> On August 25th, 2025, Firebase Dynamic Links will shut down. All links served by Firebase Dynamic Links (both hosted on custom domains and page.link subdomains) will stop working and you will no longer be able to create new links.


It is an old code but it might be more appropriate to fess up and say the resource has been deleted?

410 “Gone” https://developer.mozilla.org/en-US/docs/Web/HTTP/Status/410

But alas, even more apt:

417 “Expectation failed” https://developer.mozilla.org/en-US/docs/Web/HTTP/Status/417


Expectation failed is specifically a response to an unknown value for the Expect header.


It was an ironic joke about the longevity of Google’s services :)


HTTP 421 might be appropriate


The real reason is probably maintenance due to some hidden costs like conflicting infrastructure and they couldn't justify migrating it.

Related discussion (2 days ago): https://news.ycombinator.com/item?id=40998549

Discussion of the previous announcement in 2018: https://news.ycombinator.com/item?id=16719272


Honestly I bet they could have 2 interns porting the thing to Google App Engine and then migrate the database

A link shortener, as much as it has analytics and such in the background, is not rocket science.


> I bet they could have 2 interns porting the thing to Google App Engine and then migrate the database

How can you possibly have this assessment without looking at the code/infra?

There are many things that affect cost beyond the visible features. The project isn't in a vacuum. It's interlocked with their other services infrastructure.

You can judge Google however you want, but they're not stupid or amateurs. These types of announcements immensely damages their image and affect their customers, if they could avoid it easily as you imagine, why would they not?

They've built the service and run it for many years for billions of people. A more realistic guess would be that for whatever reason, the price is higher than what's visible on the surface and they're not willing to pay it.


>It's interlocked with their other services infrastructure.

then they could fucking disinterlock it from the other services and leave it in read-only mode instead of killing it.

>You can judge Google however you want, but they're not stupid or amateurs.

they are not amateurs, because an amateur would have no problem maintaining a basic bitch KV store that probably fits in RAM on a single machine


Just to be clear, I'm not saying they can port everything to it, but only the basic functionality to not let the links die (then progress with it)

> These types of announcements immensely damages their image and affect their customers, if they could avoid it easily as you imagine, why would they not?

You're assuming they care. And the answer of how much they care is: can this be used to further my (that is, an engineer or manager) promotion? If not then no

Google has become dysfunctional


you are assuming hidden costs, I am assuming hidden incentives. It’s not that they are stupid or incompetent, but bad incentives within the org can and do produce stupid outcomes.


If they used AWS, this would have no code and no maintenance: host the bucket out of S3 and enable redirects.

GCP doesn’t support that, but they could get pretty close using a cloud function - stick with the Python stdlib & SQLite or DBM for the mappings or use an Apache redirect map, and you’d have many years before you need to touch it again.


> These types of announcements immensely damages their image and affect their customers, if they could avoid it easily as you imagine, why would they not?

I believe they don't care. What are you gonna do, boycott them?


>These types of announcements immensely damages their image and affect their customers, if they could avoid it easily as you imagine, why would they not?

laziness, greed, apathy


What's happened here is that you've erroneously assumed there's a good reason. It's fun to hold nonsense like this up against testimony from the ministers and officials at the Horizon enquiry, all of whom can be relied upon to say that "with the benefit of hindsight" obviously what they did was wrong but insist that they were too stupid to realise there was a problem and thought they were powerless to do anything.

Remember on average the other humans are just as stupid and lazy as you are. Most often there aren't "good reasons" for what happened, if there are even reasons at all.


I wonder if there's a story here involving a URL shortener service having hidden costs? I can imagine there being something in the abuse space that makes it feel more expensive than just the hosting costs to operate.


Probably career product managers finding it untenable to write self-reviews with such a low impact, low maintenance product...

And if nobody wants to take it on...


Google was killed by growing too large.

Google the company was designed with really high coordination requirements, which has made the marginal coordination cost of adding a new engineer higher than the value they add.


it's so sad this multi-trillion dollar company can't spare some resources to serve a some 301s.


Having products scale through time is an engineering problem, and they seem to not be able to recognize it as so.

As long as they don't understand this, they won't be able to expand their product offering (and thus Revenue) significantly faster than their headcounts.


Many years ago, there was an industrial group (?, maybe just a campaign.. can't remember the details) promised to provide protection/transfer service if one of their member shutdown.

Tried to search the news, can't find any reference to that.


301works.org might be what you mean.

https://archive.org/details/301works


[dupe]

Discussion on official post: https://news.ycombinator.com/item?id=40998549


Like many have said, its a shame they refuse to maintain minimal requirements to keep the links working.

Google offers cloud services. It’s like AWS saying they won’t spare some ec2 instances to keep some links working. If Google knew how to use their own cloud products then they could deploy some instances, failover, and monitoring and leave it alone, and also dogfood their own cloud products.


I host my own URL shortening service. It's 2 data columns in SQLite and a few lines of Javascript.

The reason I host my own is that specifically Google taught me not to trust the longevity of cloud hosted services. So I didn't trust tinyurl.com or whatever to be there in future. Ty Google for confirming the wisdom in that decision.


They could charge for custom video urls.

https://youtu.be/bestVideo1 for $1. I bet a lot of people would pay


They could also charge for premium, junk-free, search engine results competing with Kagi.



Is it possible to download the entire database? That way we could undo the breakage using a browser extension.


internet archive is on the case! run their warrior VM to help with the cause.

https://wiki.archiveteam.org/index.php/ArchiveTeam_Warrior

choose urlteam project


Maybe the links may direct to undesirable/illegal content either now or in the future. Perhaps just CYA.


That said, I visit old discussions (5-10 years) regularly and most of the links are dead anyway.


That is sad, especially because I think that it is not a service that would take that much effort to keep up.

've seen things you people wouldn't believe... Attack ships on fire off the shoulder of Orion... I watched C-beams glitter in the dark near the Tannhäuser Gate. All those moments will be lost in time, like tears in rain...


"a candle that burns twice as bright, burns half as long"


"Fool me once, shame on you; fool me twice, shame on me."


Can someone mourning can me one single valuable link that will be lost?


Seems google doesn't care about pURLs


Not sure if this is a general remonstrance about Google not caring about permanent URLs (pURLs) or a very specific reference: In Knitting for beginners, 3rd edition (Imagine Publishing, 2015), the basic knitting techniques have a link to accompanying video demonstrations on Youtube, and they used Google's link shortener. Of course they cover the purl stitch, but the pURL for the purl video will now be broken by Google. For anyone googling this in the future, here's the redirect:

goo.gl/Z64Spk -> https://www.youtube.com/playlist?list=PLgkXzADBVZXtmB9zdaf2W...


Was generic but this quite specific example is very fine.


Yup, yet another service people rely on being shut down. Don’t rely on Google.


Yet another reminder to avoid the offerings of these mega corporations.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: