In the Web content is usually accessible via a URL that points to a HTTP endpoint from which the content can be retrieved. If the host goes down data is no longer accessible.
This makes it also very easy to deny access to certain content, by blocking access to a particular host.
Monitoring / Alerting
In A More Decentralized Vision for Linked Data they propose using monitoring that alerts when resources goes down.
This is not a real solution, but a symptomatic treatment.
Don't use URLs
Don't specify how to get content by giving the location but by some other identifier that allows content to be retrieved regardless of host status.
The most reasonable way to do this is with content-addressed storage
Linked Data Fragments
SPARQL Web-Querying Infrastructure:Ready for Action?
The answer: "Not yet."
Inquiry and monitoring of availability of public SPARQL endpoints.
Problems they identified: Discoverability, interoperability, performance and availability.