dimanche 8 février 2015

What is the most secret way to remotely host content on the Internet?



By secret, I mean discoverable by as little other entities as possible outside you, and whomever you want to know about and access that content.


The stipulation for 'remote' is because the content must have 100% uptime, for reliable 24/7 asynchronous accessibility (across shared parties), meaning, even if it were encrypted / steganographed / anonymized, P2P sharing of locally-hosted content is out of the question due to too much downtime. (All parties need to be offline when needed, and not act as a server.)


I have the following candidates, ranked from lowest to what I so far believe is the highest, for levels of secrecy of hosted Internet content. Secrecy means secret from the NSA and any other entity in the world.


(Obviously, regular domain-name website, no matter how secretly shared is way out of the question due to the extreme publicness of everything to do with an ICANN domain: the act of domain registration itself, DNS servers, unavoidable unencrypted requests when accessing the site, and the snooping and crawlability of all that typically unencrypted DNS information and traffic.)


CANDIDATES:



  • Tor hidden service. Weaknesses: 1. the content is hosted on a central server, which may have security vulnerabilities (no matter how much you harden it, due to not having access to the hardware), noticeability by the server's ISP (in some way ... even if the entire server is encrypted it is still 'known' and especially so by its connection traffic to the IPs that access it (i.e. Tor nodes, including possibly public exit ones) and thus give an idea of what it may be), and 2. Tor HS have a version of 'DNS' and a hierarchical system of Hidden Service Directories, all of which will know the onion address of the site whenever any user accesses it, or when it is first published to them by necessity in the first place. Conclusion: out of the question. Tor hidden services are highly public, despite being sensationally called the 'dark web'.


(i2p eepsites fall into the same category as Tor hidden services in that they are centralized and typically have a form of indexed / 'domain' referencing also, though I do not know whether it is possible to have secret sites without the knowledge of it by other relays, which if it were, would make i2p a notch up from Tor at least.)




  • Raw TCP/IP server hosting (like the 'warez scene' - in the case of FTP, or SSH dedicated server etc. - no DNS or other name lookup / logging involved, user has to know direct IP address to connect.) Weaknesses: 1. the content is still hosted on a central server and thus not terribly secret to its ISP and bringing all the usual centralized hosting security / secrecy weaknesses), and 2. it is still quite known by NSA or other sniffers even if it looks like arbitrary SFTP/SSH connections on that server. Although its content is secret, the correlation between its location (one point of 'discovery'), and its users (another) or their method/protocol of access, is highly observable, and there is no in-built protection to prevent privacy-compromising IP addresses from connecting to it, which may raise evidence of what the content is (or serve as an avenue for NSA to stumble upon the existence of the server in the first place, by surveillance linkage), unlike Tor hidden services, which necessitate 'onion'-encrypted obfuscation of IP addresses to access the resource at all. Conclusion: not secret enough.




  • Freenet freesite. Content hosting (whether simple html site, files or both), is 100% decentralized, stored in a network-wide DHT whereby it is split into encrypted pieces across many freenet users' data stores, and only 'joinable' to form the original uploaded data for a requester if they know its content hash key, which then gives a map for which pieces to find on the distributed network from different freenet users that (unwittingly) have them. Advantages: fully decentralized hosting, (avoiding the central server weakness of even raw TCP/IP hosting), and in-built design of distance between requester IP and the storage of the content itself. What I am unsure of: like how HSDir's together know the existence of all .onion sites in existence (and it is easy to become a HSDir) - can Freenet nodes also harvest content hash keys of otherwise-quietly uploaded content to Freenet? It may be serverless, and very anonymous, but will the existence of the content itself still be discoverable by other node operators passing along and storing its hash key?




It may simply be impossible, to have 'undiscoverable' data, when having other people carry it for you (which is unavoidable if you are to communicate the data on the Internet at all).


But if layered / end-to-end encryption can obfuscate what the Freenet keys even ARE, that are being requested (e.g. a HTTPS-like tunnel 'coating' of even the requests themselves, such that the whole request and retrieval process is automated and encrypted itself, so that ONLY the requester of the content can know what key was requested, and every node merely transports the request as HTTPS-like content, readable only by the encrypted pieces of the data that the key is looking for, themselves), then maybe it can truly be undiscoverable communication - even by the NSA.


Assume opsec and local system security is good, and that the NSA isn't targeting you in the first place (via other side channel or targeted network attacks), so this is simply about which hosting system/protocol (itself), provides the best secrecy currently available.


Lastly - the content doesn't have to be web or require a browser, files and non-web protocols are fine, indeed FTP is part of a candidate I have mentioned above.





1 commentaire:

  1. Englaise:
    You can't choose to have 'discoverable' as I have opted for that when Drive replaced Blogger and eliminated my papers from being available on the net. You can select that setting but your papers still dont show up.

    RépondreSupprimer