What will happen if somebody will try to host illegal content on ThreeFold Grid?

Hello, on behalf of Russian speaking community members, wanted to open this thread.

What will happen if somebody will try to host illegal content on ThreeFold Grid?
Is it possible to host illegal content on ThreeFold Grid? If no, can you please explain how the filtering mechanism will work? What happens to content that is legal in one country and illegal in another?

2 Likes

I’ll give my 5 cents on this (as it’s a pretty broad question, and also a sensitive one):

The ThreeFold Grid is simply the lower capacity layer hosting data in a way that there is a full separation between hardware provider side and the side of a capacity user to generate his content on the grid. So yes, a user can post any content onto the TFGrid, also content that is considered as ‘illegal’ in certain jurisdictions. Who are we (and again, “we” doesn’t really exist here, as all is decentralised) to judge on which content is illegal and which one isn’t ?
Now, the actions that could be taken is blocking IP addresses in case a jurisdiction doesn’t agree with a certain content (that can be implemented both on IPv4 and IPv6 addresses).

There might however be some changes in how people will behave in the future. I imagine that there will be tooling in the future to allow people’s identity to be revealed, at least to some extent (real personal info can of course always stay private). And it might be that some jurisdictions will act on that, ex. filter out content for which the originator’s identity can’t be revealed. And if people agree to reveal their identity, I imagine they also aren’t eager to post illegal content, no?

1 Like

I do wonder if it is not worth creating a mechanism whereby workloads / data can be expunged from the network.

This is obviously not something that can be done lightly or unilaterally, but a scenario jumps to mind:

Someone is using threefold to store, I don’t know highly illegal pornography.
The police in some jurisdiction wants this content removed (I don’t mean want in a lighthearted way, all processes have been followed and the data must be removed)

I think a mechanism put in place where the DAO can exercise a vote and on “success” the content is removed. Could be a potentially suitable solution.

Secondly, and again not saying this lightly, but given this is a “community” and the intent is to be for the greater good, the ability to expunge data / workloads from the network (by DAO vote) that are completely against the ethos of ThreeFold doesn’t seem unreasonable.

In terms of “who decides” what is legal and where, that’s not the primary concern. But in terms of removing genuinely damaging content by vote and decentralises due process I think we should consider it.

1 Like

Very difficult thing that we should all hash out. Once the door to censorship opens… it can be hard to close. Is the data being sharded such that any individual 3node holds no complete content, “illegal” or otherwise? I think this is more for localized policing agencies to address. Specifically controlling/accessing of the data in a specific location where it is not proper. The other argument can be made for not becoming the de facto system attracting immoral unethical content.

My return question with regards to his is how does this work in today’s world? We have 1000’s of petabytes in public and private storage solutions and I am pretty sure that that storage space has “less ideal” content stored on it as well, as determined by the local jurisdiction.

Just to be clear, I am not a lawyer nor expert in this matter, I share what I know the current situation is as I have been working for managed service providers in a previous life:

So as far as I know it starts with finding the person or entity that is not (allegedly at that point in time) following the law. Then a search for evidence leads to investigation real world and digital data points. It is at that point in time that the owner of (cloud) infrastructure is identified and more detailed investigations are conducted in datacenters/servers.

For the grid I believe the exact same process applies, however, there is more than one “capacity” provider in play … So nothing new under the sun. IMHO

2 Likes

Cloud providers do have terms and conditions though, they can alao unilaterally remove data, workloads and ban accounts when compelled to do so.

This does not happen lightly, but they have the tools and mechanisms in place to remove genuinely problematic workloads / content etc when required.

Another more detailed question from Russian-speaking community:
what happens if in some autocratic countries special services will try to find who has deployed morally compliant content but that is not liked by autocratic goverments?
On the other side what will happen if somebody hosts not very nice contact, like illegal drug or gun shop,
Also who is responsible for the keeping a deployment/node/server? Law enforcement can come to the node owner?

Maybe guns and drugs shouldn’t be regulated anyway. Just leads to organized crime

When you sign up for using the TF Grid, you sign (blockchain-based) T&C’s of the grid. Same for the farmer that contributes capacity. You can find the T&C’s here in the library.

With regards to content that in breach with regulations, the same investigation methods and rules apply as in any other online service/cloud. This is no different.

How are the Ts and Cs enforced on TFGrid?

Which mechanisms are planned / currently available there to enforce them?
How are these mechanisms triggered?
How are these mechanisms executed?

There is a big difference in this regard I’m afraid.

T&C’s are never enforced. The consumer (or farmer) agrees with them and then them are held responsible to act inline with the T&C’s. When a person / entity break them they are liable and open for being pushed off the grid, or prosecuted.

With large central organizations there is some “policing and enforcing” capability as they own and control everything end to end (including your data ;-)) but with a decentralized system this is different. We all are responsible for keeping an eye open to what is happening and we all have the capability to highlight and make abuse visible to authorities. One of those authorities is the Digital Autonomous Organization that ThreeFold is creating.

BTW - just for the record I am not a lawyer so this is very quickly going beyond my knowledge.

Terms and Conditions are absolutely enforced on other providers / CoLos and need to be enforceable here.

If they are violated, accounts are closed / banned, workloads removed, hardware shutdown / removed from CoLo datacenters etc.

If there is only a “please don’t run dodgy workloads” on our network, with no way of booting offenders.

Then
a: The statement “this is no different from other providers” does not hold water.
b: It can and will likely be misused.
c: We are saying, that we are happy to take the risk.

My point in my first post was exactly a suggestion as to how the DAO can potentially empowered to “democratically” protect the grid.

I would suggest approaching some legal council for input in this matter.

I will obviously be happy with whatever the consensus is, but right now it feels like either:

  • Not enough thought / research has been done on this particular aspect (including obtaining actual legal opinion)
  • ThreeFold is saying: if there are illegal workloads running, those workloads / data cannot be removed from the grid unless the offender themselves actually do so (Which even when identified and prosecuted, they may still not do) or their reservations run out.

Enforcement we can do - through the DAO as you said, just like a decentralized Colo providers could do. But what a Colo provider cannot do is do the search and find illegal content, they have no rights / access to hosted machines.

Same with us - we can action but that is only in a reaction to things being brought to the attention of the DAO.

I said I am not the expert on the legal side so I will find the export and get some further details. The DAO will be able to act - as stated. There is always more that needs to be looked at, this is certainly an area that clearly needs more discussion and attention.

1 Like

I’m absolutely in agreement with us not to search for content :slight_smile:

I would just like to understand our options once something has been identified. :slight_smile:

I’d like to add a few thoughts on this question, which comes up from time to time and I’ve thought about quite a bit.

Liability for farmers
Farmers who don’t provide public IPs for their nodes or for workloads running on them will be very difficult to link to content their nodes are storing. That’s true even if it’s stored in a plain way, but in the case of Quantum Safe Storage the data simply isn’t there. Taken in combination with the fact that prosecution for illegal content involves demonstrating intent in many jurisdictions, I think that farming without public config is generally a low risk activity (not legal advice).

One thing I’m not sure about is how this looks over Yggdrail/Planetary network, wherein all nodes can serve public functions. That is, can the Yggdrasil IPs of workloads be traced to the nodes they’re running on and the farmers who own them?

Role of gateways and public nodes
Nodes with public config and the owners of their IP addresses can be directly linked to content served from them. This is where I see a real possibility of law enforcement linking some illegal activity to a farmer. What can a farmer do in this case?

Farmers with public configs can probably take steps like block traffic to specific IPs that are being leased to workloads or ports used by gateways, without interrupting other workloads on their nodes. Of course, whomever is being investigated can simply spin up replacement workloads on another farm with public config.

How to enforce?
Then there’s the question of how enforcement could be carried out by the DAO to actually remove offending workloads. This would probably involve modifying TF Chain in such a way that nodes would proceed to decommission said workloads. The user’s account could also be blocked from making new workloads, TFT frozen, etc.

But how do you actually stop someone from continuing to use the Grid and make new deployments? That’s a much more difficult proposition, when a single individual or group can control many accounts with no obvious link between them. What would a censorship mechanism actually look like on the Grid?

1 Like

There is one decentralised mode that could probably serve as a solution:

  1. Farmers having their T&Cs and are responsible to block unwanted content from their nodes. Farmers can block certain users or certain groups of users from reserving their hardware.
  2. Since all workloads tied to end users, and there is no anonymity on ThreeFold Grid, in case of need, responsibility can be traced back to content owners.
  3. Anyone can create a public list of bad users and apply as a block filter to his farm or publish this list to others. In this way in due course of time public community authorities will emerge who can receive complains on certain bad actors and blacklist them for anyone who follows their recommendations.

what do you think about such proposal?

I think having farmers responsible for blocking unwanted content on their nodes is more responsibility than farmers are willing to take on.

3 Likes

Not actually the responsibility, but a right to choose. The idea is that farmers can decide themselves what filter lists to follow. Eventually today in most countries in case something happens, authorities are coming to the capacity provider in the end asking them to take down some or other content if they cannot find content owner. So this idea is just to give a farmer a power to do so if it is required by authorities.

Actually in the current Grid implementation, there is anonymity both for farmers and deployers. Farmers can be linked back to their public IP, but this could also be obfuscated with a VPN.

As @Geert was also suggesting, adding some kind of “Know Your Counterparty” system might make sense in the future. I think we’ll see decentralized identity protocols that can be inclusive and privacy preserving while providing some levels of assurance that a given digital identity is linked to a real person.

I think giving farmers the power to choose who deploys on their nodes makes sense. As I outlined above, the greatest responsibility will fall on farmers who provide public access points. Of course, this opens up other questions, like, will farmers then choose to block deployers who consume a lot of bandwidth, for example?

Any measures taken in this regard should also be weighed with their potential to concentrate censorship power. Giving farmers the ultimate choice helps here, but it’s possible that some entity who becomes the default blacklisting provider gains too much control over who uses the Grid and what they’re able to publish.

1 Like

We don’t want threefold associated with criminal activity or it would be bad for adoption. If we become known for servers hosting malware and CP that will not be good. There are some activities, such as those just mentioned, that are not political and the removal is not considered censorship. These items should be actively purged when given instruction to do so by authorities. Perhaps a simple vote by the DOA would be a way to remove content. Voters would have an incentive to get the network clean. I know that if I suspected CP was being hosted on my servers and there was not an avenue for the reporting and removal of that material, I would destroy my servers and leave the network.