synthetikal.com Forum Index


open access to information
Goto page Previous1, 2
Post new topic   Reply to topic    synthetikal.com Forum Index -> Ethos
Author Message
auttie

Joined: 02 Jun 2005
Posts: 18
588.80 Points

Sat Jun 04, 2005 12:18 pm
Reply with quote

what about entropy? (http://entropy.stop1984.com) why not utilize tools we allready have.
think.. if we had 10 badass servers, or maybe 200 cable users nationwide using entropy.. or think.. if every member on this site.. went and downloaded entropy right now.. and ran it all the time.. much of our hosting difficulties could/would be solved. they have come up with a entropy forum system as well. Instead of trying to search out that cure-all lets make all that information free.. lets setup anonymous internets, lets setup anonymous secure communication servers..get rid of your insecure windows machines.. lets all move into a new age and take advantage of what we have.. I know that I do my part on a regular basis. Im allways willing to teach.. half the time thats the only way I can learn.
Back to top
loki
guinea pig
Joined: 09 Mar 2005
Posts: 391
14167.88 Points

Sat Jun 04, 2005 4:17 pm
Reply with quote

well well, i have been following the p2p net business for some time now, and was terribly disappointed that the only option available was bloody freenet... this looks good, and wouldn't you know it, it's in the gentoo portage system ... now building ... i feel that if we could get like 20-30 hardcore 24/7, 90% online folks to run this it really could be quite something... i'm gonna get my head around this thing so i can help others get it going Very Happy
Back to top
loki
guinea pig
Joined: 09 Mar 2005
Posts: 391
14167.88 Points

Sat Jun 04, 2005 5:44 pm
Reply with quote

goddammit!

it uses SHA1 hashes. these are now defunct, dead, kaput, the way to crack them has been discovered and anything that uses sha1 is now a piece of junk.

just thought i should let u's all know that.

o well back to usin my webserver for the time being...

i really should get around to building an application which does exactly what we are needing - a forum, chat and file repository via p2p all encrypted with nice strong encryption and distributed. it's been on my list of things to do for a few months now, i'm really just waiting on sorting out my medications and then maybe i'll be able to do the work... not sure how to implement it but my feeling at this point is to go with php, mysql and apache, and run it as a small cut down webserver, feeding off the php database.

one of the problems with these systems is that since it's on your computer, in theory, you can inspect the cache. Even if its encrypted, i don't think its really possible to entirely lock it down to being opened up, when it's gotta be opened up to send out files...


Another option is a cluster of webservers with a distributed database system that is kept separate... aargh! there's really no easy way to do it really.

Well, a load balancing system and a round robin DNS is a start, and keeping the databases up to date with each other is not that hard, just a periodic synchronisation of the mysql folder containing the db... i'm not sure how that would work exactly, i guess if there was a function to dump a sql backup of the database (could be done as a special web address with authentication) and then diff that backup with the one on the other server, and vice versa, both do it at the same time, and then they are synchronised). and for the security, run all of the server nodes on tor and use a hidden service address, which stops the data from leaking out of the tor network. Then clients only need to run tor to access the network. I personally have a good 10gb of spare bandwidth per month, and i'm sure other users do too. If people would like to set up a little cabal to do it i would love to be involved. For security, the database can be stored on an encrypted volume and set up so that the password has to be entered when powering up the machine to unlock it, and a killswitch of some sort to unmount it...

would be a worthy project... not quite p2p, but at this point in time, unless someone wrote a whole new application it would take ages, but rigging something like this up wouldn't be hard, the only coding needed would be the synchonisation system, the rest of the system is just configuration and administration work. A payoff for the contributors of servers would be in a dramatic acceleration of the function of the board on their local network.

Just thinking some more, what could happen for data synchronisation is that first the data is inserted into the database on the server being accessed, and then a second program which monitors the database access and when new data is added, it sends the information out to all the other server's databases, which then insert it. A quick hack way to do it would be to mess around with the forum code and change all the database writes to call a separate script which handles inserting into the database and sending the update out to other databases.


well, if others are interested in doing something like this, come to my silc server chatroom and let's organise it. It would only take like 20 of us to be able to match the bandwidth and more than match the bandwidth limit of this server.

silc server is at: m0c.no-ip.org port 706 the channel is 'monastery'
Back to top
loki
guinea pig
Joined: 09 Mar 2005
Posts: 391
14167.88 Points

Sat Jun 04, 2005 6:06 pm
Reply with quote

it occurs to me that it would make sense to split server contributions into two types. servers which only redirect users to running servers containing the bbs and files etc, and servers which do the actual serving. The redirector server(s) could be run on regular web hosts with a round robin dns, and they would only contain the code which keeps track of live servers and the servers can be queried for their loading, or some other means of arbitrating distributing access.

It could even be possible to split the serving load to: webserver/database servers, redirector servers and file servers. File servers would then upload new files uploaded to them to the other file servers, the redirector servers could also run a page which tells the webserver what address prefix to put on the image and data files, distributing the load. There would only need to be a few redirector servers, and their load would be very minimal compared to the webserver/database systems and file servers.

the question of redirector servers all depends on how round robin dns works, i'm not absolutely certain whether they are usable in a situation where some of the servers may go down now and then... and what to do in that situation to send the user to another address that *is* working...

it's a big project for sure, but i know it could be done, and would be rather cool to have it running, especially if it is ssl, AND running on hidden services on the tor network... might be a year or two before something like this is fully up and running but that should be decently timely for the near future with the way some things are heading out there, would be not just desirable, but neccessary.
Back to top
loki
guinea pig
Joined: 09 Mar 2005
Posts: 391
14167.88 Points

Sat Jun 04, 2005 6:35 pm
Reply with quote

some info about a load balancing system that is being developed which could perform this function:

http://eddie.sourceforge.net/what.html
Quote:
Eddie is a high availability clustering tool. It is an open source, 100% software solution written primarily in the functional programming language Erlang (www.erlang.org) and is available for Solaris, Linux and *BSD.

Eddie provides advanced automatic traffic management and configuration of geographically distributed server sites, consisting of one or more Local Area Networks.
Overview

At each site, certain servers are designated as Front End Servers, (shown in blue). These servers are responsible for controlling and distributing incoming traffic across designated Back End Servers (shown in black), and tracking the availability of Back End Web Servers within the site. Back End Servers may support a range of Web servers, including Apache.

Currently, Eddie consists of two main software packages:

* The Enhanced DNS server which provides load balancing and monitoring of site accessibility for geographically distributed web sites. This gives round the clock access to the entire available capacity of the web site, no matter where it is located."
* An Intelligent HTTP Gateway which provides site based
o Load Balancing,
o Reliability,
o Scalability, and
o Quality of Service.

The Eddie white papers describe the need for products such as Eddie, and outlines the Eddie approach.


This would solve the distribution of the servers, all that would be needed other than having eddie set up (and preferably a tor server as well) is writing the code to distribute the database and file storage, which becomes a lot easier with something like eddie running. I'd say eddie can be queried to get a list of addresses to distribute new data for the database and files to all other live nodes, and each server should, upon re-entering the network, synchronise with another (preferably close by) server with a backend web-based system.
Back to top
Display posts from previous:   
Post new topic   Reply to topic    synthetikal.com Forum Index -> Ethos All times are GMT + 5.5 Hours
Goto page Previous1, 2
Page 2 of 2

 



Powered by phpBB 2.0.11 © 2001, 2002 phpBB Group

Igloo Theme Version 1.0 :: Created By: Andrew Charron