hdfs-over-ftp-hadoop-0.20.0
所属分类:Java编程
开发工具:Java
文件大小:3529KB
下载次数:141
上传日期:2010-12-08 00:23:39
上 传 者:
facebook
说明: 在hadoop分布式文件系统上实现ftp 服务
(Hadoop distributed file system in the ftp services to achieve)
文件列表:
hdfs-over-ftp-hadoop-0.20.0\hdfs-over-ftp\ftp.jks (1347, 2009-02-26)
hdfs-over-ftp-hadoop-0.20.0\hdfs-over-ftp\hdfs-over-ftp.conf (376, 2009-06-24)
hdfs-over-ftp-hadoop-0.20.0\hdfs-over-ftp\hdfs-over-ftp.sh (753, 2009-03-18)
hdfs-over-ftp-hadoop-0.20.0\hdfs-over-ftp\lib\ftplet-api-1.0.0.jar (22827, 2009-03-16)
hdfs-over-ftp-hadoop-0.20.0\hdfs-over-ftp\lib\ftpserver-core-1.0.0.jar (271249, 2009-03-16)
hdfs-over-ftp-hadoop-0.20.0\hdfs-over-ftp\lib\hadoop-core-0.20.0.jar (2585066, 2009-06-22)
hdfs-over-ftp-hadoop-0.20.0\hdfs-over-ftp\lib\hdfs-over-ftp-1.0-SNAPSHOT.jar (20708, 2009-06-24)
hdfs-over-ftp-hadoop-0.20.0\hdfs-over-ftp\lib\jcl-over-slf4j-1.5.2.jar (16750, 2009-03-17)
hdfs-over-ftp-hadoop-0.20.0\hdfs-over-ftp\lib\log4j-1.2.14.jar (367444, 2009-02-27)
hdfs-over-ftp-hadoop-0.20.0\hdfs-over-ftp\lib\mina-core-2.0.0-M4.jar (634627, 2009-03-16)
hdfs-over-ftp-hadoop-0.20.0\hdfs-over-ftp\lib\slf4j-api-1.5.2.jar (17384, 2009-02-27)
hdfs-over-ftp-hadoop-0.20.0\hdfs-over-ftp\lib\slf4j-log4j12-1.5.2.jar (9501, 2009-02-27)
hdfs-over-ftp-hadoop-0.20.0\hdfs-over-ftp\log4j.xml (777, 2009-03-18)
hdfs-over-ftp-hadoop-0.20.0\hdfs-over-ftp\users.conf (928, 2009-03-30)
hdfs-over-ftp-hadoop-0.20.0\pom.xml (3886, 2009-06-24)
hdfs-over-ftp-hadoop-0.20.0\src\main\assembly\distr.xml (726, 2009-03-27)
hdfs-over-ftp-hadoop-0.20.0\src\main\bin\hdfs-over-ftp.sh (753, 2009-03-18)
hdfs-over-ftp-hadoop-0.20.0\src\main\conf\ftp.jks (1347, 2009-02-26)
hdfs-over-ftp-hadoop-0.20.0\src\main\conf\hdfs-over-ftp.conf (376, 2009-06-24)
hdfs-over-ftp-hadoop-0.20.0\src\main\conf\log4j.xml (777, 2009-03-18)
hdfs-over-ftp-hadoop-0.20.0\src\main\conf\users.conf (928, 2009-03-30)
hdfs-over-ftp-hadoop-0.20.0\src\main\java\org\apache\hadoop\contrib\ftp\HdfsFileSystemFactory.java (529, 2009-03-30)
hdfs-over-ftp-hadoop-0.20.0\src\main\java\org\apache\hadoop\contrib\ftp\HdfsFileSystemView.java (2578, 2009-03-30)
hdfs-over-ftp-hadoop-0.20.0\src\main\java\org\apache\hadoop\contrib\ftp\HdfsFtpFile.java (10826, 2009-04-01)
hdfs-over-ftp-hadoop-0.20.0\src\main\java\org\apache\hadoop\contrib\ftp\HdfsOverFtpServer.java (6186, 2009-03-27)
hdfs-over-ftp-hadoop-0.20.0\src\main\java\org\apache\hadoop\contrib\ftp\HdfsOverFtpSystem.java (1495, 2009-03-30)
hdfs-over-ftp-hadoop-0.20.0\src\main\java\org\apache\hadoop\contrib\ftp\HdfsUser.java (5201, 2009-03-27)
hdfs-over-ftp-hadoop-0.20.0\src\main\java\org\apache\hadoop\contrib\ftp\HdfsUserManager.java (16012, 2009-04-06)
hdfs-over-ftp-hadoop-0.20.0\src\test\java\org\apache\hadoop\contrib\ftp\HdfsFileSystemViewTest.java (2685, 2009-04-08)
hdfs-over-ftp-hadoop-0.20.0\src\test\java\org\apache\hadoop\contrib\ftp\HdfsFtpFileTest.java (6737, 2009-04-08)
hdfs-over-ftp-hadoop-0.20.0\src\test\java\org\apache\hadoop\contrib\ftp\HdfsUserTest.java (627, 2009-04-06)
hdfs-over-ftp-hadoop-0.20.0\src\main\java\org\apache\hadoop\contrib\ftp (0, 2010-02-20)
hdfs-over-ftp-hadoop-0.20.0\src\test\java\org\apache\hadoop\contrib\ftp (0, 2010-02-20)
hdfs-over-ftp-hadoop-0.20.0\src\main\java\org\apache\hadoop\contrib (0, 2010-02-20)
hdfs-over-ftp-hadoop-0.20.0\src\test\java\org\apache\hadoop\contrib (0, 2010-02-20)
hdfs-over-ftp-hadoop-0.20.0\src\main\java\org\apache\hadoop (0, 2010-02-20)
hdfs-over-ftp-hadoop-0.20.0\src\test\java\org\apache\hadoop (0, 2010-02-20)
hdfs-over-ftp-hadoop-0.20.0\src\main\java\org\apache (0, 2010-02-20)
hdfs-over-ftp-hadoop-0.20.0\src\test\java\org\apache (0, 2010-02-20)
... ...
HDFS over FTP Server
HDFS over FTP Server allows you to expose HDFS thru FTP protocol.
Server managment
To start server ./hdfs-over-ftp.sh start
To stop server ./hdfs-over-ftp.sh stop
Mount under linux
Under linux you can mount ftp using curlftpfs:
sudo curlftpfs -o allow_other ftp://user:pass@localhost:2222 ftpfs
Server Configuration
Server is configured by hdfs-over-ftp.conf
#port to connect to ftp server
port = 2222
#ftp data ports range
data-ports = 2223-2225
#if you want to use secure connection(ftps protocol), you should enable
#following options and provide ftp.jks file as a keystore.
#and you should use ftp client which support ftps, for example FileZilla
ssl-port = 2226
ssl-data-ports = 2227-2229
keystore-password = 333333
#path to HDFS
hdfs-uri = hdfs://croatia:9000
# have to be a user which runs HDFS
# this allows you to start ftp server as a root to use 21 port
# and use hdfs as a superuser
superuser = agladyshev
User Configuration
Users are conigured by users.conf
*
* ftpserver.user.{username}.homedirectory |
* Path to the home directory for the user, based on the file system implementation used |
*
*
* ftpserver.user.{username}.userpassword |
* The password for the user.Should me in MD5 hash
* |
*
*
* ftpserver.user.{username}.enableflag |
* true if the user is enabled, false otherwise |
*
*
* ftpserver.user.{username}.writepermission |
* true if the user is allowed to upload files and create directories, false otherwise |
*
*
* ftpserver.user.{username}.idletime |
* The number of seconds the user is allowed to be idle before disconnected.
* 0 disables the idle timeout
* |
*
*
* ftpserver.user.{username}.maxloginnumber |
* The maximum number of concurrent logins by the user. 0 disables the check. |
*
*
* ftpserver.user.{username}.maxloginperip |
* The maximum number of concurrent logins from the same IP address by the user. 0 disables the check. |
*
*
* ftpserver.user.{username}.uploadrate |
* The maximum number of bytes per second the user is allowed to upload files. 0 disables the check. |
*
*
* ftpserver.user.{username}.downloadrate |
* The maximum number of bytes per second the user is allowed to download files. 0 disables the check. |
*
*
* ftpserver.user.{username}.groups |
* Groups users belongs to. Comma separeted list. First group is the main group i.e. created files will have that group. |
*
*
近期下载者:
相关文件:
收藏者: