p5-AI-XGBoost
所属分类:数值算法/人工智能
开发工具:Perl
文件大小:66KB
下载次数:0
上传日期:2017-08-20 21:00:45
上 传 者:
sh-1993
说明: p5 AI XGBoost,XGBoost库的Perl包装器,
(Perl wrapper for XGBoost library
,)
文件列表:
.perltidyrc (38, 2017-08-21)
Changes (157, 2017-08-21)
dist.ini (887, 2017-08-21)
examples (0, 2017-08-21)
examples\agaricus.txt.test (183611, 2017-08-21)
examples\agaricus.txt.train (742257, 2017-08-21)
examples\basic.pl (1039, 2017-08-21)
examples\capi.pl (564, 2017-08-21)
examples\capi_dump_model.pl (496, 2017-08-21)
examples\capi_raw.pl (1070, 2017-08-21)
examples\featmap.txt (3203, 2017-08-21)
examples\iris.pl (1729, 2017-08-21)
lib (0, 2017-08-21)
lib\AI (0, 2017-08-21)
lib\AI\XGBoost.pm (2331, 2017-08-21)
lib\AI\XGBoost (0, 2017-08-21)
lib\AI\XGBoost\Booster.pm (4433, 2017-08-21)
lib\AI\XGBoost\CAPI.pm (12382, 2017-08-21)
lib\AI\XGBoost\CAPI (0, 2017-08-21)
lib\AI\XGBoost\CAPI\RAW.pm (10096, 2017-08-21)
lib\AI\XGBoost\DMatrix.pm (6996, 2017-08-21)
misc (0, 2017-08-21)
misc\using_capi.c (859, 2017-08-21)
t (0, 2017-08-21)
t\00-load.t (126, 2017-08-21)
t\10-cast_arguments.t (239, 2017-08-21)
t\20-dmatrix.t (893, 2017-08-21)
weaver.ini (27, 2017-08-21)
# NAME
AI::XGBoost - Perl wrapper for XGBoost library [https://github.com/dmlc/xgboost](https://github.com/dmlc/xgboost)
# VERSION
version 0.11
# SYNOPSIS
```perl
use 5.010;
use aliased 'AI::XGBoost::DMatrix';
use AI::XGBoost qw(train);
# We are going to solve a binary classification problem:
# Mushroom poisonous or not
my $train_data = DMatrix->From(file => 'agaricus.txt.train');
my $test_data = DMatrix->From(file => 'agaricus.txt.test');
# With XGBoost we can solve this problem using 'gbtree' booster
# and as loss function a logistic regression 'binary:logistic'
# (Gradient Boosting Regression Tree)
# XGBoost Tree Booster has a lot of parameters that we can tune
# (https://github.com/dmlc/xgboost/blob/master/doc/parameter.md)
my $booster = train(data => $train_data, number_of_rounds => 10, params => {
objective => 'binary:logistic',
eta => 1.0,
max_depth => 2,
silent => 1
});
# For binay classification predictions are probability confidence scores in [0, 1]
# indicating that the label is positive (1 in the first column of agaricus.txt.test)
my $predictions = $booster->predict(data => $test_data);
say join "\n", @$predictions[0 .. 10];
use aliased 'AI::XGBoost::DMatrix';
use AI::XGBoost qw(train);
use Data::Dataset::Classic::Iris;
# We are going to solve a multiple classification problem:
# determining plant species using a set of flower's measures
# XGBoost uses number for "class" so we are going to codify classes
my %class = (
setosa => 0,
versicolor => 1,
virginica => 2
);
my $iris = Data::Dataset::Classic::Iris::get();
# Split train and test, label and features
my $train_dataset = [map {$iris->{$_}} grep {$_ ne 'species'} keys %$iris];
my $test_dataset = [map {$iris->{$_}} grep {$_ ne 'species'} keys %$iris];
sub transpose {
# Transposing without using PDL, Data::Table, Data::Frame or other modules
# to keep minimal dependencies
my $array = shift;
my @aux = ();
for my $row (@$array) {
for my $column (0 .. scalar @$row - 1) {
push @{$aux[$column]}, $row->[$column];
}
}
return \@aux;
}
$train_dataset = transpose($train_dataset);
$test_dataset = transpose($test_dataset);
my $train_label = [map {$class{$_}} @{$iris->{'species'}}];
my $test_label = [map {$class{$_}} @{$iris->{'species'}}];
my $train_data = DMatrix->From(matrix => $train_dataset, label => $train_label);
my $test_data = DMatrix->From(matrix => $test_dataset, label => $test_label);
# Multiclass problems need a diferent objective function and the number
# of classes, in this case we are using 'multi:softprob' and
# num_class => 3
my $booster = train(data => $train_data, number_of_rounds => 20, params => {
max_depth => 3,
eta => 0.3,
silent => 1,
objective => 'multi:softprob',
num_class => 3
});
my $predictions = $booster->predict(data => $test_data);
```
# DESCRIPTION
Perl wrapper for XGBoost library.
The easiest way to use the wrapper is using `train`, but beforehand
you need the data to be used contained in a `DMatrix` object
This is a work in progress, feedback, comments, issues, suggestion and
pull requests are welcome!!
XGBoost library is used via [Alien::XGBoost](https://metacpan.org/pod/Alien::XGBoost). That means downloading,
compiling and installing if it's not available in your system.
# FUNCTIONS
## train
Performs gradient boosting using the data and parameters passed
Returns a trained AI::XGBoost::Booster used
### Parameters
- params
Parameters for the booster object.
Full list available: https://github.com/dmlc/xgboost/blob/master/doc/parameter.md
- data
AI::XGBoost::DMatrix object used for training
- number\_of\_rounds
Number of boosting iterations
# ROADMAP
The goal is to make a full wrapper for XGBoost.
## VERSIONS
- 0.2
Full C API "easy" to use, with PDL support as [AI::XGBoost::CAPI](https://metacpan.org/pod/AI::XGBoost::CAPI)
Easy means clients don't have to use [FFI::Platypus](https://metacpan.org/pod/FFI::Platypus) or modules dealing
with C structures
- 0.25
Alien package for libxgboost.so/xgboost.dll
- 0.3
Object oriented API Moose based with DMatrix and Booster classes
- 0.4
Complete object oriented API
- 0.5
Use perl signatures ([https://metacpan.org/pod/distribution/perl/pod/perlexperiment.pod#Subroutine-signatures](https://metacpan.org/pod/distribution/perl/pod/perlexperiment.pod#Subroutine-signatures))
# SEE ALSO
- [AI::MXNet](https://metacpan.org/pod/AI::MXNet)
- [FFI::Platypus](https://metacpan.org/pod/FFI::Platypus)
- [NativeCall](https://metacpan.org/pod/NativeCall)
# AUTHOR
Pablo Rodriguez Gonzalez
# COPYRIGHT AND LICENSE
Copyright (c) 2017 by Pablo Rodriguez Gonzalez.
# CONTRIBUTOR
Ruben
近期下载者:
相关文件:
收藏者: