DBD::SQLite - Self Contained RDBMS in a DBI Driver
use DBI; my $dbh = DBI->connect("dbi:SQLite:dbname=dbfile","","");
SQLite is a public domain RDBMS database engine that you can find at http://www.hwaci.com/sw/sqlite/.
Rather than ask you to install SQLite first, because SQLite is public domain, DBD::SQLite includes the entire thing in the distribution. So in order to get a fast transaction capable RDBMS working for your perl project you simply have to install this module, and nothing else.
SQLite supports the following features:
There's lots more to it, so please refer to the docs on the SQLite web page, listed above, for SQL details. Also refer to DBI for details on how to use DBI itself.
The API works like every DBI module does. Please see DBI for more details about core features.
Currently many statement attributes are not implemented or are limited by the typeless nature of the SQLite database.
If set to a true value, DBD::SQLite will turn the UTF-8 flag on for all text strings coming out of the database. For more details on the UTF-8 flag see perlunicode. The default is for the UTF-8 flag to be turned off.
Also note that due to some bizareness in SQLite's type system (see
http://www.sqlite.org/datatype3.html), if you want to retain
blob-style behavior for some columns under $dbh->{unicode} = 1
(say, to store images in the database), you have to state so
explicitely using the 3-argument form of DBI/bind_param when doing
updates:
use DBI qw(:sql_types); $dbh->{unicode} = 1; my $sth = $dbh->prepare ("INSERT INTO mytable (blobcolumn) VALUES (?)"); $sth->bind_param(1, $binary_data, SQL_BLOB); # binary_data will # be stored as-is.
Defining the column type as BLOB in the DDL is not sufficient.
This method returns the last inserted rowid. If you specify an INTEGER PRIMARY KEY as the first column in your table, that is the column that is returned. Otherwise, it is the hidden ROWID column. See the sqlite docs for details.
Note: You can now use $dbh->last_insert_id() if you have a recent version of DBI.
Retrieve the current busy timeout.
Set the current busy timeout. The timeout is in milliseconds.
This method will register a new function which will be useable in SQL query. The method's parameters are:
For example, here is how to define a now() function which returns the current number of seconds since the epoch:
$dbh->func( 'now', 0, sub { return time }, 'create_function' );
After this, it could be use from SQL as:
INSERT INTO mytable ( now() );
This method will register a new aggregate function which can then used from SQL. The method's parameters are:
The aggregator interface consists of defining three methods:
Here is a simple aggregate function which returns the variance (example adapted from pysqlite):
package variance; sub new { bless [], shift; } sub step { my ( $self, $value ) = @_; push @$self, $value; } sub finalize { my $self = $_[0]; my $n = @$self; # Variance is NULL unless there is more than one row return undef unless $n || $n == 1; my $mu = 0; foreach my $v ( @$self ) { $mu += $v; } $mu /= $n; my $sigma = 0; foreach my $v ( @$self ) { $sigma += ($x - $mu)**2; } $sigma = $sigma / ($n - 1); return $sigma; } $dbh->func( "variance", 1, 'variance', "create_aggregate" );
The aggregate function can then be used as:
SELECT group_name, variance(score) FROM results GROUP BY group_name;
As of version 1.11, blobs should "just work" in SQLite as text columns. However this will cause the data to be treated as a string, so SQL statements such as length(x) will return the length of the column as a NUL terminated string, rather than the size of the blob in bytes. In order to store natively as a BLOB use the following code:
use DBI qw(:sql_types); my $dbh = DBI->connect("dbi:sqlite:/path/to/db"); my $blob = `cat foo.jpg`; my $sth = $dbh->prepare("INSERT INTO mytable VALUES (1, ?)"); $sth->bind_param(1, $blob, SQL_BLOB); $sth->execute();
And then retreival just works:
$sth = $dbh->prepare("SELECT * FROM mytable WHERE id = 1"); $sth->execute(); my $row = $sth->fetch; my $blobo = $row->[1]; # now $blobo == $blob
To access the database from the command line, try using dbish which comes with the DBI module. Just type:
dbish dbi:SQLite:foo.db
On the command line to access the file foo.db.
Alternatively you can install SQLite from the link above without conflicting
with DBD::SQLite and use the supplied sqlite
command line tool.
SQLite is fast, very fast. I recently processed my 72MB log file with it, inserting the data (400,000+ rows) by using transactions and only committing every 1000 rows (otherwise the insertion is quite slow), and then performing queries on the data.
Queries like count(*) and avg(bytes) took fractions of a second to return, but what surprised me most of all was:
SELECT url, count(*) as count FROM access_log GROUP BY url ORDER BY count desc LIMIT 20
To discover the top 20 hit URLs on the site (http://axkit.org), and it returned within 2 seconds. I'm seriously considering switching my log analysis code to use this little speed demon!
Oh yeah, and that was with no indexes on the table, on a 400MHz PIII.
For best performance be sure to tune your hdparm settings if you are using linux. Also you might want to set:
PRAGMA default_synchronous = OFF
Which will prevent sqlite from doing fsync's when writing (which slows down non-transactional writes significantly) at the expense of some peace of mind. Also try playing with the cache_size pragma.
Likely to be many, please use http://rt.cpan.org/ for reporting bugs.
Matt Sergeant, matt@sergeant.org
Perl extension functions contributed by Francis J. Lacoste <flacoste@logreport.org> and Wolfgang Sourdeau <wolfgang@logreport.org>
DBI.