Checkout of Instiki Trunk 1/21/2007.

This commit is contained in:
Jacques Distler 2007-01-22 07:43:50 -06:00
commit 69b62b6f33
1138 changed files with 139586 additions and 0 deletions

2608
vendor/rails/activerecord/CHANGELOG vendored Normal file

File diff suppressed because it is too large Load diff

20
vendor/rails/activerecord/MIT-LICENSE vendored Normal file
View file

@ -0,0 +1,20 @@
Copyright (c) 2004 David Heinemeier Hansson
Permission is hereby granted, free of charge, to any person obtaining
a copy of this software and associated documentation files (the
"Software"), to deal in the Software without restriction, including
without limitation the rights to use, copy, modify, merge, publish,
distribute, sublicense, and/or sell copies of the Software, and to
permit persons to whom the Software is furnished to do so, subject to
the following conditions:
The above copyright notice and this permission notice shall be
included in all copies or substantial portions of the Software.
THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND,
EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF
MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND
NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE
LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION
OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION
WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE.

360
vendor/rails/activerecord/README vendored Executable file
View file

@ -0,0 +1,360 @@
= Active Record -- Object-relation mapping put on rails
Active Record connects business objects and database tables to create a persistable
domain model where logic and data are presented in one wrapping. It's an implementation
of the object-relational mapping (ORM) pattern[http://www.martinfowler.com/eaaCatalog/activeRecord.html]
by the same name as described by Martin Fowler:
"An object that wraps a row in a database table or view, encapsulates
the database access, and adds domain logic on that data."
Active Record's main contribution to the pattern is to relieve the original of two stunting problems:
lack of associations and inheritance. By adding a simple domain language-like set of macros to describe
the former and integrating the Single Table Inheritance pattern for the latter, Active Record narrows the
gap of functionality between the data mapper and active record approach.
A short rundown of the major features:
* Automated mapping between classes and tables, attributes and columns.
class Product < ActiveRecord::Base; end
...is automatically mapped to the table named "products", such as:
CREATE TABLE products (
id int(11) NOT NULL auto_increment,
name varchar(255),
PRIMARY KEY (id)
);
...which again gives Product#name and Product#name=(new_name)
{Learn more}[link:classes/ActiveRecord/Base.html]
* Associations between objects controlled by simple meta-programming macros.
class Firm < ActiveRecord::Base
has_many :clients
has_one :account
belongs_to :conglomorate
end
{Learn more}[link:classes/ActiveRecord/Associations/ClassMethods.html]
* Aggregations of value objects controlled by simple meta-programming macros.
class Account < ActiveRecord::Base
composed_of :balance, :class_name => "Money",
:mapping => %w(balance amount)
composed_of :address,
:mapping => [%w(address_street street), %w(address_city city)]
end
{Learn more}[link:classes/ActiveRecord/Aggregations/ClassMethods.html]
* Validation rules that can differ for new or existing objects.
class Account < ActiveRecord::Base
validates_presence_of :subdomain, :name, :email_address, :password
validates_uniqueness_of :subdomain
validates_acceptance_of :terms_of_service, :on => :create
validates_confirmation_of :password, :email_address, :on => :create
end
{Learn more}[link:classes/ActiveRecord/Validations.html]
* Acts that can make records work as lists or trees:
class Item < ActiveRecord::Base
belongs_to :list
acts_as_list :scope => :list
end
item.move_higher
item.move_to_bottom
Learn about {acts_as_list}[link:classes/ActiveRecord/Acts/List/ClassMethods.html], {the instance methods acts_as_list provides}[link:classes/ActiveRecord/Acts/List/InstanceMethods.html], and
{acts_as_tree}[link:classes/ActiveRecord/Acts/Tree/ClassMethods.html]
* Callbacks as methods or queues on the entire lifecycle (instantiation, saving, destroying, validating, etc).
class Person < ActiveRecord::Base
def before_destroy # is called just before Person#destroy
CreditCard.find(credit_card_id).destroy
end
end
class Account < ActiveRecord::Base
after_find :eager_load, 'self.class.announce(#{id})'
end
{Learn more}[link:classes/ActiveRecord/Callbacks.html]
* Observers for the entire lifecycle
class CommentObserver < ActiveRecord::Observer
def after_create(comment) # is called just after Comment#save
Notifications.deliver_new_comment("david@loudthinking.com", comment)
end
end
{Learn more}[link:classes/ActiveRecord/Observer.html]
* Inheritance hierarchies
class Company < ActiveRecord::Base; end
class Firm < Company; end
class Client < Company; end
class PriorityClient < Client; end
{Learn more}[link:classes/ActiveRecord/Base.html]
* Transaction support on both a database and object level. The latter is implemented
by using Transaction::Simple[http://www.halostatue.ca/ruby/Transaction__Simple.html]
# Just database transaction
Account.transaction do
david.withdrawal(100)
mary.deposit(100)
end
# Database and object transaction
Account.transaction(david, mary) do
david.withdrawal(100)
mary.deposit(100)
end
{Learn more}[link:classes/ActiveRecord/Transactions/ClassMethods.html]
* Reflections on columns, associations, and aggregations
reflection = Firm.reflect_on_association(:clients)
reflection.klass # => Client (class)
Firm.columns # Returns an array of column descriptors for the firms table
{Learn more}[link:classes/ActiveRecord/Reflection/ClassMethods.html]
* Direct manipulation (instead of service invocation)
So instead of (Hibernate[http://www.hibernate.org/] example):
long pkId = 1234;
DomesticCat pk = (DomesticCat) sess.load( Cat.class, new Long(pkId) );
// something interesting involving a cat...
sess.save(cat);
sess.flush(); // force the SQL INSERT
Active Record lets you:
pkId = 1234
cat = Cat.find(pkId)
# something even more interesting involving the same cat...
cat.save
{Learn more}[link:classes/ActiveRecord/Base.html]
* Database abstraction through simple adapters (~100 lines) with a shared connector
ActiveRecord::Base.establish_connection(:adapter => "sqlite", :database => "dbfile")
ActiveRecord::Base.establish_connection(
:adapter => "mysql",
:host => "localhost",
:username => "me",
:password => "secret",
:database => "activerecord"
)
{Learn more}[link:classes/ActiveRecord/Base.html#M000081] and read about the built-in support for
MySQL[link:classes/ActiveRecord/ConnectionAdapters/MysqlAdapter.html], PostgreSQL[link:classes/ActiveRecord/ConnectionAdapters/PostgreSQLAdapter.html], SQLite[link:classes/ActiveRecord/ConnectionAdapters/SQLiteAdapter.html], Oracle[link:classes/ActiveRecord/ConnectionAdapters/OCIAdapter.html], SQLServer[link:classes/ActiveRecord/ConnectionAdapters/SQLServerAdapter.html], and DB2[link:classes/ActiveRecord/ConnectionAdapters/DB2Adapter.html].
* Logging support for Log4r[http://log4r.sourceforge.net] and Logger[http://www.ruby-doc.org/stdlib/libdoc/logger/rdoc]
ActiveRecord::Base.logger = Logger.new(STDOUT)
ActiveRecord::Base.logger = Log4r::Logger.new("Application Log")
== Simple example (1/2): Defining tables and classes (using MySQL)
Data definitions are specified only in the database. Active Record queries the database for
the column names (that then serves to determine which attributes are valid) on regular
object instantiation through the new constructor and relies on the column names in the rows
with the finders.
# CREATE TABLE companies (
# id int(11) unsigned NOT NULL auto_increment,
# client_of int(11),
# name varchar(255),
# type varchar(100),
# PRIMARY KEY (id)
# )
Active Record automatically links the "Company" object to the "companies" table
class Company < ActiveRecord::Base
has_many :people, :class_name => "Person"
end
class Firm < Company
has_many :clients
def people_with_all_clients
clients.inject([]) { |people, client| people + client.people }
end
end
The foreign_key is only necessary because we didn't use "firm_id" in the data definition
class Client < Company
belongs_to :firm, :foreign_key => "client_of"
end
# CREATE TABLE people (
# id int(11) unsigned NOT NULL auto_increment,
# name text,
# company_id text,
# PRIMARY KEY (id)
# )
Active Record will also automatically link the "Person" object to the "people" table
class Person < ActiveRecord::Base
belongs_to :company
end
== Simple example (2/2): Using the domain
Picking a database connection for all the Active Records
ActiveRecord::Base.establish_connection(
:adapter => "mysql",
:host => "localhost",
:username => "me",
:password => "secret",
:database => "activerecord"
)
Create some fixtures
firm = Firm.new("name" => "Next Angle")
# SQL: INSERT INTO companies (name, type) VALUES("Next Angle", "Firm")
firm.save
client = Client.new("name" => "37signals", "client_of" => firm.id)
# SQL: INSERT INTO companies (name, client_of, type) VALUES("37signals", 1, "Firm")
client.save
Lots of different finders
# SQL: SELECT * FROM companies WHERE id = 1
next_angle = Company.find(1)
# SQL: SELECT * FROM companies WHERE id = 1 AND type = 'Firm'
next_angle = Firm.find(1)
# SQL: SELECT * FROM companies WHERE id = 1 AND name = 'Next Angle'
next_angle = Company.find_first "name = 'Next Angle'"
next_angle = Firm.find_by_sql("SELECT * FROM companies WHERE id = 1").first
The supertype, Company, will return subtype instances
Firm === next_angle
All the dynamic methods added by the has_many macro
next_angle.clients.empty? # true
next_angle.clients.size # total number of clients
all_clients = next_angle.clients
Constrained finds makes access security easier when ID comes from a web-app
# SQL: SELECT * FROM companies WHERE client_of = 1 AND type = 'Client' AND id = 2
thirty_seven_signals = next_angle.clients.find(2)
Bi-directional associations thanks to the "belongs_to" macro
thirty_seven_signals.firm.nil? # true
== Examples
Active Record ships with a couple of examples that should give you a good feel for
operating usage. Be sure to edit the <tt>examples/shared_setup.rb</tt> file for your
own database before running the examples. Possibly also the table definition SQL in
the examples themselves.
It's also highly recommended to have a look at the unit tests. Read more in link:files/RUNNING_UNIT_TESTS.html
== Philosophy
Active Record attempts to provide a coherent wrapper as a solution for the inconvenience that is
object-relational mapping. The prime directive for this mapping has been to minimize
the amount of code needed to build a real-world domain model. This is made possible
by relying on a number of conventions that make it easy for Active Record to infer
complex relations and structures from a minimal amount of explicit direction.
Convention over Configuration:
* No XML-files!
* Lots of reflection and run-time extension
* Magic is not inherently a bad word
Admit the Database:
* Lets you drop down to SQL for odd cases and performance
* Doesn't attempt to duplicate or replace data definitions
== Download
The latest version of Active Record can be found at
* http://rubyforge.org/project/showfiles.php?group_id=182
Documentation can be found at
* http://ar.rubyonrails.com
== Installation
The prefered method of installing Active Record is through its GEM file. You'll need to have
RubyGems[http://rubygems.rubyforge.org/wiki/wiki.pl] installed for that, though. If you have,
then use:
% [sudo] gem install activerecord-1.10.0.gem
You can also install Active Record the old-fashion way with the following command:
% [sudo] ruby install.rb
from its distribution directory.
== License
Active Record is released under the MIT license.
== Support
The Active Record homepage is http://www.rubyonrails.com. You can find the Active Record
RubyForge page at http://rubyforge.org/projects/activerecord. And as Jim from Rake says:
Feel free to submit commits or feature requests. If you send a patch,
remember to update the corresponding unit tests. If fact, I prefer
new feature to be submitted in the form of new unit tests.
For other information, feel free to ask on the ruby-talk mailing list
(which is mirrored to comp.lang.ruby) or contact mailto:david@loudthinking.com.

View file

@ -0,0 +1,46 @@
== Creating the test database
The default names for the test databases are "activerecord_unittest" and
"activerecord_unittest2". If you want to use another database name then be sure
to update the connection adapter setups you want to test with in
test/connections/<your database>/connection.rb.
When you have the database online, you can import the fixture tables with
the test/fixtures/db_definitions/*.sql files.
Make sure that you create database objects with the same user that you specified in i
connection.rb otherwise (on Postgres, at least) tests for default values will fail.
== Running with Rake
The easiest way to run the unit tests is through Rake. The default task runs
the entire test suite for all the adapters. You can also run the suite on just
one adapter by using the tasks test_mysql_ruby, test_ruby_mysql, test_sqlite,
or test_postgresql. For more information, checkout the full array of rake tasks with "rake -T"
Rake can be found at http://rake.rubyforge.org
== Running by hand
Unit tests are located in test directory. If you only want to run a single test suite,
or don't want to bother with Rake, you can do so with something like:
cd test; ruby -I "connections/native_mysql" base_test.rb
That'll run the base suite using the MySQL-Ruby adapter. Change the adapter
and test suite name as needed.
You can also run all the suites on a specific adapter with:
cd test; all.sh "connections/native_mysql"
== Faster tests
If you are using a database that supports transactions, you can set the
"AR_TX_FIXTURES" environment variable to "yes" to use transactional fixtures.
This gives a very large speed boost. With rake:
rake AR_TX_FIXTURES=yes
Or, by hand:
AR_TX_FIXTURES=yes ruby -I connections/native_sqlite3 base_test.rb

181
vendor/rails/activerecord/Rakefile vendored Executable file
View file

@ -0,0 +1,181 @@
require 'rubygems'
require 'rake'
require 'rake/testtask'
require 'rake/rdoctask'
require 'rake/packagetask'
require 'rake/gempackagetask'
require 'rake/contrib/rubyforgepublisher'
require File.join(File.dirname(__FILE__), 'lib', 'active_record', 'version')
PKG_BUILD = ENV['PKG_BUILD'] ? '.' + ENV['PKG_BUILD'] : ''
PKG_NAME = 'activerecord'
PKG_VERSION = ActiveRecord::VERSION::STRING + PKG_BUILD
PKG_FILE_NAME = "#{PKG_NAME}-#{PKG_VERSION}"
RELEASE_NAME = "REL #{PKG_VERSION}"
RUBY_FORGE_PROJECT = "activerecord"
RUBY_FORGE_USER = "webster132"
PKG_FILES = FileList[
"lib/**/*", "test/**/*", "examples/**/*", "doc/**/*", "[A-Z]*", "install.rb", "Rakefile"
].exclude(/\bCVS\b|~$/)
desc "Default Task"
task :default => [ :test_mysql, :test_sqlite, :test_postgresql ]
# Run the unit tests
for adapter in %w( mysql postgresql sqlite sqlite3 firebird sqlserver sqlserver_odbc db2 oracle sybase openbase )
Rake::TestTask.new("test_#{adapter}") { |t|
t.libs << "test" << "test/connections/native_#{adapter}"
t.pattern = "test/*_test{,_#{adapter}}.rb"
t.verbose = true
}
end
SCHEMA_PATH = File.join(File.dirname(__FILE__), *%w(test fixtures db_definitions))
desc 'Build the MySQL test databases'
task :build_mysql_databases do
%x( mysqladmin create activerecord_unittest )
%x( mysqladmin create activerecord_unittest2 )
%x( mysql activerecord_unittest < #{File.join(SCHEMA_PATH, 'mysql.sql')} )
%x( mysql activerecord_unittest < #{File.join(SCHEMA_PATH, 'mysql2.sql')} )
end
desc 'Drop the MySQL test databases'
task :drop_mysql_databases do
%x( mysqladmin -f drop activerecord_unittest )
%x( mysqladmin -f drop activerecord_unittest2 )
end
desc 'Rebuild the MySQL test databases'
task :rebuild_mysql_databases => [:drop_mysql_databases, :build_mysql_databases]
desc 'Build the PostgreSQL test databases'
task :build_postgresql_databases do
%x( createdb activerecord_unittest )
%x( createdb activerecord_unittest2 )
%x( psql activerecord_unittest -f #{File.join(SCHEMA_PATH, 'postgresql.sql')} )
%x( psql activerecord_unittest2 -f #{File.join(SCHEMA_PATH, 'postgresql2.sql')} )
end
desc 'Drop the PostgreSQL test databases'
task :drop_postgresql_databases do
%x( dropdb activerecord_unittest )
%x( dropdb activerecord_unittest2 )
end
desc 'Rebuild the PostgreSQL test databases'
task :rebuild_postgresql_databases => [:drop_postgresql_databases, :build_postgresql_databases]
# Generate the RDoc documentation
Rake::RDocTask.new { |rdoc|
rdoc.rdoc_dir = 'doc'
rdoc.title = "Active Record -- Object-relation mapping put on rails"
rdoc.options << '--line-numbers' << '--inline-source' << '-A cattr_accessor=object'
rdoc.template = "#{ENV['template']}.rb" if ENV['template']
rdoc.rdoc_files.include('README', 'RUNNING_UNIT_TESTS', 'CHANGELOG')
rdoc.rdoc_files.include('lib/**/*.rb')
rdoc.rdoc_files.exclude('lib/active_record/vendor/*')
rdoc.rdoc_files.include('dev-utils/*.rb')
}
# Enhance rdoc task to copy referenced images also
task :rdoc do
FileUtils.mkdir_p "doc/files/examples/"
FileUtils.copy "examples/associations.png", "doc/files/examples/associations.png"
end
# Create compressed packages
dist_dirs = [ "lib", "test", "examples", "dev-utils" ]
spec = Gem::Specification.new do |s|
s.name = PKG_NAME
s.version = PKG_VERSION
s.summary = "Implements the ActiveRecord pattern for ORM."
s.description = %q{Implements the ActiveRecord pattern (Fowler, PoEAA) for ORM. It ties database tables and classes together for business objects, like Customer or Subscription, that can find, save, and destroy themselves without resorting to manual SQL.}
s.files = [ "Rakefile", "install.rb", "README", "RUNNING_UNIT_TESTS", "CHANGELOG" ]
dist_dirs.each do |dir|
s.files = s.files + Dir.glob( "#{dir}/**/*" ).delete_if { |item| item.include?( "\.svn" ) }
end
s.add_dependency('activesupport', '= 1.3.1' + PKG_BUILD)
s.files.delete "test/fixtures/fixture_database.sqlite"
s.files.delete "test/fixtures/fixture_database_2.sqlite"
s.files.delete "test/fixtures/fixture_database.sqlite3"
s.files.delete "test/fixtures/fixture_database_2.sqlite3"
s.require_path = 'lib'
s.autorequire = 'active_record'
s.has_rdoc = true
s.extra_rdoc_files = %w( README )
s.rdoc_options.concat ['--main', 'README']
s.author = "David Heinemeier Hansson"
s.email = "david@loudthinking.com"
s.homepage = "http://www.rubyonrails.org"
s.rubyforge_project = "activerecord"
end
Rake::GemPackageTask.new(spec) do |p|
p.gem_spec = spec
p.need_tar = true
p.need_zip = true
end
task :lines do
lines, codelines, total_lines, total_codelines = 0, 0, 0, 0
for file_name in FileList["lib/active_record/**/*.rb"]
next if file_name =~ /vendor/
f = File.open(file_name)
while line = f.gets
lines += 1
next if line =~ /^\s*$/
next if line =~ /^\s*#/
codelines += 1
end
puts "L: #{sprintf("%4d", lines)}, LOC #{sprintf("%4d", codelines)} | #{file_name}"
total_lines += lines
total_codelines += codelines
lines, codelines = 0, 0
end
puts "Total: Lines #{total_lines}, LOC #{total_codelines}"
end
# Publishing ------------------------------------------------------
desc "Publish the beta gem"
task :pgem => [:package] do
Rake::SshFilePublisher.new("davidhh@wrath.rubyonrails.org", "public_html/gems/gems", "pkg", "#{PKG_FILE_NAME}.gem").upload
`ssh davidhh@wrath.rubyonrails.org './gemupdate.sh'`
end
desc "Publish the API documentation"
task :pdoc => [:rdoc] do
Rake::SshDirPublisher.new("davidhh@wrath.rubyonrails.org", "public_html/ar", "doc").upload
end
desc "Publish the release files to RubyForge."
task :release => [ :package ] do
`rubyforge login`
for ext in %w( gem tgz zip )
release_command = "rubyforge add_release #{PKG_NAME} #{PKG_NAME} 'REL #{PKG_VERSION}' pkg/#{PKG_NAME}-#{PKG_VERSION}.#{ext}"
puts release_command
system(release_command)
end
end

View file

@ -0,0 +1,26 @@
$:.unshift(File.dirname(__FILE__) + '/../lib')
if ARGV[2]
require 'rubygems'
require_gem 'activerecord', ARGV[2]
else
require 'active_record'
end
ActiveRecord::Base.establish_connection(:adapter => "mysql", :database => "basecamp")
class Post < ActiveRecord::Base; end
require 'benchmark'
RUNS = ARGV[0].to_i
if ARGV[1] == "profile" then require 'profile' end
runtime = Benchmark::measure {
RUNS.times {
Post.find_all(nil,nil,100).each { |p| p.title }
}
}
puts "Runs: #{RUNS}"
puts "Avg. runtime: #{runtime.real / RUNS}"
puts "Requests/second: #{RUNS / runtime.real}"

View file

@ -0,0 +1,19 @@
require 'mysql'
conn = Mysql::real_connect("localhost", "root", "", "basecamp")
require 'benchmark'
require 'profile' if ARGV[1] == "profile"
RUNS = ARGV[0].to_i
runtime = Benchmark::measure {
RUNS.times {
result = conn.query("SELECT * FROM posts LIMIT 100")
result.each_hash { |p| p["title"] }
}
}
puts "Runs: #{RUNS}"
puts "Avg. runtime: #{runtime.real / RUNS}"
puts "Requests/second: #{RUNS / runtime.real}"

Binary file not shown.

After

Width:  |  Height:  |  Size: 40 KiB

View file

@ -0,0 +1,87 @@
require File.dirname(__FILE__) + '/shared_setup'
logger = Logger.new(STDOUT)
# Database setup ---------------
logger.info "\nCreate tables"
[ "DROP TABLE companies", "DROP TABLE people", "DROP TABLE people_companies",
"CREATE TABLE companies (id int(11) auto_increment, client_of int(11), name varchar(255), type varchar(100), PRIMARY KEY (id))",
"CREATE TABLE people (id int(11) auto_increment, name varchar(100), PRIMARY KEY (id))",
"CREATE TABLE people_companies (person_id int(11), company_id int(11), PRIMARY KEY (person_id, company_id))",
].each { |statement|
# Tables doesn't necessarily already exist
begin; ActiveRecord::Base.connection.execute(statement); rescue ActiveRecord::StatementInvalid; end
}
# Class setup ---------------
class Company < ActiveRecord::Base
has_and_belongs_to_many :people, :class_name => "Person", :join_table => "people_companies", :table_name => "people"
end
class Firm < Company
has_many :clients, :foreign_key => "client_of"
def people_with_all_clients
clients.inject([]) { |people, client| people + client.people }
end
end
class Client < Company
belongs_to :firm, :foreign_key => "client_of"
end
class Person < ActiveRecord::Base
has_and_belongs_to_many :companies, :join_table => "people_companies"
def self.table_name() "people" end
end
# Usage ---------------
logger.info "\nCreate fixtures"
Firm.new("name" => "Next Angle").save
Client.new("name" => "37signals", "client_of" => 1).save
Person.new("name" => "David").save
logger.info "\nUsing Finders"
next_angle = Company.find(1)
next_angle = Firm.find(1)
next_angle = Company.find_first "name = 'Next Angle'"
next_angle = Firm.find_by_sql("SELECT * FROM companies WHERE id = 1").first
Firm === next_angle
logger.info "\nUsing has_many association"
next_angle.has_clients?
next_angle.clients_count
all_clients = next_angle.clients
thirty_seven_signals = next_angle.find_in_clients(2)
logger.info "\nUsing belongs_to association"
thirty_seven_signals.has_firm?
thirty_seven_signals.firm?(next_angle)
logger.info "\nUsing has_and_belongs_to_many association"
david = Person.find(1)
david.add_companies(thirty_seven_signals, next_angle)
david.companies.include?(next_angle)
david.companies_count == 2
david.remove_companies(next_angle)
david.companies_count == 1
thirty_seven_signals.people.include?(david)

View file

@ -0,0 +1,15 @@
# Be sure to change the mysql_connection details and create a database for the example
$: << File.dirname(__FILE__) + '/../lib'
require 'active_record'
require 'logger'; class Logger; def format_message(severity, timestamp, msg, progname) "#{msg}\n" end; end
ActiveRecord::Base.logger = Logger.new(STDOUT)
ActiveRecord::Base.establish_connection(
:adapter => "mysql",
:host => "localhost",
:username => "root",
:password => "",
:database => "activerecord_examples"
)

View file

@ -0,0 +1,85 @@
require File.dirname(__FILE__) + '/shared_setup'
logger = Logger.new(STDOUT)
# Database setup ---------------
logger.info "\nCreate tables"
[ "DROP TABLE people",
"CREATE TABLE people (id int(11) auto_increment, name varchar(100), pass varchar(100), email varchar(100), PRIMARY KEY (id))"
].each { |statement|
begin; ActiveRecord::Base.connection.execute(statement); rescue ActiveRecord::StatementInvalid; end # Tables doesn't necessarily already exist
}
# Class setup ---------------
class Person < ActiveRecord::Base
# Using
def self.authenticate(name, pass)
# find_first "name = '#{name}' AND pass = '#{pass}'" would be open to sql-injection (in a web-app scenario)
find_first [ "name = '%s' AND pass = '%s'", name, pass ]
end
def self.name_exists?(name, id = nil)
if id.nil?
condition = [ "name = '%s'", name ]
else
# Check if anyone else than the person identified by person_id has that user_name
condition = [ "name = '%s' AND id <> %d", name, id ]
end
!find_first(condition).nil?
end
def email_address_with_name
"\"#{name}\" <#{email}>"
end
protected
def validate
errors.add_on_empty(%w(name pass email))
errors.add("email", "must be valid") unless email_address_valid?
end
def validate_on_create
if attribute_present?("name") && Person.name_exists?(name)
errors.add("name", "is already taken by another person")
end
end
def validate_on_update
if attribute_present?("name") && Person.name_exists?(name, id)
errors.add("name", "is already taken by another person")
end
end
private
def email_address_valid?() email =~ /\w[-.\w]*\@[-\w]+[-.\w]*\.\w+/ end
end
# Usage ---------------
logger.info "\nCreate fixtures"
david = Person.new("name" => "David Heinemeier Hansson", "pass" => "", "email" => "")
unless david.save
puts "There was #{david.errors.count} error(s)"
david.errors.each_full { |error| puts error }
end
david.pass = "something"
david.email = "invalid_address"
unless david.save
puts "There was #{david.errors.count} error(s)"
puts "It was email with: " + david.errors.on("email")
end
david.email = "david@loudthinking.com"
if david.save then puts "David finally made it!" end
another_david = Person.new("name" => "David Heinemeier Hansson", "pass" => "xc", "email" => "david@loudthinking")
unless another_david.save
puts "Error on name: " + another_david.errors.on("name")
end

30
vendor/rails/activerecord/install.rb vendored Normal file
View file

@ -0,0 +1,30 @@
require 'rbconfig'
require 'find'
require 'ftools'
include Config
# this was adapted from rdoc's install.rb by ways of Log4r
$sitedir = CONFIG["sitelibdir"]
unless $sitedir
version = CONFIG["MAJOR"] + "." + CONFIG["MINOR"]
$libdir = File.join(CONFIG["libdir"], "ruby", version)
$sitedir = $:.find {|x| x =~ /site_ruby/ }
if !$sitedir
$sitedir = File.join($libdir, "site_ruby")
elsif $sitedir !~ Regexp.quote(version)
$sitedir = File.join($sitedir, version)
end
end
# the acual gruntwork
Dir.chdir("lib")
Find.find("active_record", "active_record.rb") { |f|
if f[-3..-1] == ".rb"
File::install(f, File.join($sitedir, *f.split(/\//)), 0644, true)
else
File::makedirs(File.join($sitedir, *f.split(/\//)))
end
}

View file

@ -0,0 +1,79 @@
#--
# Copyright (c) 2004 David Heinemeier Hansson
#
# Permission is hereby granted, free of charge, to any person obtaining
# a copy of this software and associated documentation files (the
# "Software"), to deal in the Software without restriction, including
# without limitation the rights to use, copy, modify, merge, publish,
# distribute, sublicense, and/or sell copies of the Software, and to
# permit persons to whom the Software is furnished to do so, subject to
# the following conditions:
#
# The above copyright notice and this permission notice shall be
# included in all copies or substantial portions of the Software.
#
# THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND,
# EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF
# MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND
# NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE
# LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION
# OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION
# WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE.
#++
$:.unshift(File.dirname(__FILE__)) unless
$:.include?(File.dirname(__FILE__)) || $:.include?(File.expand_path(File.dirname(__FILE__)))
unless defined?(ActiveSupport)
begin
$:.unshift(File.dirname(__FILE__) + "/../../activesupport/lib")
require 'active_support'
rescue LoadError
require 'rubygems'
require_gem 'activesupport'
end
end
require 'active_record/base'
require 'active_record/observer'
require 'active_record/validations'
require 'active_record/callbacks'
require 'active_record/reflection'
require 'active_record/associations'
require 'active_record/aggregations'
require 'active_record/transactions'
require 'active_record/timestamp'
require 'active_record/acts/list'
require 'active_record/acts/tree'
require 'active_record/acts/nested_set'
require 'active_record/locking'
require 'active_record/migration'
require 'active_record/schema'
require 'active_record/calculations'
ActiveRecord::Base.class_eval do
include ActiveRecord::Validations
include ActiveRecord::Locking
include ActiveRecord::Callbacks
include ActiveRecord::Observing
include ActiveRecord::Timestamp
include ActiveRecord::Associations
include ActiveRecord::Aggregations
include ActiveRecord::Transactions
include ActiveRecord::Reflection
include ActiveRecord::Acts::Tree
include ActiveRecord::Acts::List
include ActiveRecord::Acts::NestedSet
include ActiveRecord::Calculations
end
unless defined?(RAILS_CONNECTION_ADAPTERS)
RAILS_CONNECTION_ADAPTERS = %w( mysql postgresql sqlite firebird sqlserver db2 oracle sybase openbase )
end
RAILS_CONNECTION_ADAPTERS.each do |adapter|
require "active_record/connection_adapters/" + adapter + "_adapter"
end
require 'active_record/query_cache'
require 'active_record/schema_dumper'

View file

@ -0,0 +1,233 @@
module ActiveRecord
module Acts #:nodoc:
module List #:nodoc:
def self.append_features(base)
super
base.extend(ClassMethods)
end
# This act provides the capabilities for sorting and reordering a number of objects in a list.
# The class that has this specified needs to have a "position" column defined as an integer on
# the mapped database table.
#
# Todo list example:
#
# class TodoList < ActiveRecord::Base
# has_many :todo_items, :order => "position"
# end
#
# class TodoItem < ActiveRecord::Base
# belongs_to :todo_list
# acts_as_list :scope => :todo_list
# end
#
# todo_list.first.move_to_bottom
# todo_list.last.move_higher
module ClassMethods
# Configuration options are:
#
# * +column+ - specifies the column name to use for keeping the position integer (default: position)
# * +scope+ - restricts what is to be considered a list. Given a symbol, it'll attach "_id"
# (if that hasn't been already) and use that as the foreign key restriction. It's also possible
# to give it an entire string that is interpolated if you need a tighter scope than just a foreign key.
# Example: <tt>acts_as_list :scope => 'todo_list_id = #{todo_list_id} AND completed = 0'</tt>
def acts_as_list(options = {})
configuration = { :column => "position", :scope => "1 = 1" }
configuration.update(options) if options.is_a?(Hash)
configuration[:scope] = "#{configuration[:scope]}_id".intern if configuration[:scope].is_a?(Symbol) && configuration[:scope].to_s !~ /_id$/
if configuration[:scope].is_a?(Symbol)
scope_condition_method = %(
def scope_condition
if #{configuration[:scope].to_s}.nil?
"#{configuration[:scope].to_s} IS NULL"
else
"#{configuration[:scope].to_s} = \#{#{configuration[:scope].to_s}}"
end
end
)
else
scope_condition_method = "def scope_condition() \"#{configuration[:scope]}\" end"
end
class_eval <<-EOV
include ActiveRecord::Acts::List::InstanceMethods
def acts_as_list_class
::#{self.name}
end
def position_column
'#{configuration[:column]}'
end
#{scope_condition_method}
after_destroy :remove_from_list
before_create :add_to_list_bottom
EOV
end
end
# All the methods available to a record that has had <tt>acts_as_list</tt> specified. Each method works
# by assuming the object to be the item in the list, so <tt>chapter.move_lower</tt> would move that chapter
# lower in the list of all chapters. Likewise, <tt>chapter.first?</tt> would return true if that chapter is
# the first in the list of all chapters.
module InstanceMethods
def insert_at(position = 1)
insert_at_position(position)
end
def move_lower
return unless lower_item
acts_as_list_class.transaction do
lower_item.decrement_position
increment_position
end
end
def move_higher
return unless higher_item
acts_as_list_class.transaction do
higher_item.increment_position
decrement_position
end
end
def move_to_bottom
return unless in_list?
acts_as_list_class.transaction do
decrement_positions_on_lower_items
assume_bottom_position
end
end
def move_to_top
return unless in_list?
acts_as_list_class.transaction do
increment_positions_on_higher_items
assume_top_position
end
end
def remove_from_list
decrement_positions_on_lower_items if in_list?
end
def increment_position
return unless in_list?
update_attribute position_column, self.send(position_column).to_i + 1
end
def decrement_position
return unless in_list?
update_attribute position_column, self.send(position_column).to_i - 1
end
def first?
return false unless in_list?
self.send(position_column) == 1
end
def last?
return false unless in_list?
self.send(position_column) == bottom_position_in_list
end
def higher_item
return nil unless in_list?
acts_as_list_class.find(:first, :conditions =>
"#{scope_condition} AND #{position_column} = #{(send(position_column).to_i - 1).to_s}"
)
end
def lower_item
return nil unless in_list?
acts_as_list_class.find(:first, :conditions =>
"#{scope_condition} AND #{position_column} = #{(send(position_column).to_i + 1).to_s}"
)
end
def in_list?
!send(position_column).nil?
end
private
def add_to_list_top
increment_positions_on_all_items
end
def add_to_list_bottom
self[position_column] = bottom_position_in_list.to_i + 1
end
# Overwrite this method to define the scope of the list changes
def scope_condition() "1" end
def bottom_position_in_list(except = nil)
item = bottom_item(except)
item ? item.send(position_column) : 0
end
def bottom_item(except = nil)
conditions = scope_condition
conditions = "#{conditions} AND #{self.class.primary_key} != #{except.id}" if except
acts_as_list_class.find(:first, :conditions => conditions, :order => "#{position_column} DESC")
end
def assume_bottom_position
update_attribute(position_column, bottom_position_in_list(self).to_i + 1)
end
def assume_top_position
update_attribute(position_column, 1)
end
# This has the effect of moving all the higher items up one.
def decrement_positions_on_higher_items(position)
acts_as_list_class.update_all(
"#{position_column} = (#{position_column} - 1)", "#{scope_condition} AND #{position_column} <= #{position}"
)
end
# This has the effect of moving all the lower items up one.
def decrement_positions_on_lower_items
return unless in_list?
acts_as_list_class.update_all(
"#{position_column} = (#{position_column} - 1)", "#{scope_condition} AND #{position_column} > #{send(position_column).to_i}"
)
end
# This has the effect of moving all the higher items down one.
def increment_positions_on_higher_items
return unless in_list?
acts_as_list_class.update_all(
"#{position_column} = (#{position_column} + 1)", "#{scope_condition} AND #{position_column} < #{send(position_column).to_i}"
)
end
# This has the effect of moving all the lower items down one.
def increment_positions_on_lower_items(position)
acts_as_list_class.update_all(
"#{position_column} = (#{position_column} + 1)", "#{scope_condition} AND #{position_column} >= #{position}"
)
end
def increment_positions_on_all_items
acts_as_list_class.update_all(
"#{position_column} = (#{position_column} + 1)", "#{scope_condition}"
)
end
def insert_at_position(position)
remove_from_list
increment_positions_on_lower_items(position)
self.update_attribute(position_column, position)
end
end
end
end
end

View file

@ -0,0 +1,212 @@
module ActiveRecord
module Acts #:nodoc:
module NestedSet #:nodoc:
def self.append_features(base)
super
base.extend(ClassMethods)
end
# This acts provides Nested Set functionality. Nested Set is similiar to Tree, but with
# the added feature that you can select the children and all of their descendents with
# a single query. A good use case for this is a threaded post system, where you want
# to display every reply to a comment without multiple selects.
#
# A google search for "Nested Set" should point you in the direction to explain the
# database theory. I figured out a bunch of this from
# http://threebit.net/tutorials/nestedset/tutorial1.html
#
# Instead of picturing a leaf node structure with children pointing back to their parent,
# the best way to imagine how this works is to think of the parent entity surrounding all
# of its children, and its parent surrounding it, etc. Assuming that they are lined up
# horizontally, we store the left and right boundries in the database.
#
# Imagine:
# root
# |_ Child 1
# |_ Child 1.1
# |_ Child 1.2
# |_ Child 2
# |_ Child 2.1
# |_ Child 2.2
#
# If my cirlces in circles description didn't make sense, check out this sweet
# ASCII art:
#
# ___________________________________________________________________
# | Root |
# | ____________________________ ____________________________ |
# | | Child 1 | | Child 2 | |
# | | __________ _________ | | __________ _________ | |
# | | | C 1.1 | | C 1.2 | | | | C 2.1 | | C 2.2 | | |
# 1 2 3_________4 5________6 7 8 9_________10 11_______12 13 14
# | |___________________________| |___________________________| |
# |___________________________________________________________________|
#
# The numbers represent the left and right boundries. The table then might
# look like this:
# ID | PARENT | LEFT | RIGHT | DATA
# 1 | 0 | 1 | 14 | root
# 2 | 1 | 2 | 7 | Child 1
# 3 | 2 | 3 | 4 | Child 1.1
# 4 | 2 | 5 | 6 | Child 1.2
# 5 | 1 | 8 | 13 | Child 2
# 6 | 5 | 9 | 10 | Child 2.1
# 7 | 5 | 11 | 12 | Child 2.2
#
# So, to get all children of an entry, you
# SELECT * WHERE CHILD.LEFT IS BETWEEN PARENT.LEFT AND PARENT.RIGHT
#
# To get the count, it's (LEFT - RIGHT + 1)/2, etc.
#
# To get the direct parent, it falls back to using the PARENT_ID field.
#
# There are instance methods for all of these.
#
# The structure is good if you need to group things together; the downside is that
# keeping data integrity is a pain, and both adding and removing an entry
# require a full table write.
#
# This sets up a before_destroy trigger to prune the tree correctly if one of its
# elements gets deleted.
#
module ClassMethods
# Configuration options are:
#
# * +parent_column+ - specifies the column name to use for keeping the position integer (default: parent_id)
# * +left_column+ - column name for left boundry data, default "lft"
# * +right_column+ - column name for right boundry data, default "rgt"
# * +scope+ - restricts what is to be considered a list. Given a symbol, it'll attach "_id"
# (if that hasn't been already) and use that as the foreign key restriction. It's also possible
# to give it an entire string that is interpolated if you need a tighter scope than just a foreign key.
# Example: <tt>acts_as_list :scope => 'todo_list_id = #{todo_list_id} AND completed = 0'</tt>
def acts_as_nested_set(options = {})
configuration = { :parent_column => "parent_id", :left_column => "lft", :right_column => "rgt", :scope => "1 = 1" }
configuration.update(options) if options.is_a?(Hash)
configuration[:scope] = "#{configuration[:scope]}_id".intern if configuration[:scope].is_a?(Symbol) && configuration[:scope].to_s !~ /_id$/
if configuration[:scope].is_a?(Symbol)
scope_condition_method = %(
def scope_condition
if #{configuration[:scope].to_s}.nil?
"#{configuration[:scope].to_s} IS NULL"
else
"#{configuration[:scope].to_s} = \#{#{configuration[:scope].to_s}}"
end
end
)
else
scope_condition_method = "def scope_condition() \"#{configuration[:scope]}\" end"
end
class_eval <<-EOV
include ActiveRecord::Acts::NestedSet::InstanceMethods
#{scope_condition_method}
def left_col_name() "#{configuration[:left_column]}" end
def right_col_name() "#{configuration[:right_column]}" end
def parent_column() "#{configuration[:parent_column]}" end
EOV
end
end
module InstanceMethods
# Returns true is this is a root node.
def root?
parent_id = self[parent_column]
(parent_id == 0 || parent_id.nil?) && (self[left_col_name] == 1) && (self[right_col_name] > self[left_col_name])
end
# Returns true is this is a child node
def child?
parent_id = self[parent_column]
!(parent_id == 0 || parent_id.nil?) && (self[left_col_name] > 1) && (self[right_col_name] > self[left_col_name])
end
# Returns true if we have no idea what this is
def unknown?
!root? && !child?
end
# Adds a child to this object in the tree. If this object hasn't been initialized,
# it gets set up as a root node. Otherwise, this method will update all of the
# other elements in the tree and shift them to the right, keeping everything
# balanced.
def add_child( child )
self.reload
child.reload
if child.root?
raise "Adding sub-tree isn\'t currently supported"
else
if ( (self[left_col_name] == nil) || (self[right_col_name] == nil) )
# Looks like we're now the root node! Woo
self[left_col_name] = 1
self[right_col_name] = 4
# What do to do about validation?
return nil unless self.save
child[parent_column] = self.id
child[left_col_name] = 2
child[right_col_name]= 3
return child.save
else
# OK, we need to add and shift everything else to the right
child[parent_column] = self.id
right_bound = self[right_col_name]
child[left_col_name] = right_bound
child[right_col_name] = right_bound + 1
self[right_col_name] += 2
self.class.transaction {
self.class.update_all( "#{left_col_name} = (#{left_col_name} + 2)", "#{scope_condition} AND #{left_col_name} >= #{right_bound}" )
self.class.update_all( "#{right_col_name} = (#{right_col_name} + 2)", "#{scope_condition} AND #{right_col_name} >= #{right_bound}" )
self.save
child.save
}
end
end
end
# Returns the number of nested children of this object.
def children_count
return (self[right_col_name] - self[left_col_name] - 1)/2
end
# Returns a set of itself and all of its nested children
def full_set
self.class.find(:all, :conditions => "#{scope_condition} AND (#{left_col_name} BETWEEN #{self[left_col_name]} and #{self[right_col_name]})" )
end
# Returns a set of all of its children and nested children
def all_children
self.class.find(:all, :conditions => "#{scope_condition} AND (#{left_col_name} > #{self[left_col_name]}) and (#{right_col_name} < #{self[right_col_name]})" )
end
# Returns a set of only this entry's immediate children
def direct_children
self.class.find(:all, :conditions => "#{scope_condition} and #{parent_column} = #{self.id}")
end
# Prunes a branch off of the tree, shifting all of the elements on the right
# back to the left so the counts still work.
def before_destroy
return if self[right_col_name].nil? || self[left_col_name].nil?
dif = self[right_col_name] - self[left_col_name] + 1
self.class.transaction {
self.class.delete_all( "#{scope_condition} and #{left_col_name} > #{self[left_col_name]} and #{right_col_name} < #{self[right_col_name]}" )
self.class.update_all( "#{left_col_name} = (#{left_col_name} - #{dif})", "#{scope_condition} AND #{left_col_name} >= #{self[right_col_name]}" )
self.class.update_all( "#{right_col_name} = (#{right_col_name} - #{dif} )", "#{scope_condition} AND #{right_col_name} >= #{self[right_col_name]}" )
}
end
end
end
end
end

View file

@ -0,0 +1,90 @@
module ActiveRecord
module Acts #:nodoc:
module Tree #:nodoc:
def self.append_features(base)
super
base.extend(ClassMethods)
end
# Specify this act if you want to model a tree structure by providing a parent association and a children
# association. This act requires that you have a foreign key column, which by default is called parent_id.
#
# class Category < ActiveRecord::Base
# acts_as_tree :order => "name"
# end
#
# Example :
# root
# \_ child1
# \_ subchild1
# \_ subchild2
#
# root = Category.create("name" => "root")
# child1 = root.children.create("name" => "child1")
# subchild1 = child1.children.create("name" => "subchild1")
#
# root.parent # => nil
# child1.parent # => root
# root.children # => [child1]
# root.children.first.children.first # => subchild1
#
# In addition to the parent and children associations, the following instance methods are added to the class
# after specifying the act:
# * siblings : Returns all the children of the parent, excluding the current node ([ subchild2 ] when called from subchild1)
# * self_and_siblings : Returns all the children of the parent, including the current node ([ subchild1, subchild2 ] when called from subchild1)
# * ancestors : Returns all the ancestors of the current node ([child1, root] when called from subchild2)
# * root : Returns the root of the current node (root when called from subchild2)
module ClassMethods
# Configuration options are:
#
# * <tt>foreign_key</tt> - specifies the column name to use for tracking of the tree (default: parent_id)
# * <tt>order</tt> - makes it possible to sort the children according to this SQL snippet.
# * <tt>counter_cache</tt> - keeps a count in a children_count column if set to true (default: false).
def acts_as_tree(options = {})
configuration = { :foreign_key => "parent_id", :order => nil, :counter_cache => nil }
configuration.update(options) if options.is_a?(Hash)
belongs_to :parent, :class_name => name, :foreign_key => configuration[:foreign_key], :counter_cache => configuration[:counter_cache]
has_many :children, :class_name => name, :foreign_key => configuration[:foreign_key], :order => configuration[:order], :dependent => :destroy
class_eval <<-EOV
include ActiveRecord::Acts::Tree::InstanceMethods
def self.roots
find(:all, :conditions => "#{configuration[:foreign_key]} IS NULL", :order => #{configuration[:order].nil? ? "nil" : %Q{"#{configuration[:order]}"}})
end
def self.root
find(:first, :conditions => "#{configuration[:foreign_key]} IS NULL", :order => #{configuration[:order].nil? ? "nil" : %Q{"#{configuration[:order]}"}})
end
EOV
end
end
module InstanceMethods
# Returns list of ancestors, starting from parent until root.
#
# subchild1.ancestors # => [child1, root]
def ancestors
node, nodes = self, []
nodes << node = node.parent until not node.has_parent?
nodes
end
def root
node = self
node = node.parent until not node.has_parent?
node
end
def siblings
self_and_siblings - [self]
end
def self_and_siblings
has_parent? ? parent.children : self.class.roots
end
end
end
end
end

View file

@ -0,0 +1,167 @@
module ActiveRecord
module Aggregations # :nodoc:
def self.included(base)
base.extend(ClassMethods)
end
def clear_aggregation_cache #:nodoc:
self.class.reflect_on_all_aggregations.to_a.each do |assoc|
instance_variable_set "@#{assoc.name}", nil
end unless self.new_record?
end
# Active Record implements aggregation through a macro-like class method called +composed_of+ for representing attributes
# as value objects. It expresses relationships like "Account [is] composed of Money [among other things]" or "Person [is]
# composed of [an] address". Each call to the macro adds a description of how the value objects are created from the
# attributes of the entity object (when the entity is initialized either as a new object or from finding an existing object)
# and how it can be turned back into attributes (when the entity is saved to the database). Example:
#
# class Customer < ActiveRecord::Base
# composed_of :balance, :class_name => "Money", :mapping => %w(balance amount)
# composed_of :address, :mapping => [ %w(address_street street), %w(address_city city) ]
# end
#
# The customer class now has the following methods to manipulate the value objects:
# * <tt>Customer#balance, Customer#balance=(money)</tt>
# * <tt>Customer#address, Customer#address=(address)</tt>
#
# These methods will operate with value objects like the ones described below:
#
# class Money
# include Comparable
# attr_reader :amount, :currency
# EXCHANGE_RATES = { "USD_TO_DKK" => 6 }
#
# def initialize(amount, currency = "USD")
# @amount, @currency = amount, currency
# end
#
# def exchange_to(other_currency)
# exchanged_amount = (amount * EXCHANGE_RATES["#{currency}_TO_#{other_currency}"]).floor
# Money.new(exchanged_amount, other_currency)
# end
#
# def ==(other_money)
# amount == other_money.amount && currency == other_money.currency
# end
#
# def <=>(other_money)
# if currency == other_money.currency
# amount <=> amount
# else
# amount <=> other_money.exchange_to(currency).amount
# end
# end
# end
#
# class Address
# attr_reader :street, :city
# def initialize(street, city)
# @street, @city = street, city
# end
#
# def close_to?(other_address)
# city == other_address.city
# end
#
# def ==(other_address)
# city == other_address.city && street == other_address.street
# end
# end
#
# Now it's possible to access attributes from the database through the value objects instead. If you choose to name the
# composition the same as the attributes name, it will be the only way to access that attribute. That's the case with our
# +balance+ attribute. You interact with the value objects just like you would any other attribute, though:
#
# customer.balance = Money.new(20) # sets the Money value object and the attribute
# customer.balance # => Money value object
# customer.balance.exchanged_to("DKK") # => Money.new(120, "DKK")
# customer.balance > Money.new(10) # => true
# customer.balance == Money.new(20) # => true
# customer.balance < Money.new(5) # => false
#
# Value objects can also be composed of multiple attributes, such as the case of Address. The order of the mappings will
# determine the order of the parameters. Example:
#
# customer.address_street = "Hyancintvej"
# customer.address_city = "Copenhagen"
# customer.address # => Address.new("Hyancintvej", "Copenhagen")
# customer.address = Address.new("May Street", "Chicago")
# customer.address_street # => "May Street"
# customer.address_city # => "Chicago"
#
# == Writing value objects
#
# Value objects are immutable and interchangeable objects that represent a given value, such as a Money object representing
# $5. Two Money objects both representing $5 should be equal (through methods such as == and <=> from Comparable if ranking
# makes sense). This is unlike entity objects where equality is determined by identity. An entity class such as Customer can
# easily have two different objects that both have an address on Hyancintvej. Entity identity is determined by object or
# relational unique identifiers (such as primary keys). Normal ActiveRecord::Base classes are entity objects.
#
# It's also important to treat the value objects as immutable. Don't allow the Money object to have its amount changed after
# creation. Create a new money object with the new value instead. This is exemplified by the Money#exchanged_to method that
# returns a new value object instead of changing its own values. Active Record won't persist value objects that have been
# changed through other means than the writer method.
#
# The immutable requirement is enforced by Active Record by freezing any object assigned as a value object. Attempting to
# change it afterwards will result in a TypeError.
#
# Read more about value objects on http://c2.com/cgi/wiki?ValueObject and on the dangers of not keeping value objects
# immutable on http://c2.com/cgi/wiki?ValueObjectsShouldBeImmutable
module ClassMethods
# Adds the a reader and writer method for manipulating a value object, so
# <tt>composed_of :address</tt> would add <tt>address</tt> and <tt>address=(new_address)</tt>.
#
# Options are:
# * <tt>:class_name</tt> - specify the class name of the association. Use it only if that name can't be inferred
# from the part id. So <tt>composed_of :address</tt> will by default be linked to the +Address+ class, but
# if the real class name is +CompanyAddress+, you'll have to specify it with this option.
# * <tt>:mapping</tt> - specifies a number of mapping arrays (attribute, parameter) that bind an attribute name
# to a constructor parameter on the value class.
#
# Option examples:
# composed_of :temperature, :mapping => %w(reading celsius)
# composed_of :balance, :class_name => "Money", :mapping => %w(balance amount)
# composed_of :address, :mapping => [ %w(address_street street), %w(address_city city) ]
# composed_of :gps_location
def composed_of(part_id, options = {})
options.assert_valid_keys(:class_name, :mapping)
name = part_id.id2name
class_name = options[:class_name] || name_to_class_name(name)
mapping = options[:mapping] || [ name, name ]
reader_method(name, class_name, mapping)
writer_method(name, class_name, mapping)
create_reflection(:composed_of, part_id, options, self)
end
private
def name_to_class_name(name)
name.capitalize.gsub(/_(.)/) { |s| $1.capitalize }
end
def reader_method(name, class_name, mapping)
module_eval <<-end_eval
def #{name}(force_reload = false)
if @#{name}.nil? || force_reload
@#{name} = #{class_name}.new(#{(Array === mapping.first ? mapping : [ mapping ]).collect{ |pair| "read_attribute(\"#{pair.first}\")"}.join(", ")})
end
return @#{name}
end
end_eval
end
def writer_method(name, class_name, mapping)
module_eval <<-end_eval
def #{name}=(part)
@#{name} = part.freeze
#{(Array === mapping.first ? mapping : [ mapping ]).collect{ |pair| "@attributes[\"#{pair.first}\"] = part.#{pair.last}" }.join("\n")}
end
end_eval
end
end
end
end

File diff suppressed because it is too large Load diff

View file

@ -0,0 +1,160 @@
require 'set'
module ActiveRecord
module Associations
class AssociationCollection < AssociationProxy #:nodoc:
def to_ary
load_target
@target.to_ary
end
def reset
@target = []
@loaded = false
end
# Add +records+ to this association. Returns +self+ so method calls may be chained.
# Since << flattens its argument list and inserts each record, +push+ and +concat+ behave identically.
def <<(*records)
result = true
load_target
@owner.transaction do
flatten_deeper(records).each do |record|
raise_on_type_mismatch(record)
callback(:before_add, record)
result &&= insert_record(record) unless @owner.new_record?
@target << record
callback(:after_add, record)
end
end
result && self
end
alias_method :push, :<<
alias_method :concat, :<<
# Remove all records from this association
def delete_all
load_target
delete(@target)
@target = []
end
# Remove +records+ from this association. Does not destroy +records+.
def delete(*records)
records = flatten_deeper(records)
records.each { |record| raise_on_type_mismatch(record) }
records.reject! { |record| @target.delete(record) if record.new_record? }
return if records.empty?
@owner.transaction do
records.each { |record| callback(:before_remove, record) }
delete_records(records)
records.each do |record|
@target.delete(record)
callback(:after_remove, record)
end
end
end
# Removes all records from this association. Returns +self+ so method calls may be chained.
def clear
return self if length.zero? # forces load_target if hasn't happened already
if @reflection.options[:dependent] && @reflection.options[:dependent] == :delete_all
destroy_all
else
delete_all
end
self
end
def destroy_all
@owner.transaction do
each { |record| record.destroy }
end
@target = []
end
def create(attributes = {})
# Can't use Base.create since the foreign key may be a protected attribute.
if attributes.is_a?(Array)
attributes.collect { |attr| create(attr) }
else
record = build(attributes)
record.save unless @owner.new_record?
record
end
end
# Returns the size of the collection by executing a SELECT COUNT(*) query if the collection hasn't been loaded and
# calling collection.size if it has. If it's more likely than not that the collection does have a size larger than zero
# and you need to fetch that collection afterwards, it'll take one less SELECT query if you use length.
def size
if loaded? then @target.size else count_records end
end
# Returns the size of the collection by loading it and calling size on the array. If you want to use this method to check
# whether the collection is empty, use collection.length.zero? instead of collection.empty?
def length
load_target.size
end
def empty?
size.zero?
end
def uniq(collection = self)
collection.inject([]) { |uniq_records, record| uniq_records << record unless uniq_records.include?(record); uniq_records }
end
# Replace this collection with +other_array+
# This will perform a diff and delete/add only records that have changed.
def replace(other_array)
other_array.each { |val| raise_on_type_mismatch(val) }
load_target
other = other_array.size < 100 ? other_array : other_array.to_set
current = @target.size < 100 ? @target : @target.to_set
@owner.transaction do
delete(@target.select { |v| !other.include?(v) })
concat(other_array.select { |v| !current.include?(v) })
end
end
private
# Array#flatten has problems with recursive arrays. Going one level deeper solves the majority of the problems.
def flatten_deeper(array)
array.collect { |element| element.respond_to?(:flatten) ? element.flatten : element }.flatten
end
def callback(method, record)
callbacks_for(method).each do |callback|
case callback
when Symbol
@owner.send(callback, record)
when Proc, Method
callback.call(@owner, record)
else
if callback.respond_to?(method)
callback.send(method, @owner, record)
else
raise ActiveRecordError, "Callbacks must be a symbol denoting the method to call, a string to be evaluated, a block to be invoked, or an object responding to the callback method."
end
end
end
end
def callbacks_for(callback_name)
full_callback_name = "#{callback_name}_for_#{@reflection.name}"
@owner.class.read_inheritable_attribute(full_callback_name.to_sym) || []
end
end
end
end

View file

@ -0,0 +1,139 @@
module ActiveRecord
module Associations
class AssociationProxy #:nodoc:
attr_reader :reflection
alias_method :proxy_respond_to?, :respond_to?
alias_method :proxy_extend, :extend
instance_methods.each { |m| undef_method m unless m =~ /(^__|^nil\?|^proxy_respond_to\?|^proxy_extend|^send)/ }
def initialize(owner, reflection)
@owner, @reflection = owner, reflection
proxy_extend(reflection.options[:extend]) if reflection.options[:extend]
reset
end
def respond_to?(symbol, include_priv = false)
proxy_respond_to?(symbol, include_priv) || (load_target && @target.respond_to?(symbol, include_priv))
end
# Explicitly proxy === because the instance method removal above
# doesn't catch it.
def ===(other)
load_target
other === @target
end
def aliased_table_name
@reflection.klass.table_name
end
def conditions
@conditions ||= eval("%(#{@reflection.active_record.send :sanitize_sql, @reflection.options[:conditions]})") if @reflection.options[:conditions]
end
alias :sql_conditions :conditions
def reset
@target = nil
@loaded = false
end
def reload
reset
load_target
end
def loaded?
@loaded
end
def loaded
@loaded = true
end
def target
@target
end
def target=(target)
@target = target
loaded
end
protected
def dependent?
@reflection.options[:dependent] || false
end
def quoted_record_ids(records)
records.map { |record| record.quoted_id }.join(',')
end
def interpolate_sql_options!(options, *keys)
keys.each { |key| options[key] &&= interpolate_sql(options[key]) }
end
def interpolate_sql(sql, record = nil)
@owner.send(:interpolate_sql, sql, record)
end
def sanitize_sql(sql)
@reflection.klass.send(:sanitize_sql, sql)
end
def extract_options_from_args!(args)
@owner.send(:extract_options_from_args!, args)
end
def set_belongs_to_association_for(record)
if @reflection.options[:as]
record["#{@reflection.options[:as]}_id"] = @owner.id unless @owner.new_record?
record["#{@reflection.options[:as]}_type"] = @owner.class.base_class.name.to_s
else
record[@reflection.primary_key_name] = @owner.id unless @owner.new_record?
end
end
def merge_options_from_reflection!(options)
options.reverse_merge!(
:group => @reflection.options[:group],
:limit => @reflection.options[:limit],
:offset => @reflection.options[:offset],
:joins => @reflection.options[:joins],
:include => @reflection.options[:include],
:select => @reflection.options[:select]
)
end
private
def method_missing(method, *args, &block)
load_target
@target.send(method, *args, &block)
end
def load_target
if !@owner.new_record? || foreign_key_present
begin
@target = find_target if !loaded?
rescue ActiveRecord::RecordNotFound
reset
end
end
loaded if target
target
end
# Can be overwritten by associations that might have the foreign key available for an association without
# having the object itself (and still being a new record). Currently, only belongs_to present this scenario.
def foreign_key_present
false
end
def raise_on_type_mismatch(record)
unless record.is_a?(@reflection.klass)
raise ActiveRecord::AssociationTypeMismatch, "#{@reflection.class_name} expected, got #{record.class}"
end
end
end
end
end

View file

@ -0,0 +1,56 @@
module ActiveRecord
module Associations
class BelongsToAssociation < AssociationProxy #:nodoc:
def create(attributes = {})
replace(@reflection.klass.create(attributes))
end
def build(attributes = {})
replace(@reflection.klass.new(attributes))
end
def replace(record)
counter_cache_name = @reflection.counter_cache_column
if record.nil?
if counter_cache_name && @owner[counter_cache_name] && !@owner.new_record?
@reflection.klass.decrement_counter(counter_cache_name, @owner[@reflection.primary_key_name]) if @owner[@reflection.primary_key_name]
end
@target = @owner[@reflection.primary_key_name] = nil
else
raise_on_type_mismatch(record)
if counter_cache_name && !@owner.new_record?
@reflection.klass.increment_counter(counter_cache_name, record.id)
@reflection.klass.decrement_counter(counter_cache_name, @owner[@reflection.primary_key_name]) if @owner[@reflection.primary_key_name]
end
@target = (AssociationProxy === record ? record.target : record)
@owner[@reflection.primary_key_name] = record.id unless record.new_record?
@updated = true
end
loaded
record
end
def updated?
@updated
end
private
def find_target
@reflection.klass.find(
@owner[@reflection.primary_key_name],
:conditions => conditions,
:include => @reflection.options[:include]
)
end
def foreign_key_present
!@owner[@reflection.primary_key_name].nil?
end
end
end
end

View file

@ -0,0 +1,50 @@
module ActiveRecord
module Associations
class BelongsToPolymorphicAssociation < AssociationProxy #:nodoc:
def replace(record)
if record.nil?
@target = @owner[@reflection.primary_key_name] = @owner[@reflection.options[:foreign_type]] = nil
else
@target = (AssociationProxy === record ? record.target : record)
unless record.new_record?
@owner[@reflection.primary_key_name] = record.id
@owner[@reflection.options[:foreign_type]] = record.class.base_class.name.to_s
end
@updated = true
end
loaded
record
end
def updated?
@updated
end
private
def find_target
return nil if association_class.nil?
if @reflection.options[:conditions]
association_class.find(
@owner[@reflection.primary_key_name],
:conditions => conditions,
:include => @reflection.options[:include]
)
else
association_class.find(@owner[@reflection.primary_key_name], :include => @reflection.options[:include])
end
end
def foreign_key_present
!@owner[@reflection.primary_key_name].nil?
end
def association_class
@owner[@reflection.options[:foreign_type]] ? @owner[@reflection.options[:foreign_type]].constantize : nil
end
end
end
end

View file

@ -0,0 +1,169 @@
module ActiveRecord
module Associations
class HasAndBelongsToManyAssociation < AssociationCollection #:nodoc:
def initialize(owner, reflection)
super
construct_sql
end
def build(attributes = {})
load_target
record = @reflection.klass.new(attributes)
@target << record
record
end
def find_first
load_target.first
end
def find(*args)
options = Base.send(:extract_options_from_args!, args)
# If using a custom finder_sql, scan the entire collection.
if @reflection.options[:finder_sql]
expects_array = args.first.kind_of?(Array)
ids = args.flatten.compact.uniq
if ids.size == 1
id = ids.first.to_i
record = load_target.detect { |record| id == record.id }
expects_array ? [record] : record
else
load_target.select { |record| ids.include?(record.id) }
end
else
conditions = "#{@finder_sql}"
if sanitized_conditions = sanitize_sql(options[:conditions])
conditions << " AND (#{sanitized_conditions})"
end
options[:conditions] = conditions
options[:joins] = @join_sql
options[:readonly] = finding_with_ambigious_select?(options[:select])
if options[:order] && @reflection.options[:order]
options[:order] = "#{options[:order]}, #{@reflection.options[:order]}"
elsif @reflection.options[:order]
options[:order] = @reflection.options[:order]
end
merge_options_from_reflection!(options)
# Pass through args exactly as we received them.
args << options
@reflection.klass.find(*args)
end
end
def push_with_attributes(record, join_attributes = {})
raise_on_type_mismatch(record)
join_attributes.each { |key, value| record[key.to_s] = value }
callback(:before_add, record)
insert_record(record) unless @owner.new_record?
@target << record
callback(:after_add, record)
self
end
alias :concat_with_attributes :push_with_attributes
def size
@reflection.options[:uniq] ? count_records : super
end
protected
def method_missing(method, *args, &block)
if @target.respond_to?(method) || (!@reflection.klass.respond_to?(method) && Class.respond_to?(method))
super
else
@reflection.klass.with_scope(:find => { :conditions => @finder_sql, :joins => @join_sql, :readonly => false }) do
@reflection.klass.send(method, *args, &block)
end
end
end
def find_target
if @reflection.options[:finder_sql]
records = @reflection.klass.find_by_sql(@finder_sql)
else
records = find(:all)
end
@reflection.options[:uniq] ? uniq(records) : records
end
def count_records
load_target.size
end
def insert_record(record)
if record.new_record?
return false unless record.save
end
if @reflection.options[:insert_sql]
@owner.connection.execute(interpolate_sql(@reflection.options[:insert_sql], record))
else
columns = @owner.connection.columns(@reflection.options[:join_table], "#{@reflection.options[:join_table]} Columns")
attributes = columns.inject({}) do |attributes, column|
case column.name
when @reflection.primary_key_name
attributes[column.name] = @owner.quoted_id
when @reflection.association_foreign_key
attributes[column.name] = record.quoted_id
else
if record.attributes.has_key?(column.name)
value = @owner.send(:quote, record[column.name], column)
attributes[column.name] = value unless value.nil?
end
end
attributes
end
sql =
"INSERT INTO #{@reflection.options[:join_table]} (#{@owner.send(:quoted_column_names, attributes).join(', ')}) " +
"VALUES (#{attributes.values.join(', ')})"
@owner.connection.execute(sql)
end
return true
end
def delete_records(records)
if sql = @reflection.options[:delete_sql]
records.each { |record| @owner.connection.execute(interpolate_sql(sql, record)) }
else
ids = quoted_record_ids(records)
sql = "DELETE FROM #{@reflection.options[:join_table]} WHERE #{@reflection.primary_key_name} = #{@owner.quoted_id} AND #{@reflection.association_foreign_key} IN (#{ids})"
@owner.connection.execute(sql)
end
end
def construct_sql
interpolate_sql_options!(@reflection.options, :finder_sql)
if @reflection.options[:finder_sql]
@finder_sql = @reflection.options[:finder_sql]
else
@finder_sql = "#{@reflection.options[:join_table]}.#{@reflection.primary_key_name} = #{@owner.quoted_id} "
@finder_sql << " AND (#{conditions})" if conditions
end
@join_sql = "INNER JOIN #{@reflection.options[:join_table]} ON #{@reflection.klass.table_name}.#{@reflection.klass.primary_key} = #{@reflection.options[:join_table]}.#{@reflection.association_foreign_key}"
end
# Join tables with additional columns on top of the two foreign keys must be considered ambigious unless a select
# clause has been explicitly defined. Otherwise you can get broken records back, if, say, the join column also has
# and id column, which will then overwrite the id column of the records coming back.
def finding_with_ambigious_select?(select_clause)
!select_clause && @owner.connection.columns(@reflection.options[:join_table], "Join Table Columns").size != 2
end
end
end
end

View file

@ -0,0 +1,190 @@
module ActiveRecord
module Associations
class HasManyAssociation < AssociationCollection #:nodoc:
def initialize(owner, reflection)
super
construct_sql
end
def build(attributes = {})
if attributes.is_a?(Array)
attributes.collect { |attr| build(attr) }
else
load_target
record = @reflection.klass.new(attributes)
set_belongs_to_association_for(record)
@target << record
record
end
end
# DEPRECATED.
def find_all(runtime_conditions = nil, orderings = nil, limit = nil, joins = nil)
if @reflection.options[:finder_sql]
@reflection.klass.find_by_sql(@finder_sql)
else
conditions = @finder_sql
conditions += " AND (#{sanitize_sql(runtime_conditions)})" if runtime_conditions
orderings ||= @reflection.options[:order]
@reflection.klass.find_all(conditions, orderings, limit, joins)
end
end
# DEPRECATED. Find the first associated record. All arguments are optional.
def find_first(conditions = nil, orderings = nil)
find_all(conditions, orderings, 1).first
end
# Count the number of associated records. All arguments are optional.
def count(runtime_conditions = nil)
if @reflection.options[:counter_sql]
@reflection.klass.count_by_sql(@counter_sql)
elsif @reflection.options[:finder_sql]
@reflection.klass.count_by_sql(@finder_sql)
else
sql = @finder_sql
sql += " AND (#{sanitize_sql(runtime_conditions)})" if runtime_conditions
@reflection.klass.count(sql)
end
end
def find(*args)
options = Base.send(:extract_options_from_args!, args)
# If using a custom finder_sql, scan the entire collection.
if @reflection.options[:finder_sql]
expects_array = args.first.kind_of?(Array)
ids = args.flatten.compact.uniq
if ids.size == 1
id = ids.first
record = load_target.detect { |record| id == record.id }
expects_array ? [ record ] : record
else
load_target.select { |record| ids.include?(record.id) }
end
else
conditions = "#{@finder_sql}"
if sanitized_conditions = sanitize_sql(options[:conditions])
conditions << " AND (#{sanitized_conditions})"
end
options[:conditions] = conditions
if options[:order] && @reflection.options[:order]
options[:order] = "#{options[:order]}, #{@reflection.options[:order]}"
elsif @reflection.options[:order]
options[:order] = @reflection.options[:order]
end
merge_options_from_reflection!(options)
# Pass through args exactly as we received them.
args << options
@reflection.klass.find(*args)
end
end
protected
def method_missing(method, *args, &block)
if @target.respond_to?(method) || (!@reflection.klass.respond_to?(method) && Class.respond_to?(method))
super
else
@reflection.klass.with_scope(
:find => {
:conditions => @finder_sql,
:joins => @join_sql,
:readonly => false
},
:create => {
@reflection.primary_key_name => @owner.id
}
) do
@reflection.klass.send(method, *args, &block)
end
end
end
def find_target
if @reflection.options[:finder_sql]
@reflection.klass.find_by_sql(@finder_sql)
else
find(:all)
end
end
def count_records
count = if has_cached_counter?
@owner.send(:read_attribute, cached_counter_attribute_name)
elsif @reflection.options[:counter_sql]
@reflection.klass.count_by_sql(@counter_sql)
else
@reflection.klass.count(@counter_sql)
end
@target = [] and loaded if count == 0
if @reflection.options[:limit]
count = [ @reflection.options[:limit], count ].min
end
return count
end
def has_cached_counter?
@owner.attribute_present?(cached_counter_attribute_name)
end
def cached_counter_attribute_name
"#{@reflection.name}_count"
end
def insert_record(record)
set_belongs_to_association_for(record)
record.save
end
def delete_records(records)
if @reflection.options[:dependent]
records.each { |r| r.destroy }
else
ids = quoted_record_ids(records)
@reflection.klass.update_all(
"#{@reflection.primary_key_name} = NULL",
"#{@reflection.primary_key_name} = #{@owner.quoted_id} AND #{@reflection.klass.primary_key} IN (#{ids})"
)
end
end
def target_obsolete?
false
end
def construct_sql
case
when @reflection.options[:finder_sql]
@finder_sql = interpolate_sql(@reflection.options[:finder_sql])
when @reflection.options[:as]
@finder_sql =
"#{@reflection.klass.table_name}.#{@reflection.options[:as]}_id = #{@owner.quoted_id} AND " +
"#{@reflection.klass.table_name}.#{@reflection.options[:as]}_type = #{@owner.class.quote @owner.class.base_class.name.to_s}"
@finder_sql << " AND (#{conditions})" if conditions
else
@finder_sql = "#{@reflection.klass.table_name}.#{@reflection.primary_key_name} = #{@owner.quoted_id}"
@finder_sql << " AND (#{conditions})" if conditions
end
if @reflection.options[:counter_sql]
@counter_sql = interpolate_sql(@reflection.options[:counter_sql])
elsif @reflection.options[:finder_sql]
# replace the SELECT clause with COUNT(*), preserving any hints within /* ... */
@reflection.options[:counter_sql] = @reflection.options[:finder_sql].sub(/SELECT (\/\*.*?\*\/ )?(.*)\bFROM\b/im) { "SELECT #{$1}COUNT(*) FROM" }
@counter_sql = interpolate_sql(@reflection.options[:counter_sql])
else
@counter_sql = @finder_sql
end
end
end
end
end

View file

@ -0,0 +1,147 @@
module ActiveRecord
module Associations
class HasManyThroughAssociation < AssociationProxy #:nodoc:
def initialize(owner, reflection)
super
reflection.check_validity!
@finder_sql = construct_conditions
construct_sql
end
def find(*args)
options = Base.send(:extract_options_from_args!, args)
conditions = "#{@finder_sql}"
if sanitized_conditions = sanitize_sql(options[:conditions])
conditions << " AND (#{sanitized_conditions})"
end
options[:conditions] = conditions
if options[:order] && @reflection.options[:order]
options[:order] = "#{options[:order]}, #{@reflection.options[:order]}"
elsif @reflection.options[:order]
options[:order] = @reflection.options[:order]
end
options[:select] = construct_select(options[:select])
options[:from] ||= construct_from
options[:joins] = construct_joins(options[:joins])
options[:include] = @reflection.source_reflection.options[:include] if options[:include].nil?
merge_options_from_reflection!(options)
# Pass through args exactly as we received them.
args << options
@reflection.klass.find(*args)
end
def reset
@target = []
@loaded = false
end
protected
def method_missing(method, *args, &block)
if @target.respond_to?(method) || (!@reflection.klass.respond_to?(method) && Class.respond_to?(method))
super
else
@reflection.klass.with_scope(construct_scope) { @reflection.klass.send(method, *args, &block) }
end
end
def find_target
@reflection.klass.find(:all,
:select => construct_select,
:conditions => construct_conditions,
:from => construct_from,
:joins => construct_joins,
:order => @reflection.options[:order],
:limit => @reflection.options[:limit],
:group => @reflection.options[:group],
:include => @reflection.options[:include] || @reflection.source_reflection.options[:include]
)
end
def construct_conditions
conditions = if @reflection.through_reflection.options[:as]
"#{@reflection.through_reflection.table_name}.#{@reflection.through_reflection.options[:as]}_id = #{@owner.quoted_id} " +
"AND #{@reflection.through_reflection.table_name}.#{@reflection.through_reflection.options[:as]}_type = #{@owner.class.quote @owner.class.base_class.name.to_s}"
else
"#{@reflection.through_reflection.table_name}.#{@reflection.through_reflection.primary_key_name} = #{@owner.quoted_id}"
end
conditions << " AND (#{sql_conditions})" if sql_conditions
return conditions
end
def construct_from
@reflection.table_name
end
def construct_select(custom_select = nil)
selected = custom_select || @reflection.options[:select] || "#{@reflection.table_name}.*"
end
def construct_joins(custom_joins = nil)
polymorphic_join = nil
if @reflection.through_reflection.options[:as] || @reflection.source_reflection.macro == :belongs_to
reflection_primary_key = @reflection.klass.primary_key
source_primary_key = @reflection.source_reflection.primary_key_name
else
reflection_primary_key = @reflection.source_reflection.primary_key_name
source_primary_key = @reflection.klass.primary_key
if @reflection.source_reflection.options[:as]
polymorphic_join = "AND %s.%s = %s" % [
@reflection.table_name, "#{@reflection.source_reflection.options[:as]}_type",
@owner.class.quote(@reflection.through_reflection.klass.name)
]
end
end
"INNER JOIN %s ON %s.%s = %s.%s %s #{@reflection.options[:joins]} #{custom_joins}" % [
@reflection.through_reflection.table_name,
@reflection.table_name, reflection_primary_key,
@reflection.through_reflection.table_name, source_primary_key,
polymorphic_join
]
end
def construct_scope
{
:find => { :from => construct_from, :conditions => construct_conditions, :joins => construct_joins, :select => construct_select },
:create => { @reflection.primary_key_name => @owner.id }
}
end
def construct_sql
case
when @reflection.options[:finder_sql]
@finder_sql = interpolate_sql(@reflection.options[:finder_sql])
@finder_sql = "#{@reflection.klass.table_name}.#{@reflection.primary_key_name} = #{@owner.quoted_id}"
@finder_sql << " AND (#{conditions})" if conditions
end
if @reflection.options[:counter_sql]
@counter_sql = interpolate_sql(@reflection.options[:counter_sql])
elsif @reflection.options[:finder_sql]
# replace the SELECT clause with COUNT(*), preserving any hints within /* ... */
@reflection.options[:counter_sql] = @reflection.options[:finder_sql].sub(/SELECT (\/\*.*?\*\/ )?(.*)\bFROM\b/im) { "SELECT #{$1}COUNT(*) FROM" }
@counter_sql = interpolate_sql(@reflection.options[:counter_sql])
else
@counter_sql = @finder_sql
end
end
def conditions
@conditions ||= [
(interpolate_sql(@reflection.active_record.send(:sanitize_sql, @reflection.options[:conditions])) if @reflection.options[:conditions]),
(interpolate_sql(@reflection.active_record.send(:sanitize_sql, @reflection.through_reflection.options[:conditions])) if @reflection.through_reflection.options[:conditions])
].compact.collect { |condition| "(#{condition})" }.join(' AND ') unless (!@reflection.options[:conditions] && !@reflection.through_reflection.options[:conditions])
end
alias_method :sql_conditions, :conditions
end
end
end

View file

@ -0,0 +1,80 @@
module ActiveRecord
module Associations
class HasOneAssociation < BelongsToAssociation #:nodoc:
def initialize(owner, reflection)
super
construct_sql
end
def create(attributes = {}, replace_existing = true)
record = build(attributes, replace_existing)
record.save
record
end
def build(attributes = {}, replace_existing = true)
record = @reflection.klass.new(attributes)
if replace_existing
replace(record, true)
else
record[@reflection.primary_key_name] = @owner.id unless @owner.new_record?
self.target = record
end
record
end
def replace(obj, dont_save = false)
load_target
unless @target.nil?
if dependent? && !dont_save && @target != obj
@target.destroy unless @target.new_record?
@owner.clear_association_cache
else
@target[@reflection.primary_key_name] = nil
@target.save unless @owner.new_record? || @target.new_record?
end
end
if obj.nil?
@target = nil
else
raise_on_type_mismatch(obj)
set_belongs_to_association_for(obj)
@target = (AssociationProxy === obj ? obj.target : obj)
end
@loaded = true
unless @owner.new_record? or obj.nil? or dont_save
return (obj.save ? self : false)
else
return (obj.nil? ? nil : self)
end
end
private
def find_target
@reflection.klass.find(:first,
:conditions => @finder_sql,
:order => @reflection.options[:order],
:include => @reflection.options[:include]
)
end
def construct_sql
case
when @reflection.options[:as]
@finder_sql =
"#{@reflection.klass.table_name}.#{@reflection.options[:as]}_id = #{@owner.quoted_id} AND " +
"#{@reflection.klass.table_name}.#{@reflection.options[:as]}_type = #{@owner.class.quote @owner.class.base_class.name.to_s}"
else
@finder_sql = "#{@reflection.table_name}.#{@reflection.primary_key_name} = #{@owner.quoted_id}"
end
@finder_sql << " AND (#{conditions})" if conditions
end
end
end
end

File diff suppressed because it is too large Load diff

View file

@ -0,0 +1,229 @@
module ActiveRecord
module Calculations #:nodoc:
CALCULATIONS_OPTIONS = [:conditions, :joins, :order, :select, :group, :having, :distinct, :limit, :offset]
def self.included(base)
base.extend(ClassMethods)
end
module ClassMethods
# Count operates using three different approaches.
#
# * Count all: By not passing any parameters to count, it will return a count of all the rows for the model.
# * Count by conditions or joins: For backwards compatibility, you can pass in +conditions+ and +joins+ as individual parameters.
# * Count using options will find the row count matched by the options used.
#
# The last approach, count using options, accepts an option hash as the only parameter. The options are:
#
# * <tt>:conditions</tt>: An SQL fragment like "administrator = 1" or [ "user_name = ?", username ]. See conditions in the intro.
# * <tt>:joins</tt>: An SQL fragment for additional joins like "LEFT JOIN comments ON comments.post_id = id". (Rarely needed).
# The records will be returned read-only since they will have attributes that do not correspond to the table's columns.
# * <tt>:include</tt>: Named associations that should be loaded alongside using LEFT OUTER JOINs. The symbols named refer
# to already defined associations. When using named associations count returns the number DISTINCT items for the model you're counting.
# See eager loading under Associations.
# * <tt>:order</tt>: An SQL fragment like "created_at DESC, name" (really only used with GROUP BY calculations).
# * <tt>:group</tt>: An attribute name by which the result should be grouped. Uses the GROUP BY SQL-clause.
# * <tt>:select</tt>: By default, this is * as in SELECT * FROM, but can be changed if you for example want to do a join, but not
# include the joined columns.
# * <tt>:distinct</tt>: Set this to true to make this a distinct calculation, such as SELECT COUNT(DISTINCT posts.id) ...
#
# Examples for counting all:
# Person.count # returns the total count of all people
#
# Examples for count by +conditions+ and +joins+ (for backwards compatibility):
# Person.count("age > 26") # returns the number of people older than 26
# Person.find("age > 26 AND job.salary > 60000", "LEFT JOIN jobs on jobs.person_id = person.id") # returns the total number of rows matching the conditions and joins fetched by SELECT COUNT(*).
#
# Examples for count with options:
# Person.count(:conditions => "age > 26")
# Person.count(:conditions => "age > 26 AND job.salary > 60000", :include => :job) # because of the named association, it finds the DISTINCT count using LEFT OUTER JOIN.
# Person.count(:conditions => "age > 26 AND job.salary > 60000", :joins => "LEFT JOIN jobs on jobs.person_id = person.id") # finds the number of rows matching the conditions and joins.
# Person.count('id', :conditions => "age > 26") # Performs a COUNT(id)
# Person.count(:all, :conditions => "age > 26") # Performs a COUNT(*) (:all is an alias for '*')
#
# Note: Person.count(:all) will not work because it will use :all as the condition. Use Person.count instead.
def count(*args)
options = {}
column_name = :all
# For backwards compatibility, we need to handle both count(conditions=nil, joins=nil) or count(options={}) or count(column_name=:all, options={}).
if args.size >= 0 && args.size <= 2
if args.first.is_a?(Hash)
options = args.first
elsif args[1].is_a?(Hash)
options = args[1]
column_name = args.first
else
# Handle legacy paramter options: def count(conditions=nil, joins=nil)
options.merge!(:conditions => args[0]) if args.length > 0
options.merge!(:joins => args[1]) if args.length > 1
end
else
raise(ArgumentError, "Unexpected parameters passed to count(*args): expected either count(conditions=nil, joins=nil) or count(options={})")
end
if options[:include] || scope(:find, :include)
count_with_associations(options)
else
calculate(:count, column_name, options)
end
end
# Calculates average value on a given column. The value is returned as a float. See #calculate for examples with options.
#
# Person.average('age')
def average(column_name, options = {})
calculate(:avg, column_name, options)
end
# Calculates the minimum value on a given column. The value is returned with the same data type of the column.. See #calculate for examples with options.
#
# Person.minimum('age')
def minimum(column_name, options = {})
calculate(:min, column_name, options)
end
# Calculates the maximum value on a given column. The value is returned with the same data type of the column.. See #calculate for examples with options.
#
# Person.maximum('age')
def maximum(column_name, options = {})
calculate(:max, column_name, options)
end
# Calculates the sum value on a given column. The value is returned with the same data type of the column.. See #calculate for examples with options.
#
# Person.sum('age')
def sum(column_name, options = {})
calculate(:sum, column_name, options)
end
# This calculates aggregate values in the given column: Methods for count, sum, average, minimum, and maximum have been added as shortcuts.
# Options such as :conditions, :order, :group, :having, and :joins can be passed to customize the query.
#
# There are two basic forms of output:
# * Single aggregate value: The single value is type cast to Fixnum for COUNT, Float for AVG, and the given column's type for everything else.
# * Grouped values: This returns an ordered hash of the values and groups them by the :group option. It takes either a column name, or the name
# of a belongs_to association.
#
# values = Person.maximum(:age, :group => 'last_name')
# puts values["Drake"]
# => 43
#
# drake = Family.find_by_last_name('Drake')
# values = Person.maximum(:age, :group => :family) # Person belongs_to :family
# puts values[drake]
# => 43
#
# values.each do |family, max_age|
# ...
# end
#
# Options:
# * <tt>:conditions</tt>: An SQL fragment like "administrator = 1" or [ "user_name = ?", username ]. See conditions in the intro.
# * <tt>:joins</tt>: An SQL fragment for additional joins like "LEFT JOIN comments ON comments.post_id = id". (Rarely needed).
# The records will be returned read-only since they will have attributes that do not correspond to the table's columns.
# * <tt>:order</tt>: An SQL fragment like "created_at DESC, name" (really only used with GROUP BY calculations).
# * <tt>:group</tt>: An attribute name by which the result should be grouped. Uses the GROUP BY SQL-clause.
# * <tt>:select</tt>: By default, this is * as in SELECT * FROM, but can be changed if you for example want to do a join, but not
# include the joined columns.
# * <tt>:distinct</tt>: Set this to true to make this a distinct calculation, such as SELECT COUNT(DISTINCT posts.id) ...
#
# Examples:
# Person.calculate(:count, :all) # The same as Person.count
# Person.average(:age) # SELECT AVG(age) FROM people...
# Person.minimum(:age, :conditions => ['last_name != ?', 'Drake']) # Selects the minimum age for everyone with a last name other than 'Drake'
# Person.minimum(:age, :having => 'min(age) > 17', :group => :last_name) # Selects the minimum age for any family without any minors
def calculate(operation, column_name, options = {})
validate_calculation_options(operation, options)
column_name = options[:select] if options[:select]
column_name = '*' if column_name == :all
column = column_for column_name
aggregate = select_aggregate(operation, column_name, options)
aggregate_alias = column_alias_for(operation, column_name)
if options[:group]
execute_grouped_calculation(operation, column_name, column, aggregate, aggregate_alias, options)
else
execute_simple_calculation(operation, column_name, column, aggregate, aggregate_alias, options)
end
end
protected
def construct_calculation_sql(aggregate, aggregate_alias, options) #:nodoc:
scope = scope(:find)
sql = "SELECT #{aggregate} AS #{aggregate_alias}"
sql << ", #{options[:group_field]} AS #{options[:group_alias]}" if options[:group]
sql << " FROM #{table_name} "
add_joins!(sql, options, scope)
add_conditions!(sql, options[:conditions], scope)
sql << " GROUP BY #{options[:group_field]}" if options[:group]
sql << " HAVING #{options[:having]}" if options[:group] && options[:having]
sql << " ORDER BY #{options[:order]}" if options[:order]
add_limit!(sql, options)
sql
end
def execute_simple_calculation(operation, column_name, column, aggregate, aggregate_alias, options) #:nodoc:
value = connection.select_value(construct_calculation_sql(aggregate, aggregate_alias, options))
type_cast_calculated_value(value, column, operation)
end
def execute_grouped_calculation(operation, column_name, column, aggregate, aggregate_alias, options) #:nodoc:
group_attr = options[:group].to_s
association = reflect_on_association(group_attr.to_sym)
associated = association && association.macro == :belongs_to # only count belongs_to associations
group_field = (associated ? "#{options[:group]}_id" : options[:group]).to_s
group_alias = column_alias_for(group_field)
group_column = column_for group_field
sql = construct_calculation_sql(aggregate, aggregate_alias, options.merge(:group_field => group_field, :group_alias => group_alias))
calculated_data = connection.select_all(sql)
if association
key_ids = calculated_data.collect { |row| row[group_alias] }
key_records = association.klass.base_class.find(key_ids)
key_records = key_records.inject({}) { |hsh, r| hsh.merge(r.id => r) }
end
calculated_data.inject(OrderedHash.new) do |all, row|
key = associated ? key_records[row[group_alias].to_i] : type_cast_calculated_value(row[group_alias], group_column)
value = row[aggregate_alias]
all << [key, type_cast_calculated_value(value, column, operation)]
end
end
private
def validate_calculation_options(operation, options = {})
if operation.to_s == 'count'
options.assert_valid_keys(CALCULATIONS_OPTIONS + [:include])
else
options.assert_valid_keys(CALCULATIONS_OPTIONS)
end
end
def select_aggregate(operation, column_name, options)
"#{operation}(#{'DISTINCT ' if options[:distinct]}#{column_name})"
end
# converts a given key to the value that the database adapter returns as
#
# users.id #=> users_id
# sum(id) #=> sum_id
# count(distinct users.id) #=> count_distinct_users_id
# count(*) #=> count_all
def column_alias_for(*keys)
keys.join(' ').downcase.gsub(/\*/, 'all').gsub(/\W+/, ' ').strip.gsub(/ +/, '_')
end
def column_for(field)
field_name = field.to_s.split('.').last
columns.detect { |c| c.name.to_s == field_name }
end
def type_cast_calculated_value(value, column, operation = nil)
operation = operation.to_s.downcase
case operation
when 'count' then value.to_i
when 'avg' then value.to_f
else column ? column.type_cast(value) : value
end
end
end
end
end

View file

@ -0,0 +1,378 @@
require 'observer'
module ActiveRecord
# Callbacks are hooks into the lifecycle of an Active Record object that allows you to trigger logic
# before or after an alteration of the object state. This can be used to make sure that associated and
# dependent objects are deleted when destroy is called (by overwriting before_destroy) or to massage attributes
# before they're validated (by overwriting before_validation). As an example of the callbacks initiated, consider
# the Base#save call:
#
# * (-) save
# * (-) valid?
# * (1) before_validation
# * (2) before_validation_on_create
# * (-) validate
# * (-) validate_on_create
# * (3) after_validation
# * (4) after_validation_on_create
# * (5) before_save
# * (6) before_create
# * (-) create
# * (7) after_create
# * (8) after_save
#
# That's a total of eight callbacks, which gives you immense power to react and prepare for each state in the
# Active Record lifecycle.
#
# Examples:
# class CreditCard < ActiveRecord::Base
# # Strip everything but digits, so the user can specify "555 234 34" or
# # "5552-3434" or both will mean "55523434"
# def before_validation_on_create
# self.number = number.gsub(/[^0-9]/, "") if attribute_present?("number")
# end
# end
#
# class Subscription < ActiveRecord::Base
# before_create :record_signup
#
# private
# def record_signup
# self.signed_up_on = Date.today
# end
# end
#
# class Firm < ActiveRecord::Base
# # Destroys the associated clients and people when the firm is destroyed
# before_destroy { |record| Person.destroy_all "firm_id = #{record.id}" }
# before_destroy { |record| Client.destroy_all "client_of = #{record.id}" }
# end
#
# == Inheritable callback queues
#
# Besides the overwriteable callback methods, it's also possible to register callbacks through the use of the callback macros.
# Their main advantage is that the macros add behavior into a callback queue that is kept intact down through an inheritance
# hierarchy. Example:
#
# class Topic < ActiveRecord::Base
# before_destroy :destroy_author
# end
#
# class Reply < Topic
# before_destroy :destroy_readers
# end
#
# Now, when Topic#destroy is run only +destroy_author+ is called. When Reply#destroy is run both +destroy_author+ and
# +destroy_readers+ is called. Contrast this to the situation where we've implemented the save behavior through overwriteable
# methods:
#
# class Topic < ActiveRecord::Base
# def before_destroy() destroy_author end
# end
#
# class Reply < Topic
# def before_destroy() destroy_readers end
# end
#
# In that case, Reply#destroy would only run +destroy_readers+ and _not_ +destroy_author+. So use the callback macros when
# you want to ensure that a certain callback is called for the entire hierarchy and the regular overwriteable methods when you
# want to leave it up to each descendent to decide whether they want to call +super+ and trigger the inherited callbacks.
#
# *IMPORTANT:* In order for inheritance to work for the callback queues, you must specify the callbacks before specifying the
# associations. Otherwise, you might trigger the loading of a child before the parent has registered the callbacks and they won't
# be inherited.
#
# == Types of callbacks
#
# There are four types of callbacks accepted by the callback macros: Method references (symbol), callback objects,
# inline methods (using a proc), and inline eval methods (using a string). Method references and callback objects are the
# recommended approaches, inline methods using a proc are sometimes appropriate (such as for creating mix-ins), and inline
# eval methods are deprecated.
#
# The method reference callbacks work by specifying a protected or private method available in the object, like this:
#
# class Topic < ActiveRecord::Base
# before_destroy :delete_parents
#
# private
# def delete_parents
# self.class.delete_all "parent_id = #{id}"
# end
# end
#
# The callback objects have methods named after the callback called with the record as the only parameter, such as:
#
# class BankAccount < ActiveRecord::Base
# before_save EncryptionWrapper.new("credit_card_number")
# after_save EncryptionWrapper.new("credit_card_number")
# after_initialize EncryptionWrapper.new("credit_card_number")
# end
#
# class EncryptionWrapper
# def initialize(attribute)
# @attribute = attribute
# end
#
# def before_save(record)
# record.credit_card_number = encrypt(record.credit_card_number)
# end
#
# def after_save(record)
# record.credit_card_number = decrypt(record.credit_card_number)
# end
#
# alias_method :after_find, :after_save
#
# private
# def encrypt(value)
# # Secrecy is committed
# end
#
# def decrypt(value)
# # Secrecy is unveiled
# end
# end
#
# So you specify the object you want messaged on a given callback. When that callback is triggered, the object has
# a method by the name of the callback messaged.
#
# The callback macros usually accept a symbol for the method they're supposed to run, but you can also pass a "method string",
# which will then be evaluated within the binding of the callback. Example:
#
# class Topic < ActiveRecord::Base
# before_destroy 'self.class.delete_all "parent_id = #{id}"'
# end
#
# Notice that single plings (') are used so the #{id} part isn't evaluated until the callback is triggered. Also note that these
# inline callbacks can be stacked just like the regular ones:
#
# class Topic < ActiveRecord::Base
# before_destroy 'self.class.delete_all "parent_id = #{id}"',
# 'puts "Evaluated after parents are destroyed"'
# end
#
# == The after_find and after_initialize exceptions
#
# Because after_find and after_initialize are called for each object found and instantiated by a finder, such as Base.find(:all), we've had
# to implement a simple performance constraint (50% more speed on a simple test case). Unlike all the other callbacks, after_find and
# after_initialize will only be run if an explicit implementation is defined (<tt>def after_find</tt>). In that case, all of the
# callback types will be called.
#
# == Cancelling callbacks
#
# If a before_* callback returns false, all the later callbacks and the associated action are cancelled. If an after_* callback returns
# false, all the later callbacks are cancelled. Callbacks are generally run in the order they are defined, with the exception of callbacks
# defined as methods on the model, which are called last.
module Callbacks
CALLBACKS = %w(
after_find after_initialize before_save after_save before_create after_create before_update after_update before_validation
after_validation before_validation_on_create after_validation_on_create before_validation_on_update
after_validation_on_update before_destroy after_destroy
)
def self.append_features(base) #:nodoc:
super
base.extend(ClassMethods)
base.class_eval do
class << self
include Observable
alias_method :instantiate_without_callbacks, :instantiate
alias_method :instantiate, :instantiate_with_callbacks
end
alias_method :initialize_without_callbacks, :initialize
alias_method :initialize, :initialize_with_callbacks
alias_method :create_or_update_without_callbacks, :create_or_update
alias_method :create_or_update, :create_or_update_with_callbacks
alias_method :valid_without_callbacks, :valid?
alias_method :valid?, :valid_with_callbacks
alias_method :create_without_callbacks, :create
alias_method :create, :create_with_callbacks
alias_method :update_without_callbacks, :update
alias_method :update, :update_with_callbacks
alias_method :destroy_without_callbacks, :destroy
alias_method :destroy, :destroy_with_callbacks
end
CALLBACKS.each do |method|
base.class_eval <<-"end_eval"
def self.#{method}(*callbacks, &block)
callbacks << block if block_given?
write_inheritable_array(#{method.to_sym.inspect}, callbacks)
end
end_eval
end
end
module ClassMethods #:nodoc:
def instantiate_with_callbacks(record)
object = instantiate_without_callbacks(record)
if object.respond_to_without_attributes?(:after_find)
object.send(:callback, :after_find)
end
if object.respond_to_without_attributes?(:after_initialize)
object.send(:callback, :after_initialize)
end
object
end
end
# Is called when the object was instantiated by one of the finders, like Base.find.
#def after_find() end
# Is called after the object has been instantiated by a call to Base.new.
#def after_initialize() end
def initialize_with_callbacks(attributes = nil) #:nodoc:
initialize_without_callbacks(attributes)
result = yield self if block_given?
callback(:after_initialize) if respond_to_without_attributes?(:after_initialize)
result
end
# Is called _before_ Base.save (regardless of whether it's a create or update save).
def before_save() end
# Is called _after_ Base.save (regardless of whether it's a create or update save).
#
# class Contact < ActiveRecord::Base
# after_save { logger.info( 'New contact saved!' ) }
# end
def after_save() end
def create_or_update_with_callbacks #:nodoc:
return false if callback(:before_save) == false
result = create_or_update_without_callbacks
callback(:after_save)
result
end
# Is called _before_ Base.save on new objects that haven't been saved yet (no record exists).
def before_create() end
# Is called _after_ Base.save on new objects that haven't been saved yet (no record exists).
def after_create() end
def create_with_callbacks #:nodoc:
return false if callback(:before_create) == false
result = create_without_callbacks
callback(:after_create)
result
end
# Is called _before_ Base.save on existing objects that have a record.
def before_update() end
# Is called _after_ Base.save on existing objects that have a record.
def after_update() end
def update_with_callbacks #:nodoc:
return false if callback(:before_update) == false
result = update_without_callbacks
callback(:after_update)
result
end
# Is called _before_ Validations.validate (which is part of the Base.save call).
def before_validation() end
# Is called _after_ Validations.validate (which is part of the Base.save call).
def after_validation() end
# Is called _before_ Validations.validate (which is part of the Base.save call) on new objects
# that haven't been saved yet (no record exists).
def before_validation_on_create() end
# Is called _after_ Validations.validate (which is part of the Base.save call) on new objects
# that haven't been saved yet (no record exists).
def after_validation_on_create() end
# Is called _before_ Validations.validate (which is part of the Base.save call) on
# existing objects that have a record.
def before_validation_on_update() end
# Is called _after_ Validations.validate (which is part of the Base.save call) on
# existing objects that have a record.
def after_validation_on_update() end
def valid_with_callbacks #:nodoc:
return false if callback(:before_validation) == false
if new_record? then result = callback(:before_validation_on_create) else result = callback(:before_validation_on_update) end
return false if result == false
result = valid_without_callbacks
callback(:after_validation)
if new_record? then callback(:after_validation_on_create) else callback(:after_validation_on_update) end
return result
end
# Is called _before_ Base.destroy.
#
# Note: If you need to _destroy_ or _nullify_ associated records first,
# use the _:dependent_ option on your associations.
def before_destroy() end
# Is called _after_ Base.destroy (and all the attributes have been frozen).
#
# class Contact < ActiveRecord::Base
# after_destroy { |record| logger.info( "Contact #{record.id} was destroyed." ) }
# end
def after_destroy() end
def destroy_with_callbacks #:nodoc:
return false if callback(:before_destroy) == false
result = destroy_without_callbacks
callback(:after_destroy)
result
end
private
def callback(method)
notify(method)
callbacks_for(method).each do |callback|
result = case callback
when Symbol
self.send(callback)
when String
eval(callback, binding)
when Proc, Method
callback.call(self)
else
if callback.respond_to?(method)
callback.send(method, self)
else
raise ActiveRecordError, "Callbacks must be a symbol denoting the method to call, a string to be evaluated, a block to be invoked, or an object responding to the callback method."
end
end
return false if result == false
end
result = send(method) if respond_to_without_attributes?(method)
return result
end
def callbacks_for(method)
self.class.read_inheritable_attribute(method.to_sym) or []
end
def invoke_and_notify(method)
notify(method)
send(method) if respond_to_without_attributes?(method)
end
def notify(method) #:nodoc:
self.class.changed
self.class.notify_observers(method, self)
end
end
end

View file

@ -0,0 +1,268 @@
require 'set'
module ActiveRecord
class Base
class ConnectionSpecification #:nodoc:
attr_reader :config, :adapter_method
def initialize (config, adapter_method)
@config, @adapter_method = config, adapter_method
end
end
# Check for activity after at least +verification_timeout+ seconds.
# Defaults to 0 (always check.)
cattr_accessor :verification_timeout
@@verification_timeout = 0
# The class -> [adapter_method, config] map
@@defined_connections = {}
# The class -> thread id -> adapter cache. (class -> adapter if not allow_concurrency)
@@active_connections = {}
class << self
# Retrieve the connection cache.
def thread_safe_active_connections #:nodoc:
@@active_connections[Thread.current.object_id] ||= {}
end
def single_threaded_active_connections #:nodoc:
@@active_connections
end
# pick up the right active_connection method from @@allow_concurrency
if @@allow_concurrency
alias_method :active_connections, :thread_safe_active_connections
else
alias_method :active_connections, :single_threaded_active_connections
end
# set concurrency support flag (not thread safe, like most of the methods in this file)
def allow_concurrency=(threaded) #:nodoc:
logger.debug "allow_concurrency=#{threaded}" if logger
return if @@allow_concurrency == threaded
clear_all_cached_connections!
@@allow_concurrency = threaded
method_prefix = threaded ? "thread_safe" : "single_threaded"
sing = (class << self; self; end)
[:active_connections, :scoped_methods].each do |method|
sing.send(:alias_method, method, "#{method_prefix}_#{method}")
end
log_connections if logger
end
def active_connection_name #:nodoc:
@active_connection_name ||=
if active_connections[name] || @@defined_connections[name]
name
elsif self == ActiveRecord::Base
nil
else
superclass.active_connection_name
end
end
def clear_active_connection_name #:nodoc:
@active_connection_name = nil
subclasses.each { |klass| klass.clear_active_connection_name }
end
# Returns the connection currently associated with the class. This can
# also be used to "borrow" the connection to do database work unrelated
# to any of the specific Active Records.
def connection
if @active_connection_name && (conn = active_connections[@active_connection_name])
conn
else
# retrieve_connection sets the cache key.
conn = retrieve_connection
active_connections[@active_connection_name] = conn
end
end
# Clears the cache which maps classes to connections.
def clear_active_connections!
clear_cache!(@@active_connections) do |name, conn|
conn.disconnect!
end
end
# Verify active connections.
def verify_active_connections! #:nodoc:
if @@allow_concurrency
remove_stale_cached_threads!(@@active_connections) do |name, conn|
conn.disconnect!
end
end
active_connections.each_value do |connection|
connection.verify!(@@verification_timeout)
end
end
private
def clear_cache!(cache, thread_id = nil, &block)
if cache
if @@allow_concurrency
thread_id ||= Thread.current.object_id
thread_cache, cache = cache, cache[thread_id]
return unless cache
end
cache.each(&block) if block_given?
cache.clear
end
ensure
if thread_cache && @@allow_concurrency
thread_cache.delete(thread_id)
end
end
# Remove stale threads from the cache.
def remove_stale_cached_threads!(cache, &block)
stale = Set.new(cache.keys)
Thread.list.each do |thread|
stale.delete(thread.object_id) if thread.alive?
end
stale.each do |thread_id|
clear_cache!(cache, thread_id, &block)
end
end
def clear_all_cached_connections!
if @@allow_concurrency
@@active_connections.each_value do |connection_hash_for_thread|
connection_hash_for_thread.each_value {|conn| conn.disconnect! }
connection_hash_for_thread.clear
end
else
@@active_connections.each_value {|conn| conn.disconnect! }
end
@@active_connections.clear
end
end
# Returns the connection currently associated with the class. This can
# also be used to "borrow" the connection to do database work that isn't
# easily done without going straight to SQL.
def connection
self.class.connection
end
# Establishes the connection to the database. Accepts a hash as input where
# the :adapter key must be specified with the name of a database adapter (in lower-case)
# example for regular databases (MySQL, Postgresql, etc):
#
# ActiveRecord::Base.establish_connection(
# :adapter => "mysql",
# :host => "localhost",
# :username => "myuser",
# :password => "mypass",
# :database => "somedatabase"
# )
#
# Example for SQLite database:
#
# ActiveRecord::Base.establish_connection(
# :adapter => "sqlite",
# :database => "path/to/dbfile"
# )
#
# Also accepts keys as strings (for parsing from yaml for example):
# ActiveRecord::Base.establish_connection(
# "adapter" => "sqlite",
# "database" => "path/to/dbfile"
# )
#
# The exceptions AdapterNotSpecified, AdapterNotFound and ArgumentError
# may be returned on an error.
def self.establish_connection(spec = nil)
case spec
when nil
raise AdapterNotSpecified unless defined? RAILS_ENV
establish_connection(RAILS_ENV)
when ConnectionSpecification
clear_active_connection_name
@active_connection_name = name
@@defined_connections[name] = spec
when Symbol, String
if configuration = configurations[spec.to_s]
establish_connection(configuration)
else
raise AdapterNotSpecified, "#{spec} database is not configured"
end
else
spec = spec.symbolize_keys
unless spec.key?(:adapter) then raise AdapterNotSpecified, "database configuration does not specify adapter" end
adapter_method = "#{spec[:adapter]}_connection"
unless respond_to?(adapter_method) then raise AdapterNotFound, "database configuration specifies nonexistent #{spec[:adapter]} adapter" end
remove_connection
establish_connection(ConnectionSpecification.new(spec, adapter_method))
end
end
# Locate the connection of the nearest super class. This can be an
# active or defined connections: if it is the latter, it will be
# opened and set as the active connection for the class it was defined
# for (not necessarily the current class).
def self.retrieve_connection #:nodoc:
# Name is nil if establish_connection hasn't been called for
# some class along the inheritance chain up to AR::Base yet.
if name = active_connection_name
if conn = active_connections[name]
# Verify the connection.
conn.verify!(@@verification_timeout)
elsif spec = @@defined_connections[name]
# Activate this connection specification.
klass = name.constantize
klass.connection = spec
conn = active_connections[name]
end
end
conn or raise ConnectionNotEstablished
end
# Returns true if a connection that's accessible to this class have already been opened.
def self.connected?
active_connections[active_connection_name] ? true : false
end
# Remove the connection for this class. This will close the active
# connection and the defined connection (if they exist). The result
# can be used as argument for establish_connection, for easy
# re-establishing of the connection.
def self.remove_connection(klass=self)
spec = @@defined_connections[klass.name]
konn = active_connections[klass.name]
@@defined_connections.delete_if { |key, value| value == spec }
active_connections.delete_if { |key, value| value == konn }
konn.disconnect! if konn
spec.config if spec
end
# Set the connection for the class.
def self.connection=(spec) #:nodoc:
if spec.kind_of?(ActiveRecord::ConnectionAdapters::AbstractAdapter)
active_connections[name] = spec
elsif spec.kind_of?(ConnectionSpecification)
self.connection = self.send(spec.adapter_method, spec.config)
elsif spec.nil?
raise ConnectionNotEstablished
else
establish_connection spec
end
end
# connection state logging
def self.log_connections #:nodoc:
if logger
logger.info "Defined connections: #{@@defined_connections.inspect}"
logger.info "Active connections: #{active_connections.inspect}"
logger.info "Active connection name: #{@active_connection_name}"
end
end
end
end

View file

@ -0,0 +1,104 @@
module ActiveRecord
module ConnectionAdapters # :nodoc:
module DatabaseStatements
# Returns an array of record hashes with the column names as keys and
# column values as values.
def select_all(sql, name = nil)
end
# Returns a record hash with the column names as keys and column values
# as values.
def select_one(sql, name = nil)
end
# Returns a single value from a record
def select_value(sql, name = nil)
result = select_one(sql, name)
result.nil? ? nil : result.values.first
end
# Returns an array of the values of the first column in a select:
# select_values("SELECT id FROM companies LIMIT 3") => [1,2,3]
def select_values(sql, name = nil)
result = select_all(sql, name)
result.map{ |v| v.values.first }
end
# Executes the SQL statement in the context of this connection.
# This abstract method raises a NotImplementedError.
def execute(sql, name = nil)
raise NotImplementedError, "execute is an abstract method"
end
# Returns the last auto-generated ID from the affected table.
def insert(sql, name = nil, pk = nil, id_value = nil, sequence_name = nil) end
# Executes the update statement and returns the number of rows affected.
def update(sql, name = nil) end
# Executes the delete statement and returns the number of rows affected.
def delete(sql, name = nil) end
# Wrap a block in a transaction. Returns result of block.
def transaction(start_db_transaction = true)
transaction_open = false
begin
if block_given?
if start_db_transaction
begin_db_transaction
transaction_open = true
end
yield
end
rescue Exception => database_transaction_rollback
if transaction_open
transaction_open = false
rollback_db_transaction
end
raise
end
ensure
commit_db_transaction if transaction_open
end
# Begins the transaction (and turns off auto-committing).
def begin_db_transaction() end
# Commits the transaction (and turns on auto-committing).
def commit_db_transaction() end
# Rolls back the transaction (and turns on auto-committing). Must be
# done if the transaction block raises an exception or returns false.
def rollback_db_transaction() end
# Alias for #add_limit_offset!.
def add_limit!(sql, options)
add_limit_offset!(sql, options) if options
end
# Appends +LIMIT+ and +OFFSET+ options to a SQL statement.
# This method *modifies* the +sql+ parameter.
# ===== Examples
# add_limit_offset!('SELECT * FROM suppliers', {:limit => 10, :offset => 50})
# generates
# SELECT * FROM suppliers LIMIT 10 OFFSET 50
def add_limit_offset!(sql, options)
if limit = options[:limit]
sql << " LIMIT #{limit}"
if offset = options[:offset]
sql << " OFFSET #{offset}"
end
end
end
def default_sequence_name(table, column)
nil
end
# Set the sequence to the max value of the table's column.
def reset_sequence!(table, column, sequence = nil)
# Do nothing by default. Implement for PostgreSQL, Oracle, ...
end
end
end
end

View file

@ -0,0 +1,51 @@
module ActiveRecord
module ConnectionAdapters # :nodoc:
module Quoting
# Quotes the column value to help prevent
# {SQL injection attacks}[http://en.wikipedia.org/wiki/SQL_injection].
def quote(value, column = nil)
case value
when String
if column && column.type == :binary && column.class.respond_to?(:string_to_binary)
"'#{quote_string(column.class.string_to_binary(value))}'" # ' (for ruby-mode)
elsif column && [:integer, :float].include?(column.type)
value.to_s
else
"'#{quote_string(value)}'" # ' (for ruby-mode)
end
when NilClass then "NULL"
when TrueClass then (column && column.type == :integer ? '1' : quoted_true)
when FalseClass then (column && column.type == :integer ? '0' : quoted_false)
when Float, Fixnum, Bignum then value.to_s
when Date then "'#{value.to_s}'"
when Time, DateTime then "'#{quoted_date(value)}'"
else "'#{quote_string(value.to_yaml)}'"
end
end
# Quotes a string, escaping any ' (single quote) and \ (backslash)
# characters.
def quote_string(s)
s.gsub(/\\/, '\&\&').gsub(/'/, "''") # ' (for ruby-mode)
end
# Returns a quoted form of the column name. This is highly adapter
# specific.
def quote_column_name(name)
name
end
def quoted_true
"'t'"
end
def quoted_false
"'f'"
end
def quoted_date(value)
value.strftime("%Y-%m-%d %H:%M:%S")
end
end
end
end

View file

@ -0,0 +1,259 @@
require 'parsedate'
module ActiveRecord
module ConnectionAdapters #:nodoc:
# An abstract definition of a column in a table.
class Column
attr_reader :name, :default, :type, :limit, :null, :sql_type
attr_accessor :primary
# Instantiates a new column in the table.
#
# +name+ is the column's name, as in <tt><b>supplier_id</b> int(11)</tt>.
# +default+ is the type-casted default value, such as <tt>sales_stage varchar(20) default <b>'new'</b></tt>.
# +sql_type+ is only used to extract the column's length, if necessary. For example, <tt>company_name varchar(<b>60</b>)</tt>.
# +null+ determines if this column allows +NULL+ values.
def initialize(name, default, sql_type = nil, null = true)
@name, @type, @null = name, simplified_type(sql_type), null
@sql_type = sql_type
# have to do this one separately because type_cast depends on #type
@default = type_cast(default)
@limit = extract_limit(sql_type) unless sql_type.nil?
@primary = nil
@text = [:string, :text].include? @type
@number = [:float, :integer].include? @type
end
def text?
@text
end
def number?
@number
end
# Returns the Ruby class that corresponds to the abstract data type.
def klass
case type
when :integer then Fixnum
when :float then Float
when :datetime then Time
when :date then Date
when :timestamp then Time
when :time then Time
when :text, :string then String
when :binary then String
when :boolean then Object
end
end
# Casts value (which is a String) to an appropriate instance.
def type_cast(value)
return nil if value.nil?
case type
when :string then value
when :text then value
when :integer then value.to_i rescue value ? 1 : 0
when :float then value.to_f
when :datetime then self.class.string_to_time(value)
when :timestamp then self.class.string_to_time(value)
when :time then self.class.string_to_dummy_time(value)
when :date then self.class.string_to_date(value)
when :binary then self.class.binary_to_string(value)
when :boolean then self.class.value_to_boolean(value)
else value
end
end
def type_cast_code(var_name)
case type
when :string then nil
when :text then nil
when :integer then "(#{var_name}.to_i rescue #{var_name} ? 1 : 0)"
when :float then "#{var_name}.to_f"
when :datetime then "#{self.class.name}.string_to_time(#{var_name})"
when :timestamp then "#{self.class.name}.string_to_time(#{var_name})"
when :time then "#{self.class.name}.string_to_dummy_time(#{var_name})"
when :date then "#{self.class.name}.string_to_date(#{var_name})"
when :binary then "#{self.class.name}.binary_to_string(#{var_name})"
when :boolean then "#{self.class.name}.value_to_boolean(#{var_name})"
else nil
end
end
# Returns the human name of the column name.
#
# ===== Examples
# Column.new('sales_stage', ...).human_name #=> 'Sales stage'
def human_name
Base.human_attribute_name(@name)
end
# Used to convert from Strings to BLOBs
def self.string_to_binary(value)
value
end
# Used to convert from BLOBs to Strings
def self.binary_to_string(value)
value
end
def self.string_to_date(string)
return string unless string.is_a?(String)
date_array = ParseDate.parsedate(string)
# treat 0000-00-00 as nil
Date.new(date_array[0], date_array[1], date_array[2]) rescue nil
end
def self.string_to_time(string)
return string unless string.is_a?(String)
time_array = ParseDate.parsedate(string)[0..5]
# treat 0000-00-00 00:00:00 as nil
Time.send(Base.default_timezone, *time_array) rescue nil
end
def self.string_to_dummy_time(string)
return string unless string.is_a?(String)
time_array = ParseDate.parsedate(string)
# pad the resulting array with dummy date information
time_array[0] = 2000; time_array[1] = 1; time_array[2] = 1;
Time.send(Base.default_timezone, *time_array) rescue nil
end
# convert something to a boolean
def self.value_to_boolean(value)
return value if value==true || value==false
case value.to_s.downcase
when "true", "t", "1" then true
else false
end
end
private
def extract_limit(sql_type)
$1.to_i if sql_type =~ /\((.*)\)/
end
def simplified_type(field_type)
case field_type
when /int/i
:integer
when /float|double|decimal|numeric/i
:float
when /datetime/i
:datetime
when /timestamp/i
:timestamp
when /time/i
:time
when /date/i
:date
when /clob/i, /text/i
:text
when /blob/i, /binary/i
:binary
when /char/i, /string/i
:string
when /boolean/i
:boolean
end
end
end
class IndexDefinition < Struct.new(:table, :name, :unique, :columns) #:nodoc:
end
class ColumnDefinition < Struct.new(:base, :name, :type, :limit, :default, :null) #:nodoc:
def to_sql
column_sql = "#{base.quote_column_name(name)} #{type_to_sql(type.to_sym, limit)}"
add_column_options!(column_sql, :null => null, :default => default)
column_sql
end
alias to_s :to_sql
private
def type_to_sql(name, limit)
base.type_to_sql(name, limit) rescue name
end
def add_column_options!(sql, options)
base.add_column_options!(sql, options.merge(:column => self))
end
end
# Represents a SQL table in an abstract way.
# Columns are stored as ColumnDefinition in the #columns attribute.
class TableDefinition
attr_accessor :columns
def initialize(base)
@columns = []
@base = base
end
# Appends a primary key definition to the table definition.
# Can be called multiple times, but this is probably not a good idea.
def primary_key(name)
column(name, native[:primary_key])
end
# Returns a ColumnDefinition for the column with name +name+.
def [](name)
@columns.find {|column| column.name.to_s == name.to_s}
end
# Instantiates a new column for the table.
# The +type+ parameter must be one of the following values:
# <tt>:primary_key</tt>, <tt>:string</tt>, <tt>:text</tt>,
# <tt>:integer</tt>, <tt>:float</tt>, <tt>:datetime</tt>,
# <tt>:timestamp</tt>, <tt>:time</tt>, <tt>:date</tt>,
# <tt>:binary</tt>, <tt>:boolean</tt>.
#
# Available options are (none of these exists by default):
# * <tt>:limit</tt>:
# Requests a maximum column length (<tt>:string</tt>, <tt>:text</tt>,
# <tt>:binary</tt> or <tt>:integer</tt> columns only)
# * <tt>:default</tt>:
# The column's default value. You cannot explicitely set the default
# value to +NULL+. Simply leave off this option if you want a +NULL+
# default value.
# * <tt>:null</tt>:
# Allows or disallows +NULL+ values in the column. This option could
# have been named <tt>:null_allowed</tt>.
#
# This method returns <tt>self</tt>.
#
# ===== Examples
# # Assuming def is an instance of TableDefinition
# def.column(:granted, :boolean)
# #=> granted BOOLEAN
#
# def.column(:picture, :binary, :limit => 2.megabytes)
# #=> picture BLOB(2097152)
#
# def.column(:sales_stage, :string, :limit => 20, :default => 'new', :null => false)
# #=> sales_stage VARCHAR(20) DEFAULT 'new' NOT NULL
def column(name, type, options = {})
column = self[name] || ColumnDefinition.new(@base, name, type)
column.limit = options[:limit] || native[type.to_sym][:limit] if options[:limit] or native[type.to_sym]
column.default = options[:default]
column.null = options[:null]
@columns << column unless @columns.include? column
self
end
# Returns a String whose contents are the column definitions
# concatenated together. This string can then be pre and appended to
# to generate the final SQL to create the table.
def to_sql
@columns * ', '
end
private
def native
@base.native_database_types
end
end
end
end

View file

@ -0,0 +1,271 @@
module ActiveRecord
module ConnectionAdapters # :nodoc:
module SchemaStatements
# Returns a Hash of mappings from the abstract data types to the native
# database types. See TableDefinition#column for details on the recognized
# abstract data types.
def native_database_types
{}
end
# This is the maximum length a table alias can be
def table_alias_length
255
end
# Truncates a table alias according to the limits of the current adapter.
def table_alias_for(table_name)
table_name[0..table_alias_length-1].gsub(/\./, '_')
end
# def tables(name = nil) end
# Returns an array of indexes for the given table.
# def indexes(table_name, name = nil) end
# Returns an array of Column objects for the table specified by +table_name+.
# See the concrete implementation for details on the expected parameter values.
def columns(table_name, name = nil) end
# Creates a new table
# There are two ways to work with #create_table. You can use the block
# form or the regular form, like this:
#
# === Block form
# # create_table() yields a TableDefinition instance
# create_table(:suppliers) do |t|
# t.column :name, :string, :limit => 60
# # Other fields here
# end
#
# === Regular form
# create_table(:suppliers)
# add_column(:suppliers, :name, :string, {:limit => 60})
#
# The +options+ hash can include the following keys:
# [<tt>:id</tt>]
# Set to true or false to add/not add a primary key column
# automatically. Defaults to true.
# [<tt>:primary_key</tt>]
# The name of the primary key, if one is to be added automatically.
# Defaults to +id+.
# [<tt>:options</tt>]
# Any extra options you want appended to the table definition.
# [<tt>:temporary</tt>]
# Make a temporary table.
# [<tt>:force</tt>]
# Set to true or false to drop the table before creating it.
# Defaults to false.
#
# ===== Examples
# ====== Add a backend specific option to the generated SQL (MySQL)
# create_table(:suppliers, :options => 'ENGINE=InnoDB DEFAULT CHARSET=utf8')
# generates:
# CREATE TABLE suppliers (
# id int(11) DEFAULT NULL auto_increment PRIMARY KEY
# ) ENGINE=InnoDB DEFAULT CHARSET=utf8
#
# ====== Rename the primary key column
# create_table(:objects, :primary_key => 'guid') do |t|
# t.column :name, :string, :limit => 80
# end
# generates:
# CREATE TABLE objects (
# guid int(11) DEFAULT NULL auto_increment PRIMARY KEY,
# name varchar(80)
# )
#
# ====== Do not add a primary key column
# create_table(:categories_suppliers, :id => false) do |t|
# t.column :category_id, :integer
# t.column :supplier_id, :integer
# end
# generates:
# CREATE TABLE categories_suppliers_join (
# category_id int,
# supplier_id int
# )
#
# See also TableDefinition#column for details on how to create columns.
def create_table(name, options = {})
table_definition = TableDefinition.new(self)
table_definition.primary_key(options[:primary_key] || "id") unless options[:id] == false
yield table_definition
if options[:force]
drop_table(name) rescue nil
end
create_sql = "CREATE#{' TEMPORARY' if options[:temporary]} TABLE "
create_sql << "#{name} ("
create_sql << table_definition.to_sql
create_sql << ") #{options[:options]}"
execute create_sql
end
# Renames a table.
# ===== Example
# rename_table('octopuses', 'octopi')
def rename_table(name, new_name)
raise NotImplementedError, "rename_table is not implemented"
end
# Drops a table from the database.
def drop_table(name)
execute "DROP TABLE #{name}"
end
# Adds a new column to the named table.
# See TableDefinition#column for details of the options you can use.
def add_column(table_name, column_name, type, options = {})
add_column_sql = "ALTER TABLE #{table_name} ADD #{quote_column_name(column_name)} #{type_to_sql(type, options[:limit])}"
add_column_options!(add_column_sql, options)
execute(add_column_sql)
end
# Removes the column from the table definition.
# ===== Examples
# remove_column(:suppliers, :qualification)
def remove_column(table_name, column_name)
execute "ALTER TABLE #{table_name} DROP #{quote_column_name(column_name)}"
end
# Changes the column's definition according to the new options.
# See TableDefinition#column for details of the options you can use.
# ===== Examples
# change_column(:suppliers, :name, :string, :limit => 80)
# change_column(:accounts, :description, :text)
def change_column(table_name, column_name, type, options = {})
raise NotImplementedError, "change_column is not implemented"
end
# Sets a new default value for a column. If you want to set the default
# value to +NULL+, you are out of luck. You need to
# DatabaseStatements#execute the apppropriate SQL statement yourself.
# ===== Examples
# change_column_default(:suppliers, :qualification, 'new')
# change_column_default(:accounts, :authorized, 1)
def change_column_default(table_name, column_name, default)
raise NotImplementedError, "change_column_default is not implemented"
end
# Renames a column.
# ===== Example
# rename_column(:suppliers, :description, :name)
def rename_column(table_name, column_name, new_column_name)
raise NotImplementedError, "rename_column is not implemented"
end
# Adds a new index to the table. +column_name+ can be a single Symbol, or
# an Array of Symbols.
#
# The index will be named after the table and the first column names,
# unless you pass +:name+ as an option.
#
# When creating an index on multiple columns, the first column is used as a name
# for the index. For example, when you specify an index on two columns
# [+:first+, +:last+], the DBMS creates an index for both columns as well as an
# index for the first colum +:first+. Using just the first name for this index
# makes sense, because you will never have to create a singular index with this
# name.
#
# ===== Examples
# ====== Creating a simple index
# add_index(:suppliers, :name)
# generates
# CREATE INDEX suppliers_name_index ON suppliers(name)
# ====== Creating a unique index
# add_index(:accounts, [:branch_id, :party_id], :unique => true)
# generates
# CREATE UNIQUE INDEX accounts_branch_id_index ON accounts(branch_id, party_id)
# ====== Creating a named index
# add_index(:accounts, [:branch_id, :party_id], :unique => true, :name => 'by_branch_party')
# generates
# CREATE UNIQUE INDEX by_branch_party ON accounts(branch_id, party_id)
def add_index(table_name, column_name, options = {})
column_names = Array(column_name)
index_name = index_name(table_name, :column => column_names.first)
if Hash === options # legacy support, since this param was a string
index_type = options[:unique] ? "UNIQUE" : ""
index_name = options[:name] || index_name
else
index_type = options
end
quoted_column_names = column_names.map { |e| quote_column_name(e) }.join(", ")
execute "CREATE #{index_type} INDEX #{quote_column_name(index_name)} ON #{table_name} (#{quoted_column_names})"
end
# Remove the given index from the table.
#
# Remove the suppliers_name_index in the suppliers table (legacy support, use the second or third forms).
# remove_index :suppliers, :name
# Remove the index named accounts_branch_id in the accounts table.
# remove_index :accounts, :column => :branch_id
# Remove the index named by_branch_party in the accounts table.
# remove_index :accounts, :name => :by_branch_party
#
# You can remove an index on multiple columns by specifying the first column.
# add_index :accounts, [:username, :password]
# remove_index :accounts, :username
def remove_index(table_name, options = {})
execute "DROP INDEX #{quote_column_name(index_name(table_name, options))} ON #{table_name}"
end
def index_name(table_name, options) #:nodoc:
if Hash === options # legacy support
if options[:column]
"#{table_name}_#{options[:column]}_index"
elsif options[:name]
options[:name]
else
raise ArgumentError, "You must specify the index name"
end
else
"#{table_name}_#{options}_index"
end
end
# Returns a string of <tt>CREATE TABLE</tt> SQL statement(s) for recreating the
# entire structure of the database.
def structure_dump
end
# Should not be called normally, but this operation is non-destructive.
# The migrations module handles this automatically.
def initialize_schema_information
begin
execute "CREATE TABLE #{ActiveRecord::Migrator.schema_info_table_name} (version #{type_to_sql(:integer)})"
execute "INSERT INTO #{ActiveRecord::Migrator.schema_info_table_name} (version) VALUES(0)"
rescue ActiveRecord::StatementInvalid
# Schema has been intialized
end
end
def dump_schema_information #:nodoc:
begin
if (current_schema = ActiveRecord::Migrator.current_version) > 0
return "INSERT INTO #{ActiveRecord::Migrator.schema_info_table_name} (version) VALUES (#{current_schema})"
end
rescue ActiveRecord::StatementInvalid
# No Schema Info
end
end
def type_to_sql(type, limit = nil) #:nodoc:
native = native_database_types[type]
limit ||= native[:limit]
column_type_sql = native[:name]
column_type_sql << "(#{limit})" if limit
column_type_sql
end
def add_column_options!(sql, options) #:nodoc:
sql << " DEFAULT #{quote(options[:default], options[:column])}" unless options[:default].nil?
sql << " NOT NULL" if options[:null] == false
end
end
end
end

View file

@ -0,0 +1,153 @@
require 'benchmark'
require 'date'
require 'active_record/connection_adapters/abstract/schema_definitions'
require 'active_record/connection_adapters/abstract/schema_statements'
require 'active_record/connection_adapters/abstract/database_statements'
require 'active_record/connection_adapters/abstract/quoting'
require 'active_record/connection_adapters/abstract/connection_specification'
module ActiveRecord
module ConnectionAdapters # :nodoc:
# All the concrete database adapters follow the interface laid down in this class.
# You can use this interface directly by borrowing the database connection from the Base with
# Base.connection.
#
# Most of the methods in the adapter are useful during migrations. Most
# notably, SchemaStatements#create_table, SchemaStatements#drop_table,
# SchemaStatements#add_index, SchemaStatements#remove_index,
# SchemaStatements#add_column, SchemaStatements#change_column and
# SchemaStatements#remove_column are very useful.
class AbstractAdapter
include Quoting, DatabaseStatements, SchemaStatements
@@row_even = true
def initialize(connection, logger = nil) #:nodoc:
@connection, @logger = connection, logger
@runtime = 0
@last_verification = 0
end
# Returns the human-readable name of the adapter. Use mixed case - one
# can always use downcase if needed.
def adapter_name
'Abstract'
end
# Does this adapter support migrations? Backend specific, as the
# abstract adapter always returns +false+.
def supports_migrations?
false
end
# Does this adapter support using DISTINCT within COUNT? This is +true+
# for all adapters except sqlite.
def supports_count_distinct?
true
end
# Should primary key values be selected from their corresponding
# sequence before the insert statement? If true, next_sequence_value
# is called before each insert to set the record's primary key.
# This is false for all adapters but Firebird.
def prefetch_primary_key?(table_name = nil)
false
end
def reset_runtime #:nodoc:
rt, @runtime = @runtime, 0
rt
end
# CONNECTION MANAGEMENT ====================================
# Is this connection active and ready to perform queries?
def active?
@active != false
end
# Close this connection and open a new one in its place.
def reconnect!
@active = true
end
# Close this connection
def disconnect!
@active = false
end
# Lazily verify this connection, calling +active?+ only if it hasn't
# been called for +timeout+ seconds.
def verify!(timeout)
now = Time.now.to_i
if (now - @last_verification) > timeout
reconnect! unless active?
@last_verification = now
end
end
# Provides access to the underlying database connection. Useful for
# when you need to call a proprietary method such as postgresql's lo_*
# methods
def raw_connection
@connection
end
protected
def log(sql, name)
if block_given?
if @logger and @logger.level <= Logger::INFO
result = nil
seconds = Benchmark.realtime { result = yield }
@runtime += seconds
log_info(sql, name, seconds)
result
else
yield
end
else
log_info(sql, name, 0)
nil
end
rescue Exception => e
# Log message and raise exception.
# Set last_verfication to 0, so that connection gets verified
# upon reentering the request loop
@last_verification = 0
message = "#{e.class.name}: #{e.message}: #{sql}"
log_info(message, name, 0)
raise ActiveRecord::StatementInvalid, message
end
def log_info(sql, name, runtime)
return unless @logger
@logger.debug(
format_log_entry(
"#{name.nil? ? "SQL" : name} (#{sprintf("%f", runtime)})",
sql.gsub(/ +/, " ")
)
)
end
def format_log_entry(message, dump = nil)
if ActiveRecord::Base.colorize_logging
if @@row_even
@@row_even = false
message_color, dump_color = "4;36;1", "0;1"
else
@@row_even = true
message_color, dump_color = "4;35;1", "0"
end
log_entry = " \e[#{message_color}m#{message}\e[0m "
log_entry << "\e[#{dump_color}m%#{String === dump ? 's' : 'p'}\e[0m" % dump if dump
log_entry
else
"%s %s" % [message, dump]
end
end
end
end
end

View file

@ -0,0 +1,238 @@
# Author/Maintainer: Maik Schmidt <contact@maik-schmidt.de>
require 'active_record/connection_adapters/abstract_adapter'
begin
require 'db2/db2cli' unless self.class.const_defined?(:DB2CLI)
require 'active_record/vendor/db2'
module ActiveRecord
class Base
# Establishes a connection to the database that's used by
# all Active Record objects
def self.db2_connection(config) # :nodoc:
config = config.symbolize_keys
usr = config[:username]
pwd = config[:password]
schema = config[:schema]
if config.has_key?(:database)
database = config[:database]
else
raise ArgumentError, 'No database specified. Missing argument: database.'
end
connection = DB2::Connection.new(DB2::Environment.new)
connection.connect(database, usr, pwd)
ConnectionAdapters::DB2Adapter.new(connection, logger, :schema => schema)
end
end
module ConnectionAdapters
# The DB2 adapter works with the C-based CLI driver (http://rubyforge.org/projects/ruby-dbi/)
#
# Options:
#
# * <tt>:username</tt> -- Defaults to nothing
# * <tt>:password</tt> -- Defaults to nothing
# * <tt>:database</tt> -- The name of the database. No default, must be provided.
# * <tt>:schema</tt> -- Database schema to be set initially.
class DB2Adapter < AbstractAdapter
def initialize(connection, logger, connection_options)
super(connection, logger)
@connection_options = connection_options
if schema = @connection_options[:schema]
with_statement do |stmt|
stmt.exec_direct("SET SCHEMA=#{schema}")
end
end
end
def select_all(sql, name = nil)
select(sql, name)
end
def select_one(sql, name = nil)
select(sql, name).first
end
def insert(sql, name = nil, pk = nil, id_value = nil, sequence_name = nil)
execute(sql, name = nil)
id_value || last_insert_id
end
def execute(sql, name = nil)
rows_affected = 0
with_statement do |stmt|
log(sql, name) do
stmt.exec_direct(sql)
rows_affected = stmt.row_count
end
end
rows_affected
end
alias_method :update, :execute
alias_method :delete, :execute
def begin_db_transaction
@connection.set_auto_commit_off
end
def commit_db_transaction
@connection.commit
@connection.set_auto_commit_on
end
def rollback_db_transaction
@connection.rollback
@connection.set_auto_commit_on
end
def quote_column_name(column_name)
column_name
end
def adapter_name()
'DB2'
end
def quote_string(string)
string.gsub(/'/, "''") # ' (for ruby-mode)
end
def add_limit_offset!(sql, options)
if limit = options[:limit]
offset = options[:offset] || 0
# The following trick was added by andrea+rails@webcom.it.
sql.gsub!(/SELECT/i, 'SELECT B.* FROM (SELECT A.*, row_number() over () AS internal$rownum FROM (SELECT')
sql << ") A ) B WHERE B.internal$rownum > #{offset} AND B.internal$rownum <= #{limit + offset}"
end
end
def tables(name = nil)
result = []
schema = @connection_options[:schema] || '%'
with_statement do |stmt|
stmt.tables(schema).each { |t| result << t[2].downcase }
end
result
end
def indexes(table_name, name = nil)
tmp = {}
schema = @connection_options[:schema] || ''
with_statement do |stmt|
stmt.indexes(table_name, schema).each do |t|
next unless t[5]
next if t[4] == 'SYSIBM' # Skip system indexes.
idx_name = t[5].downcase
col_name = t[8].downcase
if tmp.has_key?(idx_name)
tmp[idx_name].columns << col_name
else
is_unique = t[3] == 0
tmp[idx_name] = IndexDefinition.new(table_name, idx_name, is_unique, [col_name])
end
end
end
tmp.values
end
def columns(table_name, name = nil)
result = []
schema = @connection_options[:schema] || '%'
with_statement do |stmt|
stmt.columns(table_name, schema).each do |c|
c_name = c[3].downcase
c_default = c[12] == 'NULL' ? nil : c[12]
c_default.gsub!(/^'(.*)'$/, '\1') if !c_default.nil?
c_type = c[5].downcase
c_type += "(#{c[6]})" if !c[6].nil? && c[6] != ''
result << Column.new(c_name, c_default, c_type, c[17] == 'YES')
end
end
result
end
def native_database_types
{
:primary_key => 'int generated by default as identity (start with 42) primary key',
:string => { :name => 'varchar', :limit => 255 },
:text => { :name => 'clob', :limit => 32768 },
:integer => { :name => 'int' },
:float => { :name => 'float' },
:datetime => { :name => 'timestamp' },
:timestamp => { :name => 'timestamp' },
:time => { :name => 'time' },
:date => { :name => 'date' },
:binary => { :name => 'blob', :limit => 32768 },
:boolean => { :name => 'decimal', :limit => 1 }
}
end
def quoted_true
'1'
end
def quoted_false
'0'
end
def active?
@connection.select_one 'select 1 from ibm.sysdummy1'
true
rescue Exception
false
end
def reconnect!
end
def table_alias_length
128
end
private
def with_statement
stmt = DB2::Statement.new(@connection)
yield stmt
stmt.free
end
def last_insert_id
row = select_one(<<-GETID.strip)
with temp(id) as (values (identity_val_local())) select * from temp
GETID
row['id'].to_i
end
def select(sql, name = nil)
rows = []
with_statement do |stmt|
log(sql, name) do
stmt.exec_direct("#{sql.gsub(/=\s*null/i, 'IS NULL')} with ur")
end
while row = stmt.fetch_as_hash
row.delete('internal$rownum')
rows << row
end
end
rows
end
end
end
end
rescue LoadError
# DB2 driver is unavailable.
module ActiveRecord # :nodoc:
class Base
def self.db2_connection(config) # :nodoc:
# Set up a reasonable error message
raise LoadError, "DB2 Libraries could not be loaded."
end
end
end
end

View file

@ -0,0 +1,414 @@
# Author: Ken Kunz <kennethkunz@gmail.com>
require 'active_record/connection_adapters/abstract_adapter'
module FireRuby # :nodoc: all
class Database
def self.new_from_params(database, host, port, service)
db_string = ""
if host
db_string << host
db_string << "/#{service || port}" if service || port
db_string << ":"
end
db_string << database
new(db_string)
end
end
end
module ActiveRecord
class << Base
def firebird_connection(config) # :nodoc:
require_library_or_gem 'fireruby'
unless defined? FireRuby::SQLType
raise AdapterNotFound,
'The Firebird adapter requires FireRuby version 0.4.0 or greater; you appear ' <<
'to be running an older version -- please update FireRuby (gem install fireruby).'
end
config = config.symbolize_keys
unless config.has_key?(:database)
raise ArgumentError, "No database specified. Missing argument: database."
end
options = config[:charset] ? { CHARACTER_SET => config[:charset] } : {}
connection_params = [config[:username], config[:password], options]
db = FireRuby::Database.new_from_params(*config.values_at(:database, :host, :port, :service))
connection = db.connect(*connection_params)
ConnectionAdapters::FirebirdAdapter.new(connection, logger, connection_params)
end
end
module ConnectionAdapters
class FirebirdColumn < Column # :nodoc:
VARCHAR_MAX_LENGTH = 32_765
BLOB_MAX_LENGTH = 32_767
def initialize(name, domain, type, sub_type, length, precision, scale, default_source, null_flag)
@firebird_type = FireRuby::SQLType.to_base_type(type, sub_type).to_s
super(name.downcase, nil, @firebird_type, !null_flag)
@default = parse_default(default_source) if default_source
@limit = type == 'BLOB' ? BLOB_MAX_LENGTH : length
@domain, @sub_type, @precision, @scale = domain, sub_type, precision, scale
end
def type
if @domain =~ /BOOLEAN/
:boolean
elsif @type == :binary and @sub_type == 1
:text
else
@type
end
end
# Submits a _CAST_ query to the database, casting the default value to the specified SQL type.
# This enables Firebird to provide an actual value when context variables are used as column
# defaults (such as CURRENT_TIMESTAMP).
def default
if @default
sql = "SELECT CAST(#{@default} AS #{column_def}) FROM RDB$DATABASE"
connection = ActiveRecord::Base.active_connections.values.detect { |conn| conn && conn.adapter_name == 'Firebird' }
if connection
type_cast connection.execute(sql).to_a.first['CAST']
else
raise ConnectionNotEstablished, "No Firebird connections established."
end
end
end
def type_cast(value)
if type == :boolean
value == true or value == ActiveRecord::ConnectionAdapters::FirebirdAdapter.boolean_domain[:true]
else
super
end
end
private
def parse_default(default_source)
default_source =~ /^\s*DEFAULT\s+(.*)\s*$/i
return $1 unless $1.upcase == "NULL"
end
def column_def
case @firebird_type
when 'BLOB' then "VARCHAR(#{VARCHAR_MAX_LENGTH})"
when 'CHAR', 'VARCHAR' then "#{@firebird_type}(#{@limit})"
when 'NUMERIC', 'DECIMAL' then "#{@firebird_type}(#{@precision},#{@scale.abs})"
when 'DOUBLE' then "DOUBLE PRECISION"
else @firebird_type
end
end
def simplified_type(field_type)
if field_type == 'TIMESTAMP'
:datetime
else
super
end
end
end
# The Firebird adapter relies on the FireRuby[http://rubyforge.org/projects/fireruby/]
# extension, version 0.4.0 or later (available as a gem or from
# RubyForge[http://rubyforge.org/projects/fireruby/]). FireRuby works with
# Firebird 1.5.x on Linux, OS X and Win32 platforms.
#
# == Usage Notes
#
# === Sequence (Generator) Names
# The Firebird adapter supports the same approach adopted for the Oracle
# adapter. See ActiveRecord::Base#set_sequence_name for more details.
#
# Note that in general there is no need to create a <tt>BEFORE INSERT</tt>
# trigger corresponding to a Firebird sequence generator when using
# ActiveRecord. In other words, you don't have to try to make Firebird
# simulate an <tt>AUTO_INCREMENT</tt> or +IDENTITY+ column. When saving a
# new record, ActiveRecord pre-fetches the next sequence value for the table
# and explicitly includes it in the +INSERT+ statement. (Pre-fetching the
# next primary key value is the only reliable method for the Firebird
# adapter to report back the +id+ after a successful insert.)
#
# === BOOLEAN Domain
# Firebird 1.5 does not provide a native +BOOLEAN+ type. But you can easily
# define a +BOOLEAN+ _domain_ for this purpose, e.g.:
#
# CREATE DOMAIN D_BOOLEAN AS SMALLINT CHECK (VALUE IN (0, 1));
#
# When the Firebird adapter encounters a column that is based on a domain
# that includes "BOOLEAN" in the domain name, it will attempt to treat
# the column as a +BOOLEAN+.
#
# By default, the Firebird adapter will assume that the BOOLEAN domain is
# defined as above. This can be modified if needed. For example, if you
# have a legacy schema with the following +BOOLEAN+ domain defined:
#
# CREATE DOMAIN BOOLEAN AS CHAR(1) CHECK (VALUE IN ('T', 'F'));
#
# ...you can add the following line to your <tt>environment.rb</tt> file:
#
# ActiveRecord::ConnectionAdapters::FirebirdAdapter.boolean_domain = { :true => 'T', :false => 'F' }
#
# === BLOB Elements
# The Firebird adapter currently provides only limited support for +BLOB+
# columns. You cannot currently retrieve or insert a +BLOB+ as an IO stream.
# When selecting a +BLOB+, the entire element is converted into a String.
# When inserting or updating a +BLOB+, the entire value is included in-line
# in the SQL statement, limiting you to values <= 32KB in size.
#
# === Column Name Case Semantics
# Firebird and ActiveRecord have somewhat conflicting case semantics for
# column names.
#
# [*Firebird*]
# The standard practice is to use unquoted column names, which can be
# thought of as case-insensitive. (In fact, Firebird converts them to
# uppercase.) Quoted column names (not typically used) are case-sensitive.
# [*ActiveRecord*]
# Attribute accessors corresponding to column names are case-sensitive.
# The defaults for primary key and inheritance columns are lowercase, and
# in general, people use lowercase attribute names.
#
# In order to map between the differing semantics in a way that conforms
# to common usage for both Firebird and ActiveRecord, uppercase column names
# in Firebird are converted to lowercase attribute names in ActiveRecord,
# and vice-versa. Mixed-case column names retain their case in both
# directions. Lowercase (quoted) Firebird column names are not supported.
# This is similar to the solutions adopted by other adapters.
#
# In general, the best approach is to use unqouted (case-insensitive) column
# names in your Firebird DDL (or if you must quote, use uppercase column
# names). These will correspond to lowercase attributes in ActiveRecord.
#
# For example, a Firebird table based on the following DDL:
#
# CREATE TABLE products (
# id BIGINT NOT NULL PRIMARY KEY,
# "TYPE" VARCHAR(50),
# name VARCHAR(255) );
#
# ...will correspond to an ActiveRecord model class called +Product+ with
# the following attributes: +id+, +type+, +name+.
#
# ==== Quoting <tt>"TYPE"</tt> and other Firebird reserved words:
# In ActiveRecord, the default inheritance column name is +type+. The word
# _type_ is a Firebird reserved word, so it must be quoted in any Firebird
# SQL statements. Because of the case mapping described above, you should
# always reference this column using quoted-uppercase syntax
# (<tt>"TYPE"</tt>) within Firebird DDL or other SQL statements (as in the
# example above). This holds true for any other Firebird reserved words used
# as column names as well.
#
# === Migrations
# The Firebird adapter does not currently support Migrations. I hope to
# add this feature in the near future.
#
# == Connection Options
# The following options are supported by the Firebird adapter. None of the
# options have default values.
#
# <tt>:database</tt>::
# <i>Required option.</i> Specifies one of: (i) a Firebird database alias;
# (ii) the full path of a database file; _or_ (iii) a full Firebird
# connection string. <i>Do not specify <tt>:host</tt>, <tt>:service</tt>
# or <tt>:port</tt> as separate options when using a full connection
# string.</i>
# <tt>:host</tt>::
# Set to <tt>"remote.host.name"</tt> for remote database connections.
# May be omitted for local connections if a full database path is
# specified for <tt>:database</tt>. Some platforms require a value of
# <tt>"localhost"</tt> for local connections when using a Firebird
# database _alias_.
# <tt>:service</tt>::
# Specifies a service name for the connection. Only used if <tt>:host</tt>
# is provided. Required when connecting to a non-standard service.
# <tt>:port</tt>::
# Specifies the connection port. Only used if <tt>:host</tt> is provided
# and <tt>:service</tt> is not. Required when connecting to a non-standard
# port and <tt>:service</tt> is not defined.
# <tt>:username</tt>::
# Specifies the database user. May be omitted or set to +nil+ (together
# with <tt>:password</tt>) to use the underlying operating system user
# credentials on supported platforms.
# <tt>:password</tt>::
# Specifies the database password. Must be provided if <tt>:username</tt>
# is explicitly specified; should be omitted if OS user credentials are
# are being used.
# <tt>:charset</tt>::
# Specifies the character set to be used by the connection. Refer to
# Firebird documentation for valid options.
class FirebirdAdapter < AbstractAdapter
@@boolean_domain = { :true => 1, :false => 0 }
cattr_accessor :boolean_domain
def initialize(connection, logger, connection_params=nil)
super(connection, logger)
@connection_params = connection_params
end
def adapter_name # :nodoc:
'Firebird'
end
# Returns true for Firebird adapter (since Firebird requires primary key
# values to be pre-fetched before insert). See also #next_sequence_value.
def prefetch_primary_key?(table_name = nil)
true
end
def default_sequence_name(table_name, primary_key) # :nodoc:
"#{table_name}_seq"
end
# QUOTING ==================================================
def quote(value, column = nil) # :nodoc:
if [Time, DateTime].include?(value.class)
"CAST('#{value.strftime("%Y-%m-%d %H:%M:%S")}' AS TIMESTAMP)"
else
super
end
end
def quote_string(string) # :nodoc:
string.gsub(/'/, "''")
end
def quote_column_name(column_name) # :nodoc:
%Q("#{ar_to_fb_case(column_name)}")
end
def quoted_true # :nodoc:
quote(boolean_domain[:true])
end
def quoted_false # :nodoc:
quote(boolean_domain[:false])
end
# CONNECTION MANAGEMENT ====================================
def active?
not @connection.closed?
end
def reconnect!
@connection.close
@connection = @connection.database.connect(*@connection_params)
end
# DATABASE STATEMENTS ======================================
def select_all(sql, name = nil) # :nodoc:
select(sql, name)
end
def select_one(sql, name = nil) # :nodoc:
result = select(sql, name)
result.nil? ? nil : result.first
end
def execute(sql, name = nil, &block) # :nodoc:
log(sql, name) do
if @transaction
@connection.execute(sql, @transaction, &block)
else
@connection.execute_immediate(sql, &block)
end
end
end
def insert(sql, name = nil, pk = nil, id_value = nil, sequence_name = nil) # :nodoc:
execute(sql, name)
id_value
end
alias_method :update, :execute
alias_method :delete, :execute
def begin_db_transaction() # :nodoc:
@transaction = @connection.start_transaction
end
def commit_db_transaction() # :nodoc:
@transaction.commit
ensure
@transaction = nil
end
def rollback_db_transaction() # :nodoc:
@transaction.rollback
ensure
@transaction = nil
end
def add_limit_offset!(sql, options) # :nodoc:
if options[:limit]
limit_string = "FIRST #{options[:limit]}"
limit_string << " SKIP #{options[:offset]}" if options[:offset]
sql.sub!(/\A(\s*SELECT\s)/i, '\&' + limit_string + ' ')
end
end
# Returns the next sequence value from a sequence generator. Not generally
# called directly; used by ActiveRecord to get the next primary key value
# when inserting a new database record (see #prefetch_primary_key?).
def next_sequence_value(sequence_name)
FireRuby::Generator.new(sequence_name, @connection).next(1)
end
# SCHEMA STATEMENTS ========================================
def columns(table_name, name = nil) # :nodoc:
sql = <<-END_SQL
SELECT r.rdb$field_name, r.rdb$field_source, f.rdb$field_type, f.rdb$field_sub_type,
f.rdb$field_length, f.rdb$field_precision, f.rdb$field_scale,
COALESCE(r.rdb$default_source, f.rdb$default_source) rdb$default_source,
COALESCE(r.rdb$null_flag, f.rdb$null_flag) rdb$null_flag
FROM rdb$relation_fields r
JOIN rdb$fields f ON r.rdb$field_source = f.rdb$field_name
WHERE r.rdb$relation_name = '#{table_name.to_s.upcase}'
ORDER BY r.rdb$field_position
END_SQL
execute(sql, name).collect do |field|
field_values = field.values.collect do |value|
case value
when String then value.rstrip
when FireRuby::Blob then value.to_s
else value
end
end
FirebirdColumn.new(*field_values)
end
end
private
def select(sql, name = nil)
execute(sql, name).collect do |row|
hashed_row = {}
row.each do |column, value|
value = value.to_s if FireRuby::Blob === value
hashed_row[fb_to_ar_case(column)] = value
end
hashed_row
end
end
# Maps uppercase Firebird column names to lowercase for ActiveRecord;
# mixed-case columns retain their original case.
def fb_to_ar_case(column_name)
column_name =~ /[[:lower:]]/ ? column_name : column_name.downcase
end
# Maps lowercase ActiveRecord column names to uppercase for Fierbird;
# mixed-case columns retain their original case.
def ar_to_fb_case(column_name)
column_name =~ /[[:upper:]]/ ? column_name : column_name.upcase
end
end
end
end

View file

@ -0,0 +1,357 @@
require 'active_record/connection_adapters/abstract_adapter'
module ActiveRecord
class Base
# Establishes a connection to the database that's used by all Active Record objects.
def self.mysql_connection(config) # :nodoc:
# Only include the MySQL driver if one hasn't already been loaded
unless defined? Mysql
begin
require_library_or_gem 'mysql'
rescue LoadError => cannot_require_mysql
# Only use the supplied backup Ruby/MySQL driver if no driver is already in place
begin
require 'active_record/vendor/mysql'
rescue LoadError
raise cannot_require_mysql
end
end
end
config = config.symbolize_keys
host = config[:host]
port = config[:port]
socket = config[:socket]
username = config[:username] ? config[:username].to_s : 'root'
password = config[:password].to_s
if config.has_key?(:database)
database = config[:database]
else
raise ArgumentError, "No database specified. Missing argument: database."
end
mysql = Mysql.init
mysql.ssl_set(config[:sslkey], config[:sslcert], config[:sslca], config[:sslcapath], config[:sslcipher]) if config[:sslkey]
ConnectionAdapters::MysqlAdapter.new(mysql, logger, [host, username, password, database, port, socket], config)
end
end
module ConnectionAdapters
class MysqlColumn < Column #:nodoc:
private
def simplified_type(field_type)
return :boolean if MysqlAdapter.emulate_booleans && field_type.downcase.index("tinyint(1)")
return :string if field_type =~ /enum/i
super
end
end
# The MySQL adapter will work with both Ruby/MySQL, which is a Ruby-based MySQL adapter that comes bundled with Active Record, and with
# the faster C-based MySQL/Ruby adapter (available both as a gem and from http://www.tmtm.org/en/mysql/ruby/).
#
# Options:
#
# * <tt>:host</tt> -- Defaults to localhost
# * <tt>:port</tt> -- Defaults to 3306
# * <tt>:socket</tt> -- Defaults to /tmp/mysql.sock
# * <tt>:username</tt> -- Defaults to root
# * <tt>:password</tt> -- Defaults to nothing
# * <tt>:database</tt> -- The name of the database. No default, must be provided.
# * <tt>:sslkey</tt> -- Necessary to use MySQL with an SSL connection
# * <tt>:sslcert</tt> -- Necessary to use MySQL with an SSL connection
# * <tt>:sslcapath</tt> -- Necessary to use MySQL with an SSL connection
# * <tt>:sslcipher</tt> -- Necessary to use MySQL with an SSL connection
#
# By default, the MysqlAdapter will consider all columns of type tinyint(1)
# as boolean. If you wish to disable this emulation (which was the default
# behavior in versions 0.13.1 and earlier) you can add the following line
# to your environment.rb file:
#
# ActiveRecord::ConnectionAdapters::MysqlAdapter.emulate_booleans = false
class MysqlAdapter < AbstractAdapter
@@emulate_booleans = true
cattr_accessor :emulate_booleans
LOST_CONNECTION_ERROR_MESSAGES = [
"Server shutdown in progress",
"Broken pipe",
"Lost connection to MySQL server during query",
"MySQL server has gone away"
]
def initialize(connection, logger, connection_options, config)
super(connection, logger)
@connection_options, @config = connection_options, config
@null_values_in_each_hash = Mysql.const_defined?(:VERSION)
connect
end
def adapter_name #:nodoc:
'MySQL'
end
def supports_migrations? #:nodoc:
true
end
def native_database_types #:nodoc
{
:primary_key => "int(11) DEFAULT NULL auto_increment PRIMARY KEY",
:string => { :name => "varchar", :limit => 255 },
:text => { :name => "text" },
:integer => { :name => "int", :limit => 11 },
:float => { :name => "float" },
:datetime => { :name => "datetime" },
:timestamp => { :name => "datetime" },
:time => { :name => "time" },
:date => { :name => "date" },
:binary => { :name => "blob" },
:boolean => { :name => "tinyint", :limit => 1 }
}
end
# QUOTING ==================================================
def quote(value, column = nil)
if value.kind_of?(String) && column && column.type == :binary && column.class.respond_to?(:string_to_binary)
s = column.class.string_to_binary(value).unpack("H*")[0]
"x'#{s}'"
else
super
end
end
def quote_column_name(name) #:nodoc:
"`#{name}`"
end
def quote_string(string) #:nodoc:
@connection.quote(string)
end
def quoted_true
"1"
end
def quoted_false
"0"
end
# CONNECTION MANAGEMENT ====================================
def active?
if @connection.respond_to?(:stat)
@connection.stat
else
@connection.query 'select 1'
end
# mysql-ruby doesn't raise an exception when stat fails.
if @connection.respond_to?(:errno)
@connection.errno.zero?
else
true
end
rescue Mysql::Error
false
end
def reconnect!
disconnect!
connect
end
def disconnect!
@connection.close rescue nil
end
# DATABASE STATEMENTS ======================================
def select_all(sql, name = nil) #:nodoc:
select(sql, name)
end
def select_one(sql, name = nil) #:nodoc:
result = select(sql, name)
result.nil? ? nil : result.first
end
def execute(sql, name = nil, retries = 2) #:nodoc:
log(sql, name) { @connection.query(sql) }
rescue ActiveRecord::StatementInvalid => exception
if exception.message.split(":").first =~ /Packets out of order/
raise ActiveRecord::StatementInvalid, "'Packets out of order' error was received from the database. Please update your mysql bindings (gem install mysql) and read http://dev.mysql.com/doc/mysql/en/password-hashing.html for more information. If you're on Windows, use the Instant Rails installer to get the updated mysql bindings."
else
raise
end
end
def insert(sql, name = nil, pk = nil, id_value = nil, sequence_name = nil) #:nodoc:
execute(sql, name = nil)
id_value || @connection.insert_id
end
def update(sql, name = nil) #:nodoc:
execute(sql, name)
@connection.affected_rows
end
alias_method :delete, :update #:nodoc:
def begin_db_transaction #:nodoc:
execute "BEGIN"
rescue Exception
# Transactions aren't supported
end
def commit_db_transaction #:nodoc:
execute "COMMIT"
rescue Exception
# Transactions aren't supported
end
def rollback_db_transaction #:nodoc:
execute "ROLLBACK"
rescue Exception
# Transactions aren't supported
end
def add_limit_offset!(sql, options) #:nodoc
if limit = options[:limit]
unless offset = options[:offset]
sql << " LIMIT #{limit}"
else
sql << " LIMIT #{offset}, #{limit}"
end
end
end
# SCHEMA STATEMENTS ========================================
def structure_dump #:nodoc:
if supports_views?
sql = "SHOW FULL TABLES WHERE Table_type = 'BASE TABLE'"
else
sql = "SHOW TABLES"
end
select_all(sql).inject("") do |structure, table|
table.delete('Table_type')
structure += select_one("SHOW CREATE TABLE #{table.to_a.first.last}")["Create Table"] + ";\n\n"
end
end
def recreate_database(name) #:nodoc:
drop_database(name)
create_database(name)
end
def create_database(name) #:nodoc:
execute "CREATE DATABASE `#{name}`"
end
def drop_database(name) #:nodoc:
execute "DROP DATABASE IF EXISTS `#{name}`"
end
def current_database
select_one("SELECT DATABASE() as db")["db"]
end
def tables(name = nil) #:nodoc:
tables = []
execute("SHOW TABLES", name).each { |field| tables << field[0] }
tables
end
def indexes(table_name, name = nil)#:nodoc:
indexes = []
current_index = nil
execute("SHOW KEYS FROM #{table_name}", name).each do |row|
if current_index != row[2]
next if row[2] == "PRIMARY" # skip the primary key
current_index = row[2]
indexes << IndexDefinition.new(row[0], row[2], row[1] == "0", [])
end
indexes.last.columns << row[4]
end
indexes
end
def columns(table_name, name = nil)#:nodoc:
sql = "SHOW FIELDS FROM #{table_name}"
columns = []
execute(sql, name).each { |field| columns << MysqlColumn.new(field[0], field[4], field[1], field[2] == "YES") }
columns
end
def create_table(name, options = {}) #:nodoc:
super(name, {:options => "ENGINE=InnoDB"}.merge(options))
end
def rename_table(name, new_name)
execute "RENAME TABLE #{name} TO #{new_name}"
end
def change_column_default(table_name, column_name, default) #:nodoc:
current_type = select_one("SHOW COLUMNS FROM #{table_name} LIKE '#{column_name}'")["Type"]
change_column(table_name, column_name, current_type, { :default => default })
end
def change_column(table_name, column_name, type, options = {}) #:nodoc:
options[:default] ||= select_one("SHOW COLUMNS FROM #{table_name} LIKE '#{column_name}'")["Default"]
change_column_sql = "ALTER TABLE #{table_name} CHANGE #{column_name} #{column_name} #{type_to_sql(type, options[:limit])}"
add_column_options!(change_column_sql, options)
execute(change_column_sql)
end
def rename_column(table_name, column_name, new_column_name) #:nodoc:
current_type = select_one("SHOW COLUMNS FROM #{table_name} LIKE '#{column_name}'")["Type"]
execute "ALTER TABLE #{table_name} CHANGE #{column_name} #{new_column_name} #{current_type}"
end
private
def connect
encoding = @config[:encoding]
if encoding
@connection.options(Mysql::SET_CHARSET_NAME, encoding) rescue nil
end
@connection.real_connect(*@connection_options)
execute("SET NAMES '#{encoding}'") if encoding
end
def select(sql, name = nil)
@connection.query_with_result = true
result = execute(sql, name)
rows = []
if @null_values_in_each_hash
result.each_hash { |row| rows << row }
else
all_fields = result.fetch_fields.inject({}) { |fields, f| fields[f.name] = nil; fields }
result.each_hash { |row| rows << all_fields.dup.update(row) }
end
result.free
rows
end
def supports_views?
version[0] >= 5
end
def version
@version ||= @connection.server_info.scan(/^(\d+)\.(\d+)\.(\d+)/).flatten.map { |v| v.to_i }
end
end
end
end

View file

@ -0,0 +1,349 @@
require 'active_record/connection_adapters/abstract_adapter'
module ActiveRecord
class Base
# Establishes a connection to the database that's used by all Active Record objects
def self.openbase_connection(config) # :nodoc:
require_library_or_gem 'openbase' unless self.class.const_defined?(:OpenBase)
config = config.symbolize_keys
host = config[:host]
username = config[:username].to_s
password = config[:password].to_s
if config.has_key?(:database)
database = config[:database]
else
raise ArgumentError, "No database specified. Missing argument: database."
end
oba = ConnectionAdapters::OpenBaseAdapter.new(
OpenBase.new(database, host, username, password), logger
)
oba
end
end
module ConnectionAdapters
class OpenBaseColumn < Column #:nodoc:
private
def simplified_type(field_type)
return :integer if field_type.downcase =~ /long/
return :float if field_type.downcase == "money"
return :binary if field_type.downcase == "object"
super
end
end
# The OpenBase adapter works with the Ruby/Openbase driver by Tetsuya Suzuki.
# http://www.spice-of-life.net/ruby-openbase/ (needs version 0.7.3+)
#
# Options:
#
# * <tt>:host</tt> -- Defaults to localhost
# * <tt>:username</tt> -- Defaults to nothing
# * <tt>:password</tt> -- Defaults to nothing
# * <tt>:database</tt> -- The name of the database. No default, must be provided.
#
# The OpenBase adapter will make use of OpenBase's ability to generate unique ids
# for any column with an unique index applied. Thus, if the value of a primary
# key is not specified at the time an INSERT is performed, the adapter will prefetch
# a unique id for the primary key. This prefetching is also necessary in order
# to return the id after an insert.
#
# Caveat: Operations involving LIMIT and OFFSET do not yet work!
#
# Maintainer: derrickspell@cdmplus.com
class OpenBaseAdapter < AbstractAdapter
def adapter_name
'OpenBase'
end
def native_database_types
{
:primary_key => "integer UNIQUE INDEX DEFAULT _rowid",
:string => { :name => "char", :limit => 4096 },
:text => { :name => "text" },
:integer => { :name => "integer" },
:float => { :name => "float" },
:datetime => { :name => "datetime" },
:timestamp => { :name => "timestamp" },
:time => { :name => "time" },
:date => { :name => "date" },
:binary => { :name => "object" },
:boolean => { :name => "boolean" }
}
end
def supports_migrations?
false
end
def prefetch_primary_key?(table_name = nil)
true
end
def default_sequence_name(table_name, primary_key) # :nodoc:
"#{table_name} #{primary_key}"
end
def next_sequence_value(sequence_name)
ary = sequence_name.split(' ')
if (!ary[1]) then
ary[0] =~ /(\w+)_nonstd_seq/
ary[0] = $1
end
@connection.unique_row_id(ary[0], ary[1])
end
# QUOTING ==================================================
def quote(value, column = nil)
if value.kind_of?(String) && column && column.type == :binary
"'#{@connection.insert_binary(value)}'"
else
super
end
end
def quoted_true
"1"
end
def quoted_false
"0"
end
# DATABASE STATEMENTS ======================================
def add_limit_offset!(sql, options) #:nodoc
if limit = options[:limit]
unless offset = options[:offset]
sql << " RETURN RESULTS #{limit}"
else
limit = limit + offset
sql << " RETURN RESULTS #{offset} TO #{limit}"
end
end
end
def select_all(sql, name = nil) #:nodoc:
select(sql, name)
end
def select_one(sql, name = nil) #:nodoc:
add_limit_offset!(sql,{:limit => 1})
results = select(sql, name)
results.first if results
end
def insert(sql, name = nil, pk = nil, id_value = nil, sequence_name = nil) #:nodoc:
execute(sql, name)
update_nulls_after_insert(sql, name, pk, id_value, sequence_name)
id_value
end
def execute(sql, name = nil) #:nodoc:
log(sql, name) { @connection.execute(sql) }
end
def update(sql, name = nil) #:nodoc:
execute(sql, name).rows_affected
end
alias_method :delete, :update #:nodoc:
#=begin
def begin_db_transaction #:nodoc:
execute "START TRANSACTION"
rescue Exception
# Transactions aren't supported
end
def commit_db_transaction #:nodoc:
execute "COMMIT"
rescue Exception
# Transactions aren't supported
end
def rollback_db_transaction #:nodoc:
execute "ROLLBACK"
rescue Exception
# Transactions aren't supported
end
#=end
# SCHEMA STATEMENTS ========================================
# Return the list of all tables in the schema search path.
def tables(name = nil) #:nodoc:
tables = @connection.tables
tables.reject! { |t| /\A_SYS_/ === t }
tables
end
def columns(table_name, name = nil) #:nodoc:
sql = "SELECT * FROM _sys_tables "
sql << "WHERE tablename='#{table_name}' AND INDEXOF(fieldname,'_')<>0 "
sql << "ORDER BY columnNumber"
columns = []
select_all(sql, name).each do |row|
columns << OpenBaseColumn.new(row["fieldname"],
default_value(row["defaultvalue"]),
sql_type_name(row["typename"],row["length"]),
row["notnull"]
)
# breakpoint() if row["fieldname"] == "content"
end
columns
end
def indexes(table_name, name = nil)#:nodoc:
sql = "SELECT fieldname, notnull, searchindex, uniqueindex, clusteredindex FROM _sys_tables "
sql << "WHERE tablename='#{table_name}' AND INDEXOF(fieldname,'_')<>0 "
sql << "AND primarykey=0 "
sql << "AND (searchindex=1 OR uniqueindex=1 OR clusteredindex=1) "
sql << "ORDER BY columnNumber"
indexes = []
execute(sql, name).each do |row|
indexes << IndexDefinition.new(table_name,index_name(row),row[3]==1,[row[0]])
end
indexes
end
private
def select(sql, name = nil)
sql = translate_sql(sql)
results = execute(sql, name)
date_cols = []
col_names = []
results.column_infos.each do |info|
col_names << info.name
date_cols << info.name if info.type == "date"
end
rows = []
if ( results.rows_affected )
results.each do |row| # loop through result rows
hashed_row = {}
row.each_index do |index|
hashed_row["#{col_names[index]}"] = row[index] unless col_names[index] == "_rowid"
end
date_cols.each do |name|
unless hashed_row["#{name}"].nil? or hashed_row["#{name}"].empty?
hashed_row["#{name}"] = Date.parse(hashed_row["#{name}"],false).to_s
end
end
rows << hashed_row
end
end
rows
end
def default_value(value)
# Boolean type values
return true if value =~ /true/
return false if value =~ /false/
# Date / Time magic values
return Time.now.to_s if value =~ /^now\(\)/i
# Empty strings should be set to null
return nil if value.empty?
# Otherwise return what we got from OpenBase
# and hope for the best...
return value
end
def sql_type_name(type_name, length)
return "#{type_name}(#{length})" if ( type_name =~ /char/ )
type_name
end
def index_name(row = [])
name = ""
name << "UNIQUE " if row[3]
name << "CLUSTERED " if row[4]
name << "INDEX"
name
end
def translate_sql(sql)
# Change table.* to list of columns in table
while (sql =~ /SELECT.*\s(\w+)\.\*/)
table = $1
cols = columns(table)
if ( cols.size == 0 ) then
# Maybe this is a table alias
sql =~ /FROM(.+?)(?:LEFT|OUTER|JOIN|WHERE|GROUP|HAVING|ORDER|RETURN|$)/
$1 =~ /[\s|,](\w+)\s+#{table}[\s|,]/ # get the tablename for this alias
cols = columns($1)
end
select_columns = []
cols.each do |col|
select_columns << table + '.' + col.name
end
sql.gsub!(table + '.*',select_columns.join(", ")) if select_columns
end
# Change JOIN clause to table list and WHERE condition
while (sql =~ /JOIN/)
sql =~ /((LEFT )?(OUTER )?JOIN (\w+) ON )(.+?)(?:LEFT|OUTER|JOIN|WHERE|GROUP|HAVING|ORDER|RETURN|$)/
join_clause = $1 + $5
is_outer_join = $3
join_table = $4
join_condition = $5
join_condition.gsub!(/=/,"*") if is_outer_join
if (sql =~ /WHERE/)
sql.gsub!(/WHERE/,"WHERE (#{join_condition}) AND")
else
sql.gsub!(join_clause,"#{join_clause} WHERE #{join_condition}")
end
sql =~ /(FROM .+?)(?:LEFT|OUTER|JOIN|WHERE|$)/
from_clause = $1
sql.gsub!(from_clause,"#{from_clause}, #{join_table} ")
sql.gsub!(join_clause,"")
end
# ORDER BY _rowid if no explicit ORDER BY
# This will ensure that find(:first) returns the first inserted row
if (sql !~ /(ORDER BY)|(GROUP BY)/)
if (sql =~ /RETURN RESULTS/)
sql.sub!(/RETURN RESULTS/,"ORDER BY _rowid RETURN RESULTS")
else
sql << " ORDER BY _rowid"
end
end
sql
end
def update_nulls_after_insert(sql, name = nil, pk = nil, id_value = nil, sequence_name = nil)
sql =~ /INSERT INTO (\w+) \((.*)\) VALUES\s*\((.*)\)/m
table = $1
cols = $2
values = $3
cols = cols.split(',')
values.gsub!(/'[^']*'/,"''")
values.gsub!(/"[^"]*"/,"\"\"")
values = values.split(',')
update_cols = []
values.each_index { |index| update_cols << cols[index] if values[index] =~ /\s*NULL\s*/ }
update_sql = "UPDATE #{table} SET"
update_cols.each { |col| update_sql << " #{col}=NULL," unless col.empty? }
update_sql.chop!()
update_sql << " WHERE #{pk}=#{quote(id_value)}"
execute(update_sql, name + " NULL Correction") if update_cols.size > 0
end
end
end
end

View file

@ -0,0 +1,665 @@
# oracle_adapter.rb -- ActiveRecord adapter for Oracle 8i, 9i, 10g
#
# Original author: Graham Jenkins
#
# Current maintainer: Michael Schoen <schoenm@earthlink.net>
#
#########################################################################
#
# Implementation notes:
# 1. Redefines (safely) a method in ActiveRecord to make it possible to
# implement an autonumbering solution for Oracle.
# 2. The OCI8 driver is patched to properly handle values for LONG and
# TIMESTAMP columns. The driver-author has indicated that a future
# release of the driver will obviate this patch.
# 3. LOB support is implemented through an after_save callback.
# 4. Oracle does not offer native LIMIT and OFFSET options; this
# functionality is mimiced through the use of nested selects.
# See http://asktom.oracle.com/pls/ask/f?p=4950:8:::::F4950_P8_DISPLAYID:127412348064
#
# Do what you want with this code, at your own peril, but if any
# significant portion of my code remains then please acknowledge my
# contribution.
# portions Copyright 2005 Graham Jenkins
require 'active_record/connection_adapters/abstract_adapter'
require 'delegate'
begin
require_library_or_gem 'oci8' unless self.class.const_defined? :OCI8
module ActiveRecord
class Base
def self.oracle_connection(config) #:nodoc:
# Use OCI8AutoRecover instead of normal OCI8 driver.
ConnectionAdapters::OracleAdapter.new OCI8AutoRecover.new(config), logger
end
# for backwards-compatibility
def self.oci_connection(config) #:nodoc:
config[:database] = config[:host]
self.oracle_connection(config)
end
# Enable the id column to be bound into the sql later, by the adapter's insert method.
# This is preferable to inserting the hard-coded value here, because the insert method
# needs to know the id value explicitly.
alias :attributes_with_quotes_pre_oracle :attributes_with_quotes
def attributes_with_quotes(include_primary_key = true) #:nodoc:
aq = attributes_with_quotes_pre_oracle(include_primary_key)
if connection.class == ConnectionAdapters::OracleAdapter
aq[self.class.primary_key] = ":id" if include_primary_key && aq[self.class.primary_key].nil?
end
aq
end
# After setting large objects to empty, select the OCI8::LOB
# and write back the data.
after_save :write_lobs
def write_lobs() #:nodoc:
if connection.is_a?(ConnectionAdapters::OracleAdapter)
self.class.columns.select { |c| c.type == :binary }.each { |c|
value = self[c.name]
next if value.nil? || (value == '')
lob = connection.select_one(
"SELECT #{ c.name} FROM #{ self.class.table_name } WHERE #{ self.class.primary_key} = #{quote(id)}",
'Writable Large Object')[c.name]
lob.write value
}
end
end
private :write_lobs
end
module ConnectionAdapters #:nodoc:
class OracleColumn < Column #:nodoc:
attr_reader :sql_type
# overridden to add the concept of scale, required to differentiate
# between integer and float fields
def initialize(name, default, sql_type, limit, scale, null)
@name, @limit, @sql_type, @scale, @null = name, limit, sql_type, scale, null
@type = simplified_type(sql_type)
@default = type_cast(default)
@primary = nil
@text = [:string, :text].include? @type
@number = [:float, :integer].include? @type
end
def type_cast(value)
return nil if value.nil? || value =~ /^\s*null\s*$/i
case type
when :string then value
when :integer then defined?(value.to_i) ? value.to_i : (value ? 1 : 0)
when :float then value.to_f
when :datetime then cast_to_date_or_time(value)
when :time then cast_to_time(value)
else value
end
end
private
def simplified_type(field_type)
case field_type
when /char/i : :string
when /num|float|double|dec|real|int/i : @scale == 0 ? :integer : :float
when /date|time/i : @name =~ /_at$/ ? :time : :datetime
when /clob/i : :text
when /blob/i : :binary
end
end
def cast_to_date_or_time(value)
return value if value.is_a? Date
return nil if value.blank?
guess_date_or_time (value.is_a? Time) ? value : cast_to_time(value)
end
def cast_to_time(value)
return value if value.is_a? Time
time_array = ParseDate.parsedate value
time_array[0] ||= 2000; time_array[1] ||= 1; time_array[2] ||= 1;
Time.send(Base.default_timezone, *time_array) rescue nil
end
def guess_date_or_time(value)
(value.hour == 0 and value.min == 0 and value.sec == 0) ?
Date.new(value.year, value.month, value.day) : value
end
end
# This is an Oracle/OCI adapter for the ActiveRecord persistence
# framework. It relies upon the OCI8 driver, which works with Oracle 8i
# and above. Most recent development has been on Debian Linux against
# a 10g database, ActiveRecord 1.12.1 and OCI8 0.1.13.
# See: http://rubyforge.org/projects/ruby-oci8/
#
# Usage notes:
# * Key generation assumes a "${table_name}_seq" sequence is available
# for all tables; the sequence name can be changed using
# ActiveRecord::Base.set_sequence_name. When using Migrations, these
# sequences are created automatically.
# * Oracle uses DATE or TIMESTAMP datatypes for both dates and times.
# Consequently some hacks are employed to map data back to Date or Time
# in Ruby. If the column_name ends in _time it's created as a Ruby Time.
# Else if the hours/minutes/seconds are 0, I make it a Ruby Date. Else
# it's a Ruby Time. This is a bit nasty - but if you use Duck Typing
# you'll probably not care very much. In 9i and up it's tempting to
# map DATE to Date and TIMESTAMP to Time, but too many databases use
# DATE for both. Timezones and sub-second precision on timestamps are
# not supported.
# * Default values that are functions (such as "SYSDATE") are not
# supported. This is a restriction of the way ActiveRecord supports
# default values.
# * Support for Oracle8 is limited by Rails' use of ANSI join syntax, which
# is supported in Oracle9i and later. You will need to use #finder_sql for
# has_and_belongs_to_many associations to run against Oracle8.
#
# Required parameters:
#
# * <tt>:username</tt>
# * <tt>:password</tt>
# * <tt>:database</tt>
class OracleAdapter < AbstractAdapter
def adapter_name #:nodoc:
'Oracle'
end
def supports_migrations? #:nodoc:
true
end
def native_database_types #:nodoc
{
:primary_key => "NUMBER(38) NOT NULL PRIMARY KEY",
:string => { :name => "VARCHAR2", :limit => 255 },
:text => { :name => "CLOB" },
:integer => { :name => "NUMBER", :limit => 38 },
:float => { :name => "NUMBER" },
:datetime => { :name => "DATE" },
:timestamp => { :name => "DATE" },
:time => { :name => "DATE" },
:date => { :name => "DATE" },
:binary => { :name => "BLOB" },
:boolean => { :name => "NUMBER", :limit => 1 }
}
end
def table_alias_length
30
end
# QUOTING ==================================================
#
# see: abstract/quoting.rb
# camelCase column names need to be quoted; not that anyone using Oracle
# would really do this, but handling this case means we pass the test...
def quote_column_name(name) #:nodoc:
name =~ /[A-Z]/ ? "\"#{name}\"" : name
end
def quote_string(string) #:nodoc:
string.gsub(/'/, "''")
end
def quote(value, column = nil) #:nodoc:
if column && column.type == :binary
%Q{empty_#{ column.sql_type rescue 'blob' }()}
else
case value
when String : %Q{'#{quote_string(value)}'}
when NilClass : 'null'
when TrueClass : '1'
when FalseClass : '0'
when Numeric : value.to_s
when Date, Time : %Q{'#{value.strftime("%Y-%m-%d %H:%M:%S")}'}
else %Q{'#{quote_string(value.to_yaml)}'}
end
end
end
# CONNECTION MANAGEMENT ====================================
#
# Returns true if the connection is active.
def active?
# Pings the connection to check if it's still good. Note that an
# #active? method is also available, but that simply returns the
# last known state, which isn't good enough if the connection has
# gone stale since the last use.
@connection.ping
rescue OCIException
false
end
# Reconnects to the database.
def reconnect!
@connection.reset!
rescue OCIException => e
@logger.warn "#{adapter_name} automatic reconnection failed: #{e.message}"
end
# Disconnects from the database.
def disconnect!
@connection.logoff rescue nil
@connection.active = false
end
# DATABASE STATEMENTS ======================================
#
# see: abstract/database_statements.rb
def select_all(sql, name = nil) #:nodoc:
select(sql, name)
end
def select_one(sql, name = nil) #:nodoc:
result = select_all(sql, name)
result.size > 0 ? result.first : nil
end
def execute(sql, name = nil) #:nodoc:
log(sql, name) { @connection.exec sql }
end
def insert(sql, name = nil, pk = nil, id_value = nil, sequence_name = nil) #:nodoc:
if pk.nil? # Who called us? What does the sql look like? No idea!
execute sql, name
elsif id_value # Pre-assigned id
log(sql, name) { @connection.exec sql }
else # Assume the sql contains a bind-variable for the id
id_value = select_one("select #{sequence_name}.nextval id from dual")['id']
log(sql, name) { @connection.exec sql, id_value }
end
id_value
end
alias :update :execute #:nodoc:
alias :delete :execute #:nodoc:
def begin_db_transaction #:nodoc:
@connection.autocommit = false
end
def commit_db_transaction #:nodoc:
@connection.commit
ensure
@connection.autocommit = true
end
def rollback_db_transaction #:nodoc:
@connection.rollback
ensure
@connection.autocommit = true
end
def add_limit_offset!(sql, options) #:nodoc:
offset = options[:offset] || 0
if limit = options[:limit]
sql.replace "select * from (select raw_sql_.*, rownum raw_rnum_ from (#{sql}) raw_sql_ where rownum <= #{offset+limit}) where raw_rnum_ > #{offset}"
elsif offset > 0
sql.replace "select * from (select raw_sql_.*, rownum raw_rnum_ from (#{sql}) raw_sql_) where raw_rnum_ > #{offset}"
end
end
def default_sequence_name(table, column) #:nodoc:
"#{table}_seq"
end
# SCHEMA STATEMENTS ========================================
#
# see: abstract/schema_statements.rb
def current_database #:nodoc:
select_one("select sys_context('userenv','db_name') db from dual")["db"]
end
def tables(name = nil) #:nodoc:
select_all("select lower(table_name) from user_tables").inject([]) do | tabs, t |
tabs << t.to_a.first.last
end
end
def indexes(table_name, name = nil) #:nodoc:
result = select_all(<<-SQL, name)
SELECT lower(i.index_name) as index_name, i.uniqueness, lower(c.column_name) as column_name
FROM user_indexes i, user_ind_columns c
WHERE i.table_name = '#{table_name.to_s.upcase}'
AND c.index_name = i.index_name
AND i.index_name NOT IN (SELECT index_name FROM user_constraints WHERE constraint_type = 'P')
ORDER BY i.index_name, c.column_position
SQL
current_index = nil
indexes = []
result.each do |row|
if current_index != row['index_name']
indexes << IndexDefinition.new(table_name, row['index_name'], row['uniqueness'] == "UNIQUE", [])
current_index = row['index_name']
end
indexes.last.columns << row['column_name']
end
indexes
end
def columns(table_name, name = nil) #:nodoc:
(owner, table_name) = @connection.describe(table_name)
table_cols = %Q{
select column_name, data_type, data_default, nullable,
decode(data_type, 'NUMBER', data_precision,
'VARCHAR2', data_length,
null) as length,
decode(data_type, 'NUMBER', data_scale, null) as scale
from all_tab_columns
where owner = '#{owner}'
and table_name = '#{table_name}'
order by column_id
}
select_all(table_cols, name).map do |row|
if row['data_default']
row['data_default'].sub!(/^(.*?)\s*$/, '\1')
row['data_default'].sub!(/^'(.*)'$/, '\1')
end
OracleColumn.new(
oracle_downcase(row['column_name']),
row['data_default'],
row['data_type'],
(l = row['length']).nil? ? nil : l.to_i,
(s = row['scale']).nil? ? nil : s.to_i,
row['nullable'] == 'Y'
)
end
end
def create_table(name, options = {}) #:nodoc:
super(name, options)
execute "CREATE SEQUENCE #{name}_seq START WITH 10000" unless options[:id] == false
end
def rename_table(name, new_name) #:nodoc:
execute "RENAME #{name} TO #{new_name}"
execute "RENAME #{name}_seq TO #{new_name}_seq" rescue nil
end
def drop_table(name) #:nodoc:
super(name)
execute "DROP SEQUENCE #{name}_seq" rescue nil
end
def remove_index(table_name, options = {}) #:nodoc:
execute "DROP INDEX #{index_name(table_name, options)}"
end
def change_column_default(table_name, column_name, default) #:nodoc:
execute "ALTER TABLE #{table_name} MODIFY #{column_name} DEFAULT #{quote(default)}"
end
def change_column(table_name, column_name, type, options = {}) #:nodoc:
change_column_sql = "ALTER TABLE #{table_name} MODIFY #{column_name} #{type_to_sql(type, options[:limit])}"
add_column_options!(change_column_sql, options)
execute(change_column_sql)
end
def rename_column(table_name, column_name, new_column_name) #:nodoc:
execute "ALTER TABLE #{table_name} RENAME COLUMN #{column_name} to #{new_column_name}"
end
def remove_column(table_name, column_name) #:nodoc:
execute "ALTER TABLE #{table_name} DROP COLUMN #{column_name}"
end
def structure_dump #:nodoc:
s = select_all("select sequence_name from user_sequences").inject("") do |structure, seq|
structure << "create sequence #{seq.to_a.first.last};\n\n"
end
select_all("select table_name from user_tables").inject(s) do |structure, table|
ddl = "create table #{table.to_a.first.last} (\n "
cols = select_all(%Q{
select column_name, data_type, data_length, data_precision, data_scale, data_default, nullable
from user_tab_columns
where table_name = '#{table.to_a.first.last}'
order by column_id
}).map do |row|
col = "#{row['column_name'].downcase} #{row['data_type'].downcase}"
if row['data_type'] =='NUMBER' and !row['data_precision'].nil?
col << "(#{row['data_precision'].to_i}"
col << ",#{row['data_scale'].to_i}" if !row['data_scale'].nil?
col << ')'
elsif row['data_type'].include?('CHAR')
col << "(#{row['data_length'].to_i})"
end
col << " default #{row['data_default']}" if !row['data_default'].nil?
col << ' not null' if row['nullable'] == 'N'
col
end
ddl << cols.join(",\n ")
ddl << ");\n\n"
structure << ddl
end
end
def structure_drop #:nodoc:
s = select_all("select sequence_name from user_sequences").inject("") do |drop, seq|
drop << "drop sequence #{seq.to_a.first.last};\n\n"
end
select_all("select table_name from user_tables").inject(s) do |drop, table|
drop << "drop table #{table.to_a.first.last} cascade constraints;\n\n"
end
end
private
def select(sql, name = nil)
cursor = execute(sql, name)
cols = cursor.get_col_names.map { |x| oracle_downcase(x) }
rows = []
while row = cursor.fetch
hash = Hash.new
cols.each_with_index do |col, i|
hash[col] =
case row[i]
when OCI8::LOB
name == 'Writable Large Object' ? row[i]: row[i].read
when OraDate
(row[i].hour == 0 and row[i].minute == 0 and row[i].second == 0) ?
row[i].to_date : row[i].to_time
else row[i]
end unless col == 'raw_rnum_'
end
rows << hash
end
rows
ensure
cursor.close if cursor
end
# Oracle column names by default are case-insensitive, but treated as upcase;
# for neatness, we'll downcase within Rails. EXCEPT that folks CAN quote
# their column names when creating Oracle tables, which makes then case-sensitive.
# I don't know anybody who does this, but we'll handle the theoretical case of a
# camelCase column name. I imagine other dbs handle this different, since there's a
# unit test that's currently failing test_oci.
def oracle_downcase(column_name)
column_name =~ /[a-z]/ ? column_name : column_name.downcase
end
end
end
end
class OCI8 #:nodoc:
# This OCI8 patch may not longer be required with the upcoming
# release of version 0.2.
class Cursor #:nodoc:
alias :define_a_column_pre_ar :define_a_column
def define_a_column(i)
case do_ocicall(@ctx) { @parms[i - 1].attrGet(OCI_ATTR_DATA_TYPE) }
when 8 : @stmt.defineByPos(i, String, 65535) # Read LONG values
when 187 : @stmt.defineByPos(i, OraDate) # Read TIMESTAMP values
when 108
if @parms[i - 1].attrGet(OCI_ATTR_TYPE_NAME) == 'XMLTYPE'
@stmt.defineByPos(i, String, 65535)
else
raise 'unsupported datatype'
end
else define_a_column_pre_ar i
end
end
end
# missing constant from oci8 < 0.1.14
OCI_PTYPE_UNK = 0 unless defined?(OCI_PTYPE_UNK)
# Uses the describeAny OCI call to find the target owner and table_name
# indicated by +name+, parsing through synonynms as necessary. Returns
# an array of [owner, table_name].
def describe(name)
@desc ||= @@env.alloc(OCIDescribe)
@desc.attrSet(OCI_ATTR_DESC_PUBLIC, -1) if VERSION >= '0.1.14'
@desc.describeAny(@svc, name.to_s, OCI_PTYPE_UNK)
info = @desc.attrGet(OCI_ATTR_PARAM)
case info.attrGet(OCI_ATTR_PTYPE)
when OCI_PTYPE_TABLE, OCI_PTYPE_VIEW
owner = info.attrGet(OCI_ATTR_OBJ_SCHEMA)
table_name = info.attrGet(OCI_ATTR_OBJ_NAME)
[owner, table_name]
when OCI_PTYPE_SYN
schema = info.attrGet(OCI_ATTR_SCHEMA_NAME)
name = info.attrGet(OCI_ATTR_NAME)
describe(schema + '.' + name)
end
end
end
# The OracleConnectionFactory factors out the code necessary to connect and
# configure an Oracle/OCI connection.
class OracleConnectionFactory #:nodoc:
def new_connection(username, password, database)
conn = OCI8.new username, password, database
conn.exec %q{alter session set nls_date_format = 'YYYY-MM-DD HH24:MI:SS'}
conn.exec %q{alter session set nls_timestamp_format = 'YYYY-MM-DD HH24:MI:SS'} rescue nil
conn.autocommit = true
conn
end
end
# The OCI8AutoRecover class enhances the OCI8 driver with auto-recover and
# reset functionality. If a call to #exec fails, and autocommit is turned on
# (ie., we're not in the middle of a longer transaction), it will
# automatically reconnect and try again. If autocommit is turned off,
# this would be dangerous (as the earlier part of the implied transaction
# may have failed silently if the connection died) -- so instead the
# connection is marked as dead, to be reconnected on it's next use.
class OCI8AutoRecover < DelegateClass(OCI8) #:nodoc:
attr_accessor :active
alias :active? :active
cattr_accessor :auto_retry
class << self
alias :auto_retry? :auto_retry
end
@@auto_retry = false
def initialize(config, factory = OracleConnectionFactory.new)
@active = true
@username, @password, @database = config[:username], config[:password], config[:database]
@factory = factory
@connection = @factory.new_connection @username, @password, @database
super @connection
end
# Checks connection, returns true if active. Note that ping actively
# checks the connection, while #active? simply returns the last
# known state.
def ping
@connection.exec("select 1 from dual") { |r| nil }
@active = true
rescue
@active = false
raise
end
# Resets connection, by logging off and creating a new connection.
def reset!
logoff rescue nil
begin
@connection = @factory.new_connection @username, @password, @database
__setobj__ @connection
@active = true
rescue
@active = false
raise
end
end
# ORA-00028: your session has been killed
# ORA-01012: not logged on
# ORA-03113: end-of-file on communication channel
# ORA-03114: not connected to ORACLE
LOST_CONNECTION_ERROR_CODES = [ 28, 1012, 3113, 3114 ]
# Adds auto-recovery functionality.
#
# See: http://www.jiubao.org/ruby-oci8/api.en.html#label-11
def exec(sql, *bindvars)
should_retry = self.class.auto_retry? && autocommit?
begin
@connection.exec(sql, *bindvars)
rescue OCIException => e
raise unless LOST_CONNECTION_ERROR_CODES.include?(e.code)
@active = false
raise unless should_retry
should_retry = false
reset! rescue nil
retry
end
end
end
rescue LoadError
# OCI8 driver is unavailable.
module ActiveRecord # :nodoc:
class Base
def self.oracle_connection(config) # :nodoc:
# Set up a reasonable error message
raise LoadError, "Oracle/OCI libraries could not be loaded."
end
def self.oci_connection(config) # :nodoc:
# Set up a reasonable error message
raise LoadError, "Oracle/OCI libraries could not be loaded."
end
end
end
end

View file

@ -0,0 +1,507 @@
require 'active_record/connection_adapters/abstract_adapter'
module ActiveRecord
class Base
# Establishes a connection to the database that's used by all Active Record objects
def self.postgresql_connection(config) # :nodoc:
require_library_or_gem 'postgres' unless self.class.const_defined?(:PGconn)
config = config.symbolize_keys
host = config[:host]
port = config[:port] || 5432 unless host.nil?
username = config[:username].to_s
password = config[:password].to_s
min_messages = config[:min_messages]
if config.has_key?(:database)
database = config[:database]
else
raise ArgumentError, "No database specified. Missing argument: database."
end
pga = ConnectionAdapters::PostgreSQLAdapter.new(
PGconn.connect(host, port, "", "", database, username, password), logger, config
)
PGconn.translate_results = false if PGconn.respond_to? :translate_results=
pga.schema_search_path = config[:schema_search_path] || config[:schema_order]
pga
end
end
module ConnectionAdapters
# The PostgreSQL adapter works both with the C-based (http://www.postgresql.jp/interfaces/ruby/) and the Ruby-base
# (available both as gem and from http://rubyforge.org/frs/?group_id=234&release_id=1145) drivers.
#
# Options:
#
# * <tt>:host</tt> -- Defaults to localhost
# * <tt>:port</tt> -- Defaults to 5432
# * <tt>:username</tt> -- Defaults to nothing
# * <tt>:password</tt> -- Defaults to nothing
# * <tt>:database</tt> -- The name of the database. No default, must be provided.
# * <tt>:schema_search_path</tt> -- An optional schema search path for the connection given as a string of comma-separated schema names. This is backward-compatible with the :schema_order option.
# * <tt>:encoding</tt> -- An optional client encoding that is using in a SET client_encoding TO <encoding> call on connection.
# * <tt>:min_messages</tt> -- An optional client min messages that is using in a SET client_min_messages TO <min_messages> call on connection.
class PostgreSQLAdapter < AbstractAdapter
def adapter_name
'PostgreSQL'
end
def initialize(connection, logger, config = {})
super(connection, logger)
@config = config
configure_connection
end
# Is this connection alive and ready for queries?
def active?
if @connection.respond_to?(:status)
@connection.status == PGconn::CONNECTION_OK
else
@connection.query 'SELECT 1'
true
end
# postgres-pr raises a NoMethodError when querying if no conn is available
rescue PGError, NoMethodError
false
end
# Close then reopen the connection.
def reconnect!
# TODO: postgres-pr doesn't have PGconn#reset.
if @connection.respond_to?(:reset)
@connection.reset
configure_connection
end
end
def disconnect!
# Both postgres and postgres-pr respond to :close
@connection.close rescue nil
end
def native_database_types
{
:primary_key => "serial primary key",
:string => { :name => "character varying", :limit => 255 },
:text => { :name => "text" },
:integer => { :name => "integer" },
:float => { :name => "float" },
:datetime => { :name => "timestamp" },
:timestamp => { :name => "timestamp" },
:time => { :name => "time" },
:date => { :name => "date" },
:binary => { :name => "bytea" },
:boolean => { :name => "boolean" }
}
end
def supports_migrations?
true
end
def table_alias_length
63
end
# QUOTING ==================================================
def quote(value, column = nil)
if value.kind_of?(String) && column && column.type == :binary
"'#{escape_bytea(value)}'"
else
super
end
end
def quote_column_name(name)
%("#{name}")
end
# DATABASE STATEMENTS ======================================
def select_all(sql, name = nil) #:nodoc:
select(sql, name)
end
def select_one(sql, name = nil) #:nodoc:
result = select(sql, name)
result.first if result
end
def insert(sql, name = nil, pk = nil, id_value = nil, sequence_name = nil) #:nodoc:
execute(sql, name)
table = sql.split(" ", 4)[2]
id_value || last_insert_id(table, sequence_name || default_sequence_name(table, pk))
end
def query(sql, name = nil) #:nodoc:
log(sql, name) { @connection.query(sql) }
end
def execute(sql, name = nil) #:nodoc:
log(sql, name) { @connection.exec(sql) }
end
def update(sql, name = nil) #:nodoc:
execute(sql, name).cmdtuples
end
alias_method :delete, :update #:nodoc:
def begin_db_transaction #:nodoc:
execute "BEGIN"
end
def commit_db_transaction #:nodoc:
execute "COMMIT"
end
def rollback_db_transaction #:nodoc:
execute "ROLLBACK"
end
# SCHEMA STATEMENTS ========================================
# Return the list of all tables in the schema search path.
def tables(name = nil) #:nodoc:
schemas = schema_search_path.split(/,/).map { |p| quote(p) }.join(',')
query(<<-SQL, name).map { |row| row[0] }
SELECT tablename
FROM pg_tables
WHERE schemaname IN (#{schemas})
SQL
end
def indexes(table_name, name = nil) #:nodoc:
result = query(<<-SQL, name)
SELECT i.relname, d.indisunique, a.attname
FROM pg_class t, pg_class i, pg_index d, pg_attribute a
WHERE i.relkind = 'i'
AND d.indexrelid = i.oid
AND d.indisprimary = 'f'
AND t.oid = d.indrelid
AND t.relname = '#{table_name}'
AND a.attrelid = t.oid
AND ( d.indkey[0]=a.attnum OR d.indkey[1]=a.attnum
OR d.indkey[2]=a.attnum OR d.indkey[3]=a.attnum
OR d.indkey[4]=a.attnum OR d.indkey[5]=a.attnum
OR d.indkey[6]=a.attnum OR d.indkey[7]=a.attnum
OR d.indkey[8]=a.attnum OR d.indkey[9]=a.attnum )
ORDER BY i.relname
SQL
current_index = nil
indexes = []
result.each do |row|
if current_index != row[0]
indexes << IndexDefinition.new(table_name, row[0], row[1] == "t", [])
current_index = row[0]
end
indexes.last.columns << row[2]
end
indexes
end
def columns(table_name, name = nil) #:nodoc:
column_definitions(table_name).collect do |name, type, default, notnull|
Column.new(name, default_value(default), translate_field_type(type),
notnull == "f")
end
end
# Set the schema search path to a string of comma-separated schema names.
# Names beginning with $ are quoted (e.g. $user => '$user')
# See http://www.postgresql.org/docs/8.0/interactive/ddl-schemas.html
def schema_search_path=(schema_csv) #:nodoc:
if schema_csv
execute "SET search_path TO #{schema_csv}"
@schema_search_path = nil
end
end
def schema_search_path #:nodoc:
@schema_search_path ||= query('SHOW search_path')[0][0]
end
def default_sequence_name(table_name, pk = nil)
default_pk, default_seq = pk_and_sequence_for(table_name)
default_seq || "#{table_name}_#{pk || default_pk || 'id'}_seq"
end
# Resets sequence to the max value of the table's pk if present.
def reset_pk_sequence!(table, pk = nil, sequence = nil)
unless pk and sequence
default_pk, default_sequence = pk_and_sequence_for(table)
pk ||= default_pk
sequence ||= default_sequence
end
if pk
if sequence
select_value <<-end_sql, 'Reset sequence'
SELECT setval('#{sequence}', (SELECT COALESCE(MAX(#{pk})+(SELECT increment_by FROM #{sequence}), (SELECT min_value FROM #{sequence})) FROM #{table}), false)
end_sql
else
@logger.warn "#{table} has primary key #{pk} with no default sequence" if @logger
end
end
end
# Find a table's primary key and sequence.
def pk_and_sequence_for(table)
# First try looking for a sequence with a dependency on the
# given table's primary key.
result = execute(<<-end_sql, 'PK and serial sequence')[0]
SELECT attr.attname, name.nspname, seq.relname
FROM pg_class seq,
pg_attribute attr,
pg_depend dep,
pg_namespace name,
pg_constraint cons
WHERE seq.oid = dep.objid
AND seq.relnamespace = name.oid
AND seq.relkind = 'S'
AND attr.attrelid = dep.refobjid
AND attr.attnum = dep.refobjsubid
AND attr.attrelid = cons.conrelid
AND attr.attnum = cons.conkey[1]
AND cons.contype = 'p'
AND dep.refobjid = '#{table}'::regclass
end_sql
if result.nil? or result.empty?
# If that fails, try parsing the primary key's default value.
# Support the 7.x and 8.0 nextval('foo'::text) as well as
# the 8.1+ nextval('foo'::regclass).
# TODO: assumes sequence is in same schema as table.
result = execute(<<-end_sql, 'PK and custom sequence')[0]
SELECT attr.attname, name.nspname, split_part(def.adsrc, '\\\'', 2)
FROM pg_class t
JOIN pg_namespace name ON (t.relnamespace = name.oid)
JOIN pg_attribute attr ON (t.oid = attrelid)
JOIN pg_attrdef def ON (adrelid = attrelid AND adnum = attnum)
JOIN pg_constraint cons ON (conrelid = adrelid AND adnum = conkey[1])
WHERE t.oid = '#{table}'::regclass
AND cons.contype = 'p'
AND def.adsrc ~* 'nextval'
end_sql
end
# check for existence of . in sequence name as in public.foo_sequence. if it does not exist, join the current namespace
result.last['.'] ? [result.first, result.last] : [result.first, "#{result[1]}.#{result[2]}"]
rescue
nil
end
def rename_table(name, new_name)
execute "ALTER TABLE #{name} RENAME TO #{new_name}"
end
def add_column(table_name, column_name, type, options = {})
execute("ALTER TABLE #{table_name} ADD #{column_name} #{type_to_sql(type, options[:limit])}")
execute("ALTER TABLE #{table_name} ALTER #{column_name} SET NOT NULL") if options[:null] == false
change_column_default(table_name, column_name, options[:default]) unless options[:default].nil?
end
def change_column(table_name, column_name, type, options = {}) #:nodoc:
begin
execute "ALTER TABLE #{table_name} ALTER #{column_name} TYPE #{type_to_sql(type, options[:limit])}"
rescue ActiveRecord::StatementInvalid
# This is PG7, so we use a more arcane way of doing it.
begin_db_transaction
add_column(table_name, "#{column_name}_ar_tmp", type, options)
execute "UPDATE #{table_name} SET #{column_name}_ar_tmp = CAST(#{column_name} AS #{type_to_sql(type, options[:limit])})"
remove_column(table_name, column_name)
rename_column(table_name, "#{column_name}_ar_tmp", column_name)
commit_db_transaction
end
change_column_default(table_name, column_name, options[:default]) unless options[:default].nil?
end
def change_column_default(table_name, column_name, default) #:nodoc:
execute "ALTER TABLE #{table_name} ALTER COLUMN #{column_name} SET DEFAULT '#{default}'"
end
def rename_column(table_name, column_name, new_column_name) #:nodoc:
execute "ALTER TABLE #{table_name} RENAME COLUMN #{column_name} TO #{new_column_name}"
end
def remove_index(table_name, options) #:nodoc:
execute "DROP INDEX #{index_name(table_name, options)}"
end
private
BYTEA_COLUMN_TYPE_OID = 17
TIMESTAMPOID = 1114
TIMESTAMPTZOID = 1184
def configure_connection
if @config[:encoding]
execute("SET client_encoding TO '#{@config[:encoding]}'")
end
if @config[:min_messages]
execute("SET client_min_messages TO '#{@config[:min_messages]}'")
end
end
def last_insert_id(table, sequence_name)
Integer(select_value("SELECT currval('#{sequence_name}')"))
end
def select(sql, name = nil)
res = execute(sql, name)
results = res.result
rows = []
if results.length > 0
fields = res.fields
results.each do |row|
hashed_row = {}
row.each_index do |cel_index|
column = row[cel_index]
case res.type(cel_index)
when BYTEA_COLUMN_TYPE_OID
column = unescape_bytea(column)
when TIMESTAMPTZOID, TIMESTAMPOID
column = cast_to_time(column)
end
hashed_row[fields[cel_index]] = column
end
rows << hashed_row
end
end
return rows
end
def escape_bytea(s)
if PGconn.respond_to? :escape_bytea
self.class.send(:define_method, :escape_bytea) do |s|
PGconn.escape_bytea(s) if s
end
else
self.class.send(:define_method, :escape_bytea) do |s|
if s
result = ''
s.each_byte { |c| result << sprintf('\\\\%03o', c) }
result
end
end
end
escape_bytea(s)
end
def unescape_bytea(s)
if PGconn.respond_to? :unescape_bytea
self.class.send(:define_method, :unescape_bytea) do |s|
PGconn.unescape_bytea(s) if s
end
else
self.class.send(:define_method, :unescape_bytea) do |s|
if s
result = ''
i, max = 0, s.size
while i < max
char = s[i]
if char == ?\\
if s[i+1] == ?\\
char = ?\\
i += 1
else
char = s[i+1..i+3].oct
i += 3
end
end
result << char
i += 1
end
result
end
end
end
unescape_bytea(s)
end
# Query a table's column names, default values, and types.
#
# The underlying query is roughly:
# SELECT column.name, column.type, default.value
# FROM column LEFT JOIN default
# ON column.table_id = default.table_id
# AND column.num = default.column_num
# WHERE column.table_id = get_table_id('table_name')
# AND column.num > 0
# AND NOT column.is_dropped
# ORDER BY column.num
#
# If the table name is not prefixed with a schema, the database will
# take the first match from the schema search path.
#
# Query implementation notes:
# - format_type includes the column size constraint, e.g. varchar(50)
# - ::regclass is a function that gives the id for a table name
def column_definitions(table_name)
query <<-end_sql
SELECT a.attname, format_type(a.atttypid, a.atttypmod), d.adsrc, a.attnotnull
FROM pg_attribute a LEFT JOIN pg_attrdef d
ON a.attrelid = d.adrelid AND a.attnum = d.adnum
WHERE a.attrelid = '#{table_name}'::regclass
AND a.attnum > 0 AND NOT a.attisdropped
ORDER BY a.attnum
end_sql
end
# Translate PostgreSQL-specific types into simplified SQL types.
# These are special cases; standard types are handled by
# ConnectionAdapters::Column#simplified_type.
def translate_field_type(field_type)
# Match the beginning of field_type since it may have a size constraint on the end.
case field_type
when /^timestamp/i then 'datetime'
when /^real|^money/i then 'float'
when /^interval/i then 'string'
# geometric types (the line type is currently not implemented in postgresql)
when /^(?:point|lseg|box|"?path"?|polygon|circle)/i then 'string'
when /^bytea/i then 'binary'
else field_type # Pass through standard types.
end
end
def default_value(value)
# Boolean types
return "t" if value =~ /true/i
return "f" if value =~ /false/i
# Char/String/Bytea type values
return $1 if value =~ /^'(.*)'::(bpchar|text|character varying|bytea)$/
# Numeric values
return value if value =~ /^-?[0-9]+(\.[0-9]*)?/
# Fixed dates / times
return $1 if value =~ /^'(.+)'::(date|timestamp)/
# Anything else is blank, some user type, or some function
# and we can't know the value of that, so return nil.
return nil
end
# Only needed for DateTime instances
def cast_to_time(value)
return value unless value.class == DateTime
v = value
time_array = [v.year, v.month, v.day, v.hour, v.min, v.sec]
Time.send(Base.default_timezone, *time_array) rescue nil
end
end
end
end

View file

@ -0,0 +1,371 @@
# Author: Luke Holden <lholden@cablelan.net>
# Updated for SQLite3: Jamis Buck <jamis@37signals.com>
require 'active_record/connection_adapters/abstract_adapter'
module ActiveRecord
class Base
class << self
# sqlite3 adapter reuses sqlite_connection.
def sqlite3_connection(config) # :nodoc:
parse_config!(config)
unless self.class.const_defined?(:SQLite3)
require_library_or_gem(config[:adapter])
end
db = SQLite3::Database.new(
config[:database],
:results_as_hash => true,
:type_translation => false
)
ConnectionAdapters::SQLiteAdapter.new(db, logger)
end
# Establishes a connection to the database that's used by all Active Record objects
def sqlite_connection(config) # :nodoc:
parse_config!(config)
unless self.class.const_defined?(:SQLite)
require_library_or_gem(config[:adapter])
db = SQLite::Database.new(config[:database], 0)
db.show_datatypes = "ON" if !defined? SQLite::Version
db.results_as_hash = true if defined? SQLite::Version
db.type_translation = false
# "Downgrade" deprecated sqlite API
if SQLite.const_defined?(:Version)
ConnectionAdapters::SQLite2Adapter.new(db, logger)
else
ConnectionAdapters::DeprecatedSQLiteAdapter.new(db, logger)
end
end
end
private
def parse_config!(config)
config[:database] ||= config[:dbfile]
# Require database.
unless config[:database]
raise ArgumentError, "No database file specified. Missing argument: database"
end
# Allow database path relative to RAILS_ROOT, but only if
# the database path is not the special path that tells
# Sqlite build a database only in memory.
if Object.const_defined?(:RAILS_ROOT) && ':memory:' != config[:database]
config[:database] = File.expand_path(config[:database], RAILS_ROOT)
end
end
end
end
module ConnectionAdapters #:nodoc:
class SQLiteColumn < Column #:nodoc:
class << self
def string_to_binary(value)
value.gsub(/\0|\%/) do |b|
case b
when "\0" then "%00"
when "%" then "%25"
end
end
end
def binary_to_string(value)
value.gsub(/%00|%25/) do |b|
case b
when "%00" then "\0"
when "%25" then "%"
end
end
end
end
end
# The SQLite adapter works with both the 2.x and 3.x series of SQLite with the sqlite-ruby drivers (available both as gems and
# from http://rubyforge.org/projects/sqlite-ruby/).
#
# Options:
#
# * <tt>:database</tt> -- Path to the database file.
class SQLiteAdapter < AbstractAdapter
def adapter_name #:nodoc:
'SQLite'
end
def supports_migrations? #:nodoc:
true
end
def supports_count_distinct? #:nodoc:
false
end
def native_database_types #:nodoc:
{
:primary_key => "INTEGER PRIMARY KEY NOT NULL",
:string => { :name => "varchar", :limit => 255 },
:text => { :name => "text" },
:integer => { :name => "integer" },
:float => { :name => "float" },
:datetime => { :name => "datetime" },
:timestamp => { :name => "datetime" },
:time => { :name => "datetime" },
:date => { :name => "date" },
:binary => { :name => "blob" },
:boolean => { :name => "boolean" }
}
end
# QUOTING ==================================================
def quote_string(s) #:nodoc:
@connection.class.quote(s)
end
def quote_column_name(name) #:nodoc:
%Q("#{name}")
end
# DATABASE STATEMENTS ======================================
def execute(sql, name = nil) #:nodoc:
catch_schema_changes { log(sql, name) { @connection.execute(sql) } }
end
def update(sql, name = nil) #:nodoc:
execute(sql, name)
@connection.changes
end
def delete(sql, name = nil) #:nodoc:
sql += " WHERE 1=1" unless sql =~ /WHERE/i
execute(sql, name)
@connection.changes
end
def insert(sql, name = nil, pk = nil, id_value = nil, sequence_name = nil) #:nodoc:
execute(sql, name = nil)
id_value || @connection.last_insert_row_id
end
def select_all(sql, name = nil) #:nodoc:
execute(sql, name).map do |row|
record = {}
row.each_key do |key|
if key.is_a?(String)
record[key.sub(/^\w+\./, '')] = row[key]
end
end
record
end
end
def select_one(sql, name = nil) #:nodoc:
result = select_all(sql, name)
result.nil? ? nil : result.first
end
def begin_db_transaction #:nodoc:
catch_schema_changes { @connection.transaction }
end
def commit_db_transaction #:nodoc:
catch_schema_changes { @connection.commit }
end
def rollback_db_transaction #:nodoc:
catch_schema_changes { @connection.rollback }
end
# SCHEMA STATEMENTS ========================================
def tables(name = nil) #:nodoc:
execute("SELECT name FROM sqlite_master WHERE type = 'table'", name).map do |row|
row[0]
end
end
def columns(table_name, name = nil) #:nodoc:
table_structure(table_name).map do |field|
SQLiteColumn.new(field['name'], field['dflt_value'], field['type'], field['notnull'] == "0")
end
end
def indexes(table_name, name = nil) #:nodoc:
execute("PRAGMA index_list(#{table_name})", name).map do |row|
index = IndexDefinition.new(table_name, row['name'])
index.unique = row['unique'] != '0'
index.columns = execute("PRAGMA index_info('#{index.name}')").map { |col| col['name'] }
index
end
end
def primary_key(table_name) #:nodoc:
column = table_structure(table_name).find {|field| field['pk'].to_i == 1}
column ? column['name'] : nil
end
def remove_index(table_name, options={}) #:nodoc:
execute "DROP INDEX #{quote_column_name(index_name(table_name, options))}"
end
def rename_table(name, new_name)
move_table(name, new_name)
end
def add_column(table_name, column_name, type, options = {}) #:nodoc:
alter_table(table_name) do |definition|
definition.column(column_name, type, options)
end
end
def remove_column(table_name, column_name) #:nodoc:
alter_table(table_name) do |definition|
definition.columns.delete(definition[column_name])
end
end
def change_column_default(table_name, column_name, default) #:nodoc:
alter_table(table_name) do |definition|
definition[column_name].default = default
end
end
def change_column(table_name, column_name, type, options = {}) #:nodoc:
alter_table(table_name) do |definition|
definition[column_name].instance_eval do
self.type = type
self.limit = options[:limit] if options[:limit]
self.default = options[:default] if options[:default]
end
end
end
def rename_column(table_name, column_name, new_column_name) #:nodoc:
alter_table(table_name, :rename => {column_name => new_column_name})
end
protected
def table_structure(table_name)
returning structure = execute("PRAGMA table_info(#{table_name})") do
raise ActiveRecord::StatementInvalid if structure.empty?
end
end
def alter_table(table_name, options = {}) #:nodoc:
altered_table_name = "altered_#{table_name}"
caller = lambda {|definition| yield definition if block_given?}
transaction do
move_table(table_name, altered_table_name,
options.merge(:temporary => true))
move_table(altered_table_name, table_name, &caller)
end
end
def move_table(from, to, options = {}, &block) #:nodoc:
copy_table(from, to, options, &block)
drop_table(from)
end
def copy_table(from, to, options = {}) #:nodoc:
create_table(to, options) do |@definition|
columns(from).each do |column|
column_name = options[:rename] ?
(options[:rename][column.name] ||
options[:rename][column.name.to_sym] ||
column.name) : column.name
@definition.column(column_name, column.type,
:limit => column.limit, :default => column.default,
:null => column.null)
end
@definition.primary_key(primary_key(from))
yield @definition if block_given?
end
copy_table_indexes(from, to)
copy_table_contents(from, to,
@definition.columns.map {|column| column.name},
options[:rename] || {})
end
def copy_table_indexes(from, to) #:nodoc:
indexes(from).each do |index|
name = index.name
if to == "altered_#{from}"
name = "temp_#{name}"
elsif from == "altered_#{to}"
name = name[5..-1]
end
opts = { :name => name }
opts[:unique] = true if index.unique
add_index(to, index.columns, opts)
end
end
def copy_table_contents(from, to, columns, rename = {}) #:nodoc:
column_mappings = Hash[*columns.map {|name| [name, name]}.flatten]
rename.inject(column_mappings) {|map, a| map[a.last] = a.first; map}
@connection.execute "SELECT * FROM #{from}" do |row|
sql = "INSERT INTO #{to} VALUES ("
sql << columns.map {|col| quote row[column_mappings[col]]} * ', '
sql << ')'
@connection.execute sql
end
end
def catch_schema_changes
return yield
rescue ActiveRecord::StatementInvalid => exception
if exception.message =~ /database schema has changed/
reconnect!
retry
else
raise
end
end
end
class SQLite2Adapter < SQLiteAdapter # :nodoc:
# SQLite 2 does not support COUNT(DISTINCT) queries:
#
# select COUNT(DISTINCT ArtistID) from CDs;
#
# In order to get the number of artists we execute the following statement
#
# SELECT COUNT(ArtistID) FROM (SELECT DISTINCT ArtistID FROM CDs);
def execute(sql, name = nil) #:nodoc:
super(rewrite_count_distinct_queries(sql), name)
end
def rewrite_count_distinct_queries(sql)
if sql =~ /count\(distinct ([^\)]+)\)( AS \w+)? (.*)/i
distinct_column = $1
distinct_query = $3
column_name = distinct_column.split('.').last
"SELECT COUNT(#{column_name}) FROM (SELECT DISTINCT #{distinct_column} #{distinct_query})"
else
sql
end
end
end
class DeprecatedSQLiteAdapter < SQLite2Adapter # :nodoc:
def insert(sql, name = nil, pk = nil, id_value = nil)
execute(sql, name = nil)
id_value || @connection.last_insert_rowid
end
end
end
end

View file

@ -0,0 +1,563 @@
require 'active_record/connection_adapters/abstract_adapter'
# sqlserver_adapter.rb -- ActiveRecord adapter for Microsoft SQL Server
#
# Author: Joey Gibson <joey@joeygibson.com>
# Date: 10/14/2004
#
# Modifications: DeLynn Berry <delynnb@megastarfinancial.com>
# Date: 3/22/2005
#
# Modifications (ODBC): Mark Imbriaco <mark.imbriaco@pobox.com>
# Date: 6/26/2005
#
# Current maintainer: Ryan Tomayko <rtomayko@gmail.com>
#
# Modifications (Migrations): Tom Ward <tom@popdog.net>
# Date: 27/10/2005
#
module ActiveRecord
class Base
def self.sqlserver_connection(config) #:nodoc:
require_library_or_gem 'dbi' unless self.class.const_defined?(:DBI)
config = config.symbolize_keys
mode = config[:mode] ? config[:mode].to_s.upcase : 'ADO'
username = config[:username] ? config[:username].to_s : 'sa'
password = config[:password] ? config[:password].to_s : ''
autocommit = config.key?(:autocommit) ? config[:autocommit] : true
if mode == "ODBC"
raise ArgumentError, "Missing DSN. Argument ':dsn' must be set in order for this adapter to work." unless config.has_key?(:dsn)
dsn = config[:dsn]
driver_url = "DBI:ODBC:#{dsn}"
else
raise ArgumentError, "Missing Database. Argument ':database' must be set in order for this adapter to work." unless config.has_key?(:database)
database = config[:database]
host = config[:host] ? config[:host].to_s : 'localhost'
driver_url = "DBI:ADO:Provider=SQLOLEDB;Data Source=#{host};Initial Catalog=#{database};User Id=#{username};Password=#{password};"
end
conn = DBI.connect(driver_url, username, password)
conn["AutoCommit"] = autocommit
ConnectionAdapters::SQLServerAdapter.new(conn, logger, [driver_url, username, password])
end
end # class Base
module ConnectionAdapters
class ColumnWithIdentity < Column# :nodoc:
attr_reader :identity, :is_special, :scale
def initialize(name, default, sql_type = nil, is_identity = false, null = true, scale_value = 0)
super(name, default, sql_type, null)
@identity = is_identity
@is_special = sql_type =~ /text|ntext|image/i ? true : false
@scale = scale_value
# SQL Server only supports limits on *char and float types
@limit = nil unless @type == :float or @type == :string
end
def simplified_type(field_type)
case field_type
when /int|bigint|smallint|tinyint/i then :integer
when /float|double|decimal|money|numeric|real|smallmoney/i then @scale == 0 ? :integer : :float
when /datetime|smalldatetime/i then :datetime
when /timestamp/i then :timestamp
when /time/i then :time
when /text|ntext/i then :text
when /binary|image|varbinary/i then :binary
when /char|nchar|nvarchar|string|varchar/i then :string
when /bit/i then :boolean
when /uniqueidentifier/i then :string
end
end
def type_cast(value)
return nil if value.nil? || value =~ /^\s*null\s*$/i
case type
when :string then value
when :integer then value == true || value == false ? value == true ? 1 : 0 : value.to_i
when :float then value.to_f
when :datetime then cast_to_datetime(value)
when :timestamp then cast_to_time(value)
when :time then cast_to_time(value)
when :date then cast_to_datetime(value)
when :boolean then value == true or (value =~ /^t(rue)?$/i) == 0 or value.to_s == '1'
else value
end
end
def cast_to_time(value)
return value if value.is_a?(Time)
time_array = ParseDate.parsedate(value)
time_array[0] ||= 2000
time_array[1] ||= 1
time_array[2] ||= 1
Time.send(Base.default_timezone, *time_array) rescue nil
end
def cast_to_datetime(value)
if value.is_a?(Time)
if value.year != 0 and value.month != 0 and value.day != 0
return value
else
return Time.mktime(2000, 1, 1, value.hour, value.min, value.sec) rescue nil
end
end
return cast_to_time(value) if value.is_a?(Date) or value.is_a?(String) rescue nil
value
end
# These methods will only allow the adapter to insert binary data with a length of 7K or less
# because of a SQL Server statement length policy.
def self.string_to_binary(value)
value.gsub(/(\r|\n|\0|\x1a)/) do
case $1
when "\r" then "%00"
when "\n" then "%01"
when "\0" then "%02"
when "\x1a" then "%03"
end
end
end
def self.binary_to_string(value)
value.gsub(/(%00|%01|%02|%03)/) do
case $1
when "%00" then "\r"
when "%01" then "\n"
when "%02\0" then "\0"
when "%03" then "\x1a"
end
end
end
end
# In ADO mode, this adapter will ONLY work on Windows systems,
# since it relies on Win32OLE, which, to my knowledge, is only
# available on Windows.
#
# This mode also relies on the ADO support in the DBI module. If you are using the
# one-click installer of Ruby, then you already have DBI installed, but
# the ADO module is *NOT* installed. You will need to get the latest
# source distribution of Ruby-DBI from http://ruby-dbi.sourceforge.net/
# unzip it, and copy the file
# <tt>src/lib/dbd_ado/ADO.rb</tt>
# to
# <tt>X:/Ruby/lib/ruby/site_ruby/1.8/DBD/ADO/ADO.rb</tt>
# (you will more than likely need to create the ADO directory).
# Once you've installed that file, you are ready to go.
#
# In ODBC mode, the adapter requires the ODBC support in the DBI module which requires
# the Ruby ODBC module. Ruby ODBC 0.996 was used in development and testing,
# and it is available at http://www.ch-werner.de/rubyodbc/
#
# Options:
#
# * <tt>:mode</tt> -- ADO or ODBC. Defaults to ADO.
# * <tt>:username</tt> -- Defaults to sa.
# * <tt>:password</tt> -- Defaults to empty string.
#
# ADO specific options:
#
# * <tt>:host</tt> -- Defaults to localhost.
# * <tt>:database</tt> -- The name of the database. No default, must be provided.
#
# ODBC specific options:
#
# * <tt>:dsn</tt> -- Defaults to nothing.
#
# ADO code tested on Windows 2000 and higher systems,
# running ruby 1.8.2 (2004-07-29) [i386-mswin32], and SQL Server 2000 SP3.
#
# ODBC code tested on a Fedora Core 4 system, running FreeTDS 0.63,
# unixODBC 2.2.11, Ruby ODBC 0.996, Ruby DBI 0.0.23 and Ruby 1.8.2.
# [Linux strongmad 2.6.11-1.1369_FC4 #1 Thu Jun 2 22:55:56 EDT 2005 i686 i686 i386 GNU/Linux]
class SQLServerAdapter < AbstractAdapter
def initialize(connection, logger, connection_options=nil)
super(connection, logger)
@connection_options = connection_options
end
def native_database_types
{
:primary_key => "int NOT NULL IDENTITY(1, 1) PRIMARY KEY",
:string => { :name => "varchar", :limit => 255 },
:text => { :name => "text" },
:integer => { :name => "int" },
:float => { :name => "float", :limit => 8 },
:datetime => { :name => "datetime" },
:timestamp => { :name => "datetime" },
:time => { :name => "datetime" },
:date => { :name => "datetime" },
:binary => { :name => "image"},
:boolean => { :name => "bit"}
}
end
def adapter_name
'SQLServer'
end
def supports_migrations? #:nodoc:
true
end
# CONNECTION MANAGEMENT ====================================#
# Returns true if the connection is active.
def active?
@connection.execute("SELECT 1") { }
true
rescue DBI::DatabaseError, DBI::InterfaceError
false
end
# Reconnects to the database, returns false if no connection could be made.
def reconnect!
disconnect!
@connection = DBI.connect(*@connection_options)
rescue DBI::DatabaseError => e
@logger.warn "#{adapter_name} reconnection failed: #{e.message}" if @logger
false
end
# Disconnects from the database
def disconnect!
@connection.disconnect rescue nil
end
def select_all(sql, name = nil)
select(sql, name)
end
def select_one(sql, name = nil)
add_limit!(sql, :limit => 1)
result = select(sql, name)
result.nil? ? nil : result.first
end
def columns(table_name, name = nil)
return [] if table_name.blank?
table_name = table_name.to_s if table_name.is_a?(Symbol)
table_name = table_name.split('.')[-1] unless table_name.nil?
sql = "SELECT COLUMN_NAME as ColName, COLUMN_DEFAULT as DefaultValue, DATA_TYPE as ColType, IS_NULLABLE As IsNullable, COL_LENGTH('#{table_name}', COLUMN_NAME) as Length, COLUMNPROPERTY(OBJECT_ID('#{table_name}'), COLUMN_NAME, 'IsIdentity') as IsIdentity, NUMERIC_SCALE as Scale FROM INFORMATION_SCHEMA.COLUMNS WHERE TABLE_NAME = '#{table_name}'"
# Comment out if you want to have the Columns select statment logged.
# Personally, I think it adds unnecessary bloat to the log.
# If you do comment it out, make sure to un-comment the "result" line that follows
result = log(sql, name) { @connection.select_all(sql) }
#result = @connection.select_all(sql)
columns = []
result.each do |field|
default = field[:DefaultValue].to_s.gsub!(/[()\']/,"") =~ /null/ ? nil : field[:DefaultValue]
type = "#{field[:ColType]}(#{field[:Length]})"
is_identity = field[:IsIdentity] == 1
is_nullable = field[:IsNullable] == 'YES'
columns << ColumnWithIdentity.new(field[:ColName], default, type, is_identity, is_nullable, field[:Scale])
end
columns
end
def insert(sql, name = nil, pk = nil, id_value = nil, sequence_name = nil)
begin
table_name = get_table_name(sql)
col = get_identity_column(table_name)
ii_enabled = false
if col != nil
if query_contains_identity_column(sql, col)
begin
execute enable_identity_insert(table_name, true)
ii_enabled = true
rescue Exception => e
raise ActiveRecordError, "IDENTITY_INSERT could not be turned ON"
end
end
end
log(sql, name) do
@connection.execute(sql)
id_value || select_one("SELECT @@IDENTITY AS Ident")["Ident"]
end
ensure
if ii_enabled
begin
execute enable_identity_insert(table_name, false)
rescue Exception => e
raise ActiveRecordError, "IDENTITY_INSERT could not be turned OFF"
end
end
end
end
def execute(sql, name = nil)
if sql =~ /^\s*INSERT/i
insert(sql, name)
elsif sql =~ /^\s*UPDATE|^\s*DELETE/i
log(sql, name) do
@connection.execute(sql)
retVal = select_one("SELECT @@ROWCOUNT AS AffectedRows")["AffectedRows"]
end
else
log(sql, name) { @connection.execute(sql) }
end
end
def update(sql, name = nil)
execute(sql, name)
end
alias_method :delete, :update
def begin_db_transaction
@connection["AutoCommit"] = false
rescue Exception => e
@connection["AutoCommit"] = true
end
def commit_db_transaction
@connection.commit
ensure
@connection["AutoCommit"] = true
end
def rollback_db_transaction
@connection.rollback
ensure
@connection["AutoCommit"] = true
end
def quote(value, column = nil)
case value
when String
if column && column.type == :binary && column.class.respond_to?(:string_to_binary)
"'#{quote_string(column.class.string_to_binary(value))}'"
else
"'#{quote_string(value)}'"
end
when NilClass then "NULL"
when TrueClass then '1'
when FalseClass then '0'
when Float, Fixnum, Bignum then value.to_s
when Date then "'#{value.to_s}'"
when Time, DateTime then "'#{value.strftime("%Y-%m-%d %H:%M:%S")}'"
else "'#{quote_string(value.to_yaml)}'"
end
end
def quote_string(string)
string.gsub(/\'/, "''")
end
def quoted_true
"1"
end
def quoted_false
"0"
end
def quote_column_name(name)
"[#{name}]"
end
def add_limit_offset!(sql, options)
if options[:limit] and options[:offset]
total_rows = @connection.select_all("SELECT count(*) as TotalRows from (#{sql.gsub(/\bSELECT\b/i, "SELECT TOP 1000000000")}) tally")[0][:TotalRows].to_i
if (options[:limit] + options[:offset]) >= total_rows
options[:limit] = (total_rows - options[:offset] >= 0) ? (total_rows - options[:offset]) : 0
end
sql.sub!(/^\s*SELECT/i, "SELECT * FROM (SELECT TOP #{options[:limit]} * FROM (SELECT TOP #{options[:limit] + options[:offset]} ")
sql << ") AS tmp1"
if options[:order]
options[:order] = options[:order].split(',').map do |field|
parts = field.split(" ")
tc = parts[0]
if sql =~ /\.\[/ and tc =~ /\./ # if column quoting used in query
tc.gsub!(/\./, '\\.\\[')
tc << '\\]'
end
if sql =~ /#{tc} AS (t\d_r\d\d?)/
parts[0] = $1
end
parts.join(' ')
end.join(', ')
sql << " ORDER BY #{change_order_direction(options[:order])}) AS tmp2 ORDER BY #{options[:order]}"
else
sql << " ) AS tmp2"
end
elsif sql !~ /^\s*SELECT (@@|COUNT\()/i
sql.sub!(/^\s*SELECT([\s]*distinct)?/i) do
"SELECT#{$1} TOP #{options[:limit]}"
end unless options[:limit].nil?
end
end
def recreate_database(name)
drop_database(name)
create_database(name)
end
def drop_database(name)
execute "DROP DATABASE #{name}"
end
def create_database(name)
execute "CREATE DATABASE #{name}"
end
def current_database
@connection.select_one("select DB_NAME()")[0]
end
def tables(name = nil)
execute("SELECT table_name from information_schema.tables WHERE table_type = 'BASE TABLE'", name).inject([]) do |tables, field|
table_name = field[0]
tables << table_name unless table_name == 'dtproperties'
tables
end
end
def indexes(table_name, name = nil)
indexes = []
execute("EXEC sp_helpindex #{table_name}", name).each do |index|
unique = index[1] =~ /unique/
primary = index[1] =~ /primary key/
if !primary
indexes << IndexDefinition.new(table_name, index[0], unique, index[2].split(", "))
end
end
indexes
end
def rename_table(name, new_name)
execute "EXEC sp_rename '#{name}', '#{new_name}'"
end
def remove_column(table_name, column_name)
execute "ALTER TABLE #{table_name} DROP COLUMN #{column_name}"
end
def rename_column(table, column, new_column_name)
execute "EXEC sp_rename '#{table}.#{column}', '#{new_column_name}'"
end
def change_column(table_name, column_name, type, options = {}) #:nodoc:
sql_commands = ["ALTER TABLE #{table_name} ALTER COLUMN #{column_name} #{type_to_sql(type, options[:limit])}"]
if options[:default]
remove_default_constraint(table_name, column_name)
sql_commands << "ALTER TABLE #{table_name} ADD CONSTRAINT DF_#{table_name}_#{column_name} DEFAULT #{options[:default]} FOR #{column_name}"
end
sql_commands.each {|c|
execute(c)
}
end
def remove_column(table_name, column_name)
remove_default_constraint(table_name, column_name)
execute "ALTER TABLE #{table_name} DROP COLUMN #{column_name}"
end
def remove_default_constraint(table_name, column_name)
defaults = select "select def.name from sysobjects def, syscolumns col, sysobjects tab where col.cdefault = def.id and col.name = '#{column_name}' and tab.name = '#{table_name}' and col.id = tab.id"
defaults.each {|constraint|
execute "ALTER TABLE #{table_name} DROP CONSTRAINT #{constraint["name"]}"
}
end
def remove_index(table_name, options = {})
execute "DROP INDEX #{table_name}.#{index_name(table_name, options)}"
end
def type_to_sql(type, limit = nil) #:nodoc:
native = native_database_types[type]
# if there's no :limit in the default type definition, assume that type doesn't support limits
limit = limit || native[:limit]
column_type_sql = native[:name]
column_type_sql << "(#{limit})" if limit
column_type_sql
end
private
def select(sql, name = nil)
rows = []
repair_special_columns(sql)
log(sql, name) do
@connection.select_all(sql) do |row|
record = {}
row.column_names.each do |col|
record[col] = row[col]
record[col] = record[col].to_time if record[col].is_a? DBI::Timestamp
end
rows << record
end
end
rows
end
def enable_identity_insert(table_name, enable = true)
if has_identity_column(table_name)
"SET IDENTITY_INSERT #{table_name} #{enable ? 'ON' : 'OFF'}"
end
end
def get_table_name(sql)
if sql =~ /^\s*insert\s+into\s+([^\(\s]+)\s*|^\s*update\s+([^\(\s]+)\s*/i
$1
elsif sql =~ /from\s+([^\(\s]+)\s*/i
$1
else
nil
end
end
def has_identity_column(table_name)
!get_identity_column(table_name).nil?
end
def get_identity_column(table_name)
@table_columns = {} unless @table_columns
@table_columns[table_name] = columns(table_name) if @table_columns[table_name] == nil
@table_columns[table_name].each do |col|
return col.name if col.identity
end
return nil
end
def query_contains_identity_column(sql, col)
sql =~ /\[#{col}\]/
end
def change_order_direction(order)
order.split(",").collect {|fragment|
case fragment
when /\bDESC\b/i then fragment.gsub(/\bDESC\b/i, "ASC")
when /\bASC\b/i then fragment.gsub(/\bASC\b/i, "DESC")
else String.new(fragment).split(',').join(' DESC,') + ' DESC'
end
}.join(",")
end
def get_special_columns(table_name)
special = []
@table_columns ||= {}
@table_columns[table_name] ||= columns(table_name)
@table_columns[table_name].each do |col|
special << col.name if col.is_special
end
special
end
def repair_special_columns(sql)
special_cols = get_special_columns(get_table_name(sql))
for col in special_cols.to_a
sql.gsub!(Regexp.new(" #{col.to_s} = "), " #{col.to_s} LIKE ")
sql.gsub!(/ORDER BY #{col.to_s}/i, '')
end
sql
end
end #class SQLServerAdapter < AbstractAdapter
end #module ConnectionAdapters
end #module ActiveRecord

View file

@ -0,0 +1,684 @@
# sybase_adaptor.rb
# Author: John Sheets <dev@metacasa.net>
# Date: 01 Mar 2006
#
# Based on code from Will Sobel (http://dev.rubyonrails.org/ticket/2030)
#
# 17 Mar 2006: Added support for migrations; fixed issues with :boolean columns.
#
require 'active_record/connection_adapters/abstract_adapter'
begin
require 'sybsql'
module ActiveRecord
class Base
# Establishes a connection to the database that's used by all Active Record objects
def self.sybase_connection(config) # :nodoc:
config = config.symbolize_keys
username = config[:username] ? config[:username].to_s : 'sa'
password = config[:password] ? config[:password].to_s : ''
if config.has_key?(:host)
host = config[:host]
else
raise ArgumentError, "No database server name specified. Missing argument: host."
end
if config.has_key?(:database)
database = config[:database]
else
raise ArgumentError, "No database specified. Missing argument: database."
end
ConnectionAdapters::SybaseAdapter.new(
SybSQL.new({'S' => host, 'U' => username, 'P' => password},
ConnectionAdapters::SybaseAdapterContext), database, logger)
end
end # class Base
module ConnectionAdapters
# ActiveRecord connection adapter for Sybase Open Client bindings
# (see http://raa.ruby-lang.org/project/sybase-ctlib).
#
# Options:
#
# * <tt>:host</tt> -- The name of the database server. No default, must be provided.
# * <tt>:database</tt> -- The name of the database. No default, must be provided.
# * <tt>:username</tt> -- Defaults to sa.
# * <tt>:password</tt> -- Defaults to empty string.
#
# Usage Notes:
#
# * The sybase-ctlib bindings do not support the DATE SQL column type; use DATETIME instead.
# * Table and column names are limited to 30 chars in Sybase 12.5
# * :binary columns not yet supported
# * :boolean columns use the BIT SQL type, which does not allow nulls or
# indexes. If a DEFAULT is not specified for ALTER TABLE commands, the
# column will be declared with DEFAULT 0 (false).
#
# Migrations:
#
# The Sybase adapter supports migrations, but for ALTER TABLE commands to
# work, the database must have the database option 'select into' set to
# 'true' with sp_dboption (see below). The sp_helpdb command lists the current
# options for all databases.
#
# 1> use mydb
# 2> go
# 1> master..sp_dboption mydb, "select into", true
# 2> go
# 1> checkpoint
# 2> go
class SybaseAdapter < AbstractAdapter # :nodoc:
class ColumnWithIdentity < Column
attr_reader :identity, :primary
def initialize(name, default, sql_type = nil, nullable = nil, identity = nil, primary = nil)
super(name, default, sql_type, nullable)
@default, @identity, @primary = type_cast(default), identity, primary
end
def simplified_type(field_type)
case field_type
when /int|bigint|smallint|tinyint/i then :integer
when /float|double|decimal|money|numeric|real|smallmoney/i then :float
when /text|ntext/i then :text
when /binary|image|varbinary/i then :binary
when /char|nchar|nvarchar|string|varchar/i then :string
when /bit/i then :boolean
when /datetime|smalldatetime/i then :datetime
else super
end
end
def self.string_to_binary(value)
"0x#{value.unpack("H*")[0]}"
end
def self.binary_to_string(value)
# FIXME: sybase-ctlib uses separate sql method for binary columns.
value
end
end # class ColumnWithIdentity
# Sybase adapter
def initialize(connection, database, logger = nil)
super(connection, logger)
context = connection.context
context.init(logger)
@limit = @offset = 0
unless connection.sql_norow("USE #{database}")
raise "Cannot USE #{database}"
end
end
def native_database_types
{
:primary_key => "numeric(9,0) IDENTITY PRIMARY KEY",
:string => { :name => "varchar", :limit => 255 },
:text => { :name => "text" },
:integer => { :name => "int" },
:float => { :name => "float", :limit => 8 },
:datetime => { :name => "datetime" },
:timestamp => { :name => "timestamp" },
:time => { :name => "time" },
:date => { :name => "datetime" },
:binary => { :name => "image"},
:boolean => { :name => "bit" }
}
end
def adapter_name
'Sybase'
end
def active?
!(@connection.connection.nil? || @connection.connection_dead?)
end
def disconnect!
@connection.close rescue nil
end
def reconnect!
raise "Sybase Connection Adapter does not yet support reconnect!"
# disconnect!
# connect! # Not yet implemented
end
def table_alias_length
30
end
# Check for a limit statement and parse out the limit and
# offset if specified. Remove the limit from the sql statement
# and call select.
def select_all(sql, name = nil)
select(sql, name)
end
# Remove limit clause from statement. This will almost always
# contain LIMIT 1 from the caller. set the rowcount to 1 before
# calling select.
def select_one(sql, name = nil)
result = select(sql, name)
result.nil? ? nil : result.first
end
def columns(table_name, name = nil)
table_structure(table_name).inject([]) do |columns, column|
name, default, type, nullable, identity, primary = column
columns << ColumnWithIdentity.new(name, default, type, nullable, identity, primary)
columns
end
end
def insert(sql, name = nil, pk = nil, id_value = nil, sequence_name = nil)
begin
table_name = get_table_name(sql)
col = get_identity_column(table_name)
ii_enabled = false
if col != nil
if query_contains_identity_column(sql, col)
begin
execute enable_identity_insert(table_name, true)
ii_enabled = true
rescue Exception => e
raise ActiveRecordError, "IDENTITY_INSERT could not be turned ON"
end
end
end
log(sql, name) do
execute(sql, name)
ident = select_one("SELECT @@IDENTITY AS last_id")["last_id"]
id_value || ident
end
ensure
if ii_enabled
begin
execute enable_identity_insert(table_name, false)
rescue Exception => e
raise ActiveRecordError, "IDENTITY_INSERT could not be turned OFF"
end
end
end
end
def execute(sql, name = nil)
log(sql, name) do
@connection.context.reset
@connection.set_rowcount(@limit || 0)
@limit = @offset = nil
@connection.sql_norow(sql)
if @connection.cmd_fail? or @connection.context.failed?
raise "SQL Command Failed for #{name}: #{sql}\nMessage: #{@connection.context.message}"
end
end
# Return rows affected
@connection.results[0].row_count
end
alias_method :update, :execute
alias_method :delete, :execute
def begin_db_transaction() execute "BEGIN TRAN" end
def commit_db_transaction() execute "COMMIT TRAN" end
def rollback_db_transaction() execute "ROLLBACK TRAN" end
def tables(name = nil)
tables = []
select("select name from sysobjects where type='U'", name).each do |row|
tables << row['name']
end
tables
end
def indexes(table_name, name = nil)
indexes = []
select("exec sp_helpindex #{table_name}", name).each do |index|
unique = index["index_description"] =~ /unique/
primary = index["index_description"] =~ /^clustered/
if !primary
cols = index["index_keys"].split(", ").each { |col| col.strip! }
indexes << IndexDefinition.new(table_name, index["index_name"], unique, cols)
end
end
indexes
end
def quoted_true
"1"
end
def quoted_false
"0"
end
def quote(value, column = nil)
case value
when String
if column && column.type == :binary && column.class.respond_to?(:string_to_binary)
"#{quote_string(column.class.string_to_binary(value))}"
elsif value =~ /^[+-]?[0-9]+$/o
value
else
"'#{quote_string(value)}'"
end
when NilClass then (column && column.type == :boolean) ? '0' : "NULL"
when TrueClass then '1'
when FalseClass then '0'
when Float, Fixnum, Bignum then value.to_s
when Date then "'#{value.to_s}'"
when Time, DateTime then "'#{value.strftime("%Y-%m-%d %H:%M:%S")}'"
else "'#{quote_string(value.to_yaml)}'"
end
end
def quote_column(type, value)
case type
when :boolean
case value
when String then value =~ /^[ty]/o ? 1 : 0
when true then 1
when false then 0
else value.to_i
end
when :integer then value.to_i
when :float then value.to_f
when :text, :string, :enum
case value
when String, Symbol, Fixnum, Float, Bignum, TrueClass, FalseClass
"'#{quote_string(value.to_s)}'"
else
"'#{quote_string(value.to_yaml)}'"
end
when :date, :datetime, :time
case value
when Time, DateTime then "'#{value.strftime("%Y-%m-%d %H:%M:%S")}'"
when Date then "'#{value.to_s}'"
else "'#{quote_string(value)}'"
end
else "'#{quote_string(value.to_yaml)}'"
end
end
def quote_string(s)
s.gsub(/'/, "''") # ' (for ruby-mode)
end
def quote_column_name(name)
"[#{name}]"
end
def add_limit_offset!(sql, options) # :nodoc:
@limit = options[:limit]
@offset = options[:offset]
if !normal_select?
# Use temp table to hack offset with Sybase
sql.sub!(/ FROM /i, ' INTO #artemp FROM ')
elsif zero_limit?
# "SET ROWCOUNT 0" turns off limits, so we have
# to use a cheap trick.
if sql =~ /WHERE/i
sql.sub!(/WHERE/i, 'WHERE 1 = 2 AND ')
elsif sql =~ /ORDER\s+BY/i
sql.sub!(/ORDER\s+BY/i, 'WHERE 1 = 2 ORDER BY')
else
sql << 'WHERE 1 = 2'
end
end
end
def supports_migrations? #:nodoc:
true
end
def rename_table(name, new_name)
execute "EXEC sp_rename '#{name}', '#{new_name}'"
end
def rename_column(table, column, new_column_name)
execute "EXEC sp_rename '#{table}.#{column}', '#{new_column_name}'"
end
def change_column(table_name, column_name, type, options = {}) #:nodoc:
sql_commands = ["ALTER TABLE #{table_name} MODIFY #{column_name} #{type_to_sql(type, options[:limit])}"]
if options[:default]
remove_default_constraint(table_name, column_name)
sql_commands << "ALTER TABLE #{table_name} ADD CONSTRAINT DF_#{table_name}_#{column_name} DEFAULT #{options[:default]} FOR #{column_name}"
end
sql_commands.each { |c| execute(c) }
end
def remove_column(table_name, column_name)
remove_default_constraint(table_name, column_name)
execute "ALTER TABLE #{table_name} DROP #{column_name}"
end
def remove_default_constraint(table_name, column_name)
defaults = select "select def.name from sysobjects def, syscolumns col, sysobjects tab where col.cdefault = def.id and col.name = '#{column_name}' and tab.name = '#{table_name}' and col.id = tab.id"
defaults.each {|constraint|
execute "ALTER TABLE #{table_name} DROP CONSTRAINT #{constraint["name"]}"
}
end
def remove_index(table_name, options = {})
execute "DROP INDEX #{table_name}.#{index_name(table_name, options)}"
end
def add_column_options!(sql, options) #:nodoc:
sql << " DEFAULT #{quote(options[:default], options[:column])}" unless options[:default].nil?
if check_null_for_column?(options[:column], sql)
sql << (options[:null] == false ? " NOT NULL" : " NULL")
end
sql
end
private
def check_null_for_column?(col, sql)
# Sybase columns are NOT NULL by default, so explicitly set NULL
# if :null option is omitted. Disallow NULLs for boolean.
type = col.nil? ? "" : col[:type]
# Ignore :null if a primary key
return false if type =~ /PRIMARY KEY/i
# Ignore :null if a :boolean or BIT column
if (sql =~ /\s+bit(\s+DEFAULT)?/i) || type == :boolean
# If no default clause found on a boolean column, add one.
sql << " DEFAULT 0" if $1.nil?
return false
end
true
end
# Return the last value of the identity global value.
def last_insert_id
@connection.sql("SELECT @@IDENTITY")
unless @connection.cmd_fail?
id = @connection.top_row_result.rows.first.first
if id
id = id.to_i
id = nil if id == 0
end
else
id = nil
end
id
end
def affected_rows(name = nil)
@connection.sql("SELECT @@ROWCOUNT")
unless @connection.cmd_fail?
count = @connection.top_row_result.rows.first.first
count = count.to_i if count
else
0
end
end
def normal_select?
# If limit is not set at all, we can ignore offset;
# If limit *is* set but offset is zero, use normal select
# with simple SET ROWCOUNT. Thus, only use the temp table
# if limit is set and offset > 0.
has_limit = !@limit.nil?
has_offset = !@offset.nil? && @offset > 0
!has_limit || !has_offset
end
def zero_limit?
!@limit.nil? && @limit == 0
end
# Select limit number of rows starting at optional offset.
def select(sql, name = nil)
@connection.context.reset
log(sql, name) do
if normal_select?
# If limit is not explicitly set, return all results.
@logger.debug "Setting row count to (#{@limit || 'off'})" if @logger
# Run a normal select
@connection.set_rowcount(@limit || 0)
@connection.sql(sql)
else
# Select into a temp table and prune results
@logger.debug "Selecting #{@limit + (@offset || 0)} or fewer rows into #artemp" if @logger
@connection.set_rowcount(@limit + (@offset || 0))
@connection.sql_norow(sql) # Select into temp table
@logger.debug "Deleting #{@offset || 0} or fewer rows from #artemp" if @logger
@connection.set_rowcount(@offset || 0)
@connection.sql_norow("delete from #artemp") # Delete leading rows
@connection.set_rowcount(0)
@connection.sql("select * from #artemp") # Return the rest
end
end
rows = []
if @connection.context.failed? or @connection.cmd_fail?
raise StatementInvalid, "SQL Command Failed for #{name}: #{sql}\nMessage: #{@connection.context.message}"
else
results = @connection.top_row_result
if results && results.rows.length > 0
fields = fixup_column_names(results.columns)
results.rows.each do |row|
hashed_row = {}
row.zip(fields) { |cell, column| hashed_row[column] = cell }
rows << hashed_row
end
end
end
@connection.sql_norow("drop table #artemp") if !normal_select?
@limit = @offset = nil
return rows
end
def enable_identity_insert(table_name, enable = true)
if has_identity_column(table_name)
"SET IDENTITY_INSERT #{table_name} #{enable ? 'ON' : 'OFF'}"
end
end
def get_table_name(sql)
if sql =~ /^\s*insert\s+into\s+([^\(\s]+)\s*|^\s*update\s+([^\(\s]+)\s*/i
$1
elsif sql =~ /from\s+([^\(\s]+)\s*/i
$1
else
nil
end
end
def has_identity_column(table_name)
!get_identity_column(table_name).nil?
end
def get_identity_column(table_name)
@table_columns = {} unless @table_columns
@table_columns[table_name] = columns(table_name) if @table_columns[table_name] == nil
@table_columns[table_name].each do |col|
return col.name if col.identity
end
return nil
end
def query_contains_identity_column(sql, col)
sql =~ /\[#{col}\]/
end
# Remove trailing _ from names.
def fixup_column_names(columns)
columns.map { |column| column.sub(/_$/, '') }
end
def table_structure(table_name)
sql = <<SQLTEXT
SELECT col.name AS name, type.name AS type, col.prec, col.scale, col.length,
col.status, obj.sysstat2, def.text
FROM sysobjects obj, syscolumns col, systypes type, syscomments def
WHERE obj.id = col.id AND col.usertype = type.usertype AND col.cdefault *= def.id
AND obj.type = 'U' AND obj.name = '#{table_name}' ORDER BY col.colid
SQLTEXT
log(sql, "Get Column Info ") do
@connection.set_rowcount(0)
@connection.sql(sql)
end
if @connection.context.failed?
raise "SQL Command for table_structure for #{table_name} failed\nMessage: #{@connection.context.message}"
elsif !@connection.cmd_fail?
columns = []
results = @connection.top_row_result
results.rows.each do |row|
name, type, prec, scale, length, status, sysstat2, default = row
type = normalize_type(type, prec, scale, length)
default_value = nil
name.sub!(/_$/o, '')
if default =~ /DEFAULT\s+(.+)/o
default_value = $1.strip
default_value = default_value[1...-1] if default_value =~ /^['"]/o
end
nullable = (status & 8) == 8
identity = status >= 128
primary = (sysstat2 & 8) == 8
columns << [name, default_value, type, nullable, identity, primary]
end
columns
else
nil
end
end
def normalize_type(field_type, prec, scale, length)
if field_type =~ /numeric/i and (scale.nil? or scale == 0)
type = 'int'
elsif field_type =~ /money/i
type = 'numeric'
else
type = field_type
end
size = ''
if prec
size = "(#{prec})"
elsif length
size = "(#{length})"
end
return type + size
end
def default_value(value)
end
end # class SybaseAdapter
class SybaseAdapterContext < SybSQLContext
DEADLOCK = 1205
attr_reader :message
def init(logger = nil)
@deadlocked = false
@failed = false
@logger = logger
@message = nil
end
def srvmsgCB(con, msg)
# Do not log change of context messages.
if msg['severity'] == 10 or msg['severity'] == 0
return true
end
if msg['msgnumber'] == DEADLOCK
@deadlocked = true
else
@logger.info "SQL Command failed!" if @logger
@failed = true
end
if @logger
@logger.error "** SybSQLContext Server Message: **"
@logger.error " Message number #{msg['msgnumber']} Severity #{msg['severity']} State #{msg['state']} Line #{msg['line']}"
@logger.error " Server #{msg['srvname']}"
@logger.error " Procedure #{msg['proc']}"
@logger.error " Message String: #{msg['text']}"
end
@message = msg['text']
true
end
def deadlocked?
@deadlocked
end
def failed?
@failed
end
def reset
@deadlocked = false
@failed = false
@message = nil
end
def cltmsgCB(con, msg)
return true unless ( msg.kind_of?(Hash) )
unless ( msg[ "severity" ] ) then
return true
end
if @logger
@logger.error "** SybSQLContext Client-Message: **"
@logger.error " Message number: LAYER=#{msg[ 'layer' ]} ORIGIN=#{msg[ 'origin' ]} SEVERITY=#{msg[ 'severity' ]} NUMBER=#{msg[ 'number' ]}"
@logger.error " Message String: #{msg['msgstring']}"
@logger.error " OS Error: #{msg['osstring']}"
@message = msg['msgstring']
end
@failed = true
# Not retry , CS_CV_RETRY_FAIL( probability TimeOut )
if( msg[ 'severity' ] == "RETRY_FAIL" ) then
@timeout_p = true
return false
end
return true
end
end # class SybaseAdapterContext
end # module ConnectionAdapters
end # module ActiveRecord
# Allow identity inserts for fixtures.
require "active_record/fixtures"
class Fixtures
alias :original_insert_fixtures :insert_fixtures
def insert_fixtures
values.each do |fixture|
allow_identity_inserts table_name, true
@connection.execute "INSERT INTO #{@table_name} (#{fixture.key_list}) VALUES (#{fixture.value_list})", 'Fixture Insert'
allow_identity_inserts table_name, false
end
end
def allow_identity_inserts(table_name, enable)
@connection.execute "SET IDENTITY_INSERT #{table_name} #{enable ? 'ON' : 'OFF'}" rescue nil
end
end
rescue LoadError => cannot_require_sybase
# Couldn't load sybase adapter
end

View file

@ -0,0 +1,90 @@
module ActiveRecord
module Associations # :nodoc:
module ClassMethods
def deprecated_collection_count_method(collection_name)# :nodoc:
module_eval <<-"end_eval", __FILE__, __LINE__
def #{collection_name}_count(force_reload = false)
#{collection_name}.reload if force_reload
#{collection_name}.size
end
end_eval
end
def deprecated_add_association_relation(association_name)# :nodoc:
module_eval <<-"end_eval", __FILE__, __LINE__
def add_#{association_name}(*items)
#{association_name}.concat(items)
end
end_eval
end
def deprecated_remove_association_relation(association_name)# :nodoc:
module_eval <<-"end_eval", __FILE__, __LINE__
def remove_#{association_name}(*items)
#{association_name}.delete(items)
end
end_eval
end
def deprecated_has_collection_method(collection_name)# :nodoc:
module_eval <<-"end_eval", __FILE__, __LINE__
def has_#{collection_name}?(force_reload = false)
!#{collection_name}(force_reload).empty?
end
end_eval
end
def deprecated_find_in_collection_method(collection_name)# :nodoc:
module_eval <<-"end_eval", __FILE__, __LINE__
def find_in_#{collection_name}(association_id)
#{collection_name}.find(association_id)
end
end_eval
end
def deprecated_find_all_in_collection_method(collection_name)# :nodoc:
module_eval <<-"end_eval", __FILE__, __LINE__
def find_all_in_#{collection_name}(runtime_conditions = nil, orderings = nil, limit = nil, joins = nil)
#{collection_name}.find_all(runtime_conditions, orderings, limit, joins)
end
end_eval
end
def deprecated_collection_create_method(collection_name)# :nodoc:
module_eval <<-"end_eval", __FILE__, __LINE__
def create_in_#{collection_name}(attributes = {})
#{collection_name}.create(attributes)
end
end_eval
end
def deprecated_collection_build_method(collection_name)# :nodoc:
module_eval <<-"end_eval", __FILE__, __LINE__
def build_to_#{collection_name}(attributes = {})
#{collection_name}.build(attributes)
end
end_eval
end
def deprecated_association_comparison_method(association_name, association_class_name) # :nodoc:
module_eval <<-"end_eval", __FILE__, __LINE__
def #{association_name}?(comparison_object, force_reload = false)
if comparison_object.kind_of?(#{association_class_name})
#{association_name}(force_reload) == comparison_object
else
raise "Comparison object is a #{association_class_name}, should have been \#{comparison_object.class.name}"
end
end
end_eval
end
def deprecated_has_association_method(association_name) # :nodoc:
module_eval <<-"end_eval", __FILE__, __LINE__
def has_#{association_name}?(force_reload = false)
!#{association_name}(force_reload).nil?
end
end_eval
end
end
end
end

View file

@ -0,0 +1,41 @@
module ActiveRecord
class Base
class << self
# This method is deprecated in favor of find with the :conditions option.
#
# Works like find, but the record matching +id+ must also meet the +conditions+.
# +RecordNotFound+ is raised if no record can be found matching the +id+ or meeting the condition.
# Example:
# Person.find_on_conditions 5, "first_name LIKE '%dav%' AND last_name = 'heinemeier'"
def find_on_conditions(ids, conditions) # :nodoc:
find(ids, :conditions => conditions)
end
# This method is deprecated in favor of find(:first, options).
#
# Returns the object for the first record responding to the conditions in +conditions+,
# such as "group = 'master'". If more than one record is returned from the query, it's the first that'll
# be used to create the object. In such cases, it might be beneficial to also specify
# +orderings+, like "income DESC, name", to control exactly which record is to be used. Example:
# Employee.find_first "income > 50000", "income DESC, name"
def find_first(conditions = nil, orderings = nil, joins = nil) # :nodoc:
find(:first, :conditions => conditions, :order => orderings, :joins => joins)
end
# This method is deprecated in favor of find(:all, options).
#
# Returns an array of all the objects that could be instantiated from the associated
# table in the database. The +conditions+ can be used to narrow the selection of objects (WHERE-part),
# such as by "color = 'red'", and arrangement of the selection can be done through +orderings+ (ORDER BY-part),
# such as by "last_name, first_name DESC". A maximum of returned objects and their offset can be specified in
# +limit+ with either just a single integer as the limit or as an array with the first element as the limit,
# the second as the offset. Examples:
# Project.find_all "category = 'accounts'", "last_accessed DESC", 15
# Project.find_all ["category = ?", category_name], "created ASC", [15, 20]
def find_all(conditions = nil, orderings = nil, limit = nil, joins = nil) # :nodoc:
limit, offset = limit.is_a?(Array) ? limit : [ limit, nil ]
find(:all, :conditions => conditions, :order => orderings, :joins => joins, :limit => limit, :offset => offset)
end
end
end
end

View file

@ -0,0 +1,600 @@
require 'erb'
require 'yaml'
require 'csv'
module YAML #:nodoc:
class Omap #:nodoc:
def keys; map { |k, v| k } end
def values; map { |k, v| v } end
end
end
class FixtureClassNotFound < ActiveRecord::ActiveRecordError #:nodoc:
end
# Fixtures are a way of organizing data that you want to test against; in short, sample data. They come in 3 flavours:
#
# 1. YAML fixtures
# 2. CSV fixtures
# 3. Single-file fixtures
#
# = YAML fixtures
#
# This type of fixture is in YAML format and the preferred default. YAML is a file format which describes data structures
# in a non-verbose, humanly-readable format. It ships with Ruby 1.8.1+.
#
# Unlike single-file fixtures, YAML fixtures are stored in a single file per model, which are placed in the directory appointed
# by <tt>Test::Unit::TestCase.fixture_path=(path)</tt> (this is automatically configured for Rails, so you can just
# put your files in <your-rails-app>/test/fixtures/). The fixture file ends with the .yml file extension (Rails example:
# "<your-rails-app>/test/fixtures/web_sites.yml"). The format of a YAML fixture file looks like this:
#
# rubyonrails:
# id: 1
# name: Ruby on Rails
# url: http://www.rubyonrails.org
#
# google:
# id: 2
# name: Google
# url: http://www.google.com
#
# This YAML fixture file includes two fixtures. Each YAML fixture (ie. record) is given a name and is followed by an
# indented list of key/value pairs in the "key: value" format. Records are separated by a blank line for your viewing
# pleasure.
#
# Note that YAML fixtures are unordered. If you want ordered fixtures, use the omap YAML type. See http://yaml.org/type/omap.html
# for the specification. You will need ordered fixtures when you have foreign key constraints on keys in the same table.
# This is commonly needed for tree structures. Example:
#
# --- !omap
# - parent:
# id: 1
# parent_id: NULL
# title: Parent
# - child:
# id: 2
# parent_id: 1
# title: Child
#
# = CSV fixtures
#
# Fixtures can also be kept in the Comma Separated Value format. Akin to YAML fixtures, CSV fixtures are stored
# in a single file, but instead end with the .csv file extension (Rails example: "<your-rails-app>/test/fixtures/web_sites.csv")
#
# The format of this type of fixture file is much more compact than the others, but also a little harder to read by us
# humans. The first line of the CSV file is a comma-separated list of field names. The rest of the file is then comprised
# of the actual data (1 per line). Here's an example:
#
# id, name, url
# 1, Ruby On Rails, http://www.rubyonrails.org
# 2, Google, http://www.google.com
#
# Should you have a piece of data with a comma character in it, you can place double quotes around that value. If you
# need to use a double quote character, you must escape it with another double quote.
#
# Another unique attribute of the CSV fixture is that it has *no* fixture name like the other two formats. Instead, the
# fixture names are automatically generated by deriving the class name of the fixture file and adding an incrementing
# number to the end. In our example, the 1st fixture would be called "web_site_1" and the 2nd one would be called
# "web_site_2".
#
# Most databases and spreadsheets support exporting to CSV format, so this is a great format for you to choose if you
# have existing data somewhere already.
#
# = Single-file fixtures
#
# This type of fixtures was the original format for Active Record that has since been deprecated in favor of the YAML and CSV formats.
# Fixtures for this format are created by placing text files in a sub-directory (with the name of the model) to the directory
# appointed by <tt>Test::Unit::TestCase.fixture_path=(path)</tt> (this is automatically configured for Rails, so you can just
# put your files in <your-rails-app>/test/fixtures/<your-model-name>/ -- like <your-rails-app>/test/fixtures/web_sites/ for the WebSite
# model).
#
# Each text file placed in this directory represents a "record". Usually these types of fixtures are named without
# extensions, but if you are on a Windows machine, you might consider adding .txt as the extension. Here's what the
# above example might look like:
#
# web_sites/google
# web_sites/yahoo.txt
# web_sites/ruby-on-rails
#
# The file format of a standard fixture is simple. Each line is a property (or column in db speak) and has the syntax
# of "name => value". Here's an example of the ruby-on-rails fixture above:
#
# id => 1
# name => Ruby on Rails
# url => http://www.rubyonrails.org
#
# = Using Fixtures
#
# Since fixtures are a testing construct, we use them in our unit and functional tests. There are two ways to use the
# fixtures, but first let's take a look at a sample unit test found:
#
# require 'web_site'
#
# class WebSiteTest < Test::Unit::TestCase
# def test_web_site_count
# assert_equal 2, WebSite.count
# end
# end
#
# As it stands, unless we pre-load the web_site table in our database with two records, this test will fail. Here's the
# easiest way to add fixtures to the database:
#
# ...
# class WebSiteTest < Test::Unit::TestCase
# fixtures :web_sites # add more by separating the symbols with commas
# ...
#
# By adding a "fixtures" method to the test case and passing it a list of symbols (only one is shown here tho), we trigger
# the testing environment to automatically load the appropriate fixtures into the database before each test.
# To ensure consistent data, the environment deletes the fixtures before running the load.
#
# In addition to being available in the database, the fixtures are also loaded into a hash stored in an instance variable
# of the test case. It is named after the symbol... so, in our example, there would be a hash available called
# @web_sites. This is where the "fixture name" comes into play.
#
# On top of that, each record is automatically "found" (using Model.find(id)) and placed in the instance variable of its name.
# So for the YAML fixtures, we'd get @rubyonrails and @google, which could be interrogated using regular Active Record semantics:
#
# # test if the object created from the fixture data has the same attributes as the data itself
# def test_find
# assert_equal @web_sites["rubyonrails"]["name"], @rubyonrails.name
# end
#
# As seen above, the data hash created from the YAML fixtures would have @web_sites["rubyonrails"]["url"] return
# "http://www.rubyonrails.org" and @web_sites["google"]["name"] would return "Google". The same fixtures, but loaded
# from a CSV fixture file, would be accessible via @web_sites["web_site_1"]["name"] == "Ruby on Rails" and have the individual
# fixtures available as instance variables @web_site_1 and @web_site_2.
#
# If you do not wish to use instantiated fixtures (usually for performance reasons) there are two options.
#
# - to completely disable instantiated fixtures:
# self.use_instantiated_fixtures = false
#
# - to keep the fixture instance (@web_sites) available, but do not automatically 'find' each instance:
# self.use_instantiated_fixtures = :no_instances
#
# Even if auto-instantiated fixtures are disabled, you can still access them
# by name via special dynamic methods. Each method has the same name as the
# model, and accepts the name of the fixture to instantiate:
#
# fixtures :web_sites
#
# def test_find
# assert_equal "Ruby on Rails", web_sites(:rubyonrails).name
# end
#
# = Dynamic fixtures with ERb
#
# Some times you don't care about the content of the fixtures as much as you care about the volume. In these cases, you can
# mix ERb in with your YAML or CSV fixtures to create a bunch of fixtures for load testing, like:
#
# <% for i in 1..1000 %>
# fix_<%= i %>:
# id: <%= i %>
# name: guy_<%= 1 %>
# <% end %>
#
# This will create 1000 very simple YAML fixtures.
#
# Using ERb, you can also inject dynamic values into your fixtures with inserts like <%= Date.today.strftime("%Y-%m-%d") %>.
# This is however a feature to be used with some caution. The point of fixtures are that they're stable units of predictable
# sample data. If you feel that you need to inject dynamic values, then perhaps you should reexamine whether your application
# is properly testable. Hence, dynamic values in fixtures are to be considered a code smell.
#
# = Transactional fixtures
#
# TestCases can use begin+rollback to isolate their changes to the database instead of having to delete+insert for every test case.
# They can also turn off auto-instantiation of fixture data since the feature is costly and often unused.
#
# class FooTest < Test::Unit::TestCase
# self.use_transactional_fixtures = true
# self.use_instantiated_fixtures = false
#
# fixtures :foos
#
# def test_godzilla
# assert !Foo.find(:all).empty?
# Foo.destroy_all
# assert Foo.find(:all).empty?
# end
#
# def test_godzilla_aftermath
# assert !Foo.find(:all).empty?
# end
# end
#
# If you preload your test database with all fixture data (probably in the Rakefile task) and use transactional fixtures,
# then you may omit all fixtures declarations in your test cases since all the data's already there and every case rolls back its changes.
#
# In order to use instantiated fixtures with preloaded data, set +self.pre_loaded_fixtures+ to true. This will provide
# access to fixture data for every table that has been loaded through fixtures (depending on the value of +use_instantiated_fixtures+)
#
# When *not* to use transactional fixtures:
# 1. You're testing whether a transaction works correctly. Nested transactions don't commit until all parent transactions commit,
# particularly, the fixtures transaction which is begun in setup and rolled back in teardown. Thus, you won't be able to verify
# the results of your transaction until Active Record supports nested transactions or savepoints (in progress.)
# 2. Your database does not support transactions. Every Active Record database supports transactions except MySQL MyISAM.
# Use InnoDB, MaxDB, or NDB instead.
class Fixtures < YAML::Omap
DEFAULT_FILTER_RE = /\.ya?ml$/
def self.instantiate_fixtures(object, table_name, fixtures, load_instances=true)
object.instance_variable_set "@#{table_name.to_s.gsub('.','_')}", fixtures
if load_instances
ActiveRecord::Base.silence do
fixtures.each do |name, fixture|
begin
object.instance_variable_set "@#{name}", fixture.find
rescue FixtureClassNotFound
nil
end
end
end
end
end
def self.instantiate_all_loaded_fixtures(object, load_instances=true)
all_loaded_fixtures.each do |table_name, fixtures|
Fixtures.instantiate_fixtures(object, table_name, fixtures, load_instances)
end
end
cattr_accessor :all_loaded_fixtures
self.all_loaded_fixtures = {}
def self.create_fixtures(fixtures_directory, table_names, class_names = {})
table_names = [table_names].flatten.map { |n| n.to_s }
connection = block_given? ? yield : ActiveRecord::Base.connection
ActiveRecord::Base.silence do
fixtures_map = {}
fixtures = table_names.map do |table_name|
fixtures_map[table_name] = Fixtures.new(connection, File.split(table_name.to_s).last, class_names[table_name.to_sym], File.join(fixtures_directory, table_name.to_s))
end
all_loaded_fixtures.merge! fixtures_map
connection.transaction do
fixtures.reverse.each { |fixture| fixture.delete_existing_fixtures }
fixtures.each { |fixture| fixture.insert_fixtures }
# Cap primary key sequences to max(pk).
if connection.respond_to?(:reset_pk_sequence!)
table_names.each do |table_name|
connection.reset_pk_sequence!(table_name)
end
end
end
return fixtures.size > 1 ? fixtures : fixtures.first
end
end
attr_reader :table_name
def initialize(connection, table_name, class_name, fixture_path, file_filter = DEFAULT_FILTER_RE)
@connection, @table_name, @fixture_path, @file_filter = connection, table_name, fixture_path, file_filter
@class_name = class_name ||
(ActiveRecord::Base.pluralize_table_names ? @table_name.singularize.camelize : @table_name.camelize)
@table_name = ActiveRecord::Base.table_name_prefix + @table_name + ActiveRecord::Base.table_name_suffix
read_fixture_files
end
def delete_existing_fixtures
@connection.delete "DELETE FROM #{@table_name}", 'Fixture Delete'
end
def insert_fixtures
values.each do |fixture|
@connection.execute "INSERT INTO #{@table_name} (#{fixture.key_list}) VALUES (#{fixture.value_list})", 'Fixture Insert'
end
end
private
def read_fixture_files
if File.file?(yaml_file_path)
# YAML fixtures
begin
yaml_string = ""
Dir["#{@fixture_path}/**/*.yml"].select {|f| test(?f,f) }.each do |subfixture_path|
yaml_string << IO.read(subfixture_path)
end
yaml_string << IO.read(yaml_file_path)
if yaml = YAML::load(erb_render(yaml_string))
yaml = yaml.value if yaml.respond_to?(:type_id) and yaml.respond_to?(:value)
yaml.each do |name, data|
self[name] = Fixture.new(data, @class_name)
end
end
rescue Exception=>boom
raise Fixture::FormatError, "a YAML error occured parsing #{yaml_file_path}. Please note that YAML must be consistently indented using spaces. Tabs are not allowed. Please have a look at http://www.yaml.org/faq.html\nThe exact error was:\n #{boom.class}: #{boom}"
end
elsif File.file?(csv_file_path)
# CSV fixtures
reader = CSV::Reader.create(erb_render(IO.read(csv_file_path)))
header = reader.shift
i = 0
reader.each do |row|
data = {}
row.each_with_index { |cell, j| data[header[j].to_s.strip] = cell.to_s.strip }
self["#{Inflector::underscore(@class_name)}_#{i+=1}"]= Fixture.new(data, @class_name)
end
elsif File.file?(deprecated_yaml_file_path)
raise Fixture::FormatError, ".yml extension required: rename #{deprecated_yaml_file_path} to #{yaml_file_path}"
else
# Standard fixtures
Dir.entries(@fixture_path).each do |file|
path = File.join(@fixture_path, file)
if File.file?(path) and file !~ @file_filter
self[file] = Fixture.new(path, @class_name)
end
end
end
end
def yaml_file_path
"#{@fixture_path}.yml"
end
def deprecated_yaml_file_path
"#{@fixture_path}.yaml"
end
def csv_file_path
@fixture_path + ".csv"
end
def yaml_fixtures_key(path)
File.basename(@fixture_path).split(".").first
end
def erb_render(fixture_content)
ERB.new(fixture_content).result
end
end
class Fixture #:nodoc:
include Enumerable
class FixtureError < StandardError#:nodoc:
end
class FormatError < FixtureError#:nodoc:
end
def initialize(fixture, class_name)
case fixture
when Hash, YAML::Omap
@fixture = fixture
when String
@fixture = read_fixture_file(fixture)
else
raise ArgumentError, "Bad fixture argument #{fixture.inspect}"
end
@class_name = class_name
end
def each
@fixture.each { |item| yield item }
end
def [](key)
@fixture[key]
end
def to_hash
@fixture
end
def key_list
columns = @fixture.keys.collect{ |column_name| ActiveRecord::Base.connection.quote_column_name(column_name) }
columns.join(", ")
end
def value_list
@fixture.values.map { |v| ActiveRecord::Base.connection.quote(v).gsub('\\n', "\n").gsub('\\r', "\r") }.join(", ")
end
def find
klass = @class_name.is_a?(Class) ? @class_name : Object.const_get(@class_name) rescue nil
if klass
klass.find(self[klass.primary_key])
else
raise FixtureClassNotFound, "The class #{@class_name.inspect} was not found."
end
end
private
def read_fixture_file(fixture_file_path)
IO.readlines(fixture_file_path).inject({}) do |fixture, line|
# Mercifully skip empty lines.
next if line =~ /^\s*$/
# Use the same regular expression for attributes as Active Record.
unless md = /^\s*([a-zA-Z][-_\w]*)\s*=>\s*(.+)\s*$/.match(line)
raise FormatError, "#{fixture_file_path}: fixture format error at '#{line}'. Expecting 'key => value'."
end
key, value = md.captures
# Disallow duplicate keys to catch typos.
raise FormatError, "#{fixture_file_path}: duplicate '#{key}' in fixture." if fixture[key]
fixture[key] = value.strip
fixture
end
end
end
module Test #:nodoc:
module Unit #:nodoc:
class TestCase #:nodoc:
cattr_accessor :fixture_path
class_inheritable_accessor :fixture_table_names
class_inheritable_accessor :fixture_class_names
class_inheritable_accessor :use_transactional_fixtures
class_inheritable_accessor :use_instantiated_fixtures # true, false, or :no_instances
class_inheritable_accessor :pre_loaded_fixtures
self.fixture_table_names = []
self.use_transactional_fixtures = false
self.use_instantiated_fixtures = true
self.pre_loaded_fixtures = false
self.fixture_class_names = {}
@@already_loaded_fixtures = {}
self.fixture_class_names = {}
def self.set_fixture_class(class_names = {})
self.fixture_class_names = self.fixture_class_names.merge(class_names)
end
def self.fixtures(*table_names)
table_names = table_names.flatten.map { |n| n.to_s }
self.fixture_table_names |= table_names
require_fixture_classes(table_names)
setup_fixture_accessors(table_names)
end
def self.require_fixture_classes(table_names=nil)
(table_names || fixture_table_names).each do |table_name|
file_name = table_name.to_s
file_name = file_name.singularize if ActiveRecord::Base.pluralize_table_names
begin
require file_name
rescue LoadError
# Let's hope the developer has included it himself
end
end
end
def self.setup_fixture_accessors(table_names=nil)
(table_names || fixture_table_names).each do |table_name|
table_name = table_name.to_s.tr('.','_')
define_method(table_name) do |fixture, *optionals|
force_reload = optionals.shift
@fixture_cache[table_name] ||= Hash.new
@fixture_cache[table_name][fixture] = nil if force_reload
if @loaded_fixtures[table_name][fixture.to_s]
@fixture_cache[table_name][fixture] ||= @loaded_fixtures[table_name][fixture.to_s].find
else
raise StandardError, "No fixture with name '#{fixture}' found for table '#{table_name}'"
end
end
end
end
def self.uses_transaction(*methods)
@uses_transaction ||= []
@uses_transaction.concat methods.map { |m| m.to_s }
end
def self.uses_transaction?(method)
@uses_transaction && @uses_transaction.include?(method.to_s)
end
def use_transactional_fixtures?
use_transactional_fixtures &&
!self.class.uses_transaction?(method_name)
end
def setup_with_fixtures
if pre_loaded_fixtures && !use_transactional_fixtures
raise RuntimeError, 'pre_loaded_fixtures requires use_transactional_fixtures'
end
@fixture_cache = Hash.new
# Load fixtures once and begin transaction.
if use_transactional_fixtures?
if @@already_loaded_fixtures[self.class]
@loaded_fixtures = @@already_loaded_fixtures[self.class]
else
load_fixtures
@@already_loaded_fixtures[self.class] = @loaded_fixtures
end
ActiveRecord::Base.lock_mutex
ActiveRecord::Base.connection.begin_db_transaction
# Load fixtures for every test.
else
@@already_loaded_fixtures[self.class] = nil
load_fixtures
end
# Instantiate fixtures for every test if requested.
instantiate_fixtures if use_instantiated_fixtures
end
alias_method :setup, :setup_with_fixtures
def teardown_with_fixtures
# Rollback changes.
if use_transactional_fixtures?
ActiveRecord::Base.connection.rollback_db_transaction
ActiveRecord::Base.unlock_mutex
end
ActiveRecord::Base.verify_active_connections!
end
alias_method :teardown, :teardown_with_fixtures
def self.method_added(method)
case method.to_s
when 'setup'
unless method_defined?(:setup_without_fixtures)
alias_method :setup_without_fixtures, :setup
define_method(:setup) do
setup_with_fixtures
setup_without_fixtures
end
end
when 'teardown'
unless method_defined?(:teardown_without_fixtures)
alias_method :teardown_without_fixtures, :teardown
define_method(:teardown) do
teardown_without_fixtures
teardown_with_fixtures
end
end
end
end
private
def load_fixtures
@loaded_fixtures = {}
fixtures = Fixtures.create_fixtures(fixture_path, fixture_table_names, fixture_class_names)
unless fixtures.nil?
if fixtures.instance_of?(Fixtures)
@loaded_fixtures[fixtures.table_name] = fixtures
else
fixtures.each { |f| @loaded_fixtures[f.table_name] = f }
end
end
end
# for pre_loaded_fixtures, only require the classes once. huge speed improvement
@@required_fixture_classes = false
def instantiate_fixtures
if pre_loaded_fixtures
raise RuntimeError, 'Load fixtures before instantiating them.' if Fixtures.all_loaded_fixtures.empty?
unless @@required_fixture_classes
self.class.require_fixture_classes Fixtures.all_loaded_fixtures.keys
@@required_fixture_classes = true
end
Fixtures.instantiate_all_loaded_fixtures(self, load_instances?)
else
raise RuntimeError, 'Load fixtures before instantiating them.' if @loaded_fixtures.nil?
@loaded_fixtures.each do |table_name, fixtures|
Fixtures.instantiate_fixtures(self, table_name, fixtures, load_instances?)
end
end
end
def load_instances?
use_instantiated_fixtures != :no_instances
end
end
end
end

View file

@ -0,0 +1,79 @@
module ActiveRecord
# Active Records support optimistic locking if the field <tt>lock_version</tt> is present. Each update to the
# record increments the lock_version column and the locking facilities ensure that records instantiated twice
# will let the last one saved raise a StaleObjectError if the first was also updated. Example:
#
# p1 = Person.find(1)
# p2 = Person.find(1)
#
# p1.first_name = "Michael"
# p1.save
#
# p2.first_name = "should fail"
# p2.save # Raises a ActiveRecord::StaleObjectError
#
# You're then responsible for dealing with the conflict by rescuing the exception and either rolling back, merging,
# or otherwise apply the business logic needed to resolve the conflict.
#
# You must ensure that your database schema defaults the lock_version column to 0.
#
# This behavior can be turned off by setting <tt>ActiveRecord::Base.lock_optimistically = false</tt>.
# To override the name of the lock_version column, invoke the <tt>set_locking_column</tt> method.
# This method uses the same syntax as <tt>set_table_name</tt>
module Locking
def self.append_features(base) #:nodoc:
super
base.class_eval do
alias_method :update_without_lock, :update
alias_method :update, :update_with_lock
end
end
def update_with_lock #:nodoc:
return update_without_lock unless locking_enabled?
lock_col = self.class.locking_column
previous_value = send(lock_col)
send(lock_col + '=', previous_value + 1)
affected_rows = connection.update(<<-end_sql, "#{self.class.name} Update with optimistic locking")
UPDATE #{self.class.table_name}
SET #{quoted_comma_pair_list(connection, attributes_with_quotes(false))}
WHERE #{self.class.primary_key} = #{quote(id)}
AND #{lock_col} = #{quote(previous_value)}
end_sql
unless affected_rows == 1
raise ActiveRecord::StaleObjectError, "Attempted to update a stale object"
end
return true
end
end
class Base
@@lock_optimistically = true
cattr_accessor :lock_optimistically
def locking_enabled? #:nodoc:
lock_optimistically && respond_to?(self.class.locking_column)
end
class << self
def set_locking_column(value = nil, &block)
define_attr_method :locking_column, value, &block
end
def locking_column #:nodoc:
reset_locking_column
end
def reset_locking_column #:nodoc:
default = 'lock_version'
set_locking_column(default)
default
end
end
end
end

View file

@ -0,0 +1,391 @@
module ActiveRecord
class IrreversibleMigration < ActiveRecordError#:nodoc:
end
class DuplicateMigrationVersionError < ActiveRecordError#:nodoc:
def initialize(version)
super("Multiple migrations have the version number #{version}")
end
end
# Migrations can manage the evolution of a schema used by several physical databases. It's a solution
# to the common problem of adding a field to make a new feature work in your local database, but being unsure of how to
# push that change to other developers and to the production server. With migrations, you can describe the transformations
# in self-contained classes that can be checked into version control systems and executed against another database that
# might be one, two, or five versions behind.
#
# Example of a simple migration:
#
# class AddSsl < ActiveRecord::Migration
# def self.up
# add_column :accounts, :ssl_enabled, :boolean, :default => 1
# end
#
# def self.down
# remove_column :accounts, :ssl_enabled
# end
# end
#
# This migration will add a boolean flag to the accounts table and remove it again, if you're backing out of the migration.
# It shows how all migrations have two class methods +up+ and +down+ that describes the transformations required to implement
# or remove the migration. These methods can consist of both the migration specific methods, like add_column and remove_column,
# but may also contain regular Ruby code for generating data needed for the transformations.
#
# Example of a more complex migration that also needs to initialize data:
#
# class AddSystemSettings < ActiveRecord::Migration
# def self.up
# create_table :system_settings do |t|
# t.column :name, :string
# t.column :label, :string
# t.column :value, :text
# t.column :type, :string
# t.column :position, :integer
# end
#
# SystemSetting.create :name => "notice", :label => "Use notice?", :value => 1
# end
#
# def self.down
# drop_table :system_settings
# end
# end
#
# This migration first adds the system_settings table, then creates the very first row in it using the Active Record model
# that relies on the table. It also uses the more advanced create_table syntax where you can specify a complete table schema
# in one block call.
#
# == Available transformations
#
# * <tt>create_table(name, options)</tt> Creates a table called +name+ and makes the table object available to a block
# that can then add columns to it, following the same format as add_column. See example above. The options hash is for
# fragments like "DEFAULT CHARSET=UTF-8" that are appended to the create table definition.
# * <tt>drop_table(name)</tt>: Drops the table called +name+.
# * <tt>rename_table(old_name, new_name)</tt>: Renames the table called +old_name+ to +new_name+.
# * <tt>add_column(table_name, column_name, type, options)</tt>: Adds a new column to the table called +table_name+
# named +column_name+ specified to be one of the following types:
# :string, :text, :integer, :float, :datetime, :timestamp, :time, :date, :binary, :boolean. A default value can be specified
# by passing an +options+ hash like { :default => 11 }.
# * <tt>rename_column(table_name, column_name, new_column_name)</tt>: Renames a column but keeps the type and content.
# * <tt>change_column(table_name, column_name, type, options)</tt>: Changes the column to a different type using the same
# parameters as add_column.
# * <tt>remove_column(table_name, column_name)</tt>: Removes the column named +column_name+ from the table called +table_name+.
# * <tt>add_index(table_name, column_names, index_type, index_name)</tt>: Add a new index with the name of the column, or +index_name+ (if specified) on the column(s). Specify an optional +index_type+ (e.g. UNIQUE).
# * <tt>remove_index(table_name, index_name)</tt>: Remove the index specified by +index_name+.
#
# == Irreversible transformations
#
# Some transformations are destructive in a manner that cannot be reversed. Migrations of that kind should raise
# an <tt>IrreversibleMigration</tt> exception in their +down+ method.
#
# == Running migrations from within Rails
#
# The Rails package has several tools to help create and apply migrations.
#
# To generate a new migration, use <tt>script/generate migration MyNewMigration</tt>
# where MyNewMigration is the name of your migration. The generator will
# create a file <tt>nnn_my_new_migration.rb</tt> in the <tt>db/migrate/</tt>
# directory, where <tt>nnn</tt> is the next largest migration number.
# You may then edit the <tt>self.up</tt> and <tt>self.down</tt> methods of
# n MyNewMigration.
#
# To run migrations against the currently configured database, use
# <tt>rake migrate</tt>. This will update the database by running all of the
# pending migrations, creating the <tt>schema_info</tt> table if missing.
#
# To roll the database back to a previous migration version, use
# <tt>rake migrate VERSION=X</tt> where <tt>X</tt> is the version to which
# you wish to downgrade. If any of the migrations throw an
# <tt>IrreversibleMigration</tt> exception, that step will fail and you'll
# have some manual work to do.
#
# == Database support
#
# Migrations are currently supported in MySQL, PostgreSQL, SQLite,
# SQL Server, Sybase, and Oracle (all supported databases except DB2).
#
# == More examples
#
# Not all migrations change the schema. Some just fix the data:
#
# class RemoveEmptyTags < ActiveRecord::Migration
# def self.up
# Tag.find(:all).each { |tag| tag.destroy if tag.pages.empty? }
# end
#
# def self.down
# # not much we can do to restore deleted data
# raise IrreversibleMigration
# end
# end
#
# Others remove columns when they migrate up instead of down:
#
# class RemoveUnnecessaryItemAttributes < ActiveRecord::Migration
# def self.up
# remove_column :items, :incomplete_items_count
# remove_column :items, :completed_items_count
# end
#
# def self.down
# add_column :items, :incomplete_items_count
# add_column :items, :completed_items_count
# end
# end
#
# And sometimes you need to do something in SQL not abstracted directly by migrations:
#
# class MakeJoinUnique < ActiveRecord::Migration
# def self.up
# execute "ALTER TABLE `pages_linked_pages` ADD UNIQUE `page_id_linked_page_id` (`page_id`,`linked_page_id`)"
# end
#
# def self.down
# execute "ALTER TABLE `pages_linked_pages` DROP INDEX `page_id_linked_page_id`"
# end
# end
#
# == Using a model after changing its table
#
# Sometimes you'll want to add a column in a migration and populate it immediately after. In that case, you'll need
# to make a call to Base#reset_column_information in order to ensure that the model has the latest column data from
# after the new column was added. Example:
#
# class AddPeopleSalary < ActiveRecord::Migration
# def self.up
# add_column :people, :salary, :integer
# Person.reset_column_information
# Person.find(:all).each do |p|
# p.salary = SalaryCalculator.compute(p)
# end
# end
# end
#
# == Controlling verbosity
#
# By default, migrations will describe the actions they are taking, writing
# them to the console as they happen, along with benchmarks describing how
# long each step took.
#
# You can quiet them down by setting ActiveRecord::Migration.verbose = false.
#
# You can also insert your own messages and benchmarks by using the #say_with_time
# method:
#
# def self.up
# ...
# say_with_time "Updating salaries..." do
# Person.find(:all).each do |p|
# p.salary = SalaryCalculator.compute(p)
# end
# end
# ...
# end
#
# The phrase "Updating salaries..." would then be printed, along with the
# benchmark for the block when the block completes.
class Migration
@@verbose = true
cattr_accessor :verbose
class << self
def up_using_benchmarks #:nodoc:
migrate(:up)
end
def down_using_benchmarks #:nodoc:
migrate(:down)
end
# Execute this migration in the named direction
def migrate(direction)
return unless respond_to?(direction)
case direction
when :up then announce "migrating"
when :down then announce "reverting"
end
result = nil
time = Benchmark.measure { result = send("real_#{direction}") }
case direction
when :up then announce "migrated (%.4fs)" % time.real; write
when :down then announce "reverted (%.4fs)" % time.real; write
end
result
end
# Because the method added may do an alias_method, it can be invoked
# recursively. We use @ignore_new_methods as a guard to indicate whether
# it is safe for the call to proceed.
def singleton_method_added(sym) #:nodoc:
return if @ignore_new_methods
begin
@ignore_new_methods = true
case sym
when :up, :down
klass = (class << self; self; end)
klass.send(:alias_method, "real_#{sym}", sym)
klass.send(:alias_method, sym, "#{sym}_using_benchmarks")
end
ensure
@ignore_new_methods = false
end
end
def write(text="")
puts(text) if verbose
end
def announce(message)
text = "#{name}: #{message}"
length = [0, 75 - text.length].max
write "== %s %s" % [text, "=" * length]
end
def say(message, subitem=false)
write "#{subitem ? " ->" : "--"} #{message}"
end
def say_with_time(message)
say(message)
result = nil
time = Benchmark.measure { result = yield }
say "%.4fs" % time.real, :subitem
result
end
def suppress_messages
save = verbose
self.verbose = false
yield
ensure
self.verbose = save
end
def method_missing(method, *arguments, &block)
say_with_time "#{method}(#{arguments.map { |a| a.inspect }.join(", ")})" do
arguments[0] = Migrator.proper_table_name(arguments.first) unless arguments.empty? || method == :execute
ActiveRecord::Base.connection.send(method, *arguments, &block)
end
end
end
end
class Migrator#:nodoc:
class << self
def migrate(migrations_path, target_version = nil)
Base.connection.initialize_schema_information
case
when target_version.nil?, current_version < target_version
up(migrations_path, target_version)
when current_version > target_version
down(migrations_path, target_version)
when current_version == target_version
return # You're on the right version
end
end
def up(migrations_path, target_version = nil)
self.new(:up, migrations_path, target_version).migrate
end
def down(migrations_path, target_version = nil)
self.new(:down, migrations_path, target_version).migrate
end
def schema_info_table_name
Base.table_name_prefix + "schema_info" + Base.table_name_suffix
end
def current_version
(Base.connection.select_one("SELECT version FROM #{schema_info_table_name}") || {"version" => 0})["version"].to_i
end
def proper_table_name(name)
# Use the ActiveRecord objects own table_name, or pre/suffix from ActiveRecord::Base if name is a symbol/string
name.table_name rescue "#{ActiveRecord::Base.table_name_prefix}#{name}#{ActiveRecord::Base.table_name_suffix}"
end
end
def initialize(direction, migrations_path, target_version = nil)
raise StandardError.new("This database does not yet support migrations") unless Base.connection.supports_migrations?
@direction, @migrations_path, @target_version = direction, migrations_path, target_version
Base.connection.initialize_schema_information
end
def current_version
self.class.current_version
end
def migrate
migration_classes.each do |(version, migration_class)|
Base.logger.info("Reached target version: #{@target_version}") and break if reached_target_version?(version)
next if irrelevant_migration?(version)
Base.logger.info "Migrating to #{migration_class} (#{version})"
migration_class.migrate(@direction)
set_schema_version(version)
end
end
private
def migration_classes
migrations = migration_files.inject([]) do |migrations, migration_file|
load(migration_file)
version, name = migration_version_and_name(migration_file)
assert_unique_migration_version(migrations, version.to_i)
migrations << [ version.to_i, migration_class(name) ]
end
down? ? migrations.sort.reverse : migrations.sort
end
def assert_unique_migration_version(migrations, version)
if !migrations.empty? && migrations.transpose.first.include?(version)
raise DuplicateMigrationVersionError.new(version)
end
end
def migration_files
files = Dir["#{@migrations_path}/[0-9]*_*.rb"].sort_by do |f|
migration_version_and_name(f).first.to_i
end
down? ? files.reverse : files
end
def migration_class(migration_name)
migration_name.camelize.constantize
end
def migration_version_and_name(migration_file)
return *migration_file.scan(/([0-9]+)_([_a-z0-9]*).rb/).first
end
def set_schema_version(version)
Base.connection.update("UPDATE #{self.class.schema_info_table_name} SET version = #{down? ? version.to_i - 1 : version.to_i}")
end
def up?
@direction == :up
end
def down?
@direction == :down
end
def reached_target_version?(version)
(up? && version.to_i - 1 == @target_version) || (down? && version.to_i == @target_version)
end
def irrelevant_migration?(version)
(up? && version.to_i <= current_version) || (down? && version.to_i > current_version)
end
end
end

View file

@ -0,0 +1,139 @@
require 'singleton'
module ActiveRecord
module Observing # :nodoc:
def self.append_features(base)
super
base.extend(ClassMethods)
end
module ClassMethods
# Activates the observers assigned. Examples:
#
# # Calls PersonObserver.instance
# ActiveRecord::Base.observers = :person_observer
#
# # Calls Cacher.instance and GarbageCollector.instance
# ActiveRecord::Base.observers = :cacher, :garbage_collector
#
# # Same as above, just using explicit class references
# ActiveRecord::Base.observers = Cacher, GarbageCollector
def observers=(*observers)
observers = [ observers ].flatten.each do |observer|
observer.is_a?(Symbol) ?
observer.to_s.camelize.constantize.instance :
observer.instance
end
end
end
end
# Observer classes respond to lifecycle callbacks to implement trigger-like
# behavior outside the original class. This is a great way to reduce the
# clutter that normally comes when the model class is burdened with
# functionality that doesn't pertain to the core responsibility of the
# class. Example:
#
# class CommentObserver < ActiveRecord::Observer
# def after_save(comment)
# Notifications.deliver_comment("admin@do.com", "New comment was posted", comment)
# end
# end
#
# This Observer sends an email when a Comment#save is finished.
#
# class ContactObserver < ActiveRecord::Observer
# def after_create(contact)
# contact.logger.info('New contact added!')
# end
#
# def after_destroy(contact)
# contact.logger.warn("Contact with an id of #{contact.id} was destroyed!")
# end
# end
#
# This Observer uses logger to log when specific callbacks are triggered.
#
# == Observing a class that can't be inferred
#
# Observers will by default be mapped to the class with which they share a name. So CommentObserver will
# be tied to observing Comment, ProductManagerObserver to ProductManager, and so on. If you want to name your observer
# differently than the class you're interested in observing, you can use the Observer.observe class method:
#
# class AuditObserver < ActiveRecord::Observer
# observe Account
#
# def after_update(account)
# AuditTrail.new(account, "UPDATED")
# end
# end
#
# If the audit observer needs to watch more than one kind of object, this can be specified with multiple arguments:
#
# class AuditObserver < ActiveRecord::Observer
# observe Account, Balance
#
# def after_update(record)
# AuditTrail.new(record, "UPDATED")
# end
# end
#
# The AuditObserver will now act on both updates to Account and Balance by treating them both as records.
#
# == Available callback methods
#
# The observer can implement callback methods for each of the methods described in the Callbacks module.
#
# == Storing Observers in Rails
#
# If you're using Active Record within Rails, observer classes are usually stored in app/models with the
# naming convention of app/models/audit_observer.rb.
#
# == Configuration
#
# In order to activate an observer, list it in the <tt>config.active_record.observers</tt> configuration setting in your
# <tt>config/environment.rb</tt> file.
#
# config.active_record.observers = :comment_observer, :signup_observer
#
# Observers will not be invoked unless you define these in your application configuration.
#
class Observer
include Singleton
# Observer subclasses should be reloaded by the dispatcher in Rails
# when Dependencies.mechanism = :load.
include Reloadable::Subclasses
# Attaches the observer to the supplied model classes.
def self.observe(*models)
define_method(:observed_class) { models }
end
def initialize
observed_classes = [ observed_class ].flatten
observed_subclasses_class = observed_classes.collect {|c| c.send(:subclasses) }.flatten!
(observed_classes + observed_subclasses_class).each do |klass|
klass.add_observer(self)
klass.send(:define_method, :after_find) unless klass.respond_to?(:after_find)
end
end
def update(callback_method, object) #:nodoc:
send(callback_method, object) if respond_to?(callback_method)
end
private
def observed_class
if self.class.respond_to? "observed_class"
self.class.observed_class
else
Object.const_get(infer_observed_class_name)
end
end
def infer_observed_class_name
self.class.name.scan(/(.*)Observer/)[0][0]
end
end
end

View file

@ -0,0 +1,64 @@
module ActiveRecord
class QueryCache #:nodoc:
def initialize(connection)
@connection = connection
@query_cache = {}
end
def clear_query_cache
@query_cache = {}
end
def select_all(sql, name = nil)
(@query_cache[sql] ||= @connection.select_all(sql, name)).dup
end
def select_one(sql, name = nil)
@query_cache[sql] ||= @connection.select_one(sql, name)
end
def columns(table_name, name = nil)
@query_cache["SHOW FIELDS FROM #{table_name}"] ||= @connection.columns(table_name, name)
end
def insert(sql, name = nil, pk = nil, id_value = nil)
clear_query_cache
@connection.insert(sql, name, pk, id_value)
end
def update(sql, name = nil)
clear_query_cache
@connection.update(sql, name)
end
def delete(sql, name = nil)
clear_query_cache
@connection.delete(sql, name)
end
private
def method_missing(method, *arguments, &proc)
@connection.send(method, *arguments, &proc)
end
end
class Base
# Set the connection for the class with caching on
class << self
alias_method :connection_without_query_cache=, :connection=
def connection=(spec)
if spec.is_a?(ConnectionSpecification) and spec.config[:query_cache]
spec = QueryCache.new(self.send(spec.adapter_method, spec.config))
end
self.connection_without_query_cache = spec
end
end
end
class AbstractAdapter #:nodoc:
# Stub method to be able to treat the connection the same whether the query cache has been turned on or not
def clear_query_cache
end
end
end

View file

@ -0,0 +1,204 @@
module ActiveRecord
module Reflection # :nodoc:
def self.included(base)
base.extend(ClassMethods)
end
# Reflection allows you to interrogate Active Record classes and objects about their associations and aggregations.
# This information can, for example, be used in a form builder that took an Active Record object and created input
# fields for all of the attributes depending on their type and displayed the associations to other objects.
#
# You can find the interface for the AggregateReflection and AssociationReflection classes in the abstract MacroReflection class.
module ClassMethods
def create_reflection(macro, name, options, active_record)
case macro
when :has_many, :belongs_to, :has_one, :has_and_belongs_to_many
reflection = AssociationReflection.new(macro, name, options, active_record)
when :composed_of
reflection = AggregateReflection.new(macro, name, options, active_record)
end
write_inheritable_hash :reflections, name => reflection
reflection
end
def reflections
read_inheritable_attribute(:reflections) or write_inheritable_attribute(:reflections, {})
end
# Returns an array of AggregateReflection objects for all the aggregations in the class.
def reflect_on_all_aggregations
reflections.values.select { |reflection| reflection.is_a?(AggregateReflection) }
end
# Returns the AggregateReflection object for the named +aggregation+ (use the symbol). Example:
# Account.reflect_on_aggregation(:balance) # returns the balance AggregateReflection
def reflect_on_aggregation(aggregation)
reflections[aggregation].is_a?(AggregateReflection) ? reflections[aggregation] : nil
end
# Returns an array of AssociationReflection objects for all the aggregations in the class. If you only want to reflect on a
# certain association type, pass in the symbol (:has_many, :has_one, :belongs_to) for that as the first parameter. Example:
# Account.reflect_on_all_associations # returns an array of all associations
# Account.reflect_on_all_associations(:has_many) # returns an array of all has_many associations
def reflect_on_all_associations(macro = nil)
association_reflections = reflections.values.select { |reflection| reflection.is_a?(AssociationReflection) }
macro ? association_reflections.select { |reflection| reflection.macro == macro } : association_reflections
end
# Returns the AssociationReflection object for the named +aggregation+ (use the symbol). Example:
# Account.reflect_on_association(:owner) # returns the owner AssociationReflection
# Invoice.reflect_on_association(:line_items).macro # returns :has_many
def reflect_on_association(association)
reflections[association].is_a?(AssociationReflection) ? reflections[association] : nil
end
end
# Abstract base class for AggregateReflection and AssociationReflection that describes the interface available for both of
# those classes. Objects of AggregateReflection and AssociationReflection are returned by the Reflection::ClassMethods.
class MacroReflection
attr_reader :active_record
def initialize(macro, name, options, active_record)
@macro, @name, @options, @active_record = macro, name, options, active_record
end
# Returns the name of the macro, so it would return :balance for "composed_of :balance, :class_name => 'Money'" or
# :clients for "has_many :clients".
def name
@name
end
# Returns the name of the macro, so it would return :composed_of for
# "composed_of :balance, :class_name => 'Money'" or :has_many for "has_many :clients".
def macro
@macro
end
# Returns the hash of options used for the macro, so it would return { :class_name => "Money" } for
# "composed_of :balance, :class_name => 'Money'" or {} for "has_many :clients".
def options
@options
end
# Returns the class for the macro, so "composed_of :balance, :class_name => 'Money'" would return the Money class and
# "has_many :clients" would return the Client class.
def klass() end
def class_name
@class_name ||= name_to_class_name(name.id2name)
end
def ==(other_aggregation)
name == other_aggregation.name && other_aggregation.options && active_record == other_aggregation.active_record
end
end
# Holds all the meta-data about an aggregation as it was specified in the Active Record class.
class AggregateReflection < MacroReflection #:nodoc:
def klass
@klass ||= Object.const_get(options[:class_name] || class_name)
end
private
def name_to_class_name(name)
name.capitalize.gsub(/_(.)/) { |s| $1.capitalize }
end
end
# Holds all the meta-data about an association as it was specified in the Active Record class.
class AssociationReflection < MacroReflection #:nodoc:
def klass
@klass ||= active_record.send(:compute_type, class_name)
end
def table_name
@table_name ||= klass.table_name
end
def primary_key_name
return @primary_key_name if @primary_key_name
case
when macro == :belongs_to
@primary_key_name = options[:foreign_key] || class_name.foreign_key
when options[:as]
@primary_key_name = options[:foreign_key] || "#{options[:as]}_id"
else
@primary_key_name = options[:foreign_key] || active_record.name.foreign_key
end
end
def association_foreign_key
@association_foreign_key ||= @options[:association_foreign_key] || class_name.foreign_key
end
def counter_cache_column
if options[:counter_cache] == true
"#{active_record.name.underscore.pluralize}_count"
elsif options[:counter_cache]
options[:counter_cache]
end
end
def through_reflection
@through_reflection ||= options[:through] ? active_record.reflect_on_association(options[:through]) : false
end
# Gets an array of possible :through source reflection names
#
# [singularized, pluralized]
def source_reflection_names
@source_reflection_names ||= (options[:source] ? [options[:source]] : [name.to_s.singularize, name]).collect { |n| n.to_sym }
end
# Gets the source of the through reflection. It checks both a singularized and pluralized form for :belongs_to or :has_many.
# (The :tags association on Tagging below)
#
# class Post
# has_many :tags, :through => :taggings
# end
#
def source_reflection
return nil unless through_reflection
@source_reflection ||= source_reflection_names.collect { |name| through_reflection.klass.reflect_on_association(name) }.compact.first
end
def check_validity!
if options[:through]
if through_reflection.nil?
raise HasManyThroughAssociationNotFoundError.new(self)
end
if source_reflection.nil?
raise HasManyThroughSourceAssociationNotFoundError.new(self)
end
if source_reflection.options[:polymorphic]
raise HasManyThroughAssociationPolymorphicError.new(class_name, self, source_reflection)
end
unless [:belongs_to, :has_many].include?(source_reflection.macro) && source_reflection.options[:through].nil?
raise HasManyThroughSourceAssociationMacroError.new(self)
end
end
end
private
def name_to_class_name(name)
if name =~ /::/
name
else
if options[:class_name]
options[:class_name]
elsif through_reflection # get the class_name of the belongs_to association of the through reflection
source_reflection.class_name
else
class_name = name.to_s.camelize
class_name = class_name.singularize if [ :has_many, :has_and_belongs_to_many ].include?(macro)
class_name
end
end
end
end
end
end

View file

@ -0,0 +1,58 @@
module ActiveRecord
# Allows programmers to programmatically define a schema in a portable
# DSL. This means you can define tables, indexes, etc. without using SQL
# directly, so your applications can more easily support multiple
# databases.
#
# Usage:
#
# ActiveRecord::Schema.define do
# create_table :authors do |t|
# t.column :name, :string, :null => false
# end
#
# add_index :authors, :name, :unique
#
# create_table :posts do |t|
# t.column :author_id, :integer, :null => false
# t.column :subject, :string
# t.column :body, :text
# t.column :private, :boolean, :default => false
# end
#
# add_index :posts, :author_id
# end
#
# ActiveRecord::Schema is only supported by database adapters that also
# support migrations, the two features being very similar.
class Schema < Migration
private_class_method :new
# Eval the given block. All methods available to the current connection
# adapter are available within the block, so you can easily use the
# database definition DSL to build up your schema (#create_table,
# #add_index, etc.).
#
# The +info+ hash is optional, and if given is used to define metadata
# about the current schema (like the schema's version):
#
# ActiveRecord::Schema.define(:version => 15) do
# ...
# end
def self.define(info={}, &block)
instance_eval(&block)
unless info.empty?
initialize_schema_information
cols = columns('schema_info')
info = info.map do |k,v|
v = Base.connection.quote(v, cols.detect { |c| c.name == k.to_s })
"#{k} = #{v}"
end
Base.connection.update "UPDATE #{Migrator.schema_info_table_name} SET #{info.join(", ")}"
end
end
end
end

View file

@ -0,0 +1,121 @@
module ActiveRecord
# This class is used to dump the database schema for some connection to some
# output format (i.e., ActiveRecord::Schema).
class SchemaDumper #:nodoc:
private_class_method :new
# A list of tables which should not be dumped to the schema.
# Acceptable values are strings as well as regexp.
# This setting is only used if ActiveRecord::Base.schema_format == :ruby
cattr_accessor :ignore_tables
@@ignore_tables = []
def self.dump(connection=ActiveRecord::Base.connection, stream=STDOUT)
new(connection).dump(stream)
stream
end
def dump(stream)
header(stream)
tables(stream)
trailer(stream)
stream
end
private
def initialize(connection)
@connection = connection
@types = @connection.native_database_types
@info = @connection.select_one("SELECT * FROM schema_info") rescue nil
end
def header(stream)
define_params = @info ? ":version => #{@info['version']}" : ""
stream.puts <<HEADER
# This file is autogenerated. Instead of editing this file, please use the
# migrations feature of ActiveRecord to incrementally modify your database, and
# then regenerate this schema definition.
ActiveRecord::Schema.define(#{define_params}) do
HEADER
end
def trailer(stream)
stream.puts "end"
end
def tables(stream)
@connection.tables.sort.each do |tbl|
next if ["schema_info", ignore_tables].flatten.any? do |ignored|
case ignored
when String: tbl == ignored
when Regexp: tbl =~ ignored
else
raise StandardError, 'ActiveRecord::SchemaDumper.ignore_tables accepts an array of String and / or Regexp values.'
end
end
table(tbl, stream)
end
end
def table(table, stream)
columns = @connection.columns(table)
begin
tbl = StringIO.new
if @connection.respond_to?(:pk_and_sequence_for)
pk, pk_seq = @connection.pk_and_sequence_for(table)
end
pk ||= 'id'
tbl.print " create_table #{table.inspect}"
if columns.detect { |c| c.name == pk }
if pk != 'id'
tbl.print %Q(, :primary_key => "#{pk}")
end
else
tbl.print ", :id => false"
end
tbl.print ", :force => true"
tbl.puts " do |t|"
columns.each do |column|
raise StandardError, "Unknown type '#{column.sql_type}' for column '#{column.name}'" if @types[column.type].nil?
next if column.name == pk
tbl.print " t.column #{column.name.inspect}, #{column.type.inspect}"
tbl.print ", :limit => #{column.limit.inspect}" if column.limit != @types[column.type][:limit]
tbl.print ", :default => #{column.default.inspect}" if !column.default.nil?
tbl.print ", :null => false" if !column.null
tbl.puts
end
tbl.puts " end"
tbl.puts
indexes(table, tbl)
tbl.rewind
stream.print tbl.read
rescue => e
stream.puts "# Could not dump table #{table.inspect} because of following #{e.class}"
stream.puts "# #{e.message}"
stream.puts
end
stream
end
def indexes(table, stream)
indexes = @connection.indexes(table)
indexes.each do |index|
stream.print " add_index #{index.table.inspect}, #{index.columns.inspect}, :name => #{index.name.inspect}"
stream.print ", :unique => true" if index.unique
stream.puts
end
stream.puts unless indexes.empty?
end
end
end

View file

@ -0,0 +1,62 @@
module ActiveRecord
# Active Records will automatically record creation and/or update timestamps of database objects
# if fields of the names created_at/created_on or updated_at/updated_on are present. This module is
# automatically included, so you don't need to do that manually.
#
# This behavior can be turned off by setting <tt>ActiveRecord::Base.record_timestamps = false</tt>.
# This behavior by default uses local time, but can use UTC by setting <tt>ActiveRecord::Base.default_timezone = :utc</tt>
module Timestamp
def self.append_features(base) # :nodoc:
super
base.class_eval do
alias_method :create_without_timestamps, :create
alias_method :create, :create_with_timestamps
alias_method :update_without_timestamps, :update
alias_method :update, :update_with_timestamps
end
end
def create_with_timestamps #:nodoc:
if record_timestamps
t = ( self.class.default_timezone == :utc ? Time.now.utc : Time.now )
write_attribute('created_at', t) if respond_to?(:created_at) && created_at.nil?
write_attribute('created_on', t) if respond_to?(:created_on) && created_on.nil?
write_attribute('updated_at', t) if respond_to?(:updated_at)
write_attribute('updated_on', t) if respond_to?(:updated_on)
end
create_without_timestamps
end
def update_with_timestamps #:nodoc:
if record_timestamps
t = ( self.class.default_timezone == :utc ? Time.now.utc : Time.now )
write_attribute('updated_at', t) if respond_to?(:updated_at)
write_attribute('updated_on', t) if respond_to?(:updated_on)
end
update_without_timestamps
end
end
class Base
# Records the creation date and possibly time in created_on (date only) or created_at (date and time) and the update date and possibly
# time in updated_on and updated_at. This only happens if the object responds to either of these messages, which they will do automatically
# if the table has columns of either of these names. This feature is turned on by default.
@@record_timestamps = true
cattr_accessor :record_timestamps
# deprecated: use ActiveRecord::Base.default_timezone instead.
@@timestamps_gmt = false
def self.timestamps_gmt=( gmt ) #:nodoc:
warn "timestamps_gmt= is deprecated. use default_timezone= instead"
self.default_timezone = ( gmt ? :utc : :local )
end
def self.timestamps_gmt #:nodoc:
warn "timestamps_gmt is deprecated. use default_timezone instead"
self.default_timezone == :utc
end
end
end

View file

@ -0,0 +1,129 @@
require 'active_record/vendor/simple.rb'
Transaction::Simple.send(:remove_method, :transaction)
require 'thread'
module ActiveRecord
module Transactions # :nodoc:
TRANSACTION_MUTEX = Mutex.new
class TransactionError < ActiveRecordError # :nodoc:
end
def self.append_features(base)
super
base.extend(ClassMethods)
base.class_eval do
alias_method :destroy_without_transactions, :destroy
alias_method :destroy, :destroy_with_transactions
alias_method :save_without_transactions, :save
alias_method :save, :save_with_transactions
end
end
# Transactions are protective blocks where SQL statements are only permanent if they can all succeed as one atomic action.
# The classic example is a transfer between two accounts where you can only have a deposit if the withdrawal succeeded and
# vice versa. Transactions enforce the integrity of the database and guard the data against program errors or database break-downs.
# So basically you should use transaction blocks whenever you have a number of statements that must be executed together or
# not at all. Example:
#
# transaction do
# david.withdrawal(100)
# mary.deposit(100)
# end
#
# This example will only take money from David and give to Mary if neither +withdrawal+ nor +deposit+ raises an exception.
# Exceptions will force a ROLLBACK that returns the database to the state before the transaction was begun. Be aware, though,
# that the objects by default will _not_ have their instance data returned to their pre-transactional state.
#
# == Transactions are not distributed across database connections
#
# A transaction acts on a single database connection. If you have
# multiple class-specific databases, the transaction will not protect
# interaction among them. One workaround is to begin a transaction
# on each class whose models you alter:
#
# Student.transaction do
# Course.transaction do
# course.enroll(student)
# student.units += course.units
# end
# end
#
# This is a poor solution, but full distributed transactions are beyond
# the scope of Active Record.
#
# == Save and destroy are automatically wrapped in a transaction
#
# Both Base#save and Base#destroy come wrapped in a transaction that ensures that whatever you do in validations or callbacks
# will happen under the protected cover of a transaction. So you can use validations to check for values that the transaction
# depend on or you can raise exceptions in the callbacks to rollback.
#
# == Object-level transactions
#
# You can enable object-level transactions for Active Record objects, though. You do this by naming each of the Active Records
# that you want to enable object-level transactions for, like this:
#
# Account.transaction(david, mary) do
# david.withdrawal(100)
# mary.deposit(100)
# end
#
# If the transaction fails, David and Mary will be returned to their pre-transactional state. No money will have changed hands in
# neither object nor database.
#
# == Exception handling
#
# Also have in mind that exceptions thrown within a transaction block will be propagated (after triggering the ROLLBACK), so you
# should be ready to catch those in your application code.
#
# Tribute: Object-level transactions are implemented by Transaction::Simple by Austin Ziegler.
module ClassMethods
def transaction(*objects, &block)
previous_handler = trap('TERM') { raise TransactionError, "Transaction aborted" }
lock_mutex
begin
objects.each { |o| o.extend(Transaction::Simple) }
objects.each { |o| o.start_transaction }
result = connection.transaction(Thread.current['start_db_transaction'], &block)
objects.each { |o| o.commit_transaction }
return result
rescue Exception => object_transaction_rollback
objects.each { |o| o.abort_transaction }
raise
ensure
unlock_mutex
trap('TERM', previous_handler)
end
end
def lock_mutex#:nodoc:
Thread.current['open_transactions'] ||= 0
TRANSACTION_MUTEX.lock if Thread.current['open_transactions'] == 0
Thread.current['start_db_transaction'] = (Thread.current['open_transactions'] == 0)
Thread.current['open_transactions'] += 1
end
def unlock_mutex#:nodoc:
Thread.current['open_transactions'] -= 1
TRANSACTION_MUTEX.unlock if Thread.current['open_transactions'] == 0
end
end
def transaction(*objects, &block)
self.class.transaction(*objects, &block)
end
def destroy_with_transactions #:nodoc:
transaction { destroy_without_transactions }
end
def save_with_transactions(perform_validation = true) #:nodoc:
transaction { save_without_transactions(perform_validation) }
end
end
end

View file

@ -0,0 +1,827 @@
module ActiveRecord
# Raised by save! and create! when the record is invalid. Use the
# record method to retrieve the record which did not validate.
# begin
# complex_operation_that_calls_save!_internally
# rescue ActiveRecord::RecordInvalid => invalid
# puts invalid.record.errors
# end
class RecordInvalid < ActiveRecordError #:nodoc:
attr_reader :record
def initialize(record)
@record = record
super("Validation failed: #{@record.errors.full_messages.join(", ")}")
end
end
# Active Record validation is reported to and from this object, which is used by Base#save to
# determine whether the object in a valid state to be saved. See usage example in Validations.
class Errors
include Enumerable
def initialize(base) # :nodoc:
@base, @errors = base, {}
end
@@default_error_messages = {
:inclusion => "is not included in the list",
:exclusion => "is reserved",
:invalid => "is invalid",
:confirmation => "doesn't match confirmation",
:accepted => "must be accepted",
:empty => "can't be empty",
:blank => "can't be blank",
:too_long => "is too long (maximum is %d characters)",
:too_short => "is too short (minimum is %d characters)",
:wrong_length => "is the wrong length (should be %d characters)",
:taken => "has already been taken",
:not_a_number => "is not a number"
}
# Holds a hash with all the default error messages, such that they can be replaced by your own copy or localizations.
cattr_accessor :default_error_messages
# Adds an error to the base object instead of any particular attribute. This is used
# to report errors that don't tie to any specific attribute, but rather to the object
# as a whole. These error messages don't get prepended with any field name when iterating
# with each_full, so they should be complete sentences.
def add_to_base(msg)
add(:base, msg)
end
# Adds an error message (+msg+) to the +attribute+, which will be returned on a call to <tt>on(attribute)</tt>
# for the same attribute and ensure that this error object returns false when asked if <tt>empty?</tt>. More than one
# error can be added to the same +attribute+ in which case an array will be returned on a call to <tt>on(attribute)</tt>.
# If no +msg+ is supplied, "invalid" is assumed.
def add(attribute, msg = @@default_error_messages[:invalid])
@errors[attribute.to_s] = [] if @errors[attribute.to_s].nil?
@errors[attribute.to_s] << msg
end
# Will add an error message to each of the attributes in +attributes+ that is empty.
def add_on_empty(attributes, msg = @@default_error_messages[:empty])
for attr in [attributes].flatten
value = @base.respond_to?(attr.to_s) ? @base.send(attr.to_s) : @base[attr.to_s]
is_empty = value.respond_to?("empty?") ? value.empty? : false
add(attr, msg) unless !value.nil? && !is_empty
end
end
# Will add an error message to each of the attributes in +attributes+ that is blank (using Object#blank?).
def add_on_blank(attributes, msg = @@default_error_messages[:blank])
for attr in [attributes].flatten
value = @base.respond_to?(attr.to_s) ? @base.send(attr.to_s) : @base[attr.to_s]
add(attr, msg) if value.blank?
end
end
# Will add an error message to each of the attributes in +attributes+ that has a length outside of the passed boundary +range+.
# If the length is above the boundary, the too_long_msg message will be used. If below, the too_short_msg.
def add_on_boundary_breaking(attributes, range, too_long_msg = @@default_error_messages[:too_long], too_short_msg = @@default_error_messages[:too_short])
for attr in [attributes].flatten
value = @base.respond_to?(attr.to_s) ? @base.send(attr.to_s) : @base[attr.to_s]
add(attr, too_short_msg % range.begin) if value && value.length < range.begin
add(attr, too_long_msg % range.end) if value && value.length > range.end
end
end
alias :add_on_boundry_breaking :add_on_boundary_breaking
# Returns true if the specified +attribute+ has errors associated with it.
def invalid?(attribute)
!@errors[attribute.to_s].nil?
end
# * Returns nil, if no errors are associated with the specified +attribute+.
# * Returns the error message, if one error is associated with the specified +attribute+.
# * Returns an array of error messages, if more than one error is associated with the specified +attribute+.
def on(attribute)
if @errors[attribute.to_s].nil?
nil
elsif @errors[attribute.to_s].length == 1
@errors[attribute.to_s].first
else
@errors[attribute.to_s]
end
end
alias :[] :on
# Returns errors assigned to base object through add_to_base according to the normal rules of on(attribute).
def on_base
on(:base)
end
# Yields each attribute and associated message per error added.
def each
@errors.each_key { |attr| @errors[attr].each { |msg| yield attr, msg } }
end
# Yields each full error message added. So Person.errors.add("first_name", "can't be empty") will be returned
# through iteration as "First name can't be empty".
def each_full
full_messages.each { |msg| yield msg }
end
# Returns all the full error messages in an array.
def full_messages
full_messages = []
@errors.each_key do |attr|
@errors[attr].each do |msg|
next if msg.nil?
if attr == "base"
full_messages << msg
else
full_messages << @base.class.human_attribute_name(attr) + " " + msg
end
end
end
return full_messages
end
# Returns true if no errors have been added.
def empty?
return @errors.empty?
end
# Removes all the errors that have been added.
def clear
@errors = {}
end
# Returns the total number of errors added. Two errors added to the same attribute will be counted as such
# with this as well.
def size
error_count = 0
@errors.each_value { |attribute| error_count += attribute.length }
error_count
end
alias_method :count, :size
alias_method :length, :size
end
# Active Records implement validation by overwriting Base#validate (or the variations, +validate_on_create+ and
# +validate_on_update+). Each of these methods can inspect the state of the object, which usually means ensuring
# that a number of attributes have a certain value (such as not empty, within a given range, matching a certain regular expression).
#
# Example:
#
# class Person < ActiveRecord::Base
# protected
# def validate
# errors.add_on_empty %w( first_name last_name )
# errors.add("phone_number", "has invalid format") unless phone_number =~ /[0-9]*/
# end
#
# def validate_on_create # is only run the first time a new object is saved
# unless valid_discount?(membership_discount)
# errors.add("membership_discount", "has expired")
# end
# end
#
# def validate_on_update
# errors.add_to_base("No changes have occurred") if unchanged_attributes?
# end
# end
#
# person = Person.new("first_name" => "David", "phone_number" => "what?")
# person.save # => false (and doesn't do the save)
# person.errors.empty? # => false
# person.errors.count # => 2
# person.errors.on "last_name" # => "can't be empty"
# person.errors.on "phone_number" # => "has invalid format"
# person.errors.each_full { |msg| puts msg }
# # => "Last name can't be empty\n" +
# "Phone number has invalid format"
#
# person.attributes = { "last_name" => "Heinemeier", "phone_number" => "555-555" }
# person.save # => true (and person is now saved in the database)
#
# An +Errors+ object is automatically created for every Active Record.
#
# Please do have a look at ActiveRecord::Validations::ClassMethods for a higher level of validations.
module Validations
VALIDATIONS = %w( validate validate_on_create validate_on_update )
def self.append_features(base) # :nodoc:
super
base.extend ClassMethods
base.class_eval do
alias_method :save_without_validation, :save
alias_method :save, :save_with_validation
alias_method :save_without_validation!, :save!
alias_method :save!, :save_with_validation!
alias_method :update_attribute_without_validation_skipping, :update_attribute
alias_method :update_attribute, :update_attribute_with_validation_skipping
end
end
# All of the following validations are defined in the class scope of the model that you're interested in validating.
# They offer a more declarative way of specifying when the model is valid and when it is not. It is recommended to use
# these over the low-level calls to validate and validate_on_create when possible.
module ClassMethods
DEFAULT_VALIDATION_OPTIONS = {
:on => :save,
:allow_nil => false,
:message => nil
}.freeze
ALL_RANGE_OPTIONS = [ :is, :within, :in, :minimum, :maximum ].freeze
def validate(*methods, &block)
methods << block if block_given?
write_inheritable_set(:validate, methods)
end
def validate_on_create(*methods, &block)
methods << block if block_given?
write_inheritable_set(:validate_on_create, methods)
end
def validate_on_update(*methods, &block)
methods << block if block_given?
write_inheritable_set(:validate_on_update, methods)
end
def condition_block?(condition)
condition.respond_to?("call") && (condition.arity == 1 || condition.arity == -1)
end
# Determine from the given condition (whether a block, procedure, method or string)
# whether or not to validate the record. See #validates_each.
def evaluate_condition(condition, record)
case condition
when Symbol: record.send(condition)
when String: eval(condition, binding)
else
if condition_block?(condition)
condition.call(record)
else
raise(
ActiveRecordError,
"Validations need to be either a symbol, string (to be eval'ed), proc/method, or " +
"class implementing a static validation method"
)
end
end
end
# Validates each attribute against a block.
#
# class Person < ActiveRecord::Base
# validates_each :first_name, :last_name do |record, attr, value|
# record.errors.add attr, 'starts with z.' if value[0] == ?z
# end
# end
#
# Options:
# * <tt>on</tt> - Specifies when this validation is active (default is :save, other options :create, :update)
# * <tt>allow_nil</tt> - Skip validation if attribute is nil.
# * <tt>if</tt> - Specifies a method, proc or string to call to determine if the validation should
# occur (e.g. :if => :allow_validation, or :if => Proc.new { |user| user.signup_step > 2 }). The
# method, proc or string should return or evaluate to a true or false value.
def validates_each(*attrs)
options = attrs.last.is_a?(Hash) ? attrs.pop.symbolize_keys : {}
attrs = attrs.flatten
# Declare the validation.
send(validation_method(options[:on] || :save)) do |record|
# Don't validate when there is an :if condition and that condition is false
unless options[:if] && !evaluate_condition(options[:if], record)
attrs.each do |attr|
value = record.send(attr)
next if value.nil? && options[:allow_nil]
yield record, attr, value
end
end
end
end
# Encapsulates the pattern of wanting to validate a password or email address field with a confirmation. Example:
#
# Model:
# class Person < ActiveRecord::Base
# validates_confirmation_of :user_name, :password
# validates_confirmation_of :email_address, :message => "should match confirmation"
# end
#
# View:
# <%= password_field "person", "password" %>
# <%= password_field "person", "password_confirmation" %>
#
# The person has to already have a password attribute (a column in the people table), but the password_confirmation is virtual.
# It exists only as an in-memory variable for validating the password. This check is performed only if password_confirmation
# is not nil and by default on save.
#
# Configuration options:
# * <tt>message</tt> - A custom error message (default is: "doesn't match confirmation")
# * <tt>on</tt> - Specifies when this validation is active (default is :save, other options :create, :update)
# * <tt>if</tt> - Specifies a method, proc or string to call to determine if the validation should
# occur (e.g. :if => :allow_validation, or :if => Proc.new { |user| user.signup_step > 2 }). The
# method, proc or string should return or evaluate to a true or false value.
def validates_confirmation_of(*attr_names)
configuration = { :message => ActiveRecord::Errors.default_error_messages[:confirmation], :on => :save }
configuration.update(attr_names.pop) if attr_names.last.is_a?(Hash)
attr_accessor *(attr_names.map { |n| "#{n}_confirmation" })
validates_each(attr_names, configuration) do |record, attr_name, value|
record.errors.add(attr_name, configuration[:message]) unless record.send("#{attr_name}_confirmation").nil? or value == record.send("#{attr_name}_confirmation")
end
end
# Encapsulates the pattern of wanting to validate the acceptance of a terms of service check box (or similar agreement). Example:
#
# class Person < ActiveRecord::Base
# validates_acceptance_of :terms_of_service
# validates_acceptance_of :eula, :message => "must be abided"
# end
#
# The terms_of_service attribute is entirely virtual. No database column is needed. This check is performed only if
# terms_of_service is not nil and by default on save.
#
# Configuration options:
# * <tt>message</tt> - A custom error message (default is: "must be accepted")
# * <tt>on</tt> - Specifies when this validation is active (default is :save, other options :create, :update)
# * <tt>accept</tt> - Specifies value that is considered accepted. The default value is a string "1", which
# makes it easy to relate to an HTML checkbox.
# * <tt>if</tt> - Specifies a method, proc or string to call to determine if the validation should
# occur (e.g. :if => :allow_validation, or :if => Proc.new { |user| user.signup_step > 2 }). The
# method, proc or string should return or evaluate to a true or false value.
def validates_acceptance_of(*attr_names)
configuration = { :message => ActiveRecord::Errors.default_error_messages[:accepted], :on => :save, :allow_nil => true, :accept => "1" }
configuration.update(attr_names.pop) if attr_names.last.is_a?(Hash)
attr_accessor *attr_names
validates_each(attr_names,configuration) do |record, attr_name, value|
record.errors.add(attr_name, configuration[:message]) unless value == configuration[:accept]
end
end
# Validates that the specified attributes are not blank (as defined by Object#blank?). Happens by default on save. Example:
#
# class Person < ActiveRecord::Base
# validates_presence_of :first_name
# end
#
# The first_name attribute must be in the object and it cannot be blank.
#
# Configuration options:
# * <tt>message</tt> - A custom error message (default is: "can't be blank")
# * <tt>on</tt> - Specifies when this validation is active (default is :save, other options :create, :update)
# * <tt>if</tt> - Specifies a method, proc or string to call to determine if the validation should
# occur (e.g. :if => :allow_validation, or :if => Proc.new { |user| user.signup_step > 2 }). The
# method, proc or string should return or evaluate to a true or false value.
#
# === Warning
# Validate the presence of the foreign key, not the instance variable itself.
# Do this:
# validate_presence_of :invoice_id
#
# Not this:
# validate_presence_of :invoice
#
# If you validate the presence of the associated object, you will get
# failures on saves when both the parent object and the child object are
# new.
def validates_presence_of(*attr_names)
configuration = { :message => ActiveRecord::Errors.default_error_messages[:blank], :on => :save }
configuration.update(attr_names.pop) if attr_names.last.is_a?(Hash)
# can't use validates_each here, because it cannot cope with nonexistent attributes,
# while errors.add_on_empty can
attr_names.each do |attr_name|
send(validation_method(configuration[:on])) do |record|
unless configuration[:if] and not evaluate_condition(configuration[:if], record)
record.errors.add_on_blank(attr_name,configuration[:message])
end
end
end
end
# Validates that the specified attribute matches the length restrictions supplied. Only one option can be used at a time:
#
# class Person < ActiveRecord::Base
# validates_length_of :first_name, :maximum=>30
# validates_length_of :last_name, :maximum=>30, :message=>"less than %d if you don't mind"
# validates_length_of :fax, :in => 7..32, :allow_nil => true
# validates_length_of :user_name, :within => 6..20, :too_long => "pick a shorter name", :too_short => "pick a longer name"
# validates_length_of :fav_bra_size, :minimum=>1, :too_short=>"please enter at least %d character"
# validates_length_of :smurf_leader, :is=>4, :message=>"papa is spelled with %d characters... don't play me."
# end
#
# Configuration options:
# * <tt>minimum</tt> - The minimum size of the attribute
# * <tt>maximum</tt> - The maximum size of the attribute
# * <tt>is</tt> - The exact size of the attribute
# * <tt>within</tt> - A range specifying the minimum and maximum size of the attribute
# * <tt>in</tt> - A synonym(or alias) for :within
# * <tt>allow_nil</tt> - Attribute may be nil; skip validation.
#
# * <tt>too_long</tt> - The error message if the attribute goes over the maximum (default is: "is too long (maximum is %d characters)")
# * <tt>too_short</tt> - The error message if the attribute goes under the minimum (default is: "is too short (min is %d characters)")
# * <tt>wrong_length</tt> - The error message if using the :is method and the attribute is the wrong size (default is: "is the wrong length (should be %d characters)")
# * <tt>message</tt> - The error message to use for a :minimum, :maximum, or :is violation. An alias of the appropriate too_long/too_short/wrong_length message
# * <tt>on</tt> - Specifies when this validation is active (default is :save, other options :create, :update)
# * <tt>if</tt> - Specifies a method, proc or string to call to determine if the validation should
# occur (e.g. :if => :allow_validation, or :if => Proc.new { |user| user.signup_step > 2 }). The
# method, proc or string should return or evaluate to a true or false value.
def validates_length_of(*attrs)
# Merge given options with defaults.
options = {
:too_long => ActiveRecord::Errors.default_error_messages[:too_long],
:too_short => ActiveRecord::Errors.default_error_messages[:too_short],
:wrong_length => ActiveRecord::Errors.default_error_messages[:wrong_length]
}.merge(DEFAULT_VALIDATION_OPTIONS)
options.update(attrs.pop.symbolize_keys) if attrs.last.is_a?(Hash)
# Ensure that one and only one range option is specified.
range_options = ALL_RANGE_OPTIONS & options.keys
case range_options.size
when 0
raise ArgumentError, 'Range unspecified. Specify the :within, :maximum, :minimum, or :is option.'
when 1
# Valid number of options; do nothing.
else
raise ArgumentError, 'Too many range options specified. Choose only one.'
end
# Get range option and value.
option = range_options.first
option_value = options[range_options.first]
case option
when :within, :in
raise ArgumentError, ":#{option} must be a Range" unless option_value.is_a?(Range)
too_short = options[:too_short] % option_value.begin
too_long = options[:too_long] % option_value.end
validates_each(attrs, options) do |record, attr, value|
if value.nil? or value.split(//).size < option_value.begin
record.errors.add(attr, too_short)
elsif value.split(//).size > option_value.end
record.errors.add(attr, too_long)
end
end
when :is, :minimum, :maximum
raise ArgumentError, ":#{option} must be a nonnegative Integer" unless option_value.is_a?(Integer) and option_value >= 0
# Declare different validations per option.
validity_checks = { :is => "==", :minimum => ">=", :maximum => "<=" }
message_options = { :is => :wrong_length, :minimum => :too_short, :maximum => :too_long }
message = (options[:message] || options[message_options[option]]) % option_value
validates_each(attrs, options) do |record, attr, value|
if value.kind_of?(String)
record.errors.add(attr, message) unless !value.nil? and value.split(//).size.method(validity_checks[option])[option_value]
else
record.errors.add(attr, message) unless !value.nil? and value.size.method(validity_checks[option])[option_value]
end
end
end
end
alias_method :validates_size_of, :validates_length_of
# Validates whether the value of the specified attributes are unique across the system. Useful for making sure that only one user
# can be named "davidhh".
#
# class Person < ActiveRecord::Base
# validates_uniqueness_of :user_name, :scope => :account_id
# end
#
# It can also validate whether the value of the specified attributes are unique based on multiple scope parameters. For example,
# making sure that a teacher can only be on the schedule once per semester for a particular class.
#
# class TeacherSchedule < ActiveRecord::Base
# validates_uniqueness_of :teacher_id, :scope => [:semester_id, :class_id]
# end
#
# When the record is created, a check is performed to make sure that no record exists in the database with the given value for the specified
# attribute (that maps to a column). When the record is updated, the same check is made but disregarding the record itself.
#
# Configuration options:
# * <tt>message</tt> - Specifies a custom error message (default is: "has already been taken")
# * <tt>scope</tt> - One or more columns by which to limit the scope of the uniquness constraint.
# * <tt>if</tt> - Specifies a method, proc or string to call to determine if the validation should
# occur (e.g. :if => :allow_validation, or :if => Proc.new { |user| user.signup_step > 2 }). The
# method, proc or string should return or evaluate to a true or false value.
def validates_uniqueness_of(*attr_names)
configuration = { :message => ActiveRecord::Errors.default_error_messages[:taken] }
configuration.update(attr_names.pop) if attr_names.last.is_a?(Hash)
validates_each(attr_names,configuration) do |record, attr_name, value|
condition_sql = "#{record.class.table_name}.#{attr_name} #{attribute_condition(value)}"
condition_params = [value]
if scope = configuration[:scope]
Array(scope).map do |scope_item|
scope_value = record.send(scope_item)
condition_sql << " AND #{record.class.table_name}.#{scope_item} #{attribute_condition(scope_value)}"
condition_params << scope_value
end
end
unless record.new_record?
condition_sql << " AND #{record.class.table_name}.#{record.class.primary_key} <> ?"
condition_params << record.send(:id)
end
if record.class.find(:first, :conditions => [condition_sql, *condition_params])
record.errors.add(attr_name, configuration[:message])
end
end
end
# Validates whether the value of the specified attribute is of the correct form by matching it against the regular expression
# provided.
#
# class Person < ActiveRecord::Base
# validates_format_of :email, :with => /^([^@\s]+)@((?:[-a-z0-9]+\.)+[a-z]{2,})$/i, :on => :create
# end
#
# A regular expression must be provided or else an exception will be raised.
#
# Configuration options:
# * <tt>message</tt> - A custom error message (default is: "is invalid")
# * <tt>with</tt> - The regular expression used to validate the format with (note: must be supplied!)
# * <tt>on</tt> Specifies when this validation is active (default is :save, other options :create, :update)
# * <tt>if</tt> - Specifies a method, proc or string to call to determine if the validation should
# occur (e.g. :if => :allow_validation, or :if => Proc.new { |user| user.signup_step > 2 }). The
# method, proc or string should return or evaluate to a true or false value.
def validates_format_of(*attr_names)
configuration = { :message => ActiveRecord::Errors.default_error_messages[:invalid], :on => :save, :with => nil }
configuration.update(attr_names.pop) if attr_names.last.is_a?(Hash)
raise(ArgumentError, "A regular expression must be supplied as the :with option of the configuration hash") unless configuration[:with].is_a?(Regexp)
validates_each(attr_names, configuration) do |record, attr_name, value|
record.errors.add(attr_name, configuration[:message]) unless value.to_s =~ configuration[:with]
end
end
# Validates whether the value of the specified attribute is available in a particular enumerable object.
#
# class Person < ActiveRecord::Base
# validates_inclusion_of :gender, :in=>%w( m f ), :message=>"woah! what are you then!??!!"
# validates_inclusion_of :age, :in=>0..99
# end
#
# Configuration options:
# * <tt>in</tt> - An enumerable object of available items
# * <tt>message</tt> - Specifies a customer error message (default is: "is not included in the list")
# * <tt>allow_nil</tt> - If set to true, skips this validation if the attribute is null (default is: false)
# * <tt>if</tt> - Specifies a method, proc or string to call to determine if the validation should
# occur (e.g. :if => :allow_validation, or :if => Proc.new { |user| user.signup_step > 2 }). The
# method, proc or string should return or evaluate to a true or false value.
def validates_inclusion_of(*attr_names)
configuration = { :message => ActiveRecord::Errors.default_error_messages[:inclusion], :on => :save }
configuration.update(attr_names.pop) if attr_names.last.is_a?(Hash)
enum = configuration[:in] || configuration[:within]
raise(ArgumentError, "An object with the method include? is required must be supplied as the :in option of the configuration hash") unless enum.respond_to?("include?")
validates_each(attr_names, configuration) do |record, attr_name, value|
record.errors.add(attr_name, configuration[:message]) unless enum.include?(value)
end
end
# Validates that the value of the specified attribute is not in a particular enumerable object.
#
# class Person < ActiveRecord::Base
# validates_exclusion_of :username, :in => %w( admin superuser ), :message => "You don't belong here"
# validates_exclusion_of :age, :in => 30..60, :message => "This site is only for under 30 and over 60"
# end
#
# Configuration options:
# * <tt>in</tt> - An enumerable object of items that the value shouldn't be part of
# * <tt>message</tt> - Specifies a customer error message (default is: "is reserved")
# * <tt>allow_nil</tt> - If set to true, skips this validation if the attribute is null (default is: false)
# * <tt>if</tt> - Specifies a method, proc or string to call to determine if the validation should
# occur (e.g. :if => :allow_validation, or :if => Proc.new { |user| user.signup_step > 2 }). The
# method, proc or string should return or evaluate to a true or false value.
def validates_exclusion_of(*attr_names)
configuration = { :message => ActiveRecord::Errors.default_error_messages[:exclusion], :on => :save }
configuration.update(attr_names.pop) if attr_names.last.is_a?(Hash)
enum = configuration[:in] || configuration[:within]
raise(ArgumentError, "An object with the method include? is required must be supplied as the :in option of the configuration hash") unless enum.respond_to?("include?")
validates_each(attr_names, configuration) do |record, attr_name, value|
record.errors.add(attr_name, configuration[:message]) if enum.include?(value)
end
end
# Validates whether the associated object or objects are all valid themselves. Works with any kind of association.
#
# class Book < ActiveRecord::Base
# has_many :pages
# belongs_to :library
#
# validates_associated :pages, :library
# end
#
# Warning: If, after the above definition, you then wrote:
#
# class Page < ActiveRecord::Base
# belongs_to :book
#
# validates_associated :book
# end
#
# ...this would specify a circular dependency and cause infinite recursion.
#
# NOTE: This validation will not fail if the association hasn't been assigned. If you want to ensure that the association
# is both present and guaranteed to be valid, you also need to use validates_presence_of.
#
# Configuration options:
# * <tt>on</tt> Specifies when this validation is active (default is :save, other options :create, :update)
# * <tt>if</tt> - Specifies a method, proc or string to call to determine if the validation should
# occur (e.g. :if => :allow_validation, or :if => Proc.new { |user| user.signup_step > 2 }). The
# method, proc or string should return or evaluate to a true or false value.
def validates_associated(*attr_names)
configuration = { :message => ActiveRecord::Errors.default_error_messages[:invalid], :on => :save }
configuration.update(attr_names.pop) if attr_names.last.is_a?(Hash)
validates_each(attr_names, configuration) do |record, attr_name, value|
record.errors.add(attr_name, configuration[:message]) unless
(value.is_a?(Array) ? value : [value]).all? { |r| r.nil? or r.valid? }
end
end
# Validates whether the value of the specified attribute is numeric by trying to convert it to
# a float with Kernel.Float (if <tt>integer</tt> is false) or applying it to the regular expression
# <tt>/^[\+\-]?\d+$/</tt> (if <tt>integer</tt> is set to true).
#
# class Person < ActiveRecord::Base
# validates_numericality_of :value, :on => :create
# end
#
# Configuration options:
# * <tt>message</tt> - A custom error message (default is: "is not a number")
# * <tt>on</tt> Specifies when this validation is active (default is :save, other options :create, :update)
# * <tt>only_integer</tt> Specifies whether the value has to be an integer, e.g. an integral value (default is false)
# * <tt>allow_nil</tt> Skip validation if attribute is nil (default is false). Notice that for fixnum and float columns empty strings are converted to nil
# * <tt>if</tt> - Specifies a method, proc or string to call to determine if the validation should
# occur (e.g. :if => :allow_validation, or :if => Proc.new { |user| user.signup_step > 2 }). The
# method, proc or string should return or evaluate to a true or false value.
def validates_numericality_of(*attr_names)
configuration = { :message => ActiveRecord::Errors.default_error_messages[:not_a_number], :on => :save,
:only_integer => false, :allow_nil => false }
configuration.update(attr_names.pop) if attr_names.last.is_a?(Hash)
if configuration[:only_integer]
validates_each(attr_names,configuration) do |record, attr_name,value|
record.errors.add(attr_name, configuration[:message]) unless record.send("#{attr_name}_before_type_cast").to_s =~ /^[+-]?\d+$/
end
else
validates_each(attr_names,configuration) do |record, attr_name,value|
next if configuration[:allow_nil] and record.send("#{attr_name}_before_type_cast").nil?
begin
Kernel.Float(record.send("#{attr_name}_before_type_cast").to_s)
rescue ArgumentError, TypeError
record.errors.add(attr_name, configuration[:message])
end
end
end
end
# Creates an object just like Base.create but calls save! instead of save
# so an exception is raised if the record is invalid.
def create!(attributes = nil)
if attributes.is_a?(Array)
attributes.collect { |attr| create!(attr) }
else
attributes.reverse_merge!(scope(:create)) if scoped?(:create)
object = new(attributes)
object.save!
object
end
end
private
def write_inheritable_set(key, methods)
existing_methods = read_inheritable_attribute(key) || []
write_inheritable_attribute(key, methods | existing_methods)
end
def validation_method(on)
case on
when :save then :validate
when :create then :validate_on_create
when :update then :validate_on_update
end
end
end
# The validation process on save can be skipped by passing false. The regular Base#save method is
# replaced with this when the validations module is mixed in, which it is by default.
def save_with_validation(perform_validation = true)
if perform_validation && valid? || !perform_validation
save_without_validation
else
false
end
end
# Attempts to save the record just like Base#save but will raise a RecordInvalid exception instead of returning false
# if the record is not valid.
def save_with_validation!
if valid?
save_without_validation!
else
raise RecordInvalid.new(self)
end
end
# Updates a single attribute and saves the record without going through the normal validation procedure.
# This is especially useful for boolean flags on existing records. The regular +update_attribute+ method
# in Base is replaced with this when the validations module is mixed in, which it is by default.
def update_attribute_with_validation_skipping(name, value)
send(name.to_s + '=', value)
save(false)
end
# Runs validate and validate_on_create or validate_on_update and returns true if no errors were added otherwise false.
def valid?
errors.clear
run_validations(:validate)
validate
if new_record?
run_validations(:validate_on_create)
validate_on_create
else
run_validations(:validate_on_update)
validate_on_update
end
errors.empty?
end
# Returns the Errors object that holds all information about attribute error messages.
def errors
@errors ||= Errors.new(self)
end
protected
# Overwrite this method for validation checks on all saves and use Errors.add(field, msg) for invalid attributes.
def validate #:doc:
end
# Overwrite this method for validation checks used only on creation.
def validate_on_create #:doc:
end
# Overwrite this method for validation checks used only on updates.
def validate_on_update # :doc:
end
private
def run_validations(validation_method)
validations = self.class.read_inheritable_attribute(validation_method.to_sym)
if validations.nil? then return end
validations.each do |validation|
if validation.is_a?(Symbol)
self.send(validation)
elsif validation.is_a?(String)
eval(validation, binding)
elsif validation_block?(validation)
validation.call(self)
elsif validation_class?(validation, validation_method)
validation.send(validation_method, self)
else
raise(
ActiveRecordError,
"Validations need to be either a symbol, string (to be eval'ed), proc/method, or " +
"class implementing a static validation method"
)
end
end
end
def validation_block?(validation)
validation.respond_to?("call") && (validation.arity == 1 || validation.arity == -1)
end
def validation_class?(validation, validation_method)
validation.respond_to?(validation_method)
end
end
end

View file

@ -0,0 +1,362 @@
require 'db2/db2cli.rb'
module DB2
module DB2Util
include DB2CLI
def free() SQLFreeHandle(@handle_type, @handle); end
def handle() @handle; end
def check_rc(rc)
if ![SQL_SUCCESS, SQL_SUCCESS_WITH_INFO, SQL_NO_DATA_FOUND].include?(rc)
rec = 1
msg = ''
loop do
a = SQLGetDiagRec(@handle_type, @handle, rec, 500)
break if a[0] != SQL_SUCCESS
msg << a[3] if !a[3].nil? and a[3] != '' # Create message.
rec += 1
end
raise "DB2 error: #{msg}"
end
end
end
class Environment
include DB2Util
def initialize
@handle_type = SQL_HANDLE_ENV
rc, @handle = SQLAllocHandle(@handle_type, SQL_NULL_HANDLE)
check_rc(rc)
end
def data_sources(buffer_length = 1024)
retval = []
max_buffer_length = buffer_length
a = SQLDataSources(@handle, SQL_FETCH_FIRST, SQL_MAX_DSN_LENGTH + 1, buffer_length)
retval << [a[1], a[3]]
max_buffer_length = [max_buffer_length, a[4]].max
loop do
a = SQLDataSources(@handle, SQL_FETCH_NEXT, SQL_MAX_DSN_LENGTH + 1, buffer_length)
break if a[0] == SQL_NO_DATA_FOUND
retval << [a[1], a[3]]
max_buffer_length = [max_buffer_length, a[4]].max
end
if max_buffer_length > buffer_length
get_data_sources(max_buffer_length)
else
retval
end
end
end
class Connection
include DB2Util
def initialize(environment)
@env = environment
@handle_type = SQL_HANDLE_DBC
rc, @handle = SQLAllocHandle(@handle_type, @env.handle)
check_rc(rc)
end
def connect(server_name, user_name = '', auth = '')
check_rc(SQLConnect(@handle, server_name, user_name.to_s, auth.to_s))
end
def set_connect_attr(attr, value)
value += "\0" if value.class == String
check_rc(SQLSetConnectAttr(@handle, attr, value))
end
def set_auto_commit_on
set_connect_attr(SQL_ATTR_AUTOCOMMIT, SQL_AUTOCOMMIT_ON)
end
def set_auto_commit_off
set_connect_attr(SQL_ATTR_AUTOCOMMIT, SQL_AUTOCOMMIT_OFF)
end
def disconnect
check_rc(SQLDisconnect(@handle))
end
def rollback
check_rc(SQLEndTran(@handle_type, @handle, SQL_ROLLBACK))
end
def commit
check_rc(SQLEndTran(@handle_type, @handle, SQL_COMMIT))
end
end
class Statement
include DB2Util
def initialize(connection)
@conn = connection
@handle_type = SQL_HANDLE_STMT
@parms = [] #yun
@sql = '' #yun
@numParms = 0 #yun
@prepared = false #yun
@parmArray = [] #yun. attributes of the parameter markers
rc, @handle = SQLAllocHandle(@handle_type, @conn.handle)
check_rc(rc)
end
def columns(table_name, schema_name = '%')
check_rc(SQLColumns(@handle, '', schema_name.upcase, table_name.upcase, '%'))
fetch_all
end
def tables(schema_name = '%')
check_rc(SQLTables(@handle, '', schema_name.upcase, '%', 'TABLE'))
fetch_all
end
def indexes(table_name, schema_name = '')
check_rc(SQLStatistics(@handle, '', schema_name.upcase, table_name.upcase, SQL_INDEX_ALL, SQL_ENSURE))
fetch_all
end
def prepare(sql)
@sql = sql
check_rc(SQLPrepare(@handle, sql))
rc, @numParms = SQLNumParams(@handle) #number of question marks
check_rc(rc)
#--------------------------------------------------------------------------
# parameter attributes are stored in instance variable @parmArray so that
# they are available when execute method is called.
#--------------------------------------------------------------------------
if @numParms > 0 # get parameter marker attributes
1.upto(@numParms) do |i| # parameter number starts from 1
rc, type, size, decimalDigits = SQLDescribeParam(@handle, i)
check_rc(rc)
@parmArray << Parameter.new(type, size, decimalDigits)
end
end
@prepared = true
self
end
def execute(*parms)
raise "The statement was not prepared" if @prepared == false
if parms.size == 1 and parms[0].class == Array
parms = parms[0]
end
if @numParms != parms.size
raise "Number of parameters supplied does not match with the SQL statement"
end
if @numParms > 0 #need to bind parameters
#--------------------------------------------------------------------
#calling bindParms may not be safe. Look comment below.
#--------------------------------------------------------------------
#bindParms(parms)
valueArray = []
1.upto(@numParms) do |i| # parameter number starts from 1
type = @parmArray[i - 1].class
size = @parmArray[i - 1].size
decimalDigits = @parmArray[i - 1].decimalDigits
if parms[i - 1].class == String
valueArray << parms[i - 1]
else
valueArray << parms[i - 1].to_s
end
rc = SQLBindParameter(@handle, i, type, size, decimalDigits, valueArray[i - 1])
check_rc(rc)
end
end
check_rc(SQLExecute(@handle))
if @numParms != 0
check_rc(SQLFreeStmt(@handle, SQL_RESET_PARAMS)) # Reset parameters
end
self
end
#-------------------------------------------------------------------------------
# The last argument(value) to SQLBindParameter is a deferred argument, that is,
# it should be available when SQLExecute is called. Even though "value" is
# local to bindParms method, it seems that it is available when SQLExecute
# is called. I am not sure whether it would still work if garbage collection
# is done between bindParms call and SQLExecute call inside the execute method
# above.
#-------------------------------------------------------------------------------
def bindParms(parms) # This is the real thing. It uses SQLBindParms
1.upto(@numParms) do |i| # parameter number starts from 1
rc, dataType, parmSize, decimalDigits = SQLDescribeParam(@handle, i)
check_rc(rc)
if parms[i - 1].class == String
value = parms[i - 1]
else
value = parms[i - 1].to_s
end
rc = SQLBindParameter(@handle, i, dataType, parmSize, decimalDigits, value)
check_rc(rc)
end
end
#------------------------------------------------------------------------------
# bind method does not use DB2's SQLBindParams, but replaces "?" in the
# SQL statement with the value before passing the SQL statement to DB2.
# It is not efficient and can handle only strings since it puts everything in
# quotes.
#------------------------------------------------------------------------------
def bind(sql, args) #does not use SQLBindParams
arg_index = 0
result = ""
tokens(sql).each do |part|
case part
when '?'
result << "'" + (args[arg_index]) + "'" #put it into quotes
arg_index += 1
when '??'
result << "?"
else
result << part
end
end
if arg_index < args.size
raise "Too many SQL parameters"
elsif arg_index > args.size
raise "Not enough SQL parameters"
end
result
end
## Break the sql string into parts.
#
# This is NOT a full lexer for SQL. It just breaks up the SQL
# string enough so that question marks, double question marks and
# quoted strings are separated. This is used when binding
# arguments to "?" in the SQL string. Note: comments are not
# handled.
#
def tokens(sql)
toks = sql.scan(/('([^'\\]|''|\\.)*'|"([^"\\]|""|\\.)*"|\?\??|[^'"?]+)/)
toks.collect { |t| t[0] }
end
def exec_direct(sql)
check_rc(SQLExecDirect(@handle, sql))
self
end
def set_cursor_name(name)
check_rc(SQLSetCursorName(@handle, name))
self
end
def get_cursor_name
rc, name = SQLGetCursorName(@handle)
check_rc(rc)
name
end
def row_count
rc, rowcount = SQLRowCount(@handle)
check_rc(rc)
rowcount
end
def num_result_cols
rc, cols = SQLNumResultCols(@handle)
check_rc(rc)
cols
end
def fetch_all
if block_given?
while row = fetch do
yield row
end
else
res = []
while row = fetch do
res << row
end
res
end
end
def fetch
cols = get_col_desc
rc = SQLFetch(@handle)
if rc == SQL_NO_DATA_FOUND
SQLFreeStmt(@handle, SQL_CLOSE) # Close cursor
SQLFreeStmt(@handle, SQL_RESET_PARAMS) # Reset parameters
return nil
end
raise "ERROR" unless rc == SQL_SUCCESS
retval = []
cols.each_with_index do |c, i|
rc, content = SQLGetData(@handle, i + 1, c[1], c[2] + 1) #yun added 1 to c[2]
retval << adjust_content(content)
end
retval
end
def fetch_as_hash
cols = get_col_desc
rc = SQLFetch(@handle)
if rc == SQL_NO_DATA_FOUND
SQLFreeStmt(@handle, SQL_CLOSE) # Close cursor
SQLFreeStmt(@handle, SQL_RESET_PARAMS) # Reset parameters
return nil
end
raise "ERROR" unless rc == SQL_SUCCESS
retval = {}
cols.each_with_index do |c, i|
rc, content = SQLGetData(@handle, i + 1, c[1], c[2] + 1) #yun added 1 to c[2]
retval[c[0]] = adjust_content(content)
end
retval
end
def get_col_desc
rc, nr_cols = SQLNumResultCols(@handle)
cols = (1..nr_cols).collect do |c|
rc, name, bl, type, col_sz = SQLDescribeCol(@handle, c, 1024)
[name.downcase, type, col_sz]
end
end
def adjust_content(c)
case c.class.to_s
when 'DB2CLI::NullClass'
return nil
when 'DB2CLI::Time'
"%02d:%02d:%02d" % [c.hour, c.minute, c.second]
when 'DB2CLI::Date'
"%04d-%02d-%02d" % [c.year, c.month, c.day]
when 'DB2CLI::Timestamp'
"%04d-%02d-%02d %02d:%02d:%02d" % [c.year, c.month, c.day, c.hour, c.minute, c.second]
else
return c
end
end
end
class Parameter
attr_reader :type, :size, :decimalDigits
def initialize(type, size, decimalDigits)
@type, @size, @decimalDigits = type, size, decimalDigits
end
end
end

File diff suppressed because it is too large Load diff

View file

@ -0,0 +1,693 @@
# :title: Transaction::Simple -- Active Object Transaction Support for Ruby
# :main: Transaction::Simple
#
# == Licence
#
# Permission is hereby granted, free of charge, to any person obtaining a
# copy of this software and associated documentation files (the "Software"),
# to deal in the Software without restriction, including without limitation
# the rights to use, copy, modify, merge, publish, distribute, sublicense,
# and/or sell copies of the Software, and to permit persons to whom the
# Software is furnished to do so, subject to the following conditions:
#
# The above copyright notice and this permission notice shall be included in
# all copies or substantial portions of the Software.
#
# THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
# IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
# FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL
# THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
# LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING
# FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER
# DEALINGS IN THE SOFTWARE.
#--
# Transaction::Simple
# Simple object transaction support for Ruby
# Version 1.3.0
#
# Copyright (c) 2003 - 2005 Austin Ziegler
#
# $Id: simple.rb,v 1.5 2005/05/05 16:16:49 austin Exp $
#++
# The "Transaction" namespace can be used for additional transaction
# support objects and modules.
module Transaction
# A standard exception for transaction errors.
class TransactionError < StandardError; end
# The TransactionAborted exception is used to indicate when a
# transaction has been aborted in the block form.
class TransactionAborted < Exception; end
# The TransactionCommitted exception is used to indicate when a
# transaction has been committed in the block form.
class TransactionCommitted < Exception; end
te = "Transaction Error: %s"
Messages = {
:bad_debug_object =>
te % "the transaction debug object must respond to #<<.",
:unique_names =>
te % "named transactions must be unique.",
:no_transaction_open =>
te % "no transaction open.",
:cannot_rewind_no_transaction =>
te % "cannot rewind; there is no current transaction.",
:cannot_rewind_named_transaction =>
te % "cannot rewind to transaction %s because it does not exist.",
:cannot_rewind_transaction_before_block =>
te % "cannot rewind a transaction started before the execution block.",
:cannot_abort_no_transaction =>
te % "cannot abort; there is no current transaction.",
:cannot_abort_transaction_before_block =>
te % "cannot abort a transaction started before the execution block.",
:cannot_abort_named_transaction =>
te % "cannot abort nonexistant transaction %s.",
:cannot_commit_no_transaction =>
te % "cannot commit; there is no current transaction.",
:cannot_commit_transaction_before_block =>
te % "cannot commit a transaction started before the execution block.",
:cannot_commit_named_transaction =>
te % "cannot commit nonexistant transaction %s.",
:cannot_start_empty_block_transaction =>
te % "cannot start a block transaction with no objects.",
:cannot_obtain_transaction_lock =>
te % "cannot obtain transaction lock for #%s.",
}
# = Transaction::Simple for Ruby
# Simple object transaction support for Ruby
#
# == Introduction
# Transaction::Simple provides a generic way to add active transaction
# support to objects. The transaction methods added by this module will
# work with most objects, excluding those that cannot be
# <i>Marshal</i>ed (bindings, procedure objects, IO instances, or
# singleton objects).
#
# The transactions supported by Transaction::Simple are not backed
# transactions; they are not associated with any sort of data store.
# They are "live" transactions occurring in memory and in the object
# itself. This is to allow "test" changes to be made to an object
# before making the changes permanent.
#
# Transaction::Simple can handle an "infinite" number of transaction
# levels (limited only by memory). If I open two transactions, commit
# the second, but abort the first, the object will revert to the
# original version.
#
# Transaction::Simple supports "named" transactions, so that multiple
# levels of transactions can be committed, aborted, or rewound by
# referring to the appropriate name of the transaction. Names may be any
# object *except* +nil+. As with Hash keys, String names will be
# duplicated and frozen before using.
#
# Copyright:: Copyright © 2003 - 2005 by Austin Ziegler
# Version:: 1.3.0
# Licence:: MIT-Style
#
# Thanks to David Black for help with the initial concept that led to
# this library.
#
# == Usage
# include 'transaction/simple'
#
# v = "Hello, you." # -> "Hello, you."
# v.extend(Transaction::Simple) # -> "Hello, you."
#
# v.start_transaction # -> ... (a Marshal string)
# v.transaction_open? # -> true
# v.gsub!(/you/, "world") # -> "Hello, world."
#
# v.rewind_transaction # -> "Hello, you."
# v.transaction_open? # -> true
#
# v.gsub!(/you/, "HAL") # -> "Hello, HAL."
# v.abort_transaction # -> "Hello, you."
# v.transaction_open? # -> false
#
# v.start_transaction # -> ... (a Marshal string)
# v.start_transaction # -> ... (a Marshal string)
#
# v.transaction_open? # -> true
# v.gsub!(/you/, "HAL") # -> "Hello, HAL."
#
# v.commit_transaction # -> "Hello, HAL."
# v.transaction_open? # -> true
# v.abort_transaction # -> "Hello, you."
# v.transaction_open? # -> false
#
# == Named Transaction Usage
# v = "Hello, you." # -> "Hello, you."
# v.extend(Transaction::Simple) # -> "Hello, you."
#
# v.start_transaction(:first) # -> ... (a Marshal string)
# v.transaction_open? # -> true
# v.transaction_open?(:first) # -> true
# v.transaction_open?(:second) # -> false
# v.gsub!(/you/, "world") # -> "Hello, world."
#
# v.start_transaction(:second) # -> ... (a Marshal string)
# v.gsub!(/world/, "HAL") # -> "Hello, HAL."
# v.rewind_transaction(:first) # -> "Hello, you."
# v.transaction_open? # -> true
# v.transaction_open?(:first) # -> true
# v.transaction_open?(:second) # -> false
#
# v.gsub!(/you/, "world") # -> "Hello, world."
# v.start_transaction(:second) # -> ... (a Marshal string)
# v.gsub!(/world/, "HAL") # -> "Hello, HAL."
# v.transaction_name # -> :second
# v.abort_transaction(:first) # -> "Hello, you."
# v.transaction_open? # -> false
#
# v.start_transaction(:first) # -> ... (a Marshal string)
# v.gsub!(/you/, "world") # -> "Hello, world."
# v.start_transaction(:second) # -> ... (a Marshal string)
# v.gsub!(/world/, "HAL") # -> "Hello, HAL."
#
# v.commit_transaction(:first) # -> "Hello, HAL."
# v.transaction_open? # -> false
#
# == Block Usage
# v = "Hello, you." # -> "Hello, you."
# Transaction::Simple.start(v) do |tv|
# # v has been extended with Transaction::Simple and an unnamed
# # transaction has been started.
# tv.transaction_open? # -> true
# tv.gsub!(/you/, "world") # -> "Hello, world."
#
# tv.rewind_transaction # -> "Hello, you."
# tv.transaction_open? # -> true
#
# tv.gsub!(/you/, "HAL") # -> "Hello, HAL."
# # The following breaks out of the transaction block after
# # aborting the transaction.
# tv.abort_transaction # -> "Hello, you."
# end
# # v still has Transaction::Simple applied from here on out.
# v.transaction_open? # -> false
#
# Transaction::Simple.start(v) do |tv|
# tv.start_transaction # -> ... (a Marshal string)
#
# tv.transaction_open? # -> true
# tv.gsub!(/you/, "HAL") # -> "Hello, HAL."
#
# # If #commit_transaction were called without having started a
# # second transaction, then it would break out of the transaction
# # block after committing the transaction.
# tv.commit_transaction # -> "Hello, HAL."
# tv.transaction_open? # -> true
# tv.abort_transaction # -> "Hello, you."
# end
# v.transaction_open? # -> false
#
# == Named Transaction Usage
# v = "Hello, you." # -> "Hello, you."
# v.extend(Transaction::Simple) # -> "Hello, you."
#
# v.start_transaction(:first) # -> ... (a Marshal string)
# v.transaction_open? # -> true
# v.transaction_open?(:first) # -> true
# v.transaction_open?(:second) # -> false
# v.gsub!(/you/, "world") # -> "Hello, world."
#
# v.start_transaction(:second) # -> ... (a Marshal string)
# v.gsub!(/world/, "HAL") # -> "Hello, HAL."
# v.rewind_transaction(:first) # -> "Hello, you."
# v.transaction_open? # -> true
# v.transaction_open?(:first) # -> true
# v.transaction_open?(:second) # -> false
#
# v.gsub!(/you/, "world") # -> "Hello, world."
# v.start_transaction(:second) # -> ... (a Marshal string)
# v.gsub!(/world/, "HAL") # -> "Hello, HAL."
# v.transaction_name # -> :second
# v.abort_transaction(:first) # -> "Hello, you."
# v.transaction_open? # -> false
#
# v.start_transaction(:first) # -> ... (a Marshal string)
# v.gsub!(/you/, "world") # -> "Hello, world."
# v.start_transaction(:second) # -> ... (a Marshal string)
# v.gsub!(/world/, "HAL") # -> "Hello, HAL."
#
# v.commit_transaction(:first) # -> "Hello, HAL."
# v.transaction_open? # -> false
#
# == Thread Safety
# Threadsafe version of Transaction::Simple and
# Transaction::Simple::Group exist; these are loaded from
# 'transaction/simple/threadsafe' and
# 'transaction/simple/threadsafe/group', respectively, and are
# represented in Ruby code as Transaction::Simple::ThreadSafe and
# Transaction::Simple::ThreadSafe::Group, respectively.
#
# == Contraindications
# While Transaction::Simple is very useful, it has some severe
# limitations that must be understood. Transaction::Simple:
#
# * uses Marshal. Thus, any object which cannot be <i>Marshal</i>ed
# cannot use Transaction::Simple. In my experience, this affects
# singleton objects more often than any other object. It may be that
# Ruby 2.0 will solve this problem.
# * does not manage resources. Resources external to the object and its
# instance variables are not managed at all. However, all instance
# variables and objects "belonging" to those instance variables are
# managed. If there are object reference counts to be handled,
# Transaction::Simple will probably cause problems.
# * is not inherently thread-safe. In the ACID ("atomic, consistent,
# isolated, durable") test, Transaction::Simple provides CD, but it is
# up to the user of Transaction::Simple to provide isolation and
# atomicity. Transactions should be considered "critical sections" in
# multi-threaded applications. If thread safety and atomicity is
# absolutely required, use Transaction::Simple::ThreadSafe, which uses
# a Mutex object to synchronize the accesses on the object during the
# transaction operations.
# * does not necessarily maintain Object#__id__ values on rewind or
# abort. This may change for future versions that will be Ruby 1.8 or
# better *only*. Certain objects that support #replace will maintain
# Object#__id__.
# * Can be a memory hog if you use many levels of transactions on many
# objects.
#
module Simple
TRANSACTION_SIMPLE_VERSION = '1.3.0'
# Sets the Transaction::Simple debug object. It must respond to #<<.
# Sets the transaction debug object. Debugging will be performed
# automatically if there's a debug object. The generic transaction
# error class.
def self.debug_io=(io)
if io.nil?
@tdi = nil
@debugging = false
else
unless io.respond_to?(:<<)
raise TransactionError, Messages[:bad_debug_object]
end
@tdi = io
@debugging = true
end
end
# Returns +true+ if we are debugging.
def self.debugging?
@debugging
end
# Returns the Transaction::Simple debug object. It must respond to
# #<<.
def self.debug_io
@tdi ||= ""
@tdi
end
# If +name+ is +nil+ (default), then returns +true+ if there is
# currently a transaction open.
#
# If +name+ is specified, then returns +true+ if there is currently a
# transaction that responds to +name+ open.
def transaction_open?(name = nil)
if name.nil?
if Transaction::Simple.debugging?
Transaction::Simple.debug_io << "Transaction " <<
"[#{(@__transaction_checkpoint__.nil?) ? 'closed' : 'open'}]\n"
end
return (not @__transaction_checkpoint__.nil?)
else
if Transaction::Simple.debugging?
Transaction::Simple.debug_io << "Transaction(#{name.inspect}) " <<
"[#{(@__transaction_checkpoint__.nil?) ? 'closed' : 'open'}]\n"
end
return ((not @__transaction_checkpoint__.nil?) and @__transaction_names__.include?(name))
end
end
# Returns the current name of the transaction. Transactions not
# explicitly named are named +nil+.
def transaction_name
if @__transaction_checkpoint__.nil?
raise TransactionError, Messages[:no_transaction_open]
end
if Transaction::Simple.debugging?
Transaction::Simple.debug_io << "#{'|' * @__transaction_level__} " <<
"Transaction Name: #{@__transaction_names__[-1].inspect}\n"
end
if @__transaction_names__[-1].kind_of?(String)
@__transaction_names__[-1].dup
else
@__transaction_names__[-1]
end
end
# Starts a transaction. Stores the current object state. If a
# transaction name is specified, the transaction will be named.
# Transaction names must be unique. Transaction names of +nil+ will be
# treated as unnamed transactions.
def start_transaction(name = nil)
@__transaction_level__ ||= 0
@__transaction_names__ ||= []
if name.nil?
@__transaction_names__ << nil
ss = "" if Transaction::Simple.debugging?
else
if @__transaction_names__.include?(name)
raise TransactionError, Messages[:unique_names]
end
name = name.dup.freeze if name.kind_of?(String)
@__transaction_names__ << name
ss = "(#{name.inspect})" if Transaction::Simple.debugging?
end
@__transaction_level__ += 1
if Transaction::Simple.debugging?
Transaction::Simple.debug_io << "#{'>' * @__transaction_level__} " <<
"Start Transaction#{ss}\n"
end
@__transaction_checkpoint__ = Marshal.dump(self)
end
# Rewinds the transaction. If +name+ is specified, then the
# intervening transactions will be aborted and the named transaction
# will be rewound. Otherwise, only the current transaction is rewound.
def rewind_transaction(name = nil)
if @__transaction_checkpoint__.nil?
raise TransactionError, Messages[:cannot_rewind_no_transaction]
end
# Check to see if we are trying to rewind a transaction that is
# outside of the current transaction block.
if @__transaction_block__ and name
nix = @__transaction_names__.index(name) + 1
if nix < @__transaction_block__
raise TransactionError, Messages[:cannot_rewind_transaction_before_block]
end
end
if name.nil?
__rewind_this_transaction
ss = "" if Transaction::Simple.debugging?
else
unless @__transaction_names__.include?(name)
raise TransactionError, Messages[:cannot_rewind_named_transaction] % name.inspect
end
ss = "(#{name})" if Transaction::Simple.debugging?
while @__transaction_names__[-1] != name
@__transaction_checkpoint__ = __rewind_this_transaction
if Transaction::Simple.debugging?
Transaction::Simple.debug_io << "#{'|' * @__transaction_level__} " <<
"Rewind Transaction#{ss}\n"
end
@__transaction_level__ -= 1
@__transaction_names__.pop
end
__rewind_this_transaction
end
if Transaction::Simple.debugging?
Transaction::Simple.debug_io << "#{'|' * @__transaction_level__} " <<
"Rewind Transaction#{ss}\n"
end
self
end
# Aborts the transaction. Resets the object state to what it was
# before the transaction was started and closes the transaction. If
# +name+ is specified, then the intervening transactions and the named
# transaction will be aborted. Otherwise, only the current transaction
# is aborted.
#
# If the current or named transaction has been started by a block
# (Transaction::Simple.start), then the execution of the block will be
# halted with +break+ +self+.
def abort_transaction(name = nil)
if @__transaction_checkpoint__.nil?
raise TransactionError, Messages[:cannot_abort_no_transaction]
end
# Check to see if we are trying to abort a transaction that is
# outside of the current transaction block. Otherwise, raise
# TransactionAborted if they are the same.
if @__transaction_block__ and name
nix = @__transaction_names__.index(name) + 1
if nix < @__transaction_block__
raise TransactionError, Messages[:cannot_abort_transaction_before_block]
end
raise TransactionAborted if @__transaction_block__ == nix
end
raise TransactionAborted if @__transaction_block__ == @__transaction_level__
if name.nil?
__abort_transaction(name)
else
unless @__transaction_names__.include?(name)
raise TransactionError, Messages[:cannot_abort_named_transaction] % name.inspect
end
__abort_transaction(name) while @__transaction_names__.include?(name)
end
self
end
# If +name+ is +nil+ (default), the current transaction level is
# closed out and the changes are committed.
#
# If +name+ is specified and +name+ is in the list of named
# transactions, then all transactions are closed and committed until
# the named transaction is reached.
def commit_transaction(name = nil)
if @__transaction_checkpoint__.nil?
raise TransactionError, Messages[:cannot_commit_no_transaction]
end
@__transaction_block__ ||= nil
# Check to see if we are trying to commit a transaction that is
# outside of the current transaction block. Otherwise, raise
# TransactionCommitted if they are the same.
if @__transaction_block__ and name
nix = @__transaction_names__.index(name) + 1
if nix < @__transaction_block__
raise TransactionError, Messages[:cannot_commit_transaction_before_block]
end
raise TransactionCommitted if @__transaction_block__ == nix
end
raise TransactionCommitted if @__transaction_block__ == @__transaction_level__
if name.nil?
ss = "" if Transaction::Simple.debugging?
__commit_transaction
if Transaction::Simple.debugging?
Transaction::Simple.debug_io << "#{'<' * @__transaction_level__} " <<
"Commit Transaction#{ss}\n"
end
else
unless @__transaction_names__.include?(name)
raise TransactionError, Messages[:cannot_commit_named_transaction] % name.inspect
end
ss = "(#{name})" if Transaction::Simple.debugging?
while @__transaction_names__[-1] != name
if Transaction::Simple.debugging?
Transaction::Simple.debug_io << "#{'<' * @__transaction_level__} " <<
"Commit Transaction#{ss}\n"
end
__commit_transaction
end
if Transaction::Simple.debugging?
Transaction::Simple.debug_io << "#{'<' * @__transaction_level__} " <<
"Commit Transaction#{ss}\n"
end
__commit_transaction
end
self
end
# Alternative method for calling the transaction methods. An optional
# name can be specified for named transaction support.
#
# #transaction(:start):: #start_transaction
# #transaction(:rewind):: #rewind_transaction
# #transaction(:abort):: #abort_transaction
# #transaction(:commit):: #commit_transaction
# #transaction(:name):: #transaction_name
# #transaction:: #transaction_open?
def transaction(action = nil, name = nil)
case action
when :start
start_transaction(name)
when :rewind
rewind_transaction(name)
when :abort
abort_transaction(name)
when :commit
commit_transaction(name)
when :name
transaction_name
when nil
transaction_open?(name)
end
end
# Allows specific variables to be excluded from transaction support.
# Must be done after extending the object but before starting the
# first transaction on the object.
#
# vv.transaction_exclusions << "@io"
def transaction_exclusions
@transaction_exclusions ||= []
end
class << self
def __common_start(name, vars, &block)
if vars.empty?
raise TransactionError, Messages[:cannot_start_empty_block_transaction]
end
if block
begin
vlevel = {}
vars.each do |vv|
vv.extend(Transaction::Simple)
vv.start_transaction(name)
vlevel[vv.__id__] = vv.instance_variable_get(:@__transaction_level__)
vv.instance_variable_set(:@__transaction_block__, vlevel[vv.__id__])
end
yield(*vars)
rescue TransactionAborted
vars.each do |vv|
if name.nil? and vv.transaction_open?
loop do
tlevel = vv.instance_variable_get(:@__transaction_level__) || -1
vv.instance_variable_set(:@__transaction_block__, -1)
break if tlevel < vlevel[vv.__id__]
vv.abort_transaction if vv.transaction_open?
end
elsif vv.transaction_open?(name)
vv.instance_variable_set(:@__transaction_block__, -1)
vv.abort_transaction(name)
end
end
rescue TransactionCommitted
nil
ensure
vars.each do |vv|
if name.nil? and vv.transaction_open?
loop do
tlevel = vv.instance_variable_get(:@__transaction_level__) || -1
break if tlevel < vlevel[vv.__id__]
vv.instance_variable_set(:@__transaction_block__, -1)
vv.commit_transaction if vv.transaction_open?
end
elsif vv.transaction_open?(name)
vv.instance_variable_set(:@__transaction_block__, -1)
vv.commit_transaction(name)
end
end
end
else
vars.each do |vv|
vv.extend(Transaction::Simple)
vv.start_transaction(name)
end
end
end
private :__common_start
def start_named(name, *vars, &block)
__common_start(name, vars, &block)
end
def start(*vars, &block)
__common_start(nil, vars, &block)
end
end
def __abort_transaction(name = nil) #:nodoc:
@__transaction_checkpoint__ = __rewind_this_transaction
if name.nil?
ss = "" if Transaction::Simple.debugging?
else
ss = "(#{name.inspect})" if Transaction::Simple.debugging?
end
if Transaction::Simple.debugging?
Transaction::Simple.debug_io << "#{'<' * @__transaction_level__} " <<
"Abort Transaction#{ss}\n"
end
@__transaction_level__ -= 1
@__transaction_names__.pop
if @__transaction_level__ < 1
@__transaction_level__ = 0
@__transaction_names__ = []
end
end
TRANSACTION_CHECKPOINT = "@__transaction_checkpoint__" #:nodoc:
SKIP_TRANSACTION_VARS = [TRANSACTION_CHECKPOINT, "@__transaction_level__"] #:nodoc:
def __rewind_this_transaction #:nodoc:
rr = Marshal.restore(@__transaction_checkpoint__)
begin
self.replace(rr) if respond_to?(:replace)
rescue
nil
end
rr.instance_variables.each do |vv|
next if SKIP_TRANSACTION_VARS.include?(vv)
next if self.transaction_exclusions.include?(vv)
if respond_to?(:instance_variable_get)
instance_variable_set(vv, rr.instance_variable_get(vv))
else
instance_eval(%q|#{vv} = rr.instance_eval("#{vv}")|)
end
end
new_ivar = instance_variables - rr.instance_variables - SKIP_TRANSACTION_VARS
new_ivar.each do |vv|
if respond_to?(:instance_variable_set)
instance_variable_set(vv, nil)
else
instance_eval(%q|#{vv} = nil|)
end
end
if respond_to?(:instance_variable_get)
rr.instance_variable_get(TRANSACTION_CHECKPOINT)
else
rr.instance_eval(TRANSACTION_CHECKPOINT)
end
end
def __commit_transaction #:nodoc:
if respond_to?(:instance_variable_get)
@__transaction_checkpoint__ = Marshal.restore(@__transaction_checkpoint__).instance_variable_get(TRANSACTION_CHECKPOINT)
else
@__transaction_checkpoint__ = Marshal.restore(@__transaction_checkpoint__).instance_eval(TRANSACTION_CHECKPOINT)
end
@__transaction_level__ -= 1
@__transaction_names__.pop
if @__transaction_level__ < 1
@__transaction_level__ = 0
@__transaction_names__ = []
end
end
private :__abort_transaction
private :__rewind_this_transaction
private :__commit_transaction
end
end

View file

@ -0,0 +1,9 @@
module ActiveRecord
module VERSION #:nodoc:
MAJOR = 1
MINOR = 14
TINY = 4
STRING = [MAJOR, MINOR, TINY].join('.')
end
end

View file

@ -0,0 +1,15 @@
require 'yaml'
module ActiveRecord
module Wrappings #:nodoc:
class YamlWrapper < AbstractWrapper #:nodoc:
def wrap(attribute) attribute.to_yaml end
def unwrap(attribute) YAML::load(attribute) end
end
module ClassMethods #:nodoc:
# Wraps the attribute in Yaml encoding
def wrap_in_yaml(*attributes) wrap_with(YamlWrapper, attributes) end
end
end
end

View file

@ -0,0 +1,59 @@
module ActiveRecord
# A plugin framework for wrapping attribute values before they go in and unwrapping them after they go out of the database.
# This was intended primarily for YAML wrapping of arrays and hashes, but this behavior is now native in the Base class.
# So for now this framework is laying dormant until a need pops up.
module Wrappings #:nodoc:
module ClassMethods #:nodoc:
def wrap_with(wrapper, *attributes)
[ attributes ].flat.each { |attribute| wrapper.wrap(attribute) }
end
end
def self.append_features(base)
super
base.extend(ClassMethods)
end
class AbstractWrapper #:nodoc:
def self.wrap(attribute, record_binding) #:nodoc:
%w( before_save after_save after_initialize ).each do |callback|
eval "#{callback} #{name}.new('#{attribute}')", record_binding
end
end
def initialize(attribute) #:nodoc:
@attribute = attribute
end
def save_wrapped_attribute(record) #:nodoc:
if record.attribute_present?(@attribute)
record.send(
"write_attribute",
@attribute,
wrap(record.send("read_attribute", @attribute))
)
end
end
def load_wrapped_attribute(record) #:nodoc:
if record.attribute_present?(@attribute)
record.send(
"write_attribute",
@attribute,
unwrap(record.send("read_attribute", @attribute))
)
end
end
alias_method :before_save, :save_wrapped_attribute #:nodoc:
alias_method :after_save, :load_wrapped_attribute #:nodoc:
alias_method :after_initialize, :after_save #:nodoc:
# Overwrite to implement the logic that'll take the regular attribute and wrap it.
def wrap(attribute) end
# Overwrite to implement the logic that'll take the wrapped attribute and unwrap it.
def unwrap(attribute) end
end
end
end

View file

@ -0,0 +1,55 @@
# The filename begins with "aaa" to ensure this is the first test.
require 'abstract_unit'
class AAACreateTablesTest < Test::Unit::TestCase
self.use_transactional_fixtures = false
def setup
@base_path = "#{File.dirname(__FILE__)}/fixtures/db_definitions"
end
def test_drop_and_create_main_tables
recreate ActiveRecord::Base
assert true
end
def test_load_schema
eval(File.read("#{File.dirname(__FILE__)}/fixtures/db_definitions/schema.rb"))
assert true
end
def test_drop_and_create_courses_table
recreate Course, '2'
assert true
end
private
def recreate(base, suffix = nil)
connection = base.connection
adapter_name = connection.adapter_name.downcase + suffix.to_s
execute_sql_file "#{@base_path}/#{adapter_name}.drop.sql", connection
execute_sql_file "#{@base_path}/#{adapter_name}.sql", connection
end
def execute_sql_file(path, connection)
# OpenBase has a different format for sql files
if current_adapter?(:OpenBaseAdapter) then
File.read(path).split("go").each_with_index do |sql, i|
begin
# OpenBase does not support comments embedded in sql
connection.execute(sql,"SQL statement ##{i}") unless sql.blank?
rescue ActiveRecord::StatementInvalid
#$stderr.puts "warning: #{$!}"
end
end
else
File.read(path).split(';').each_with_index do |sql, i|
begin
connection.execute("\n\n-- statement ##{i}\n#{sql}\n") unless sql.blank?
rescue ActiveRecord::StatementInvalid
#$stderr.puts "warning: #{$!}"
end
end
end
end
end

View file

@ -0,0 +1,67 @@
$:.unshift(File.dirname(__FILE__) + '/../lib')
$:.unshift(File.dirname(__FILE__) + '/../../activesupport/lib')
require 'test/unit'
require 'active_record'
require 'active_record/fixtures'
require 'active_support/binding_of_caller'
require 'active_support/breakpoint'
require 'connection'
QUOTED_TYPE = ActiveRecord::Base.connection.quote_column_name('type') unless Object.const_defined?(:QUOTED_TYPE)
class Test::Unit::TestCase #:nodoc:
self.fixture_path = File.dirname(__FILE__) + "/fixtures/"
self.use_instantiated_fixtures = false
self.use_transactional_fixtures = (ENV['AR_NO_TX_FIXTURES'] != "yes")
def create_fixtures(*table_names, &block)
Fixtures.create_fixtures(File.dirname(__FILE__) + "/fixtures/", table_names, {}, &block)
end
def assert_date_from_db(expected, actual, message = nil)
# SQL Server doesn't have a separate column type just for dates,
# so the time is in the string and incorrectly formatted
if current_adapter?(:SQLServerAdapter)
assert_equal expected.strftime("%Y/%m/%d 00:00:00"), actual.strftime("%Y/%m/%d 00:00:00")
elsif current_adapter?(:SybaseAdapter)
assert_equal expected.to_s, actual.to_date.to_s, message
else
assert_equal expected.to_s, actual.to_s, message
end
end
def assert_queries(num = 1)
ActiveRecord::Base.connection.class.class_eval do
self.query_count = 0
alias_method :execute, :execute_with_query_counting
end
yield
ensure
ActiveRecord::Base.connection.class.class_eval do
alias_method :execute, :execute_without_query_counting
end
assert_equal num, ActiveRecord::Base.connection.query_count, "#{ActiveRecord::Base.connection.query_count} instead of #{num} queries were executed."
end
def assert_no_queries(&block)
assert_queries(0, &block)
end
end
def current_adapter?(type)
ActiveRecord::ConnectionAdapters.const_defined?(type) &&
ActiveRecord::Base.connection.instance_of?(ActiveRecord::ConnectionAdapters.const_get(type))
end
ActiveRecord::Base.connection.class.class_eval do
cattr_accessor :query_count
alias_method :execute_without_query_counting, :execute
def execute_with_query_counting(sql, name = nil)
self.query_count += 1
execute_without_query_counting(sql, name)
end
end
#ActiveRecord::Base.logger = Logger.new(STDOUT)
#ActiveRecord::Base.colorize_logging = false

View file

@ -0,0 +1,31 @@
require 'abstract_unit'
class ActiveSchemaTest < Test::Unit::TestCase
def setup
ActiveRecord::ConnectionAdapters::MysqlAdapter.class_eval do
alias_method :real_execute, :execute
def execute(sql, name = nil) return sql end
end
end
def teardown
ActiveRecord::ConnectionAdapters::MysqlAdapter.send(:alias_method, :execute, :real_execute)
end
def test_drop_table
assert_equal "DROP TABLE people", drop_table(:people)
end
def test_add_column
assert_equal "ALTER TABLE people ADD last_name varchar(255)", add_column(:people, :last_name, :string)
end
def test_add_column_with_limit
assert_equal "ALTER TABLE people ADD key varchar(32)", add_column(:people, :key, :string, :limit => 32)
end
private
def method_missing(method_symbol, *arguments)
ActiveRecord::Base.connection.send(method_symbol, *arguments)
end
end

View file

@ -0,0 +1,85 @@
require 'abstract_unit'
class AdapterTest < Test::Unit::TestCase
def setup
@connection = ActiveRecord::Base.connection
end
def test_tables
if @connection.respond_to?(:tables)
tables = @connection.tables
assert tables.include?("accounts")
assert tables.include?("authors")
assert tables.include?("tasks")
assert tables.include?("topics")
else
warn "#{@connection.class} does not respond to #tables"
end
end
def test_indexes
idx_name = "accounts_idx"
if @connection.respond_to?(:indexes)
indexes = @connection.indexes("accounts")
assert indexes.empty?
@connection.add_index :accounts, :firm_id, :name => idx_name
indexes = @connection.indexes("accounts")
assert_equal "accounts", indexes.first.table
# OpenBase does not have the concept of a named index
# Indexes are merely properties of columns.
assert_equal idx_name, indexes.first.name unless current_adapter?(:OpenBaseAdapter)
assert !indexes.first.unique
assert_equal ["firm_id"], indexes.first.columns
else
warn "#{@connection.class} does not respond to #indexes"
end
ensure
@connection.remove_index(:accounts, :name => idx_name) rescue nil
end
def test_current_database
if @connection.respond_to?(:current_database)
assert_equal ENV['ARUNIT_DB_NAME'] || "activerecord_unittest", @connection.current_database
end
end
def test_table_alias
def @connection.test_table_alias_length() 10; end
class << @connection
alias_method :old_table_alias_length, :table_alias_length
alias_method :table_alias_length, :test_table_alias_length
end
assert_equal 'posts', @connection.table_alias_for('posts')
assert_equal 'posts_comm', @connection.table_alias_for('posts_comments')
assert_equal 'dbo_posts', @connection.table_alias_for('dbo.posts')
class << @connection
alias_method :table_alias_length, :old_table_alias_length
end
end
# test resetting sequences in odd tables in postgreSQL
if ActiveRecord::Base.connection.respond_to?(:reset_pk_sequence!)
require 'fixtures/movie'
require 'fixtures/subscriber'
def test_reset_empty_table_with_custom_pk
Movie.delete_all
Movie.connection.reset_pk_sequence! 'movies'
assert_equal 1, Movie.create(:name => 'fight club').id
end
def test_reset_table_with_non_integer_pk
Subscriber.delete_all
Subscriber.connection.reset_pk_sequence! 'subscribers'
sub = Subscriber.new(:name => 'robert drake')
sub.id = 'bob drake'
assert_nothing_raised { sub.save! }
end
end
end

View file

@ -0,0 +1,66 @@
require 'abstract_unit'
require 'fixtures/customer'
class AggregationsTest < Test::Unit::TestCase
fixtures :customers
def test_find_single_value_object
assert_equal 50, customers(:david).balance.amount
assert_kind_of Money, customers(:david).balance
assert_equal 300, customers(:david).balance.exchange_to("DKK").amount
end
def test_find_multiple_value_object
assert_equal customers(:david).address_street, customers(:david).address.street
assert(
customers(:david).address.close_to?(Address.new("Different Street", customers(:david).address_city, customers(:david).address_country))
)
end
def test_change_single_value_object
customers(:david).balance = Money.new(100)
customers(:david).save
assert_equal 100, Customer.find(1).balance.amount
end
def test_immutable_value_objects
customers(:david).balance = Money.new(100)
assert_raises(TypeError) { customers(:david).balance.instance_eval { @amount = 20 } }
end
def test_inferred_mapping
assert_equal "35.544623640962634", customers(:david).gps_location.latitude
assert_equal "-105.9309951055148", customers(:david).gps_location.longitude
customers(:david).gps_location = GpsLocation.new("39x-110")
assert_equal "39", customers(:david).gps_location.latitude
assert_equal "-110", customers(:david).gps_location.longitude
customers(:david).save
customers(:david).reload
assert_equal "39", customers(:david).gps_location.latitude
assert_equal "-110", customers(:david).gps_location.longitude
end
def test_reloaded_instance_refreshes_aggregations
assert_equal "35.544623640962634", customers(:david).gps_location.latitude
assert_equal "-105.9309951055148", customers(:david).gps_location.longitude
Customer.update_all("gps_location = '24x113'")
customers(:david).reload
assert_equal '24x113', customers(:david)['gps_location']
assert_equal GpsLocation.new('24x113'), customers(:david).gps_location
end
def test_gps_equality
assert GpsLocation.new('39x110') == GpsLocation.new('39x110')
end
def test_gps_inequality
assert GpsLocation.new('39x110') != GpsLocation.new('39x111')
end
end

8
vendor/rails/activerecord/test/all.sh vendored Executable file
View file

@ -0,0 +1,8 @@
#!/bin/sh
if [ -z "$1" ]; then
echo "Usage: $0 connections/<db_library>" 1>&2
exit 1
fi
ruby -I $1 -e 'Dir.foreach(".") { |file| require file if file =~ /_test.rb$/ }'

View file

@ -0,0 +1,33 @@
require 'abstract_unit'
require "#{File.dirname(__FILE__)}/../lib/active_record/schema"
if ActiveRecord::Base.connection.supports_migrations?
class ActiveRecordSchemaTest < Test::Unit::TestCase
self.use_transactional_fixtures = false
def setup
@connection = ActiveRecord::Base.connection
end
def teardown
@connection.drop_table :fruits rescue nil
end
def test_schema_define
ActiveRecord::Schema.define(:version => 7) do
create_table :fruits do |t|
t.column :color, :string
t.column :fruit_size, :string # NOTE: "size" is reserved in Oracle
t.column :texture, :string
t.column :flavor, :string
end
end
assert_nothing_raised { @connection.select_all "SELECT * FROM fruits" }
assert_nothing_raised { @connection.select_all "SELECT * FROM schema_info" }
assert_equal 7, @connection.select_one("SELECT version FROM schema_info")['version'].to_i
end
end
end

View file

@ -0,0 +1,124 @@
require 'abstract_unit'
require 'fixtures/post'
require 'fixtures/comment'
require 'fixtures/author'
require 'fixtures/category'
require 'fixtures/project'
require 'fixtures/developer'
class AssociationCallbacksTest < Test::Unit::TestCase
fixtures :posts, :authors, :projects, :developers
def setup
@david = authors(:david)
@thinking = posts(:thinking)
@authorless = posts(:authorless)
assert @david.post_log.empty?
end
def test_adding_macro_callbacks
@david.posts_with_callbacks << @thinking
assert_equal ["before_adding#{@thinking.id}", "after_adding#{@thinking.id}"], @david.post_log
@david.posts_with_callbacks << @thinking
assert_equal ["before_adding#{@thinking.id}", "after_adding#{@thinking.id}", "before_adding#{@thinking.id}",
"after_adding#{@thinking.id}"], @david.post_log
end
def test_adding_with_proc_callbacks
@david.posts_with_proc_callbacks << @thinking
assert_equal ["before_adding#{@thinking.id}", "after_adding#{@thinking.id}"], @david.post_log
@david.posts_with_proc_callbacks << @thinking
assert_equal ["before_adding#{@thinking.id}", "after_adding#{@thinking.id}", "before_adding#{@thinking.id}",
"after_adding#{@thinking.id}"], @david.post_log
end
def test_removing_with_macro_callbacks
first_post, second_post = @david.posts_with_callbacks[0, 2]
@david.posts_with_callbacks.delete(first_post)
assert_equal ["before_removing#{first_post.id}", "after_removing#{first_post.id}"], @david.post_log
@david.posts_with_callbacks.delete(second_post)
assert_equal ["before_removing#{first_post.id}", "after_removing#{first_post.id}", "before_removing#{second_post.id}",
"after_removing#{second_post.id}"], @david.post_log
end
def test_removing_with_proc_callbacks
first_post, second_post = @david.posts_with_callbacks[0, 2]
@david.posts_with_proc_callbacks.delete(first_post)
assert_equal ["before_removing#{first_post.id}", "after_removing#{first_post.id}"], @david.post_log
@david.posts_with_proc_callbacks.delete(second_post)
assert_equal ["before_removing#{first_post.id}", "after_removing#{first_post.id}", "before_removing#{second_post.id}",
"after_removing#{second_post.id}"], @david.post_log
end
def test_multiple_callbacks
@david.posts_with_multiple_callbacks << @thinking
assert_equal ["before_adding#{@thinking.id}", "before_adding_proc#{@thinking.id}", "after_adding#{@thinking.id}",
"after_adding_proc#{@thinking.id}"], @david.post_log
@david.posts_with_multiple_callbacks << @thinking
assert_equal ["before_adding#{@thinking.id}", "before_adding_proc#{@thinking.id}", "after_adding#{@thinking.id}",
"after_adding_proc#{@thinking.id}", "before_adding#{@thinking.id}", "before_adding_proc#{@thinking.id}",
"after_adding#{@thinking.id}", "after_adding_proc#{@thinking.id}"], @david.post_log
end
def test_has_and_belongs_to_many_add_callback
david = developers(:david)
ar = projects(:active_record)
assert ar.developers_log.empty?
ar.developers_with_callbacks << david
assert_equal ["before_adding#{david.id}", "after_adding#{david.id}"], ar.developers_log
ar.developers_with_callbacks << david
assert_equal ["before_adding#{david.id}", "after_adding#{david.id}", "before_adding#{david.id}",
"after_adding#{david.id}"], ar.developers_log
end
def test_has_and_belongs_to_many_remove_callback
david = developers(:david)
jamis = developers(:jamis)
activerecord = projects(:active_record)
assert activerecord.developers_log.empty?
activerecord.developers_with_callbacks.delete(david)
assert_equal ["before_removing#{david.id}", "after_removing#{david.id}"], activerecord.developers_log
activerecord.developers_with_callbacks.delete(jamis)
assert_equal ["before_removing#{david.id}", "after_removing#{david.id}", "before_removing#{jamis.id}",
"after_removing#{jamis.id}"], activerecord.developers_log
end
def test_has_and_belongs_to_many_remove_callback_on_clear
activerecord = projects(:active_record)
assert activerecord.developers_log.empty?
if activerecord.developers_with_callbacks.size == 0
activerecord.developers << developers(:david)
activerecord.developers << developers(:jamis)
activerecord.reload
assert activerecord.developers_with_callbacks.size == 2
end
log_array = activerecord.developers_with_callbacks.collect {|d| ["before_removing#{d.id}","after_removing#{d.id}"]}.flatten.sort
assert activerecord.developers_with_callbacks.clear
assert_equal log_array, activerecord.developers_log.sort
end
def test_dont_add_if_before_callback_raises_exception
assert !@david.unchangable_posts.include?(@authorless)
begin
@david.unchangable_posts << @authorless
rescue Exception => e
end
assert @david.post_log.empty?
assert !@david.unchangable_posts.include?(@authorless)
@david.reload
assert !@david.unchangable_posts.include?(@authorless)
end
def test_push_with_attributes
david = developers(:david)
activerecord = projects(:active_record)
assert activerecord.developers_log.empty?
activerecord.developers_with_callbacks.push_with_attributes(david, {})
assert_equal ["before_adding#{david.id}", "after_adding#{david.id}"], activerecord.developers_log
activerecord.developers_with_callbacks.push_with_attributes(david, {})
assert_equal ["before_adding#{david.id}", "after_adding#{david.id}", "before_adding#{david.id}",
"after_adding#{david.id}"], activerecord.developers_log
end
end

View file

@ -0,0 +1,14 @@
require 'abstract_unit'
require 'fixtures/company'
class AssociationInheritanceReloadTest < Test::Unit::TestCase
fixtures :companies
def test_set_attributes
assert_equal ["errors.add_on_empty('name', \"can't be empty\")"], Firm.read_inheritable_attribute("validate"), "Second run"
# ActiveRecord::Base.reset_column_information_and_inheritable_attributes_for_all_subclasses
remove_subclass_of(ActiveRecord::Base)
load 'fixtures/company.rb'
assert_equal ["errors.add_on_empty('name', \"can't be empty\")"], Firm.read_inheritable_attribute("validate"), "Second run"
end
end

View file

@ -0,0 +1,106 @@
require 'abstract_unit'
require 'active_record/acts/list'
require 'fixtures/post'
require 'fixtures/comment'
require 'fixtures/author'
require 'fixtures/category'
require 'fixtures/categorization'
require 'fixtures/mixin'
require 'fixtures/company'
require 'fixtures/topic'
require 'fixtures/reply'
class CascadedEagerLoadingTest < Test::Unit::TestCase
fixtures :authors, :mixins, :companies, :posts, :categorizations, :topics
def test_eager_association_loading_with_cascaded_two_levels
authors = Author.find(:all, :include=>{:posts=>:comments}, :order=>"authors.id")
assert_equal 2, authors.size
assert_equal 5, authors[0].posts.size
assert_equal 1, authors[1].posts.size
assert_equal 9, authors[0].posts.collect{|post| post.comments.size }.inject(0){|sum,i| sum+i}
end
def test_eager_association_loading_with_cascaded_two_levels_and_one_level
authors = Author.find(:all, :include=>[{:posts=>:comments}, :categorizations], :order=>"authors.id")
assert_equal 2, authors.size
assert_equal 5, authors[0].posts.size
assert_equal 1, authors[1].posts.size
assert_equal 9, authors[0].posts.collect{|post| post.comments.size }.inject(0){|sum,i| sum+i}
assert_equal 1, authors[0].categorizations.size
assert_equal 1, authors[1].categorizations.size
end
def test_eager_association_loading_with_cascaded_two_levels_with_two_has_many_associations
authors = Author.find(:all, :include=>{:posts=>[:comments, :categorizations]}, :order=>"authors.id")
assert_equal 2, authors.size
assert_equal 5, authors[0].posts.size
assert_equal 1, authors[1].posts.size
assert_equal 9, authors[0].posts.collect{|post| post.comments.size }.inject(0){|sum,i| sum+i}
end
def test_eager_association_loading_with_cascaded_two_levels_and_self_table_reference
authors = Author.find(:all, :include=>{:posts=>[:comments, :author]}, :order=>"authors.id")
assert_equal 2, authors.size
assert_equal 5, authors[0].posts.size
assert_equal authors(:david).name, authors[0].name
assert_equal [authors(:david).name], authors[0].posts.collect{|post| post.author.name}.uniq
end
def test_eager_association_loading_with_cascaded_two_levels_with_condition
authors = Author.find(:all, :include=>{:posts=>:comments}, :conditions=>"authors.id=1", :order=>"authors.id")
assert_equal 1, authors.size
assert_equal 5, authors[0].posts.size
end
def test_eager_association_loading_with_acts_as_tree
roots = TreeMixin.find(:all, :include=>"children", :conditions=>"mixins.parent_id IS NULL", :order=>"mixins.id")
assert_equal [mixins(:tree_1), mixins(:tree2_1), mixins(:tree3_1)], roots
assert_no_queries do
assert_equal 2, roots[0].children.size
assert_equal 0, roots[1].children.size
assert_equal 0, roots[2].children.size
end
end
def test_eager_association_loading_with_cascaded_three_levels_by_ping_pong
firms = Firm.find(:all, :include=>{:account=>{:firm=>:account}}, :order=>"companies.id")
assert_equal 2, firms.size
assert_equal firms.first.account, firms.first.account.firm.account
assert_equal companies(:first_firm).account, assert_no_queries { firms.first.account.firm.account }
assert_equal companies(:first_firm).account.firm.account, assert_no_queries { firms.first.account.firm.account }
end
def test_eager_association_loading_with_has_many_sti
topics = Topic.find(:all, :include => :replies, :order => 'topics.id')
assert_equal [topics(:first), topics(:second)], topics
assert_no_queries do
assert_equal 1, topics[0].replies.size
assert_equal 0, topics[1].replies.size
end
end
def test_eager_association_loading_with_belongs_to_sti
replies = Reply.find(:all, :include => :topic, :order => 'topics.id')
assert_equal [topics(:second)], replies
assert_equal topics(:first), assert_no_queries { replies.first.topic }
end
def test_eager_association_loading_with_multiple_stis_and_order
author = Author.find(:first, :include => { :posts => [ :special_comments , :very_special_comment ] }, :order => 'authors.name, comments.body, very_special_comments_posts.body', :conditions => 'posts.id = 4')
assert_equal authors(:david), author
assert_no_queries do
author.posts.first.special_comments
author.posts.first.very_special_comment
end
end
def test_eager_association_loading_of_stis_with_multiple_references
authors = Author.find(:all, :include => { :posts => { :special_comments => { :post => [ :special_comments, :very_special_comment ] } } }, :order => 'comments.body, very_special_comments_posts.body', :conditions => 'posts.id = 4')
assert_equal [authors(:david)], authors
assert_no_queries do
authors.first.posts.first.special_comments.first.post.special_comments
authors.first.posts.first.special_comments.first.post.very_special_comment
end
end
end

View file

@ -0,0 +1,37 @@
require 'abstract_unit'
require 'fixtures/post'
require 'fixtures/comment'
require 'fixtures/project'
require 'fixtures/developer'
class AssociationsExtensionsTest < Test::Unit::TestCase
fixtures :projects, :developers, :developers_projects, :comments, :posts
def test_extension_on_has_many
assert_equal comments(:more_greetings), posts(:welcome).comments.find_most_recent
end
def test_extension_on_habtm
assert_equal projects(:action_controller), developers(:david).projects.find_most_recent
end
def test_named_extension_on_habtm
assert_equal projects(:action_controller), developers(:david).projects_extended_by_name.find_most_recent
end
def test_marshalling_extensions
david = developers(:david)
assert_equal projects(:action_controller), david.projects.find_most_recent
david = Marshal.load(Marshal.dump(david))
assert_equal projects(:action_controller), david.projects.find_most_recent
end
def test_marshalling_named_extensions
david = developers(:david)
assert_equal projects(:action_controller), david.projects_extended_by_name.find_most_recent
david = Marshal.load(Marshal.dump(david))
assert_equal projects(:action_controller), david.projects_extended_by_name.find_most_recent
end
end

View file

@ -0,0 +1,359 @@
require 'abstract_unit'
require 'fixtures/post'
require 'fixtures/comment'
require 'fixtures/author'
require 'fixtures/category'
require 'fixtures/company'
require 'fixtures/person'
require 'fixtures/reader'
class EagerAssociationTest < Test::Unit::TestCase
fixtures :posts, :comments, :authors, :categories, :categories_posts,
:companies, :accounts, :tags, :people, :readers
def test_loading_with_one_association
posts = Post.find(:all, :include => :comments)
post = posts.find { |p| p.id == 1 }
assert_equal 2, post.comments.size
assert post.comments.include?(comments(:greetings))
post = Post.find(:first, :include => :comments, :conditions => "posts.title = 'Welcome to the weblog'")
assert_equal 2, post.comments.size
assert post.comments.include?(comments(:greetings))
end
def test_loading_conditions_with_or
posts = authors(:david).posts.find(:all, :include => :comments, :conditions => "comments.body like 'Normal%' OR comments.#{QUOTED_TYPE} = 'SpecialComment'")
assert_nil posts.detect { |p| p.author_id != authors(:david).id },
"expected to find only david's posts"
end
def test_with_ordering
list = Post.find(:all, :include => :comments, :order => "posts.id DESC")
[:eager_other, :sti_habtm, :sti_post_and_comments, :sti_comments,
:authorless, :thinking, :welcome
].each_with_index do |post, index|
assert_equal posts(post), list[index]
end
end
def test_loading_with_multiple_associations
posts = Post.find(:all, :include => [ :comments, :author, :categories ], :order => "posts.id")
assert_equal 2, posts.first.comments.size
assert_equal 2, posts.first.categories.size
assert posts.first.comments.include?(comments(:greetings))
end
def test_loading_from_an_association
posts = authors(:david).posts.find(:all, :include => :comments, :order => "posts.id")
assert_equal 2, posts.first.comments.size
end
def test_loading_with_no_associations
assert_nil Post.find(posts(:authorless).id, :include => :author).author
end
def test_eager_association_loading_with_belongs_to
comments = Comment.find(:all, :include => :post)
assert_equal 10, comments.length
titles = comments.map { |c| c.post.title }
assert titles.include?(posts(:welcome).title)
assert titles.include?(posts(:sti_post_and_comments).title)
end
def test_eager_association_loading_with_belongs_to_and_limit
comments = Comment.find(:all, :include => :post, :limit => 5, :order => 'comments.id')
assert_equal 5, comments.length
assert_equal [1,2,3,5,6], comments.collect { |c| c.id }
end
def test_eager_association_loading_with_belongs_to_and_limit_and_conditions
comments = Comment.find(:all, :include => :post, :conditions => 'post_id = 4', :limit => 3, :order => 'comments.id')
assert_equal 3, comments.length
assert_equal [5,6,7], comments.collect { |c| c.id }
end
def test_eager_association_loading_with_belongs_to_and_limit_and_offset
comments = Comment.find(:all, :include => :post, :limit => 3, :offset => 2, :order => 'comments.id')
assert_equal 3, comments.length
assert_equal [3,5,6], comments.collect { |c| c.id }
end
def test_eager_association_loading_with_belongs_to_and_limit_and_offset_and_conditions
comments = Comment.find(:all, :include => :post, :conditions => 'post_id = 4', :limit => 3, :offset => 1, :order => 'comments.id')
assert_equal 3, comments.length
assert_equal [6,7,8], comments.collect { |c| c.id }
end
def test_eager_association_loading_with_belongs_to_and_limit_and_offset_and_conditions_array
comments = Comment.find(:all, :include => :post, :conditions => ['post_id = ?',4], :limit => 3, :offset => 1, :order => 'comments.id')
assert_equal 3, comments.length
assert_equal [6,7,8], comments.collect { |c| c.id }
end
def test_eager_association_loading_with_belongs_to_and_limit_and_multiple_associations
posts = Post.find(:all, :include => [:author, :very_special_comment], :limit => 1, :order => 'posts.id')
assert_equal 1, posts.length
assert_equal [1], posts.collect { |p| p.id }
end
def test_eager_association_loading_with_belongs_to_and_limit_and_offset_and_multiple_associations
posts = Post.find(:all, :include => [:author, :very_special_comment], :limit => 1, :offset => 1, :order => 'posts.id')
assert_equal 1, posts.length
assert_equal [2], posts.collect { |p| p.id }
end
def test_eager_with_has_many_through
posts_with_comments = people(:michael).posts.find(:all, :include => :comments )
posts_with_author = people(:michael).posts.find(:all, :include => :author )
posts_with_comments_and_author = people(:michael).posts.find(:all, :include => [ :comments, :author ])
assert_equal 2, posts_with_comments.inject(0) { |sum, post| sum += post.comments.size }
assert_equal authors(:david), assert_no_queries { posts_with_author.first.author }
assert_equal authors(:david), assert_no_queries { posts_with_comments_and_author.first.author }
end
def test_eager_with_has_many_and_limit
posts = Post.find(:all, :order => 'posts.id asc', :include => [ :author, :comments ], :limit => 2)
assert_equal 2, posts.size
assert_equal 3, posts.inject(0) { |sum, post| sum += post.comments.size }
end
def test_eager_with_has_many_and_limit_and_conditions
posts = Post.find(:all, :include => [ :author, :comments ], :limit => 2, :conditions => "posts.body = 'hello'", :order => "posts.id")
assert_equal 2, posts.size
assert_equal [4,5], posts.collect { |p| p.id }
end
def test_eager_with_has_many_and_limit_and_conditions_array
posts = Post.find(:all, :include => [ :author, :comments ], :limit => 2, :conditions => [ "posts.body = ?", 'hello' ], :order => "posts.id")
assert_equal 2, posts.size
assert_equal [4,5], posts.collect { |p| p.id }
end
def test_eager_with_has_many_and_limit_and_conditions_array_on_the_eagers
posts = Post.find(:all, :include => [ :author, :comments ], :limit => 2, :conditions => [ "authors.name = ?", 'David' ])
assert_equal 2, posts.size
count = Post.count(:include => [ :author, :comments ], :limit => 2, :conditions => [ "authors.name = ?", 'David' ])
assert_equal count, posts.size
end
def test_eager_with_has_many_and_limit_ond_high_offset
posts = Post.find(:all, :include => [ :author, :comments ], :limit => 2, :offset => 10, :conditions => [ "authors.name = ?", 'David' ])
assert_equal 0, posts.size
end
def test_count_eager_with_has_many_and_limit_ond_high_offset
posts = Post.count(:all, :include => [ :author, :comments ], :limit => 2, :offset => 10, :conditions => [ "authors.name = ?", 'David' ])
assert_equal 0, posts
end
def test_eager_with_has_many_and_limit_with_no_results
posts = Post.find(:all, :include => [ :author, :comments ], :limit => 2, :conditions => "posts.title = 'magic forest'")
assert_equal 0, posts.size
end
def test_eager_with_has_and_belongs_to_many_and_limit
posts = Post.find(:all, :include => :categories, :order => "posts.id", :limit => 3)
assert_equal 3, posts.size
assert_equal 2, posts[0].categories.size
assert_equal 1, posts[1].categories.size
assert_equal 0, posts[2].categories.size
assert posts[0].categories.include?(categories(:technology))
assert posts[1].categories.include?(categories(:general))
end
def test_eager_with_has_many_and_limit_and_conditions_on_the_eagers
posts = authors(:david).posts.find(:all,
:include => :comments,
:conditions => "comments.body like 'Normal%' OR comments.#{QUOTED_TYPE}= 'SpecialComment'",
:limit => 2
)
assert_equal 2, posts.size
count = Post.count(
:include => [ :comments, :author ],
:conditions => "authors.name = 'David' AND (comments.body like 'Normal%' OR comments.#{QUOTED_TYPE}= 'SpecialComment')",
:limit => 2
)
assert_equal count, posts.size
end
def test_eager_with_has_many_and_limit_and_scoped_conditions_on_the_eagers
posts = nil
Post.with_scope(:find => {
:include => :comments,
:conditions => "comments.body like 'Normal%' OR comments.#{QUOTED_TYPE}= 'SpecialComment'"
}) do
posts = authors(:david).posts.find(:all, :limit => 2)
assert_equal 2, posts.size
end
Post.with_scope(:find => {
:include => [ :comments, :author ],
:conditions => "authors.name = 'David' AND (comments.body like 'Normal%' OR comments.#{QUOTED_TYPE}= 'SpecialComment')"
}) do
count = Post.count(:limit => 2)
assert_equal count, posts.size
end
end
def test_eager_with_has_many_and_limit_and_scoped_and_explicit_conditions_on_the_eagers
Post.with_scope(:find => { :conditions => "1=1" }) do
posts = authors(:david).posts.find(:all,
:include => :comments,
:conditions => "comments.body like 'Normal%' OR comments.#{QUOTED_TYPE}= 'SpecialComment'",
:limit => 2
)
assert_equal 2, posts.size
count = Post.count(
:include => [ :comments, :author ],
:conditions => "authors.name = 'David' AND (comments.body like 'Normal%' OR comments.#{QUOTED_TYPE}= 'SpecialComment')",
:limit => 2
)
assert_equal count, posts.size
end
end
def test_eager_association_loading_with_habtm
posts = Post.find(:all, :include => :categories, :order => "posts.id")
assert_equal 2, posts[0].categories.size
assert_equal 1, posts[1].categories.size
assert_equal 0, posts[2].categories.size
assert posts[0].categories.include?(categories(:technology))
assert posts[1].categories.include?(categories(:general))
end
def test_eager_with_inheritance
posts = SpecialPost.find(:all, :include => [ :comments ])
end
def test_eager_has_one_with_association_inheritance
post = Post.find(4, :include => [ :very_special_comment ])
assert_equal "VerySpecialComment", post.very_special_comment.class.to_s
end
def test_eager_has_many_with_association_inheritance
post = Post.find(4, :include => [ :special_comments ])
post.special_comments.each do |special_comment|
assert_equal "SpecialComment", special_comment.class.to_s
end
end
def test_eager_habtm_with_association_inheritance
post = Post.find(6, :include => [ :special_categories ])
assert_equal 1, post.special_categories.size
post.special_categories.each do |special_category|
assert_equal "SpecialCategory", special_category.class.to_s
end
end
def test_eager_with_has_one_dependent_does_not_destroy_dependent
assert_not_nil companies(:first_firm).account
f = Firm.find(:first, :include => :account,
:conditions => ["companies.name = ?", "37signals"])
assert_not_nil f.account
assert_equal companies(:first_firm, :reload).account, f.account
end
def test_eager_with_invalid_association_reference
assert_raises(ActiveRecord::ConfigurationError, "Association was not found; perhaps you misspelled it? You specified :include => :monkeys") {
post = Post.find(6, :include=> :monkeys )
}
assert_raises(ActiveRecord::ConfigurationError, "Association was not found; perhaps you misspelled it? You specified :include => :monkeys") {
post = Post.find(6, :include=>[ :monkeys ])
}
assert_raises(ActiveRecord::ConfigurationError, "Association was not found; perhaps you misspelled it? You specified :include => :monkeys") {
post = Post.find(6, :include=>[ 'monkeys' ])
}
assert_raises(ActiveRecord::ConfigurationError, "Association was not found; perhaps you misspelled it? You specified :include => :monkeys, :elephants") {
post = Post.find(6, :include=>[ :monkeys, :elephants ])
}
end
def find_all_ordered(className, include=nil)
className.find(:all, :order=>"#{className.table_name}.#{className.primary_key}", :include=>include)
end
def test_eager_with_multiple_associations_with_same_table_has_many_and_habtm
# Eager includes of has many and habtm associations aren't necessarily sorted in the same way
def assert_equal_after_sort(item1, item2, item3 = nil)
assert_equal(item1.sort{|a,b| a.id <=> b.id}, item2.sort{|a,b| a.id <=> b.id})
assert_equal(item3.sort{|a,b| a.id <=> b.id}, item2.sort{|a,b| a.id <=> b.id}) if item3
end
# Test regular association, association with conditions, association with
# STI, and association with conditions assured not to be true
post_types = [:posts, :hello_posts, :special_posts, :nonexistent_posts]
# test both has_many and has_and_belongs_to_many
[Author, Category].each do |className|
d1 = find_all_ordered(className)
# test including all post types at once
d2 = find_all_ordered(className, post_types)
d1.each_index do |i|
assert_equal(d1[i], d2[i])
assert_equal_after_sort(d1[i].posts, d2[i].posts)
post_types[1..-1].each do |post_type|
# test including post_types together
d3 = find_all_ordered(className, [:posts, post_type])
assert_equal(d1[i], d3[i])
assert_equal_after_sort(d1[i].posts, d3[i].posts)
assert_equal_after_sort(d1[i].send(post_type), d2[i].send(post_type), d3[i].send(post_type))
end
end
end
end
def test_eager_with_multiple_associations_with_same_table_has_one
d1 = find_all_ordered(Firm)
d2 = find_all_ordered(Firm, :account)
d1.each_index do |i|
assert_equal(d1[i], d2[i])
assert_equal(d1[i].account, d2[i].account)
end
end
def test_eager_with_multiple_associations_with_same_table_belongs_to
firm_types = [:firm, :firm_with_basic_id, :firm_with_other_name, :firm_with_condition]
d1 = find_all_ordered(Client)
d2 = find_all_ordered(Client, firm_types)
d1.each_index do |i|
assert_equal(d1[i], d2[i])
firm_types.each { |type| assert_equal(d1[i].send(type), d2[i].send(type)) }
end
end
def test_eager_with_valid_association_as_string_not_symbol
assert_nothing_raised { Post.find(:all, :include => 'comments') }
end
def test_preconfigured_includes_with_belongs_to
author = posts(:welcome).author_with_posts
assert_equal 5, author.posts.size
end
def test_preconfigured_includes_with_has_one
comment = posts(:sti_comments).very_special_comment_with_post
assert_equal posts(:sti_comments), comment.post
end
def test_preconfigured_includes_with_has_many
posts = authors(:david).posts_with_comments
one = posts.detect { |p| p.id == 1 }
assert_equal 5, posts.size
assert_equal 2, one.comments.size
end
def test_preconfigured_includes_with_habtm
posts = authors(:david).posts_with_categories
one = posts.detect { |p| p.id == 1 }
assert_equal 5, posts.size
assert_equal 2, one.categories.size
end
def test_preconfigured_includes_with_has_many_and_habtm
posts = authors(:david).posts_with_comments_and_categories
one = posts.detect { |p| p.id == 1 }
assert_equal 5, posts.size
assert_equal 2, one.comments.size
assert_equal 2, one.categories.size
end
end

View file

@ -0,0 +1,370 @@
require 'abstract_unit'
require 'fixtures/tag'
require 'fixtures/tagging'
require 'fixtures/post'
require 'fixtures/comment'
require 'fixtures/author'
require 'fixtures/category'
require 'fixtures/categorization'
class AssociationsJoinModelTest < Test::Unit::TestCase
self.use_transactional_fixtures = false
fixtures :posts, :authors, :categories, :categorizations, :comments, :tags, :taggings, :author_favorites
def test_has_many
assert_equal categories(:general), authors(:david).categories.first
end
def test_has_many_inherited
assert_equal categories(:sti_test), authors(:mary).categories.first
end
def test_inherited_has_many
assert_equal authors(:mary), categories(:sti_test).authors.first
end
def test_polymorphic_has_many
assert_equal taggings(:welcome_general), posts(:welcome).taggings.first
end
def test_polymorphic_has_one
assert_equal taggings(:welcome_general), posts(:welcome).tagging
end
def test_polymorphic_belongs_to
assert_equal posts(:welcome), posts(:welcome).taggings.first.taggable
end
def test_polymorphic_has_many_going_through_join_model
assert_equal tags(:general), tag = posts(:welcome).tags.first
assert_no_queries do
tag.tagging
end
end
def test_count_polymorphic_has_many
assert_equal 1, posts(:welcome).taggings.count
assert_equal 1, posts(:welcome).tags.count
end
def test_polymorphic_has_many_going_through_join_model_with_find
assert_equal tags(:general), tag = posts(:welcome).tags.find(:first)
assert_no_queries do
tag.tagging
end
end
def test_polymorphic_has_many_going_through_join_model_with_include_on_source_reflection
assert_equal tags(:general), tag = posts(:welcome).funky_tags.first
assert_no_queries do
tag.tagging
end
end
def test_polymorphic_has_many_going_through_join_model_with_include_on_source_reflection_with_find
assert_equal tags(:general), tag = posts(:welcome).funky_tags.find(:first)
assert_no_queries do
tag.tagging
end
end
def test_polymorphic_has_many_going_through_join_model_with_disabled_include
assert_equal tags(:general), tag = posts(:welcome).tags.add_joins_and_select.first
assert_queries 1 do
tag.tagging
end
end
def test_polymorphic_has_many_going_through_join_model_with_custom_select_and_joins
assert_equal tags(:general), tag = posts(:welcome).tags.add_joins_and_select.first
tag.author_id
end
def test_polymorphic_has_many_going_through_join_model_with_custom_foreign_key
assert_equal tags(:misc), taggings(:welcome_general).super_tag
assert_equal tags(:misc), posts(:welcome).super_tags.first
end
def test_polymorphic_has_many_create_model_with_inheritance_and_custom_base_class
post = SubStiPost.create :title => 'SubStiPost', :body => 'SubStiPost body'
assert_instance_of SubStiPost, post
tagging = tags(:misc).taggings.create(:taggable => post)
assert_equal "SubStiPost", tagging.taggable_type
end
def test_polymorphic_has_many_going_through_join_model_with_inheritance
assert_equal tags(:general), posts(:thinking).tags.first
end
def test_polymorphic_has_many_going_through_join_model_with_inheritance_with_custom_class_name
assert_equal tags(:general), posts(:thinking).funky_tags.first
end
def test_polymorphic_has_many_create_model_with_inheritance
post = posts(:thinking)
assert_instance_of SpecialPost, post
tagging = tags(:misc).taggings.create(:taggable => post)
assert_equal "Post", tagging.taggable_type
end
def test_polymorphic_has_one_create_model_with_inheritance
tagging = tags(:misc).create_tagging(:taggable => posts(:thinking))
assert_equal "Post", tagging.taggable_type
end
def test_set_polymorphic_has_many
tagging = tags(:misc).taggings.create
posts(:thinking).taggings << tagging
assert_equal "Post", tagging.taggable_type
end
def test_set_polymorphic_has_one
tagging = tags(:misc).taggings.create
posts(:thinking).tagging = tagging
assert_equal "Post", tagging.taggable_type
end
def test_create_polymorphic_has_many_with_scope
old_count = posts(:welcome).taggings.count
tagging = posts(:welcome).taggings.create(:tag => tags(:misc))
assert_equal "Post", tagging.taggable_type
assert_equal old_count+1, posts(:welcome).taggings.count
end
def test_create_polymorphic_has_one_with_scope
old_count = Tagging.count
tagging = posts(:welcome).tagging.create(:tag => tags(:misc))
assert_equal "Post", tagging.taggable_type
assert_equal old_count+1, Tagging.count
end
def test_delete_polymorphic_has_many_with_delete_all
assert_equal 1, posts(:welcome).taggings.count
posts(:welcome).taggings.first.update_attribute :taggable_type, 'PostWithHasManyDeleteAll'
post = find_post_with_dependency(1, :has_many, :taggings, :delete_all)
old_count = Tagging.count
post.destroy
assert_equal old_count-1, Tagging.count
assert_equal 0, posts(:welcome).taggings.count
end
def test_delete_polymorphic_has_many_with_destroy
assert_equal 1, posts(:welcome).taggings.count
posts(:welcome).taggings.first.update_attribute :taggable_type, 'PostWithHasManyDestroy'
post = find_post_with_dependency(1, :has_many, :taggings, :destroy)
old_count = Tagging.count
post.destroy
assert_equal old_count-1, Tagging.count
assert_equal 0, posts(:welcome).taggings.count
end
def test_delete_polymorphic_has_many_with_nullify
assert_equal 1, posts(:welcome).taggings.count
posts(:welcome).taggings.first.update_attribute :taggable_type, 'PostWithHasManyNullify'
post = find_post_with_dependency(1, :has_many, :taggings, :nullify)
old_count = Tagging.count
post.destroy
assert_equal old_count, Tagging.count
assert_equal 0, posts(:welcome).taggings.count
end
def test_delete_polymorphic_has_one_with_destroy
assert posts(:welcome).tagging
posts(:welcome).tagging.update_attribute :taggable_type, 'PostWithHasOneDestroy'
post = find_post_with_dependency(1, :has_one, :tagging, :destroy)
old_count = Tagging.count
post.destroy
assert_equal old_count-1, Tagging.count
assert_nil posts(:welcome).tagging(true)
end
def test_delete_polymorphic_has_one_with_nullify
assert posts(:welcome).tagging
posts(:welcome).tagging.update_attribute :taggable_type, 'PostWithHasOneNullify'
post = find_post_with_dependency(1, :has_one, :tagging, :nullify)
old_count = Tagging.count
post.destroy
assert_equal old_count, Tagging.count
assert_nil posts(:welcome).tagging(true)
end
def test_has_many_with_piggyback
assert_equal "2", categories(:sti_test).authors.first.post_id.to_s
end
def test_include_has_many_through
posts = Post.find(:all, :order => 'posts.id')
posts_with_authors = Post.find(:all, :include => :authors, :order => 'posts.id')
assert_equal posts.length, posts_with_authors.length
posts.length.times do |i|
assert_equal posts[i].authors.length, assert_no_queries { posts_with_authors[i].authors.length }
end
end
def test_include_polymorphic_has_one
post = Post.find_by_id(posts(:welcome).id, :include => :tagging)
tagging = taggings(:welcome_general)
assert_no_queries do
assert_equal tagging, post.tagging
end
end
def test_include_polymorphic_has_many_through
posts = Post.find(:all, :order => 'posts.id')
posts_with_tags = Post.find(:all, :include => :tags, :order => 'posts.id')
assert_equal posts.length, posts_with_tags.length
posts.length.times do |i|
assert_equal posts[i].tags.length, assert_no_queries { posts_with_tags[i].tags.length }
end
end
def test_include_polymorphic_has_many
posts = Post.find(:all, :order => 'posts.id')
posts_with_taggings = Post.find(:all, :include => :taggings, :order => 'posts.id')
assert_equal posts.length, posts_with_taggings.length
posts.length.times do |i|
assert_equal posts[i].taggings.length, assert_no_queries { posts_with_taggings[i].taggings.length }
end
end
def test_has_many_find_all
assert_equal [categories(:general)], authors(:david).categories.find(:all)
end
def test_has_many_find_first
assert_equal categories(:general), authors(:david).categories.find(:first)
end
def test_has_many_find_conditions
assert_equal categories(:general), authors(:david).categories.find(:first, :conditions => "categories.name = 'General'")
assert_equal nil, authors(:david).categories.find(:first, :conditions => "categories.name = 'Technology'")
end
def test_has_many_class_methods_called_by_method_missing
assert_equal categories(:general), authors(:david).categories.find_all_by_name('General').first
# assert_equal nil, authors(:david).categories.find_by_name('Technology')
end
def test_has_many_going_through_join_model_with_custom_foreign_key
assert_equal [], posts(:thinking).authors
assert_equal [authors(:mary)], posts(:authorless).authors
end
def test_belongs_to_polymorphic_with_counter_cache
assert_equal 0, posts(:welcome)[:taggings_count]
tagging = posts(:welcome).taggings.create(:tag => tags(:general))
assert_equal 1, posts(:welcome, :reload)[:taggings_count]
tagging.destroy
assert posts(:welcome, :reload)[:taggings_count].zero?
end
def test_unavailable_through_reflection
assert_raises (ActiveRecord::HasManyThroughAssociationNotFoundError) { authors(:david).nothings }
end
def test_has_many_through_join_model_with_conditions
assert_equal [], posts(:welcome).invalid_taggings
assert_equal [], posts(:welcome).invalid_tags
end
def test_has_many_polymorphic
assert_raises ActiveRecord::HasManyThroughAssociationPolymorphicError do
assert_equal [posts(:welcome), posts(:thinking)], tags(:general).taggables
end
assert_raises ActiveRecord::EagerLoadPolymorphicError do
assert_equal [posts(:welcome), posts(:thinking)], tags(:general).taggings.find(:all, :include => :taggable)
end
end
def test_has_many_through_has_many_find_all
assert_equal comments(:greetings), authors(:david).comments.find(:all, :order => 'comments.id').first
end
def test_has_many_through_has_many_find_all_with_custom_class
assert_equal comments(:greetings), authors(:david).funky_comments.find(:all, :order => 'comments.id').first
end
def test_has_many_through_has_many_find_first
assert_equal comments(:greetings), authors(:david).comments.find(:first, :order => 'comments.id')
end
def test_has_many_through_has_many_find_conditions
assert_equal comments(:does_it_hurt), authors(:david).comments.find(:first, :conditions => "comments.type='SpecialComment'", :order => 'comments.id')
end
def test_has_many_through_has_many_find_by_id
assert_equal comments(:more_greetings), authors(:david).comments.find(2)
end
def test_has_many_through_polymorphic_has_one
assert_raise(ActiveRecord::HasManyThroughSourceAssociationMacroError) { authors(:david).tagging }
end
def test_has_many_through_polymorphic_has_many
assert_equal [taggings(:welcome_general), taggings(:thinking_general)], authors(:david).taggings.uniq.sort_by { |t| t.id }
end
def test_include_has_many_through_polymorphic_has_many
author = Author.find_by_id(authors(:david).id, :include => :taggings)
expected_taggings = [taggings(:welcome_general), taggings(:thinking_general)]
assert_no_queries do
assert_equal expected_taggings, author.taggings.uniq.sort_by { |t| t.id }
end
end
def test_has_many_through_has_many_through
assert_raise(ActiveRecord::HasManyThroughSourceAssociationMacroError) { authors(:david).tags }
end
def test_has_many_through_habtm
assert_raise(ActiveRecord::HasManyThroughSourceAssociationMacroError) { authors(:david).post_categories }
end
def test_eager_load_has_many_through_has_many
author = Author.find :first, :conditions => ['name = ?', 'David'], :include => :comments, :order => 'comments.id'
SpecialComment.new; VerySpecialComment.new
assert_no_queries do
assert_equal [1,2,3,5,6,7,8,9,10], author.comments.collect(&:id)
end
end
def test_eager_belongs_to_and_has_one_not_singularized
assert_nothing_raised do
Author.find(:first, :include => :author_address)
AuthorAddress.find(:first, :include => :author)
end
end
def test_self_referential_has_many_through
assert_equal [authors(:mary)], authors(:david).favorite_authors
assert_equal [], authors(:mary).favorite_authors
end
def test_add_to_self_referential_has_many_through
new_author = Author.create(:name => "Bob")
authors(:david).author_favorites.create :favorite_author => new_author
assert_equal new_author, authors(:david).reload.favorite_authors.first
end
def test_has_many_through_uses_correct_attributes
assert_nil posts(:thinking).tags.find_by_name("General").attributes["tag_id"]
end
private
# create dynamic Post models to allow different dependency options
def find_post_with_dependency(post_id, association, association_name, dependency)
class_name = "PostWith#{association.to_s.classify}#{dependency.to_s.classify}"
Post.find(post_id).update_attribute :type, class_name
klass = Object.const_set(class_name, Class.new(ActiveRecord::Base))
klass.set_table_name 'posts'
klass.send(association, association_name, :as => :taggable, :dependent => dependency)
klass.find(post_id)
end
end

File diff suppressed because it is too large Load diff

1314
vendor/rails/activerecord/test/base_test.rb vendored Executable file

File diff suppressed because it is too large Load diff

View file

@ -0,0 +1,37 @@
require 'abstract_unit'
require 'fixtures/binary'
class BinaryTest < Test::Unit::TestCase
BINARY_FIXTURE_PATH = File.dirname(__FILE__) + '/fixtures/flowers.jpg'
def setup
Binary.connection.execute 'DELETE FROM binaries'
@data = File.read(BINARY_FIXTURE_PATH).freeze
end
def test_truth
assert true
end
# Without using prepared statements, it makes no sense to test
# BLOB data with SQL Server, because the length of a statement is
# limited to 8KB.
#
# Without using prepared statements, it makes no sense to test
# BLOB data with DB2 or Firebird, because the length of a statement
# is limited to 32KB.
unless %w(SQLServer Sybase DB2 Oracle Firebird).include? ActiveRecord::Base.connection.adapter_name
def test_load_save
bin = Binary.new
bin.data = @data
assert @data == bin.data, 'Newly assigned data differs from original'
bin.save
assert @data == bin.data, 'Data differs from original after save'
db_bin = Binary.find(bin.id)
assert @data == db_bin.data, 'Reloaded data differs from original'
end
end
end

View file

@ -0,0 +1,181 @@
require 'abstract_unit'
require 'fixtures/company'
require 'fixtures/topic'
Company.has_many :accounts
class CalculationsTest < Test::Unit::TestCase
fixtures :companies, :accounts, :topics
def test_should_sum_field
assert_equal 265, Account.sum(:credit_limit)
end
def test_should_average_field
value = Account.average(:credit_limit)
assert_equal 53, value
assert_kind_of Float, value
end
def test_should_get_maximum_of_field
assert_equal 60, Account.maximum(:credit_limit)
end
def test_should_get_minimum_of_field
assert_equal 50, Account.minimum(:credit_limit)
end
def test_should_group_by_field
c = Account.sum(:credit_limit, :group => :firm_id)
[1,6,2].each { |firm_id| assert c.keys.include?(firm_id) }
end
def test_should_group_by_summed_field
c = Account.sum(:credit_limit, :group => :firm_id)
assert_equal 50, c[1]
assert_equal 105, c[6]
assert_equal 60, c[2]
end
def test_should_order_by_grouped_field
c = Account.sum(:credit_limit, :group => :firm_id, :order => "firm_id")
assert_equal [1, 2, 6], c.keys.compact
end
def test_should_order_by_calculation
c = Account.sum(:credit_limit, :group => :firm_id, :order => "sum_credit_limit desc, firm_id")
assert_equal [105, 60, 50, 50], c.keys.collect { |k| c[k] }
assert_equal [6, 2, 1], c.keys.compact
end
def test_should_limit_calculation
c = Account.sum(:credit_limit, :conditions => "firm_id IS NOT NULL",
:group => :firm_id, :order => "firm_id", :limit => 2)
assert_equal [1, 2], c.keys.compact
end
def test_should_limit_calculation_with_offset
c = Account.sum(:credit_limit, :conditions => "firm_id IS NOT NULL",
:group => :firm_id, :order => "firm_id", :limit => 2, :offset => 1)
assert_equal [2, 6], c.keys.compact
end
def test_should_group_by_summed_field_having_condition
c = Account.sum(:credit_limit, :group => :firm_id,
:having => 'sum(credit_limit) > 50')
assert_nil c[1]
assert_equal 105, c[6]
assert_equal 60, c[2]
end
def test_should_group_by_summed_association
c = Account.sum(:credit_limit, :group => :firm)
assert_equal 50, c[companies(:first_firm)]
assert_equal 105, c[companies(:rails_core)]
assert_equal 60, c[companies(:first_client)]
end
def test_should_sum_field_with_conditions
assert_equal 105, Account.sum(:credit_limit, :conditions => 'firm_id = 6')
end
def test_should_group_by_summed_field_with_conditions
c = Account.sum(:credit_limit, :conditions => 'firm_id > 1',
:group => :firm_id)
assert_nil c[1]
assert_equal 105, c[6]
assert_equal 60, c[2]
end
def test_should_group_by_summed_field_with_conditions_and_having
c = Account.sum(:credit_limit, :conditions => 'firm_id > 1',
:group => :firm_id,
:having => 'sum(credit_limit) > 60')
assert_nil c[1]
assert_equal 105, c[6]
assert_nil c[2]
end
def test_should_group_by_fields_with_table_alias
c = Account.sum(:credit_limit, :group => 'accounts.firm_id')
assert_equal 50, c[1]
assert_equal 105, c[6]
assert_equal 60, c[2]
end
def test_should_calculate_with_invalid_field
assert_equal 5, Account.calculate(:count, '*')
assert_equal 5, Account.calculate(:count, :all)
end
def test_should_calculate_grouped_with_invalid_field
c = Account.count(:all, :group => 'accounts.firm_id')
assert_equal 1, c[1]
assert_equal 2, c[6]
assert_equal 1, c[2]
end
def test_should_calculate_grouped_association_with_invalid_field
c = Account.count(:all, :group => :firm)
assert_equal 1, c[companies(:first_firm)]
assert_equal 2, c[companies(:rails_core)]
assert_equal 1, c[companies(:first_client)]
end
def test_should_calculate_grouped_by_function
c = Company.count(:all, :group => 'UPPER(type)')
assert_equal 2, c[nil]
assert_equal 1, c['DEPENDENTFIRM']
assert_equal 3, c['CLIENT']
assert_equal 2, c['FIRM']
end
def test_should_calculate_grouped_by_function_with_table_alias
c = Company.count(:all, :group => 'UPPER(companies.type)')
assert_equal 2, c[nil]
assert_equal 1, c['DEPENDENTFIRM']
assert_equal 3, c['CLIENT']
assert_equal 2, c['FIRM']
end
def test_should_sum_scoped_field
assert_equal 15, companies(:rails_core).companies.sum(:id)
end
def test_should_sum_scoped_field_with_conditions
assert_equal 8, companies(:rails_core).companies.sum(:id, :conditions => 'id > 7')
end
def test_should_group_by_scoped_field
c = companies(:rails_core).companies.sum(:id, :group => :name)
assert_equal 7, c['Leetsoft']
assert_equal 8, c['Jadedpixel']
end
def test_should_group_by_summed_field_with_conditions_and_having
c = companies(:rails_core).companies.sum(:id, :group => :name,
:having => 'sum(id) > 7')
assert_nil c['Leetsoft']
assert_equal 8, c['Jadedpixel']
end
def test_should_reject_invalid_options
assert_nothing_raised do
[:count, :sum].each do |func|
# empty options are valid
Company.send(:validate_calculation_options, func)
# these options are valid for all calculations
[:select, :conditions, :joins, :order, :group, :having, :distinct].each do |opt|
Company.send(:validate_calculation_options, func, opt => true)
end
end
# :include is only valid on :count
Company.send(:validate_calculation_options, :count, :include => true)
end
assert_raises(ArgumentError) { Company.send(:validate_calculation_options, :sum, :include => :posts) }
assert_raises(ArgumentError) { Company.send(:validate_calculation_options, :sum, :foo => :bar) }
assert_raises(ArgumentError) { Company.send(:validate_calculation_options, :count, :foo => :bar) }
end
end

View file

@ -0,0 +1,364 @@
require 'abstract_unit'
class CallbackDeveloper < ActiveRecord::Base
set_table_name 'developers'
class << self
def callback_string(callback_method)
"history << [#{callback_method.to_sym.inspect}, :string]"
end
def callback_proc(callback_method)
Proc.new { |model| model.history << [callback_method, :proc] }
end
def define_callback_method(callback_method)
define_method("#{callback_method}_method") do |model|
model.history << [callback_method, :method]
end
end
def callback_object(callback_method)
klass = Class.new
klass.send(:define_method, callback_method) do |model|
model.history << [callback_method, :object]
end
klass.new
end
end
ActiveRecord::Callbacks::CALLBACKS.each do |callback_method|
callback_method_sym = callback_method.to_sym
define_callback_method(callback_method_sym)
send(callback_method, callback_method_sym)
send(callback_method, callback_string(callback_method_sym))
send(callback_method, callback_proc(callback_method_sym))
send(callback_method, callback_object(callback_method_sym))
send(callback_method) { |model| model.history << [callback_method_sym, :block] }
end
def history
@history ||= []
end
# after_initialize and after_find are invoked only if instance methods have been defined.
def after_initialize
end
def after_find
end
end
class RecursiveCallbackDeveloper < ActiveRecord::Base
set_table_name 'developers'
before_save :on_before_save
after_save :on_after_save
attr_reader :on_before_save_called, :on_after_save_called
def on_before_save
@on_before_save_called ||= 0
@on_before_save_called += 1
save unless @on_before_save_called > 1
end
def on_after_save
@on_after_save_called ||= 0
@on_after_save_called += 1
save unless @on_after_save_called > 1
end
end
class ImmutableDeveloper < ActiveRecord::Base
set_table_name 'developers'
validates_inclusion_of :salary, :in => 50000..200000
before_save :cancel
before_destroy :cancel
def cancelled?
@cancelled == true
end
private
def cancel
@cancelled = true
false
end
end
class ImmutableMethodDeveloper < ActiveRecord::Base
set_table_name 'developers'
validates_inclusion_of :salary, :in => 50000..200000
def cancelled?
@cancelled == true
end
def before_save
@cancelled = true
false
end
def before_destroy
@cancelled = true
false
end
end
class CallbacksTest < Test::Unit::TestCase
fixtures :developers
def test_initialize
david = CallbackDeveloper.new
assert_equal [
[ :after_initialize, :string ],
[ :after_initialize, :proc ],
[ :after_initialize, :object ],
[ :after_initialize, :block ],
], david.history
end
def test_find
david = CallbackDeveloper.find(1)
assert_equal [
[ :after_find, :string ],
[ :after_find, :proc ],
[ :after_find, :object ],
[ :after_find, :block ],
[ :after_initialize, :string ],
[ :after_initialize, :proc ],
[ :after_initialize, :object ],
[ :after_initialize, :block ],
], david.history
end
def test_new_valid?
david = CallbackDeveloper.new
david.valid?
assert_equal [
[ :after_initialize, :string ],
[ :after_initialize, :proc ],
[ :after_initialize, :object ],
[ :after_initialize, :block ],
[ :before_validation, :string ],
[ :before_validation, :proc ],
[ :before_validation, :object ],
[ :before_validation, :block ],
[ :before_validation_on_create, :string ],
[ :before_validation_on_create, :proc ],
[ :before_validation_on_create, :object ],
[ :before_validation_on_create, :block ],
[ :after_validation, :string ],
[ :after_validation, :proc ],
[ :after_validation, :object ],
[ :after_validation, :block ],
[ :after_validation_on_create, :string ],
[ :after_validation_on_create, :proc ],
[ :after_validation_on_create, :object ],
[ :after_validation_on_create, :block ]
], david.history
end
def test_existing_valid?
david = CallbackDeveloper.find(1)
david.valid?
assert_equal [
[ :after_find, :string ],
[ :after_find, :proc ],
[ :after_find, :object ],
[ :after_find, :block ],
[ :after_initialize, :string ],
[ :after_initialize, :proc ],
[ :after_initialize, :object ],
[ :after_initialize, :block ],
[ :before_validation, :string ],
[ :before_validation, :proc ],
[ :before_validation, :object ],
[ :before_validation, :block ],
[ :before_validation_on_update, :string ],
[ :before_validation_on_update, :proc ],
[ :before_validation_on_update, :object ],
[ :before_validation_on_update, :block ],
[ :after_validation, :string ],
[ :after_validation, :proc ],
[ :after_validation, :object ],
[ :after_validation, :block ],
[ :after_validation_on_update, :string ],
[ :after_validation_on_update, :proc ],
[ :after_validation_on_update, :object ],
[ :after_validation_on_update, :block ]
], david.history
end
def test_create
david = CallbackDeveloper.create('name' => 'David', 'salary' => 1000000)
assert_equal [
[ :after_initialize, :string ],
[ :after_initialize, :proc ],
[ :after_initialize, :object ],
[ :after_initialize, :block ],
[ :before_validation, :string ],
[ :before_validation, :proc ],
[ :before_validation, :object ],
[ :before_validation, :block ],
[ :before_validation_on_create, :string ],
[ :before_validation_on_create, :proc ],
[ :before_validation_on_create, :object ],
[ :before_validation_on_create, :block ],
[ :after_validation, :string ],
[ :after_validation, :proc ],
[ :after_validation, :object ],
[ :after_validation, :block ],
[ :after_validation_on_create, :string ],
[ :after_validation_on_create, :proc ],
[ :after_validation_on_create, :object ],
[ :after_validation_on_create, :block ],
[ :before_save, :string ],
[ :before_save, :proc ],
[ :before_save, :object ],
[ :before_save, :block ],
[ :before_create, :string ],
[ :before_create, :proc ],
[ :before_create, :object ],
[ :before_create, :block ],
[ :after_create, :string ],
[ :after_create, :proc ],
[ :after_create, :object ],
[ :after_create, :block ],
[ :after_save, :string ],
[ :after_save, :proc ],
[ :after_save, :object ],
[ :after_save, :block ]
], david.history
end
def test_save
david = CallbackDeveloper.find(1)
david.save
assert_equal [
[ :after_find, :string ],
[ :after_find, :proc ],
[ :after_find, :object ],
[ :after_find, :block ],
[ :after_initialize, :string ],
[ :after_initialize, :proc ],
[ :after_initialize, :object ],
[ :after_initialize, :block ],
[ :before_validation, :string ],
[ :before_validation, :proc ],
[ :before_validation, :object ],
[ :before_validation, :block ],
[ :before_validation_on_update, :string ],
[ :before_validation_on_update, :proc ],
[ :before_validation_on_update, :object ],
[ :before_validation_on_update, :block ],
[ :after_validation, :string ],
[ :after_validation, :proc ],
[ :after_validation, :object ],
[ :after_validation, :block ],
[ :after_validation_on_update, :string ],
[ :after_validation_on_update, :proc ],
[ :after_validation_on_update, :object ],
[ :after_validation_on_update, :block ],
[ :before_save, :string ],
[ :before_save, :proc ],
[ :before_save, :object ],
[ :before_save, :block ],
[ :before_update, :string ],
[ :before_update, :proc ],
[ :before_update, :object ],
[ :before_update, :block ],
[ :after_update, :string ],
[ :after_update, :proc ],
[ :after_update, :object ],
[ :after_update, :block ],
[ :after_save, :string ],
[ :after_save, :proc ],
[ :after_save, :object ],
[ :after_save, :block ]
], david.history
end
def test_destroy
david = CallbackDeveloper.find(1)
david.destroy
assert_equal [
[ :after_find, :string ],
[ :after_find, :proc ],
[ :after_find, :object ],
[ :after_find, :block ],
[ :after_initialize, :string ],
[ :after_initialize, :proc ],
[ :after_initialize, :object ],
[ :after_initialize, :block ],
[ :before_destroy, :string ],
[ :before_destroy, :proc ],
[ :before_destroy, :object ],
[ :before_destroy, :block ],
[ :after_destroy, :string ],
[ :after_destroy, :proc ],
[ :after_destroy, :object ],
[ :after_destroy, :block ]
], david.history
end
def test_delete
david = CallbackDeveloper.find(1)
CallbackDeveloper.delete(david.id)
assert_equal [
[ :after_find, :string ],
[ :after_find, :proc ],
[ :after_find, :object ],
[ :after_find, :block ],
[ :after_initialize, :string ],
[ :after_initialize, :proc ],
[ :after_initialize, :object ],
[ :after_initialize, :block ],
], david.history
end
def test_before_save_returning_false
david = ImmutableDeveloper.find(1)
assert david.valid?
assert !david.save
assert_raises(ActiveRecord::RecordNotSaved) { david.save! }
david = ImmutableDeveloper.find(1)
david.salary = 10_000_000
assert !david.valid?
assert !david.save
assert_raises(ActiveRecord::RecordInvalid) { david.save! }
end
def test_before_destroy_returning_false
david = ImmutableDeveloper.find(1)
assert !david.destroy
assert_not_nil ImmutableDeveloper.find_by_id(1)
end
def test_zzz_callback_returning_false # must be run last since we modify CallbackDeveloper
david = CallbackDeveloper.find(1)
CallbackDeveloper.before_validation proc { |model| model.history << [:before_validation, :returning_false]; return false }
CallbackDeveloper.before_validation proc { |model| model.history << [:before_validation, :should_never_get_here] }
david.save
assert_equal [
[ :after_find, :string ],
[ :after_find, :proc ],
[ :after_find, :object ],
[ :after_find, :block ],
[ :after_initialize, :string ],
[ :after_initialize, :proc ],
[ :after_initialize, :object ],
[ :after_initialize, :block ],
[ :before_validation, :string ],
[ :before_validation, :proc ],
[ :before_validation, :object ],
[ :before_validation, :block ],
[ :before_validation, :returning_false ]
], david.history
end
end

View file

@ -0,0 +1,32 @@
require 'test/unit'
require 'abstract_unit'
require 'active_support/core_ext/class/inheritable_attributes'
class A
include ClassInheritableAttributes
end
class B < A
write_inheritable_array "first", [ :one, :two ]
end
class C < A
write_inheritable_array "first", [ :three ]
end
class D < B
write_inheritable_array "first", [ :four ]
end
class ClassInheritableAttributesTest < Test::Unit::TestCase
def test_first_level
assert_equal [ :one, :two ], B.read_inheritable_attribute("first")
assert_equal [ :three ], C.read_inheritable_attribute("first")
end
def test_second_level
assert_equal [ :one, :two, :four ], D.read_inheritable_attribute("first")
assert_equal [ :one, :two ], B.read_inheritable_attribute("first")
end
end

View file

@ -0,0 +1,17 @@
require 'abstract_unit'
require 'fixtures/topic'
class TestColumnAlias < Test::Unit::TestCase
fixtures :topics
QUERY = if 'Oracle' == ActiveRecord::Base.connection.adapter_name
'SELECT id AS pk FROM topics WHERE ROWNUM < 2'
else
'SELECT id AS pk FROM topics'
end
def test_column_alias
records = Topic.connection.select_all(QUERY)
assert_equal 'pk', records[0].keys[0]
end
end

View file

@ -0,0 +1,24 @@
print "Using native DB2\n"
require_dependency 'fixtures/course'
require 'logger'
ActiveRecord::Base.logger = Logger.new("debug.log")
db1 = 'arunit'
db2 = 'arunit2'
ActiveRecord::Base.establish_connection(
:adapter => "db2",
:host => "localhost",
:username => "arunit",
:password => "arunit",
:database => db1
)
Course.establish_connection(
:adapter => "db2",
:host => "localhost",
:username => "arunit2",
:password => "arunit2",
:database => db2
)

View file

@ -0,0 +1,24 @@
print "Using native Firebird\n"
require_dependency 'fixtures/course'
require 'logger'
ActiveRecord::Base.logger = Logger.new("debug.log")
db1 = 'activerecord_unittest'
db2 = 'activerecord_unittest2'
ActiveRecord::Base.establish_connection(
:adapter => "firebird",
:host => "localhost",
:username => "rails",
:password => "rails",
:database => db1
)
Course.establish_connection(
:adapter => "firebird",
:host => "localhost",
:username => "rails",
:password => "rails",
:database => db2
)

View file

@ -0,0 +1,21 @@
print "Using native MySQL\n"
require_dependency 'fixtures/course'
require 'logger'
ActiveRecord::Base.logger = Logger.new("debug.log")
db1 = 'activerecord_unittest'
db2 = 'activerecord_unittest2'
ActiveRecord::Base.establish_connection(
:adapter => "mysql",
:username => "rails",
:encoding => "utf8",
:database => db1
)
Course.establish_connection(
:adapter => "mysql",
:username => "rails",
:database => db2
)

View file

@ -0,0 +1,22 @@
print "Using native OpenBase\n"
require_dependency 'fixtures/course'
require 'logger'
ActiveRecord::Base.logger = Logger.new("debug.log")
db1 = 'activerecord_unittest'
db2 = 'activerecord_unittest2'
ActiveRecord::Base.establish_connection(
:adapter => "openbase",
:username => "admin",
:password => "",
:database => db1
)
Course.establish_connection(
:adapter => "openbase",
:username => "admin",
:password => "",
:database => db2
)

View file

@ -0,0 +1,23 @@
print "Using Oracle\n"
require_dependency 'fixtures/course'
require 'logger'
ActiveRecord::Base.logger = Logger.new STDOUT
ActiveRecord::Base.logger.level = Logger::WARN
# Set these to your database connection strings
db = ENV['ARUNIT_DB'] || 'activerecord_unittest'
ActiveRecord::Base.establish_connection(
:adapter => 'oracle',
:username => 'arunit',
:password => 'arunit',
:database => db
)
Course.establish_connection(
:adapter => 'oracle',
:username => 'arunit2',
:password => 'arunit2',
:database => db
)

View file

@ -0,0 +1,24 @@
print "Using native PostgreSQL\n"
require_dependency 'fixtures/course'
require 'logger'
ActiveRecord::Base.logger = Logger.new("debug.log")
db1 = 'activerecord_unittest'
db2 = 'activerecord_unittest2'
ActiveRecord::Base.establish_connection(
:adapter => "postgresql",
:username => "postgres",
:password => "postgres",
:database => db1,
:min_messages => "warning"
)
Course.establish_connection(
:adapter => "postgresql",
:username => "postgres",
:password => "postgres",
:database => db2,
:min_messages => "warning"
)

View file

@ -0,0 +1,37 @@
print "Using native SQlite\n"
require_dependency 'fixtures/course'
require 'logger'
ActiveRecord::Base.logger = Logger.new("debug.log")
class SqliteError < StandardError
end
BASE_DIR = File.expand_path(File.dirname(__FILE__) + '/../../fixtures')
sqlite_test_db = "#{BASE_DIR}/fixture_database.sqlite"
sqlite_test_db2 = "#{BASE_DIR}/fixture_database_2.sqlite"
def make_connection(clazz, db_file, db_definitions_file)
unless File.exist?(db_file)
puts "SQLite database not found at #{db_file}. Rebuilding it."
sqlite_command = %Q{sqlite #{db_file} "create table a (a integer); drop table a;"}
puts "Executing '#{sqlite_command}'"
raise SqliteError.new("Seems that there is no sqlite executable available") unless system(sqlite_command)
clazz.establish_connection(
:adapter => "sqlite",
:database => db_file)
script = File.read("#{BASE_DIR}/db_definitions/#{db_definitions_file}")
# SQLite-Ruby has problems with semi-colon separated commands, so split and execute one at a time
script.split(';').each do
|command|
clazz.connection.execute(command) unless command.strip.empty?
end
else
clazz.establish_connection(
:adapter => "sqlite",
:database => db_file)
end
end
make_connection(ActiveRecord::Base, sqlite_test_db, 'sqlite.sql')
make_connection(Course, sqlite_test_db2, 'sqlite2.sql')
load(File.join(BASE_DIR, 'db_definitions', 'schema.rb'))

View file

@ -0,0 +1,37 @@
print "Using native SQLite3\n"
require_dependency 'fixtures/course'
require 'logger'
ActiveRecord::Base.logger = Logger.new("debug.log")
class SqliteError < StandardError
end
BASE_DIR = File.expand_path(File.dirname(__FILE__) + '/../../fixtures')
sqlite_test_db = "#{BASE_DIR}/fixture_database.sqlite3"
sqlite_test_db2 = "#{BASE_DIR}/fixture_database_2.sqlite3"
def make_connection(clazz, db_file, db_definitions_file)
unless File.exist?(db_file)
puts "SQLite3 database not found at #{db_file}. Rebuilding it."
sqlite_command = %Q{sqlite3 #{db_file} "create table a (a integer); drop table a;"}
puts "Executing '#{sqlite_command}'"
raise SqliteError.new("Seems that there is no sqlite3 executable available") unless system(sqlite_command)
clazz.establish_connection(
:adapter => "sqlite3",
:database => db_file)
script = File.read("#{BASE_DIR}/db_definitions/#{db_definitions_file}")
# SQLite-Ruby has problems with semi-colon separated commands, so split and execute one at a time
script.split(';').each do
|command|
clazz.connection.execute(command) unless command.strip.empty?
end
else
clazz.establish_connection(
:adapter => "sqlite3",
:database => db_file)
end
end
make_connection(ActiveRecord::Base, sqlite_test_db, 'sqlite.sql')
make_connection(Course, sqlite_test_db2, 'sqlite2.sql')
load(File.join(BASE_DIR, 'db_definitions', 'schema.rb'))

View file

@ -0,0 +1,18 @@
print "Using native SQLite3\n"
require_dependency 'fixtures/course'
require 'logger'
ActiveRecord::Base.logger = Logger.new("debug.log")
class SqliteError < StandardError
end
def make_connection(clazz, db_definitions_file)
clazz.establish_connection(:adapter => 'sqlite3', :database => ':memory:')
File.read("#{File.dirname(__FILE__)}/../../fixtures/db_definitions/#{db_definitions_file}").split(';').each do |command|
clazz.connection.execute(command) unless command.strip.empty?
end
end
make_connection(ActiveRecord::Base, 'sqlite.sql')
make_connection(Course, 'sqlite2.sql')
load("#{File.dirname(__FILE__)}/../../fixtures/db_definitions/schema.rb"))

View file

@ -0,0 +1,24 @@
print "Using native SQLServer\n"
require_dependency 'fixtures/course'
require 'logger'
ActiveRecord::Base.logger = Logger.new("debug.log")
db1 = 'activerecord_unittest'
db2 = 'activerecord_unittest2'
ActiveRecord::Base.establish_connection(
:adapter => "sqlserver",
:host => "localhost",
:username => "sa",
:password => "",
:database => db1
)
Course.establish_connection(
:adapter => "sqlserver",
:host => "localhost",
:username => "sa",
:password => "",
:database => db2
)

View file

@ -0,0 +1,26 @@
print "Using native SQLServer via ODBC\n"
require_dependency 'fixtures/course'
require 'logger'
ActiveRecord::Base.logger = Logger.new("debug.log")
dsn1 = 'activerecord_unittest'
dsn2 = 'activerecord_unittest2'
ActiveRecord::Base.establish_connection(
:adapter => "sqlserver",
:mode => "ODBC",
:host => "localhost",
:username => "sa",
:password => "",
:dsn => dsn1
)
Course.establish_connection(
:adapter => "sqlserver",
:mode => "ODBC",
:host => "localhost",
:username => "sa",
:password => "",
:dsn => dsn2
)

View file

@ -0,0 +1,24 @@
print "Using native Sybase Open Client\n"
require_dependency 'fixtures/course'
require 'logger'
ActiveRecord::Base.logger = Logger.new("debug.log")
db1 = 'activerecord_unittest'
db2 = 'activerecord_unittest2'
ActiveRecord::Base.establish_connection(
:adapter => "sybase",
:host => "database_ASE",
:username => "sa",
:password => "",
:database => db1
)
Course.establish_connection(
:adapter => "sybase",
:host => "database_ASE",
:username => "sa",
:password => "",
:database => db2
)

View file

@ -0,0 +1,64 @@
require 'abstract_unit'
class CopyTableTest < Test::Unit::TestCase
fixtures :companies, :comments
def setup
@connection = ActiveRecord::Base.connection
class << @connection
public :copy_table, :table_structure, :indexes
end
end
def test_copy_table(from = 'companies', to = 'companies2', options = {})
assert_nothing_raised {copy_table(from, to, options)}
assert_equal row_count(from), row_count(to)
if block_given?
yield from, to, options
else
assert_equal column_names(from), column_names(to)
end
@connection.drop_table(to) rescue nil
end
def test_copy_table_renaming_column
test_copy_table('companies', 'companies2',
:rename => {'client_of' => 'fan_of'}) do |from, to, options|
assert_equal column_values(from, 'client_of').compact.sort,
column_values(to, 'fan_of').compact.sort
end
end
def test_copy_table_with_index
test_copy_table('comments', 'comments_with_index') do
@connection.add_index('comments_with_index', ['post_id', 'type'])
test_copy_table('comments_with_index', 'comments_with_index2') do
assert_equal table_indexes_without_name('comments_with_index'),
table_indexes_without_name('comments_with_index2')
end
end
end
protected
def copy_table(from, to, options = {})
@connection.copy_table(from, to, {:temporary => true}.merge(options))
end
def column_names(table)
@connection.table_structure(table).map {|column| column['name']}
end
def column_values(table, column)
@connection.select_all("SELECT #{column} FROM #{table}").map {|row| row[column]}
end
def table_indexes_without_name(table)
@connection.indexes('comments_with_index').delete(:name)
end
def row_count(table)
@connection.select_one("SELECT COUNT(*) AS count FROM #{table}")['count']
end
end

View file

@ -0,0 +1,16 @@
require 'abstract_unit'
require 'fixtures/default'
class DefaultTest < Test::Unit::TestCase
def test_default_timestamp
default = Default.new
assert_instance_of(Time, default.default_timestamp)
assert_equal(:datetime, default.column_for_attribute(:default_timestamp).type)
# Variance should be small; increase if required -- e.g., if test db is on
# remote host and clocks aren't synchronized.
t1 = Time.new
accepted_variance = 1.0
assert_in_delta(t1.to_f, default.default_timestamp.to_f, accepted_variance)
end
end

View file

@ -0,0 +1,18 @@
require 'abstract_unit'
require 'fixtures/default'
class DefaultsTest < Test::Unit::TestCase
if %w(PostgreSQL).include? ActiveRecord::Base.connection.adapter_name
def test_default_integers
default = Default.new
assert_instance_of(Fixnum, default.positive_integer)
assert_equal(default.positive_integer, 1)
assert_instance_of(Fixnum, default.negative_integer)
assert_equal(default.negative_integer, -1)
end
else
def test_dummy
assert true
end
end
end

View file

@ -0,0 +1,352 @@
require 'abstract_unit'
require 'fixtures/developer'
require 'fixtures/project'
require 'fixtures/company'
require 'fixtures/topic'
require 'fixtures/reply'
# Can't declare new classes in test case methods, so tests before that
bad_collection_keys = false
begin
class Car < ActiveRecord::Base; has_many :wheels, :name => "wheels"; end
rescue ArgumentError
bad_collection_keys = true
end
raise "ActiveRecord should have barked on bad collection keys" unless bad_collection_keys
class DeprecatedAssociationsTest < Test::Unit::TestCase
fixtures :accounts, :companies, :developers, :projects, :topics,
:developers_projects
def test_has_many_find
assert_equal 2, Firm.find_first.clients.length
end
def test_has_many_orders
assert_equal "Summit", Firm.find_first.clients.first.name
end
def test_has_many_class_name
assert_equal "Microsoft", Firm.find_first.clients_sorted_desc.first.name
end
def test_has_many_foreign_key
assert_equal "Microsoft", Firm.find_first.clients_of_firm.first.name
end
def test_has_many_conditions
assert_equal "Microsoft", Firm.find_first.clients_like_ms.first.name
end
def test_has_many_sql
firm = Firm.find_first
assert_equal "Microsoft", firm.clients_using_sql.first.name
assert_equal 1, firm.clients_using_sql_count
assert_equal 1, Firm.find_first.clients_using_sql_count
end
def test_has_many_counter_sql
assert_equal 1, Firm.find_first.clients_using_counter_sql_count
end
def test_has_many_queries
assert Firm.find_first.has_clients?
firm = Firm.find_first
assert_equal 2, firm.clients_count # tests using class count
firm.clients
assert firm.has_clients?
assert_equal 2, firm.clients_count # tests using collection length
end
def test_has_many_dependence
assert_equal 3, Client.find_all.length
Firm.find_first.destroy
assert_equal 1, Client.find_all.length
end
uses_transaction :test_has_many_dependence_with_transaction_support_on_failure
def test_has_many_dependence_with_transaction_support_on_failure
assert_equal 3, Client.find_all.length
firm = Firm.find_first
clients = firm.clients
clients.last.instance_eval { def before_destroy() raise "Trigger rollback" end }
firm.destroy rescue "do nothing"
assert_equal 3, Client.find_all.length
end
def test_has_one_dependence
num_accounts = Account.count
firm = Firm.find(1)
assert firm.has_account?
firm.destroy
assert_equal num_accounts - 1, Account.count
end
def test_has_one_dependence_with_missing_association
Account.destroy_all
firm = Firm.find(1)
assert !firm.has_account?
firm.destroy
end
def test_belongs_to
assert_equal companies(:first_firm).name, Client.find(3).firm.name
assert Client.find(3).has_firm?, "Microsoft should have a firm"
# assert !Company.find(1).has_firm?, "37signals shouldn't have a firm"
end
def test_belongs_to_with_different_class_name
assert_equal Company.find(1).name, Company.find(3).firm_with_other_name.name
assert Company.find(3).has_firm_with_other_name?, "Microsoft should have a firm"
end
def test_belongs_to_with_condition
assert_equal Company.find(1).name, Company.find(3).firm_with_condition.name
assert Company.find(3).has_firm_with_condition?, "Microsoft should have a firm"
end
def test_belongs_to_equality
assert Company.find(3).firm?(Company.find(1)), "Microsoft should have 37signals as firm"
assert_raises(RuntimeError) { !Company.find(3).firm?(Company.find(3)) } # "Summit shouldn't have itself as firm"
end
def test_has_one
assert companies(:first_firm).account?(Account.find(1))
assert_equal Account.find(1).credit_limit, companies(:first_firm).account.credit_limit
assert companies(:first_firm).has_account?, "37signals should have an account"
assert Account.find(1).firm?(companies(:first_firm)), "37signals account should be able to backtrack"
assert Account.find(1).has_firm?, "37signals account should be able to backtrack"
assert !Account.find(2).has_firm?, "Unknown isn't linked"
assert !Account.find(2).firm?(companies(:first_firm)), "Unknown isn't linked"
end
def test_has_many_dependence_on_account
num_accounts = Account.count
companies(:first_firm).destroy
assert_equal num_accounts - 1, Account.count
end
def test_find_in
assert_equal Client.find(2).name, companies(:first_firm).find_in_clients(2).name
assert_raises(ActiveRecord::RecordNotFound) { companies(:first_firm).find_in_clients(6) }
end
def test_force_reload
firm = Firm.new("name" => "A New Firm, Inc")
firm.save
firm.clients.each {|c|} # forcing to load all clients
assert firm.clients.empty?, "New firm shouldn't have client objects"
assert !firm.has_clients?, "New firm shouldn't have clients"
assert_equal 0, firm.clients_count, "New firm should have 0 clients"
client = Client.new("name" => "TheClient.com", "firm_id" => firm.id)
client.save
assert firm.clients.empty?, "New firm should have cached no client objects"
assert !firm.has_clients?, "New firm should have cached a no-clients response"
assert_equal 0, firm.clients_count, "New firm should have cached 0 clients count"
assert !firm.clients(true).empty?, "New firm should have reloaded client objects"
assert firm.has_clients?(true), "New firm should have reloaded with a have-clients response"
assert_equal 1, firm.clients_count(true), "New firm should have reloaded clients count"
end
def test_included_in_collection
assert companies(:first_firm).clients.include?(Client.find(2))
end
def test_build_to_collection
assert_equal 1, companies(:first_firm).clients_of_firm_count
new_client = companies(:first_firm).build_to_clients_of_firm("name" => "Another Client")
assert_equal "Another Client", new_client.name
assert new_client.save
assert new_client.firm?(companies(:first_firm))
assert_equal 2, companies(:first_firm).clients_of_firm_count(true)
end
def test_create_in_collection
assert_equal companies(:first_firm).create_in_clients_of_firm("name" => "Another Client"), companies(:first_firm).clients_of_firm(true).last
end
def test_has_and_belongs_to_many
david = Developer.find(1)
assert david.has_projects?
assert_equal 2, david.projects_count
active_record = Project.find(1)
assert active_record.has_developers?
assert_equal 3, active_record.developers_count
assert active_record.developers.include?(david)
end
def test_has_and_belongs_to_many_removing
david = Developer.find(1)
active_record = Project.find(1)
david.remove_projects(active_record)
assert_equal 1, david.projects_count
assert_equal 2, active_record.developers_count
end
def test_has_and_belongs_to_many_zero
david = Developer.find(1)
david.remove_projects(Project.find_all)
assert_equal 0, david.projects_count
assert !david.has_projects?
end
def test_has_and_belongs_to_many_adding
jamis = Developer.find(2)
action_controller = Project.find(2)
jamis.add_projects(action_controller)
assert_equal 2, jamis.projects_count
assert_equal 2, action_controller.developers_count
end
def test_has_and_belongs_to_many_adding_from_the_project
jamis = Developer.find(2)
action_controller = Project.find(2)
action_controller.add_developers(jamis)
assert_equal 2, jamis.projects_count
assert_equal 2, action_controller.developers_count
end
def test_has_and_belongs_to_many_adding_a_collection
aredridel = Developer.new("name" => "Aredridel")
aredridel.save
aredridel.add_projects([ Project.find(1), Project.find(2) ])
assert_equal 2, aredridel.projects_count
end
def test_belongs_to_counter
topic = Topic.create("title" => "Apple", "content" => "hello world")
assert_equal 0, topic.send(:read_attribute, "replies_count"), "No replies yet"
reply = topic.create_in_replies("title" => "I'm saying no!", "content" => "over here")
assert_equal 1, Topic.find(topic.id).send(:read_attribute, "replies_count"), "First reply created"
reply.destroy
assert_equal 0, Topic.find(topic.id).send(:read_attribute, "replies_count"), "First reply deleted"
end
def test_natural_assignment_of_has_one
apple = Firm.create("name" => "Apple")
citibank = Account.create("credit_limit" => 10)
apple.account = citibank
assert_equal apple.id, citibank.firm_id
end
def test_natural_assignment_of_belongs_to
apple = Firm.create("name" => "Apple")
citibank = Account.create("credit_limit" => 10)
citibank.firm = apple
assert_equal apple.id, citibank.firm_id
end
def test_natural_assignment_of_has_many
apple = Firm.create("name" => "Apple")
natural = Client.create("name" => "Natural Company")
apple.clients << natural
assert_equal apple.id, natural.firm_id
assert_equal Client.find(natural.id), Firm.find(apple.id).clients.find(natural.id)
apple.clients.delete natural
assert_raises(ActiveRecord::RecordNotFound) {
Firm.find(apple.id).clients.find(natural.id)
}
end
def test_natural_adding_of_has_and_belongs_to_many
rails = Project.create("name" => "Rails")
ap = Project.create("name" => "Action Pack")
john = Developer.create("name" => "John")
mike = Developer.create("name" => "Mike")
rails.developers << john
rails.developers << mike
assert_equal Developer.find(john.id), Project.find(rails.id).developers.find(john.id)
assert_equal Developer.find(mike.id), Project.find(rails.id).developers.find(mike.id)
assert_equal Project.find(rails.id), Developer.find(mike.id).projects.find(rails.id)
assert_equal Project.find(rails.id), Developer.find(john.id).projects.find(rails.id)
ap.developers << john
assert_equal Developer.find(john.id), Project.find(ap.id).developers.find(john.id)
assert_equal Project.find(ap.id), Developer.find(john.id).projects.find(ap.id)
ap.developers.delete john
assert_raises(ActiveRecord::RecordNotFound) {
Project.find(ap.id).developers.find(john.id)
}
assert_raises(ActiveRecord::RecordNotFound) {
Developer.find(john.id).projects.find(ap.id)
}
end
def test_storing_in_pstore
require "pstore"
require "tmpdir"
apple = Firm.create("name" => "Apple")
natural = Client.new("name" => "Natural Company")
apple.clients << natural
db = PStore.new(File.join(Dir.tmpdir, "ar-pstore-association-test"))
db.transaction do
db["apple"] = apple
end
db = PStore.new(File.join(Dir.tmpdir, "ar-pstore-association-test"))
db.transaction do
assert_equal "Natural Company", db["apple"].clients.first.name
end
end
def test_has_many_find_all
assert_equal 2, Firm.find_first.find_all_in_clients("#{QUOTED_TYPE} = 'Client'").length
assert_equal 1, Firm.find_first.find_all_in_clients("name = 'Summit'").length
end
def test_has_one
assert companies(:first_firm).account?(Account.find(1))
assert companies(:first_firm).has_account?, "37signals should have an account"
assert Account.find(1).firm?(companies(:first_firm)), "37signals account should be able to backtrack"
assert Account.find(1).has_firm?, "37signals account should be able to backtrack"
assert !Account.find(2).has_firm?, "Unknown isn't linked"
assert !Account.find(2).firm?(companies(:first_firm)), "Unknown isn't linked"
end
def test_has_one_build
firm = Firm.new("name" => "GlobalMegaCorp")
assert firm.save
account = firm.build_account("credit_limit" => 1000)
assert account.save
assert_equal account, firm.account
end
def test_has_one_failing_build_association
firm = Firm.new("name" => "GlobalMegaCorp")
firm.save
account = firm.build_account
assert !account.save
assert_equal "can't be empty", account.errors.on("credit_limit")
end
def test_has_one_create
firm = Firm.new("name" => "GlobalMegaCorp")
firm.save
assert_equal firm.create_account("credit_limit" => 1000), firm.account
end
end

View file

@ -0,0 +1,134 @@
require 'abstract_unit'
require 'fixtures/company'
require 'fixtures/topic'
require 'fixtures/entrant'
require 'fixtures/developer'
class DeprecatedFinderTest < Test::Unit::TestCase
fixtures :companies, :topics, :entrants, :developers
def test_find_all_with_limit
entrants = Entrant.find_all nil, "id ASC", 2
assert_equal(2, entrants.size)
assert_equal(entrants(:first).name, entrants.first.name)
end
def test_find_all_with_prepared_limit_and_offset
entrants = Entrant.find_all nil, "id ASC", [2, 1]
assert_equal(2, entrants.size)
assert_equal(entrants(:second).name, entrants.first.name)
end
def test_find_first
first = Topic.find_first "title = 'The First Topic'"
assert_equal(topics(:first).title, first.title)
end
def test_find_first_failing
first = Topic.find_first "title = 'The First Topic!'"
assert_nil(first)
end
def test_deprecated_find_on_conditions
assert Topic.find_on_conditions(1, ["approved = ?", false])
assert_raises(ActiveRecord::RecordNotFound) { Topic.find_on_conditions(1, ["approved = ?", true]) }
end
def test_condition_interpolation
assert_kind_of Firm, Company.find_first(["name = '%s'", "37signals"])
assert_nil Company.find_first(["name = '%s'", "37signals!"])
assert_nil Company.find_first(["name = '%s'", "37signals!' OR 1=1"])
assert_kind_of Time, Topic.find_first(["id = %d", 1]).written_on
end
def test_bind_variables
assert_kind_of Firm, Company.find_first(["name = ?", "37signals"])
assert_nil Company.find_first(["name = ?", "37signals!"])
assert_nil Company.find_first(["name = ?", "37signals!' OR 1=1"])
assert_kind_of Time, Topic.find_first(["id = ?", 1]).written_on
assert_raises(ActiveRecord::PreparedStatementInvalid) {
Company.find_first(["id=? AND name = ?", 2])
}
assert_raises(ActiveRecord::PreparedStatementInvalid) {
Company.find_first(["id=?", 2, 3, 4])
}
end
def test_bind_variables_with_quotes
Company.create("name" => "37signals' go'es agains")
assert Company.find_first(["name = ?", "37signals' go'es agains"])
end
def test_named_bind_variables_with_quotes
Company.create("name" => "37signals' go'es agains")
assert Company.find_first(["name = :name", {:name => "37signals' go'es agains"}])
end
def test_named_bind_variables
assert_equal '1', bind(':a', :a => 1) # ' ruby-mode
assert_equal '1 1', bind(':a :a', :a => 1) # ' ruby-mode
assert_kind_of Firm, Company.find_first(["name = :name", { :name => "37signals" }])
assert_nil Company.find_first(["name = :name", { :name => "37signals!" }])
assert_nil Company.find_first(["name = :name", { :name => "37signals!' OR 1=1" }])
assert_kind_of Time, Topic.find_first(["id = :id", { :id => 1 }]).written_on
end
def test_count
assert_equal(0, Entrant.count("id > 3"))
assert_equal(1, Entrant.count(["id > ?", 2]))
assert_equal(2, Entrant.count(["id > ?", 1]))
end
def test_count_by_sql
assert_equal(0, Entrant.count_by_sql("SELECT COUNT(*) FROM entrants WHERE id > 3"))
assert_equal(1, Entrant.count_by_sql(["SELECT COUNT(*) FROM entrants WHERE id > ?", 2]))
assert_equal(2, Entrant.count_by_sql(["SELECT COUNT(*) FROM entrants WHERE id > ?", 1]))
end
def test_find_all_with_limit
first_five_developers = Developer.find_all nil, 'id ASC', 5
assert_equal 5, first_five_developers.length
assert_equal 'David', first_five_developers.first.name
assert_equal 'fixture_5', first_five_developers.last.name
no_developers = Developer.find_all nil, 'id ASC', 0
assert_equal 0, no_developers.length
assert_equal first_five_developers, Developer.find_all(nil, 'id ASC', [5])
assert_equal no_developers, Developer.find_all(nil, 'id ASC', [0])
end
def test_find_all_with_limit_and_offset
first_three_developers = Developer.find_all nil, 'id ASC', [3, 0]
second_three_developers = Developer.find_all nil, 'id ASC', [3, 3]
last_two_developers = Developer.find_all nil, 'id ASC', [2, 8]
assert_equal 3, first_three_developers.length
assert_equal 3, second_three_developers.length
assert_equal 2, last_two_developers.length
assert_equal 'David', first_three_developers.first.name
assert_equal 'fixture_4', second_three_developers.first.name
assert_equal 'fixture_9', last_two_developers.first.name
end
def test_find_all_by_one_attribute_with_options
topics = Topic.find_all_by_content("Have a nice day", "id DESC")
assert topics(:first), topics.last
topics = Topic.find_all_by_content("Have a nice day", "id DESC")
assert topics(:first), topics.first
end
protected
def bind(statement, *vars)
if vars.first.is_a?(Hash)
ActiveRecord::Base.send(:replace_named_bind_variables, statement, vars.first)
else
ActiveRecord::Base.send(:replace_bind_variables, statement, vars)
end
end
end

Some files were not shown because too many files have changed in this diff Show more