Do you have any code examples? I guess I'm still having trouble understanding how it all links together. The ResultSet methods look like what you do when you have everything together, but I guess I'm still trying to figure out the initialization piece.
For example, let's say we have data pieces as such:
- Oracle: Table_A - Key1, Data_A; Table_B - Key1, Key2, Data_B
- CSV: Table_C - Key2, Data_C
- Fusion: Table_D - Key1, Data_D
So, how would the code look to tie these relationships together? Do all of the interfaces require a DBD module (Fusion doesn't have one)? Do they need an SQL interface (CSV doesn't have one unless you use something like DBD::CSV)? Does Fusion require a DBIx::Class::Storage module to write data to it?
I'm also going to be throwing SNMP into the mix, which doesn't have either. I'm thinking of writing a DBD::SNMP one, anyway. But, I need to know if it's going to be useful for this project.
Also, since this would be a web application, I figured Catalyst would also be appropriate. Hopefully, Catalyst::Model::DBIC::Schema is going to link nicely with everything else in DBIx::Class.
| [reply] |
There are a number of things I would need to know about your project to feel comfortable recommending one framework, module, or approach over another, but I'll try and explain DBIx::Class so you can make a decision on how to proceed.
First thing to do is to create the object code which will let you access the data and give you back DBIx::Class objects. In DBIx:Class, a table = a DBIx::Class::ResultSource. So what the commands below do is look at your database schemas and create the code which will let you access each table as a ResultSource. Great, now you have a way to get all of the objects (rows) from each ResultSource (table). Any time you ask the database for some data, you will now get back a DBIx::Class::ResultSet.
Now, in your application you can do whatever you need with each ResultSource - maybe you do some stuff in memory and print a report, maybe you create another database and import all of your data into a unified schema, I'm not sure what your goal is, but hopefully this helps you understand how DBIx::Class may help you.
dbicdump -o dump_directory=./lib -o preserve_case=1 MyApp::MySQLSchema
+ dbi:mysql:DB_Name user password
dbicdump -o dump_directory=./lib -o preserve_case=1 MyApp::OracleSchem
+a dbi:oracle:DB_Name user password
use MyApp::MySQLSchema ;
use MyApp::OracleSchema ;
our $schema_mysql = MyApp::MySQLSchema->connect( sub { # code that ret
+urns a DBI handle } ) ;
our $schema_oracle = MyApp::OracleSchema->connect( sub { # code that r
+eturns another DBI handle } ) ;
my $company = $schema_mysql->resultset('Company')->find($company_id) ;
+ # get 1 row/object
my $widget = $company->find_related('widgets', $widget_id) ; # get 1 r
+elated row/object
my $parts = $widget->search_related('widget_parts', { in_stock => 1 })
+ ; # get all matching related rows/object
my $r = $widget->create_related('widget_part', { # create a row
comments => $comments,
creation_time => time(),
}) ;
$schema_mysql->txn_do( sub {
$r->insert() ;
$r->update() ;
}) ;
| [reply] [d/l] [select] |
Yeah, I'm doing some dumps right now. Loader is going through some right now. Still looking for options to add to specify the storage engine or properties thereof.
It does look like the protocol either needs its own DBD module, or at least some way of loading the data into a in-memory DBI object. I'm probably going to continue down the path of a DBD::SNMP module.
So, does DBIx::Class support crossing storage types for relationships, or does that require parsing between the divisions yourself?
| [reply] |