650 likes | 747 Views
From Order to Checkout. Improving Workflows through Acq, Cat and Circ Jackie Wrosch Systems Librarian Eastern Michigan Univeristy. Before... . Infrequent bulk imports Manual downloads and imports “On Order” location for items on order “Not Yet Available” location for items in process
E N D
From Order to Checkout Improving Workflows through Acq, Cat and Circ Jackie Wrosch Systems Librarian Eastern Michigan Univeristy
Before... • Infrequent bulk imports • Manual downloads and imports • “On Order” location for items on order • “Not Yet Available” location for items in process • Manual edits for all records • Manual creation of all item records From Order to Checkout: Improving Workflows through Acq, Cat and Circ
Before... From Order to Checkout: Improving Workflows through Acq, Cat and Circ
Before...Ordering Selectors order from Gobi Acq Librarian Reviews and Export Acq Librarian FTPs from Gobi Acq Librarian Imports to Voyager From Order to Checkout: Improving Workflows through Acq, Cat and Circ
From Order to Checkout: Improving Workflows through Acq, Cat and Circ
Before...Cataloging Items Received in Acquisitions Sent to Cataloging Locate, Create and Edit All Bib Records Create an Item Record With Not Yet Available Location From Order to Checkout: Improving Workflows through Acq, Cat and Circ
From Order to Checkout: Improving Workflows through Acq, Cat and Circ
Before...Processing Sent to Acquisitions Labeling Change Location to Final Destination Sent to Circulation. Discharged and Shelved From Order to Checkout: Improving Workflows through Acq, Cat and Circ
From Order to Checkout: Improving Workflows through Acq, Cat and Circ
Before... • No way to place a request or hold through the OPAC • No idea when an item might be available • Too many errors in changing location to final destination • Displayed information not always accurate at that moment From Order to Checkout: Improving Workflows through Acq, Cat and Circ
Goals - Staff • Automate downloading and importing records • Automate repetitive cataloging tasks • Don’t add any new manual workflows • Eliminate problematic workflows From Order to Checkout: Improving Workflows through Acq, Cat and Circ
Goals – Public • Clarify where an item is and when it is expected to be available • Make it easy to request an On Order item • Make it easy to notify patrons when an item is available From Order to Checkout: Improving Workflows through Acq, Cat and Circ
Tools • shell and perl scripts • OCLC PromptCat • Voyager Bulkimport • “On Order” patrons and patron groups • Voyager Hold functionality • Gary Strawn’s LocationChanger program From Order to Checkout: Improving Workflows through Acq, Cat and Circ
Now...Ordering Selectors order from Gobi Acq Librarian Reviews and Export Acq Librarian FTPs from Gobi Acq Librarian Imports to Voyager gobi.sh runs M-F, 6am-4pm From Order to Checkout: Improving Workflows through Acq, Cat and Circ
Now...gobi.sh • Downloads any new files from the Gobi site • Sorts the records by 245|a • Splits records into individual files • Imports each record individually • Results in a single PO, Bib record and On Order MFHD From Order to Checkout: Improving Workflows through Acq, Cat and Circ
gobi.sh part 1 #!/bin/ksh LOG=/export/home/voyager/scripts/logs/gobi.log CFG=/export/home/voyager/scripts/edi.cfg EDI=/m1/voyager/emichdb/edi MRC=$EDI/mrc MRC_P=$EDI/mrc_p DATE=`/bin/date "+%b %e"` SPLITMARC=/export/home/voyager/scripts/splitmarc.pl SORTMARC=/export/home/voyager/scripts/sortmarc.pl GOBI=`ps -ef |grep -c gobi.sh` ps -ef |grep gobi.sh >> $LOG echo $GOBI >> $LOG if [ "$GOBI" -gt 3 ]; then /bin/date >> $LOG echo "gobi.sh already running, exiting..." >> $LOG exit fi Define locations and programs needed If its already running, exit From Order to Checkout: Improving Workflows through Acq, Cat and Circ
gobi.sh part 2 Download from Gobi /usr/local/bin/ncftpget -f $CFG $MRC 8058*.mrc /usr/bin/ls $MRC/*.mrc > $MRC/tmp /usr/bin/sed 's/^.*\/8058/8058/' $MRC/tmp > $MRC/clean /usr/bin/ls $MRC_P/*.mrc > $MRC_P/tmp /usr/bin/sed 's/^.*\/8058/8058/' $MRC_P/tmp > $MRC_P/clean /usr/bin/diff $MRC/clean $MRC_P/clean > $EDI/mrc.todo.tmp /usr/bin/sed 's/^.* 8058/8058/' $EDI/mrc.todo.tmp > $EDI/mrc.todo.clean /usr/bin/sed -n '/^8058/p' $EDI/mrc.todo.clean > $EDI/mrc.todo /usr/bin/rm $MRC/tmp /usr/bin/rm $MRC/clean /usr/bin/rm $MRC_P/tmp /usr/bin/rm $MRC_P/clean /usr/bin/rm $EDI/mrc.todo.tmp /usr/bin/rm $EDI/mrc.todo.clean Determine which files need processing Clean up from the previous step From Order to Checkout: Improving Workflows through Acq, Cat and Circ
gobi.sh part 3 Mark files as processed /usr/bin/cat $EDI/mrc.todo | while read line do /usr/bin/cp $MRC/$line $MRC_P done /usr/bin/cat $EDI/mrc.todo | while read line do $SORTMARC $MRC/$line $MRC/$line.sorted MRC_COUNT=`$SPLITMARC $MRC/$line.sorted $MRC/$line.sorted` i=1 while [ "$i" -le $MRC_COUNT ] do /m1/voyager/emichdb/sbin/Pbulkimport -f$MRC/$line.sorted-$i -iYBPsel sleep 60 /usr/bin/rm $MRC/$line.sorted-$i i=`expr $i + 1` done /usr/bin/rm $MRC/$line.sorted done Sort and Split the Files Bulk import the records From Order to Checkout: Improving Workflows through Acq, Cat and Circ
sortmarc.pl part 1 #!/usr/local/bin/perl -w use strict; use MARC; my $x = new MARC; my %inc = %{$x->usmarc_default()}; my ($infile,$outfile) = @ARGV; $x->openmarc({ file=>$infile, format=>'usmarc', charset=>\%inc, lineterm=>"\n" }); my %titles; while ($x->nextmarc(1)) { my @field_987 = $x->getvalue({record=>1,field=>'987',subfield=>'a'}); my @field_245 = $x->getvalue({record=>1,field=>'245',subfield=>'a'}); $titles{$field_987[0]}=$field_245[0]; $x->output({ file=>">$field_987[0]", format=>'usmarc' }); $x->deletemarc(); Open the MARC file Output each record to its own file From Order to Checkout: Improving Workflows through Acq, Cat and Circ
sortmarc.pl part 2 Sort the titles foreach my $value (sort{uc($titles{$a}) cmp uc($titles{$b})} keys %titles) { $x->openmarc({ file=>$value, format=>'usmarc', charset=>\%inc, lineterm=>"\n" }); $x->nextmarc(1); $x->output({ file=>">>$outfile", format=>'usmarc' }); $x->deletemarc(); unlink $value; } Open each individual file Output each record to the new sorted file From Order to Checkout: Improving Workflows through Acq, Cat and Circ
splitmarc.pl #!/usr/local/bin/perl -w use strict; use MARC; my $x = new MARC; my %inc = %{$x->usmarc_default()}; my ($infile,$outfile) = @ARGV; $x->openmarc({ file=>$infile, format=>'usmarc', charset=>\%inc, lineterm=>"\n" }); my $count=0; while ($x->nextmarc(1)) { $count=$count+1; $x->output({ file=>">>$outfile-$count", format=>'usmarc' }); $x->deletemarc(); } print $count; Open the MARC file Output each record to its own file From Order to Checkout: Improving Workflows through Acq, Cat and Circ
From Order to Checkout: Improving Workflows through Acq, Cat and Circ
Now...Cataloging and Processing • Contracted to OCLC PromptCat • Locate best available bib record • Barcode all items • Label items that have good bib record • Both Firm and Approval orders are processed this way From Order to Checkout: Improving Workflows through Acq, Cat and Circ
Now...PromptCat Items Received At PromptCat Locate MOST Bib Records Labeling and Barcoding Items Shipped To EMU From Order to Checkout: Improving Workflows through Acq, Cat and Circ
Now...Cataloging Items Received in Acquisitions Sent to Cataloging Locate, Create and Edit All Bib Records Create an Item Record With Not Yet Available Location From Order to Checkout: Improving Workflows through Acq, Cat and Circ
Now...Processing Sent to Acquisitions Labeling Change Location to Final Destination Sent to Circulation. Discharged and Shelved From Order to Checkout: Improving Workflows through Acq, Cat and Circ
Now...Automated imports • Receive weekly files • Preprocess records to eliminate repetitive tasks • Bulkimport records • Firm orders overlay our Gobi records – this results in a double MFHD • Approvals create new records – bibs, MHFDs, items and POs • New reports for reviewing and processing From Order to Checkout: Improving Workflows through Acq, Cat and Circ
Now...Automated imports Download files Preprocess records Bulkimport in to Voyager Review and act on generated reports From Order to Checkout: Improving Workflows through Acq, Cat and Circ
Now...download.sh • download.sh • Runs M-F at 4am • Downloads any files from this month or last From Order to Checkout: Improving Workflows through Acq, Cat and Circ
download.sh #!/bin/ksh # Setting all the variables VOYDIR=/m1/voyager/emichdb LOG=/export/home/voyager/scripts/logs/promptcat.log /bin/date >> $LOG echo "running promptcatd-download.sh... " >> $LOG PROMPTCAT=$VOYDIR/PromptCat # File where the ftp commands used as input for "expect" are put ftpfile=ftp.exp /usr/local/bin/expect $PROMPTCAT/$ftpfile & pid=$! /bin/wait $pid ftp the files using expect From Order to Checkout: Improving Workflows through Acq, Cat and Circ
ftp.exp #!/usr/local/bin/expect ~~ spawn $env(SHELL) send "ftp edx.oclc.org\n" sleep 10 ; send "teye1\n" sleep 10 ; send "password\n" expect "ftp>" {send "lcd /m1/voyager/emichdb/PromptCat/mrc\n"} expect "ftp>" {send "passive\n"} expect "ftp>" {send "bin\n"} expect "ftp>" {send "prompt\n"} expect "ftp>" {send "cd 'edx.pcat.eye.'\n"} expect "ftp>" {send "mget RCD.YDX.EYE.D0806*.FIRM\n"} expect "ftp>" {send "mget RCD.YDX.EYE.D0805*.FIRM\n"} expect "ftp>" {send "mget RCD.YDX.EYEAP.D0806*.APPR\n"} expect "ftp>" {send "mget RCD.YDX.EYEAP.D0805*.APPR\n"} expect "ftp>" {send "bye\n"} Login to the PromptCat server Download the files From Order to Checkout: Improving Workflows through Acq, Cat and Circ
Now...npasswd.sh • npasswd.sh • Password stops working after 90 days • Establishes a new password for PromptCat FTP site • Creates a new expect file for download.sh • Runs on the 1st of the month • Can’t repeat passwords From Order to Checkout: Improving Workflows through Acq, Cat and Circ
npasswd.sh part 1 #!/bin/ksh VOYDIR=/m1/voyager/emichdb PROMPTCAT=$VOYDIR/PromptCat LOG=/export/home/voyager/scripts/logs/promptcat.log USED=$PROMPTCAT/used.txt MONTH=`date +%m` LASTMONTH=`expr $MONTH - 1` YEAR=`date +%y` LASTYEAR=$YEAR if test "$LASTMONTH" -eq 0 then LASTYEAR=`expr $YEAR - 1` if test "$LASTYEAR" -lt 10 then LASTYEAR=\0$LASTYEAR fi LASTMONTH=12 elif test "$LASTMONTH" -lt 10 then LASTMONTH=\0$LASTMONTH fi /usr/bin/cat $USED | while read line do OLDPWD=`echo $line` done Figure out current and last month so we know which files to download Get the current password From Order to Checkout: Improving Workflows through Acq, Cat and Circ
npasswd.sh part 2 NEWOK=0 NEWPWD=0 while [ $NEWOK -eq 0 ] do NEWPWD=`/usr/local/bin/mkpasswd -l 8 -C 1 -d 1` NEWOK=1 /usr/bin/cat $USED | while read line do if test $line = $NEWPWD then NEWOK=0 fi done done echo $NEWPWD >> $USED Create a new password From Order to Checkout: Improving Workflows through Acq, Cat and Circ
npasswd.sh part 3 PWDEXP=$PROMPTCAT/pwd.exp /usr/bin/cp $PWDEXP $PWDEXP.old echo "#!/usr/local/bin/expect ~~" > $PWDEXP echo "" >> $PWDEXP echo "spawn \$env(SHELL)" >> $PWDEXP echo "" >> $PWDEXP echo "send \"ftp edx.oclc.org\\\n\"" >> $PWDEXP echo "sleep 10 ;" >> $PWDEXP echo "send \"teye1\\\n\"" >> $PWDEXP echo "sleep 10 ;" >> $PWDEXP echo "send \"$OLDPWD/$NEWPWD/$NEWPWD\\\n\"" >> $PWDEXP echo "expect \"ftp>\" {send \"bye\\\n\"}" >> $PWDEXP /usr/local/bin/expect $PWDEXP & Create an expect file for changing the password Change the password on the server From Order to Checkout: Improving Workflows through Acq, Cat and Circ
npasswd.sh part 4 FTPEXP=$PROMPTCAT/ftp.exp /usr/bin/cp $FTPEXP $FTPEXP.old echo "#!/usr/local/bin/expect ~~" > $FTPEXP echo "" >> $FTPEXP echo "spawn \$env(SHELL)" >> $FTPEXP echo "" >> $FTPEXP echo "send \"ftp edx.oclc.org\\\n\"" >> $FTPEXP echo "sleep 10 ;" >> $FTPEXP echo "send \"teye1\\\n\"" >> $FTPEXP echo "sleep 10 ;" >> $FTPEXP echo "send \"$NEWPWD\\\n\"" >> $FTPEXP echo "expect \"ftp>\" {send \"lcd /m1/voyager/emichdb/PromptCat/mrc\\\n\"}" >> $FTPEXP echo "expect \"ftp>\" {send \"passive\\\n\"}" >> $FTPEXP echo "expect \"ftp>\" {send \"bin\\\n\"}" >> $FTPEXP echo "expect \"ftp>\" {send \"prompt\\\n\"}" >> $FTPEXP echo "expect \"ftp>\" {send \"cd 'edx.pcat.eye.'\\\n\"}" >> $FTPEXP echo "expect \"ftp>\" {send \"mget RCD.YDX.EYE.D$YEAR$MONTH*.FIRM\\\n\"}" >> $FTPEXP echo "expect \"ftp>\" {send \"mget RCD.YDX.EYE.D$LASTYEAR$LASTMONTH*.FIRM\\\n\"}" >> $FTPEXP echo "expect \"ftp>\" {send \"mget RCD.YDX.EYEAP.D$YEAR$MONTH*.APPR\\\n\"}" >> $FTPEXP echo "expect \"ftp>\" {send \"mget RCD.YDX.EYEAP.D$LASTYEAR$LASTMONTH*.APPR\\\n\"}" >> $FTPEXP echo "expect \"ftp>\" {send \"bye\\\n\"}" >> $FTPEXP Create the expect file for downloading records From Order to Checkout: Improving Workflows through Acq, Cat and Circ
Now...loadrecs.sh • Determines which files need to be pre-processed and imported • Runs the pre-process script, 650.pl • Bulk imports the processed records • Emails the Import log to the Cat staff • Emails any discarded records • Finds and emails the “On Order” MFHD IDs From Order to Checkout: Improving Workflows through Acq, Cat and Circ
loadrecs.sh part 1 #!/bin/ksh MAILX=/usr/bin/mailx MAIL856="lois.whitehead@emich.edu jwrosch@gmail.com" MAILBARCODE="jwrosch@gmail.com carol.smallwood@emich.edu" MAILLOG="whogan@emich.edu carol.smallwood@emich.edu jwrosch@gmail.com" MAILDISCARD="whogan@emich.edu carol.smallwood@emich.edu jwrosch@gmail.com" MAILMFHD="jwrosch@gmail.com carol.smallwood@emich.edu" VOYDIR=/m1/voyager/emichdb PROMPTCAT=$VOYDIR/PromptCat MRC=$PROMPTCAT/mrc MRC_P=$PROMPTCAT/mrc_p OUTDIR=$PROMPTCAT/out MRC_FILES=RCD.YDX.*D*.$1 SCRIPTS=/export/home/voyager/scripts/promptcat PREPROCESS=$SCRIPTS/650.pl GETMFHDS=$SCRIPTS/mfhds.pl DISCARDPL=$SCRIPTS/discard.pl DISCARDWEBDIR=$VOYDIR/webvoyage/html/discards CHARGE_BARCODE=299000028748 if test "$1" == "FIRM" then CHARGE_BARCODE=299000028746 fi LOG=/export/home/voyager/scripts/logs/promptcat.log /bin/date >> $LOG echo "running promptcat.sh... " >> $LOG Define all the people who get reports, directories and scripts that will be used throughout From Order to Checkout: Improving Workflows through Acq, Cat and Circ
loadrecs.sh part 2 /usr/bin/ls $MRC/$MRC_FILES > $MRC/tmp /usr/bin/sed 's/^.*RCD/RCD/' $MRC/tmp > $MRC/clean /usr/bin/ls $MRC_P/$MRC_FILES > $MRC_P/tmp /usr/bin/sed 's/^.*RCD/RCD/' $MRC_P/tmp > $MRC_P/clean /usr/bin/diff $MRC/clean $MRC_P/clean > $PROMPTCAT/mrc.todo.tmp /usr/bin/sed 's/^.*RCD/RCD/' $PROMPTCAT/mrc.todo.tmp > $PROMPTCAT/mrc.todo.clean /usr/bin/sed -n '/^RCD/p' $PROMPTCAT/mrc.todo.clean > $PROMPTCAT/mrc.todo /usr/bin/rm $MRC/tmp /usr/bin/rm $MRC/clean /usr/bin/rm $MRC_P/tmp /usr/bin/rm $MRC_P/clean /usr/bin/rm $PROMPTCAT/mrc.todo.tmp /usr/bin/rm $PROMPTCAT/mrc.todo.clean LOGDIR=$VOYDIR/rpt LOGTOSEND=$LOGDIR/logimp.tosend /usr/bin/rm $LOGTOSEND Compare the files in the downloaded and processed directories to determine what needs processing From Order to Checkout: Improving Workflows through Acq, Cat and Circ
loadrecs.sh part 3 Preprocess the files /usr/bin/cat $PROMPTCAT/mrc.todo | while read line do OUT856FILE=out856.$line.txt BARCODEFILE=barcodes.$line.cap.s $PREPROCESS $MRC/$line $OUTDIR/$line $OUTDIR/$OUT856FILE $OUTDIR/$BARCODEFILE $CHARGE_BARCODE for USER in `/bin/echo $MAIL856`; do $MAILX -s "$OUT856FILE" $USER < $OUTDIR/$OUT856FILE done for USER in `/bin/echo $MAILBARCODE`; do $MAILX -s "$BARCODEFILE" $USER < $OUTDIR/$BARCODEFILE done DATE=`/bin/date "+%Y%m%d.%H%M"` /m1/voyager/emichdb/sbin/PromptcatBulkImport -f$OUTDIR/$line -i$2 & wait echo "Pbulkimport -f$OUTDIR/$line..." >> $LOG echo $DATE >> $LOGTOSEND /usr/bin/cp $OUTDIR/$line $MRC_P done /usr/bin/rm $PROMPTCAT/mrc.todo Mail 856 and Barcode reports Bulk import the records and move file to processed directory From Order to Checkout: Improving Workflows through Acq, Cat and Circ
loadrecs.sh part 4 Mail bulkimport log /usr/bin/cat $LOGTOSEND | while read line do LOGIMP=log.imp.$line for USER in `/bin/echo $MAILLOG`; do $MAILX -s "$LOGIMP" $USER < "$LOGDIR/$LOGIMP" done DISCARD=discard.imp.$line DISCARDTXT=$DISCARD.txt $DISCARDPL $LOGDIR/$DISCARD $LOGDIR/$DISCARDTXT print "http://portal.emich.edu/discards/$DISCARD" >> $LOGDIR/$DISCARDTXT /usr/bin/cp $LOGDIR/$DISCARD $DISCARDWEBDIR for USER in `/bin/echo $MAILDISCARD`; do $MAILX -s "$DISCARD" $USER < "$LOGDIR/$DISCARDTXT" done if test "$1" == "FIRM" then /usr/bin/sed -n '/[0-9]\{6,7\} - 100/p' $LOGDIR/$LOGIMP > $LOGDIR/$LOGIMP.new /usr/bin/sed 's/ - 100//g' $LOGDIR/$LOGIMP.new > $LOGDIR/$LOGIMP.final $GETMFHDS $LOGDIR/$LOGIMP.final for USER in `/bin/echo $MAILMFHD`; do $MAILX -s "$LOGIMP.final.mfhds" $USER < "$LOGDIR/$LOGIMP.final.mfhds" done fi done Mail discarded records Find double MFHDs and mail the report From Order to Checkout: Improving Workflows through Acq, Cat and Circ
Now... 650.pl • Delete any subjects with 2nd indicator 2,6 or 7 • Delete any 856 with 2nd indicator 2 • If there is a 505 and an 856 TOC link, delete the 856 • Output all other 856s to a report • Deletes any 938s • Gets incoming barcodes from the 987 and creates an offline charge file From Order to Checkout: Improving Workflows through Acq, Cat and Circ
650.pl part 1 #!/usr/local/bin/perl -w use MARC::Batch; my ($infile,$out_usmarc,$out_856,$out_barcode,$barcode) = @ARGV; my $batch = MARC::Batch->new('USMARC', "$infile"); open(OUT, ">$out_usmarc"); open(OUT_856, ">$out_856"); open(OUT_BCODES, ">$out_barcode"); print OUT_BCODES "CAPTURE\n"; print OUT_BCODES "PATRON ".$barcode."\n"; print OUT_BCODES "DUE_DATE 2382-12-31 23:59:00\n"; print OUT_BCODES "BEGIN_CHARGE\n"; while (my $record = $batch->next()) { my @fields = $record->field('6..'); foreach my $field (@fields) { if ( $field ) { if ( $field->indicator(2) eq '2' || $field->indicator(2) eq '6' || $field->indicator(2) eq '7' ) { $record->delete_field($field); } } } Open files for processed records, 856 report, offline charge file Create the first part of the offline charge file Remove non-LCSH subjects From Order to Checkout: Improving Workflows through Acq, Cat and Circ
650.pl part 2 Create the 856 reports for TOC links my $field_505 = $record->field('505'); my $oclcnum = $record->field('035'); my @fields_856 = $record->field('856'); foreach my $field_856 (@fields_856) { if ( $field_856->indicator(2) eq '2' ) { $record->delete_field($field_856); } elsif ( $field_856->indicator(2) eq '1') { if ( $field_505 && index($field_856->as_string(),"Table of contents") ge 0 ) { $record->delete_field($field_856); } else { print OUT_856 $oclcnum->as_string(), "\n"; print OUT_856 $field_856->as_string(), "\n"; } } } my @fields_938 = $record->field('938'); foreach my $field_938 (@fields_938) { $record->delete_field($field_938); } Remove 938 field From Order to Checkout: Improving Workflows through Acq, Cat and Circ
650.pl part 3 Add the barcode to the offline charge file my $barcode = $record->subfield('987', "a"); if ( $barcode ) { print OUT_BCODES "ITEM ".$barcode."\n"; } print OUT $record->as_usmarc(); } print OUT_BCODES "END_CHARGE\n"; close(OUT); close(OUT_856); close(OUT_BCODES); Output the processed record Finish the offline charge file Close all the output files From Order to Checkout: Improving Workflows through Acq, Cat and Circ
discard.pl Open the discard file #!/usr/local/bin/perl -w use MARC::Batch; my ($infile,$outfile) = @ARGV; my $batch = MARC::Batch->new('USMARC', "$infile"); open(OUT, ">$outfile"); while (my $record = $batch->next()) { my $oclcnum = $record->field('035'); my $title = $record->field('245'); print OUT $oclcnum->as_string(), "\n", $title->as_string(), "\n\n"; } close(OUT); Put OCLC No. and Title into the discard report From Order to Checkout: Improving Workflows through Acq, Cat and Circ
mfhds.pl part 1 my $count=0; open(OUTFILE, ">$outfile"); my $dbh = DBI->connect('dbi:Oracle:', $username, $password) || die "Could not connect: $DBI::errstr"; while ( $count < $i ) { my $select = "select $db_name.bib_mfhd.mfhd_id from $db_name.bib_mfhd, $db_name.mfhd_master where $db_name.bib_mfhd.mfhd_id=$db_name.mfhd_master.mfhd_id and $db_name.bib_mfhd.bib_id=$bibids[$count++] and $db_name.mfhd_master.location_id=65"; my $sth = $dbh->prepare($select) || die $dbh->errstr; $sth->execute || die $dbh->errstr; while( my (@entry) = $sth->fetchrow_array() ) { print OUTFILE "$entry[0]\n"; } } close(OUTFILE); exit(0); Get the On Order MFHD for the incoming records Output the MFHD_ID From Order to Checkout: Improving Workflows through Acq, Cat and Circ
mfhds.pl part 2 #!/m1/shared/bin/perl use DBI; $ENV{ORACLE_SID} = "VGER"; $ENV{ORACLE_HOME} = "/oracle/app/oracle/product/10.2.0/db_1"; my $db_name = “xxxdb"; my $username = "ro_xxxdb"; my $password = "ro_xxxdb"; my ($infile) = @ARGV; my ($outfile) = $infile.".mfhds"; my @bibids; my $i=0; open(INFILE, $infile); while( <INFILE> ) { $bibids[$i] = $_; $bibids[$i++] =~ s/^\s+//; } close(INFILE); Information on connecting to Oracle Put the incoming bibids in an array From Order to Checkout: Improving Workflows through Acq, Cat and Circ
Now...New Cataloging Workflows • Review the 856 TOC links and add when necessary • Review any discarded records and re-import • Run the offline charge files through the Circ client • Run the “On Order” MFHD ids through LocationChanger to suppress the records From Order to Checkout: Improving Workflows through Acq, Cat and Circ
From Order to Checkout: Improving Workflows through Acq, Cat and Circ