A | B | C | D | E | F | G | H | I | J | K | L | M | N | O | P | Q | R | S | T | U | V | W | X | Y | Z | AA | AB | AC | AD | AE | |
---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
1 | Assigned? YES or NO | Contact(s) | Institution | Institution Priority | Report | Legacy System Data Element Sources | Folio JIRA Issue(s) | Is the report repeated | Parameters to the report | Requires real-time data? | Purpose | Link to sample | Import/Export? | Notes | A-M Notes | Canned Report | Within-App Cross-App Cross-System | ||||||||||||||
2 | ID001 | NO | Anne Highsmith | Texas A&M, UA, Duke, OVGU | P1 (TAMU); P4 (Duke);P1(UA); P1 (OVGU), P4 (Lehigh) | Hathitrust submission projects | full MARC bibliographic record | REP-5 | No | No | When TAMU participates in a digitization project to submit to Hathitrust, Hathitrust requires that a full MARC record be submitted for each item digitized | generalize the purpose of the report | Export | Hathi is a collaborative digitization project that many institutions participate. In practice, it operates much as a union catalog, thus the need for full MARC records. There is a robust API, that you can get info on here: https://www.hathitrust.org/bib_api. - Mike Winkler | We have not yet worked on export from FOLIO, so I don't have anything to add about this (yet), We will be working on it after Import is mostly completed. See UXPROD-652 for Data Export tool. | ||||||||||||||||
3 | ID002 | NO | Anne Highsmith | Texas A&M, UA, Duke, OVGU | P1 (TAMU); P2 (Duke); P1(UA), P1(OVGU), P5 (Lehigh) | Ongoing authority control processing | full MARC bibliographic and authority records | REP-6 | Quarterly | No | Extract records from database and submit them to authority control vendor so that bib record headings may be upgraded to RDA standards and authority records are updated to latest version. Updated MARC records are then reloaded into database. | Export & Import | This would be both Import and Export. The Import portion is in scope of what we're working on in the Data Import group. The Export portion is not. See UXPROD-652 for Data Export tool. | ||||||||||||||||||
4 | ID003 | NO | Sharon Beltaine | Cornell, UA, Duke, OVGU | P1 (Duke); P1(UA); P1(CU), P1(OVGU), P1(Lehigh but need clarification) | IMPORTS from a number of vendors: DeGruyter; Hart Pub Ebooks;Naxoos Music Lib and Jazz; Rand Ebooks; Alexander St Press; Coutts Approval and Patron Driven Print; Gale MARC Record Load; Marcive Load; Skillsoft Bks 24x7 Videos; Serial Solutions Journals and Ebooks; Early Arabic Printed Books; Vendoro Order Pickups; LCCairo Vendor Print Load; Yankee | full MARC bibliographic record; some purchase orders | REP-7 | regularly | no | Add records to Voyager database | need a lot of infomation about the various vendors | Import | Bibliographic records provided by vendors - M. Winkler bibliographic data with additionl information about vendors; most likely financial transactional type of data - M. Winkler | This is in scope of the Data Import group. See list of record sources at https://wiki.folio.org/display/MM/Sources+of+Batch+Files | cross-system | |||||||||||||||
5 | ID004 | NO | Sharon Beltaine | Cornell, UA, Duke | P1 (Duke); P1(UA); P1 (CU), P1 (Lehigh but need clarification) | EXPORTS: OCLC Daily Cataloging Export; Metadata List for RAPID and RAPID extract; WorldShare OCLC Manuscript-Archive Export; CRL Serials Extract; Daily Record Extract to OASIS; Monograph Holdings to Coutts Oasis; Hathi Trust Single and Multi-vol Monographs and HT Serials; OCLC WorldShare Daily Deletes | full MARC bibliographic record; metadata | REP-8 | daily / regularly | no | provide required records to external vendors | Export | Reporting holdings to national bibliographic catalogues - M. Winkler | We have not yet worked on export from FOLIO, so I don't have anything to add about this (yet), We will be working on it after Import is mostly completed. See UXPROD-652 for Data Export tool. | cross-system | ||||||||||||||||
6 | ID005 | YES | Sharon Beltaine | Cornell, OVGU | P1(CU) | Usage Data from electronic content providers | full MARCX bibliographic record | REP-9 | regularly | change since date and time | yes | Provide data for public facing discovery tools | export; See Heather Shipman | Is this equal to COUNTER reports? | We have not yet worked on export from FOLIO, so I don't have anything to add about this (yet), We will be working on it after Import is mostly completed. | cross-system | |||||||||||||||
7 | ID006 | YES | Sharon Beltaine | Cornell, Duke, OVGU | P1 (Duke); P1(CU), P1(OVGU), P2 (Lehigh) | Invoice Control Report/Electronic Feed | invoice, fund | REP-10 | daily | run every day invoices are approved | Yes | Invoice Control report: gather financial data in total for all of the invoices approved the day before Electronic Feed: sending by vendor, by invoice, by account number to university financial system to be paid (batch job) | Cornell categorizes these 2 reports as one because they 2 are versions of the same data | We have not yet worked on export from FOLIO, so I don't have anything to add about this (yet), We will be working on it after Import is mostly completed. This has been identified as a need related to the Invoice module of Acquisitions. | Cornell | Invoice Control Report/Electronic Feed | |||||||||||||||
8 | ID007 | YES | Claudius Herkt | Hamburg, OVGU | P5 (H), P1(OVGU), P2 (Lehigh) | COUNTER Reports | REP-11 | regularly | Provider, report type, report period | no | manage external Usage data in one place | not existent so far | import | Not in the current scope of the Data Import group, which is only handling bibliographic and acquisitions data. Presumably, these would load to the ERM? | ERM | The ERM subgroup has a ticket in Jira for support for importing COUNTER and other e-resource data | |||||||||||||||
9 | ID008 | NO | Rob Pleshar (rpleshar@uchicago.edu) | Chicago | P5 (UC), P2 (Lehigh) | Usage data from electronic content providers combined with payments | HARRASSOWITZ E-Stats/Payment data for the same subscription period in FOLIO | REP-12 | regularly | no | manage external Usage data in one place | import | Currently almost all of our usage data is harvested by E-Stats. That includes billing information for those subscriptions already with Harrassowitz, but lacks the payment information for resources acquired from other sources. | Not in the current scope of the Data Import group, which is only handling bibliographic and acquisitions data. Presumably, these would load to the ERM? Any MARC invoice or EDI invoice data IS in the scope of the Data Import group. | yes | ERM/Acq | The ERM subgroup has a ticket in Jira for support for importing COUNTER and other e-resource data. See https://issues.folio.org/browse/UXPROD-576 | ||||||||||||||
10 | ID009 | YES | Michelle Paolillo, Sharon Beltaine | Cornell | P1(CU) | bibliographic submission for Google corrections | full item-level MARC bibliographic record | REP-13 | yes | Profile is documented at https://www.hathitrust.org/bib_specifications | yes | When Cornell participates in a digitization project to submit to Hathitrust, Hathitrust requires that a full MARC record be submitted for each item digitized - see https://www.hathitrust.org/bib_data_submission | generalize the purpose of the report | Export | Hathi is a collaborative digitization project that many institutions participate. In practice, it operates much as a union catalog, thus the need for full item-level MARC records according to their spec. | We have not yet worked on export from FOLIO, so I don't have anything to add about this (yet), We will be working on it after Import is mostly completed. | perlscript in LSTools behind "Google" button | cross-system | |||||||||||||
11 | ID010 | YES | Michelle Paolillo, Sharon Beltaine | Cornell | P1(CU) | Hathitrust bibliographic submission for new deposits and for correction | full item-level MARC bibliographic record from individually specified records | REP-14 | yes | Profile is documented at https://www.hathitrust.org/bib_specifications | yes | The local partner catalog is considered the authoritative record for item deposits to HathiTrust. When there is an error in submitted metadata, practice is to correct the local catalog record, repackage and resubmit to Zephir (HathiTrust's metadata management system.) There are three reports that deliver similar output, but use different input and are named differently. | Export | Cornell has three production streams into HathiTrust, and a different routine for packaging each. (1) Google digitized items - identified by barcode, (2) locally digitized items - identified by barcode, (3) Kirtas/Microsoft digitized items, deposited to Internet Archive and identified by IA arkID. Barcode identified items rest in teh "coo" namspece and arkID identified items rest in the "coo1" namespace. | We have not yet worked on export from FOLIO, so I don't have anything to add about this (yet), We will be working on it after Import is mostly completed. | perlscripts in LSTools behind "Google" button | cross-system | ||||||||||||||
12 | ID011 | YES | Michelle Paolillo, Sharon Beltaine | Cornell, Duke, OVGU | P1 (Duke); P1 (CU), P1(OVGU), P1 (Lehigh) | not a report really, but need the ability to write my own reports as needed | ability to query our local instance of the hathifiles and unite this with our catalog data | REP-15 | yes | need access to hathifiles and our cataloging data in one interface so I might write my own reports | yes | I often need to analyze large sets of material for sutability for deposit, and/or duplication of deposit, etc. I do this by writing MS Access queries that utilize our local Hathifiles instance and our catalog data as linked tablespaces in Access so I can get full information for the volumes, perfom analysis and derive spreadsheets for clients. | Export | Used for analysis for new deposits, cleaning up problems in previously submitted material -(SMB) reporting functionality requirement? | ??? Not sure how to respond to this one | MSAccess database linked to hathifiles and Voyager tablespaces | cross-system | ||||||||||||||
13 | ID012 | YES | Michelle Paolillo, Sharon Beltaine | Cornell | P2(CU) | Processing steps for processing Google Candidate list into a picklist | Unite local tables created from files from Google Return Interface and unite with current information to produce picklists and analysis | REP-16 | yes | needs to be reasonably current | Google uses static catalog extracts that are out of date. To refresh locations to current ones, and to deduplicate among locations, and to exclude certain locations, locations must be reconciled and new picklists derived. | Export | Google digitization is not curently active, but may restart if this becomes attractive. | ??? Not sure how to respond to this one | |||||||||||||||||
14 | ID013 | YES | Michelle Paolillo, Sharon Beltaine | Cornell | P1(CU) | HathiTrust Pre-deposit Display Preview Report | MARC 008 code for publication begin and end date, publication location, US Government doc; MARC LDR for bib format , BIB_ITEM, DISPLAY_CALL_NUM, author, title, publisher | REP-17 | yes | MARC 008 code for publication begin and end date, publication location, US Government doc; MARC LDR for bib format | yes | The purpose is to predict the display behavior (full or limited view) of individual items and groups of items in HathiTrust prior to deposit. This report is also used to spot catalog issues and locate rights owners. It helps Cornell understand the amount of effort it will take to display an item in full view in the local library catalog prior to depositing it in the HathiTrust digital library. Overall, the report supports decision making by curators and collection stewards in determining the appropriate destination for collections migrations. | Export | The report mimics an algorithm used by HathiTrust to determine the viewability status of deposited digital materials. For more information, see: https://www.hathitrust.org/bib_rights_determination. | We have not yet worked on export from FOLIO, so I don't have anything to add about this (yet), We will be working on it after Import is mostly completed. | ||||||||||||||||
15 | ID014 | YES | Michelle Paolillo, Sharon Beltaine | Cornell, Duke | P4 (Duke); P2 (CU), P2 (Lehigh) | HathiTrust Print Holdings | limited elements from MARC - see https://www.hathitrust.org/print_holdings | REP-18 | yes | profile documented at https://www.hathitrust.org/print_holdings | needs to be reasonably current | Cornell annually delivers three print holdings statements (single volume monongraphs, multivolume monographs and serials). The statements include lost/withdrawn or missing items, as well as current holdings. These statements are the data source on which our annual fee is based, and also form the basis of the mechanism by which we can access preservation copies of withdrawn, lost or missing items that are in copyright. | We have not yet worked on export from FOLIO, so I don't have anything to add about this (yet), We will be working on it after Import is mostly completed. | ||||||||||||||||||
16 | |||||||||||||||||||||||||||||||
17 | |||||||||||||||||||||||||||||||
18 | |||||||||||||||||||||||||||||||
19 | |||||||||||||||||||||||||||||||
20 | |||||||||||||||||||||||||||||||
21 | |||||||||||||||||||||||||||||||
22 | |||||||||||||||||||||||||||||||
23 | |||||||||||||||||||||||||||||||
24 | |||||||||||||||||||||||||||||||
25 | |||||||||||||||||||||||||||||||
26 | |||||||||||||||||||||||||||||||
27 | |||||||||||||||||||||||||||||||
28 | |||||||||||||||||||||||||||||||
29 | |||||||||||||||||||||||||||||||
30 | |||||||||||||||||||||||||||||||
31 | |||||||||||||||||||||||||||||||
32 | |||||||||||||||||||||||||||||||
33 | |||||||||||||||||||||||||||||||
34 | |||||||||||||||||||||||||||||||
35 | |||||||||||||||||||||||||||||||
36 | |||||||||||||||||||||||||||||||
37 | |||||||||||||||||||||||||||||||
38 | |||||||||||||||||||||||||||||||
39 | |||||||||||||||||||||||||||||||
40 | |||||||||||||||||||||||||||||||
41 | |||||||||||||||||||||||||||||||
42 | |||||||||||||||||||||||||||||||
43 | |||||||||||||||||||||||||||||||
44 | |||||||||||||||||||||||||||||||
45 | |||||||||||||||||||||||||||||||
46 | |||||||||||||||||||||||||||||||
47 | |||||||||||||||||||||||||||||||
48 | |||||||||||||||||||||||||||||||
49 | |||||||||||||||||||||||||||||||
50 | |||||||||||||||||||||||||||||||
51 | |||||||||||||||||||||||||||||||
52 | |||||||||||||||||||||||||||||||
53 | |||||||||||||||||||||||||||||||
54 | |||||||||||||||||||||||||||||||
55 | |||||||||||||||||||||||||||||||
56 | |||||||||||||||||||||||||||||||
57 | |||||||||||||||||||||||||||||||
58 | |||||||||||||||||||||||||||||||
59 | |||||||||||||||||||||||||||||||
60 | |||||||||||||||||||||||||||||||
61 | |||||||||||||||||||||||||||||||
62 | |||||||||||||||||||||||||||||||
63 | |||||||||||||||||||||||||||||||
64 | |||||||||||||||||||||||||||||||
65 | |||||||||||||||||||||||||||||||
66 | |||||||||||||||||||||||||||||||
67 | |||||||||||||||||||||||||||||||
68 | |||||||||||||||||||||||||||||||
69 | |||||||||||||||||||||||||||||||
70 | |||||||||||||||||||||||||||||||
71 | |||||||||||||||||||||||||||||||
72 | |||||||||||||||||||||||||||||||
73 | |||||||||||||||||||||||||||||||
74 | |||||||||||||||||||||||||||||||
75 | |||||||||||||||||||||||||||||||
76 | |||||||||||||||||||||||||||||||
77 | |||||||||||||||||||||||||||||||
78 | |||||||||||||||||||||||||||||||
79 | |||||||||||||||||||||||||||||||
80 | |||||||||||||||||||||||||||||||
81 | |||||||||||||||||||||||||||||||
82 | |||||||||||||||||||||||||||||||
83 | |||||||||||||||||||||||||||||||
84 | |||||||||||||||||||||||||||||||
85 | |||||||||||||||||||||||||||||||
86 | |||||||||||||||||||||||||||||||
87 | |||||||||||||||||||||||||||||||
88 | |||||||||||||||||||||||||||||||
89 | |||||||||||||||||||||||||||||||
90 | |||||||||||||||||||||||||||||||
91 | |||||||||||||||||||||||||||||||
92 | |||||||||||||||||||||||||||||||
93 | |||||||||||||||||||||||||||||||
94 | |||||||||||||||||||||||||||||||
95 | |||||||||||||||||||||||||||||||
96 | |||||||||||||||||||||||||||||||
97 | |||||||||||||||||||||||||||||||
98 | |||||||||||||||||||||||||||||||
99 | |||||||||||||||||||||||||||||||
100 |