Open Source Integrated Library System

November 14, 2017

Evergreen community blog

T-Shirts! Voting!

Last year we did our first community t-shirt featuring a quote from a community member (pulled from the IRC quotes database).  This year we are doing it again but it will be a new quote.  Please rank your favorites from 1 to 6 and the one with the strongest weight will be used. This shirt will be available at the next International Evergreen Conference for sale along with the limited supply stock of the existing “I’m not a cataloger but I know enough MARC to be fun at parties” shirt.

Voting is done here:

https://www.surveymonkey.com/r/DWYZMKT

Only one vote per person but make your opinion known!

by Rogan Hamby at November 14, 2017 08:46 PM

November 07, 2017

Evergreen Indiana

Weekly Update — November 7, 2017

Hacking away at Evergreen


We’re live at the Evergreen International Hack-A-Way right now! This is the second year Evergreen Indiana has been fortunate enough to welcome Evergreen developers from across the US for a 3 day hackathon. Bugs, interface updates, and new features are all on the agenda as 20+ specialists collaborate to improve Evergreen!

Welcome Ohio Township Public Library System


Ohio Township Public Library System (OHTWP) migrated into Evergreen Indiana on October 19, 2017! Ohio Township PL is a large class B library with 3 branches located on the Ohio River near Evansville featuring a collection of over 170,000 items!

2017 Tour Update

Tour cards return next week when we’re back in the office!

In case you’re interested in tracking our progress, there is an interactive map (with links to the full sized profile cards) now available: 2017 Tour. We’re wrapping up a very busy season of being offsite and offline while on the road, so thank you for all of your patience as it has taken a bit longer to get back to you than usual.

Support Notice

The Indiana State Library will be closed on Friday, November 10, 2017 in observance of the Veterans’ Day state holiday; INfoExpress services will also be suspended.

by admin at November 07, 2017 05:48 PM

Evergreen community blog

Hack-A-Way 2017 @ Fort Benjamin Harrison

Day 1 of the 2017 Hack-A-Way at Fort Benjamin Harrison is beginning as most things do with setup!  Many thanks to the Indiana State Library who have been gracious enough to host us two years in a row at the inn located in Fort Benjamin Harrison just outside Indianapolis, Indiana.  This is one of ten building remaining from the original fort’s construction and in it’s time has served as a command post, hospital and more.  In fact it was a place where soldiers returning from the front in 1918 were treated for both physical and mental health issues and was the source of an outbreak of influenza in this region of the United States after soldiers brought it back from Europe.  This week however we promise to just be pushing git commits out of here!

 

 

 

And early birds here are Anna Goben and Jason Boyer from the Indiana State Library doing setup.

 

 

 

 

by Rogan Hamby at November 07, 2017 12:55 PM

November 03, 2017

Massachusetts Library Network Cooperative

FY17 Was a Year of Growth for MassLNC

During the last fiscal year, MassLNC gained active new development partners, saw the completion of many long-standing development projects, and continued its mission to provide a mutual support network for Massachusetts networks on Evergreen.

The newly-released MassLNC FY17 Annual Report, which covers the time period from July 1, 2016 to June 30, 2017, highlights these and many other accomplishments over the past fiscal year. In addition to accomplishments mentioned above, MassLNC also decided to put funding towards seven new development projects and identified a new strategic development initiative.

The annual report is available at http://masslnc.org/system/files/private/fy17-annual-report-final.pdf.

by Kathy Lussier at November 03, 2017 04:03 PM

October 18, 2017

Evergreen Indiana

Weekly Update — October 18, 2017

2017 Tour Update






In case you’re interested in tracking our progress, there is an interactive map (with links to the full sized profile cards) now available: 2017 Tour. We’re going to be on the road a lot in the next few weeks with visits and migrations, so thank you for your patience if it takes a bit longer to get back to you than usual.

Migration News

On October 19, 2017, Ohio Township Public Library will open their doors as the latest active Evergreen Indiana member! The ISL team is currently on-site prepping staff for the open. The cataloging freeze has been lifted.

Upcoming Evergreen Indiana Events

October 27, 2017
Fall Cataloging Roundtable and Workday, webinar (10am-Noon) and Indiana State Library (1pm-4pm)
November 7-9, 2017
Evergreen International Hack-A-Way, Fort Benjamin State Park Inn, Indianapolis, IN

by admin at October 18, 2017 11:51 AM

August 24, 2017

Dan Scott (Coffee|Code) (Evergreen entries)

Our nginx caching proxy setup for Evergreen

A long time ago, I experimented with using nginx as a caching proxy in front of Evergreen but never quite got it to work. Since then, a lot has changed in both nginx and Evergreen, and Bill Erickson figured out how to get nginx to proxy the websockets that Evergreen now needs for its web-based staff client. This spring, as part of my work towards building prototype offline support for the Evergreen catalogue's My Account section, I dug in and started figuring out some of the final pieces that are needed to enable nginx to proxy most of the static content that Apache (with its bloated processes) would otherwise have to serve up, and wrote a configuration generator script for the nginx and Apache pieces. And in July, we went live with the configuration.

This post documents what we currently (as of August 2017) are running on our Evergreen 2.12 server with Ubuntu 16.04. If you have any questions about this or our corresponding Apache configuration, please let me know and I'll attempt to answer them!

/etc/nginx/sites-enabled/evergreen.conf

This is the core configuration for the nginx server:

proxy_cache_path /tmp/nginx_cache levels=1:2 keys_zone=my_cache:10m max_size=1g
                 inactive=60m use_temp_path=off;
proxy_cache_key $scheme$http_host$request_uri;

server {
    listen 80;
    server_name clients.concat.ca;

    include /etc/nginx/concat_ssl.conf;
    include /etc/nginx/osrf_sockets.conf;

    location / {
        proxy_pass https://localhost:7443;

        rewrite ^/?$ /updates/manualupdate.html permanent;

        include /etc/nginx/concat_headers.conf;
    }
}
  • The proxy_cache_path directive tells nginx where to store the data it is caching, what kind of directory structure it should create (levels), the name of the shared memory zone to use (keys_zone), the maximum size of the disk cache (max_size), how long to retain a cached copy of the file (inactive), and whether to use the value of the proxy_temp_path directive as a parent directory for the cache.
  • The proxy_cache_key tells nginx to use a combination of the request scheme (typically HTTP or HTTPS), the hostname, and the full request URI (including GET arguments) to store and lookup the cached data. Apache's response tells nginx how long the request should be cached (whether it should expire immediately, or as of #1681095 "Extend browser cache-busting support", cache for a full year for images, JavaScript, and CSS (at least until you run autogen.sh again).
  • We currently include one server directive per hostname that we support, which is quite repetitive. Looking at this with fresh eyes, we should probably simply use something like server_name *.concat.ca to cover all of our hostnames on our domain with a single directive.
  • In this block, we only listen to port 80, which seems odd given that we're an HTTPS-only site. Read on!
  • include /etc/nginx/concat_ssl.conf; keeps all of the TLS-related configuration in one place, including listening to port 443. We'll pry open this file later.
  • include /etc/nginx/osrf_sockets.conf; keeps all of the OpenSRF websockets translator proxy configuration in one place. We'll also pry open this file later.
  • The location / block handles the proxying. At first I was nervous and wanted to proxy the actual hostname instead of localhost to ensure we got the right templates, etc, but it turns out the proxy headers guide the request to the right host. So now I'm relaxed and we simply pass the request on to https://localhost:443. Be very careful with those trailing slashes!

/etc/nginx/concat_ssl.conf

listen 443 ssl http2;
ssl_certificate /etc/apache2/ssl/server.crt;
ssl_certificate_key /etc/apache2/ssl/server.key;

if ($scheme != "https") {
    return 301 https://$host$request_uri;
}

# generate with openssl dhparam -out dhparams.pem 2048
ssl_dhparam /etc/apache2/dhparams.pem;

# From https://mozilla.github.io/server-side-tls/ssl-config-generator/
ssl_prefer_server_ciphers on;
ssl_session_timeout 1d;
ssl_session_cache shared:SSL:50m;
ssl_session_tickets off;

# intermediate configuration. tweak to your needs.
ssl_protocols TLSv1 TLSv1.1 TLSv1.2;
ssl_ciphers 'ECDHE-ECDSA-CHACHA20-POLY1305:ECDHE-RSA-CHACHA20-POLY1305:ECDHE-ECDSA-AES128-GCM-SHA256:ECDHE-RSA-AES128-GCM-SHA256:ECDHE-ECDSA-AES256-GCM-SHA384:ECDHE-RSA-AES256-GCM-SHA384:DHE-RSA-AES128-GCM-SHA256:DHE-RSA-AES256-GCM-SHA384:ECDHE-ECDSA-AES128-SHA256:ECDHE-RSA-AES128-SHA256:ECDHE-ECDSA-AES128-SHA:ECDHE-RSA-AES256-SHA384:ECDHE-RSA-AES128-SHA:ECDHE-ECDSA-AES256-SHA384:ECDHE-ECDSA-AES256-SHA:ECDHE-RSA-AES256-SHA:DHE-RSA-AES128-SHA256:DHE-RSA-AES128-SHA:DHE-RSA-AES256-SHA256:DHE-RSA-AES256-SHA:ECDHE-ECDSA-DES-CBC3-SHA:ECDHE-RSA-DES-CBC3-SHA:EDH-RSA-DES-CBC3-SHA:AES128-GCM-SHA256:AES256-GCM-SHA384:AES128-SHA256:AES256-SHA256:AES128-SHA:AES256-SHA:DES-CBC3-SHA:!DSS';

# HSTS (ngx_http_headers_module is required) (15768000 seconds = 6 months)
add_header Strict-Transport-Security max-age=15768000;

# OCSP Stapling ---
# fetch OCSP records from URL in ssl_certificate and cache them
ssl_stapling on;
ssl_stapling_verify on;

There's a fair bit going on here, but it's almost entirely related to TLS support and a lot of the content comes either from the Mozilla TLS configuration generator or from Certbot's configuration plugin for nginx. Perhaps most interesting is the listen 443 ssl http2; line that enables listening on the standard HTTPS port and also supports HTTP/2 for browsers that support it--effectively a way to use a single connection from a browser to a server to issue many parallel requests for resources, amongst other performance enhancements.

We also force any HTTP request to use an HTTPS connection using the if ($scheme != "https") { block.

/etc/nginx/concat_headers.conf

This is extracted from the sample nginx configuration shipped with OpenSRF:

location /osrf-websocket-translator {
    proxy_pass https://localhost:7682;
    proxy_set_header X-Real-IP $remote_addr;
    proxy_set_header X-Forwarded-For $proxy_add_x_forwarded_for;

    # Needed for websockets proxying.
    proxy_http_version 1.1;
    proxy_set_header Upgrade $http_upgrade;
    proxy_set_header Connection "upgrade";

    # Raise the default nginx proxy timeout values to an arbitrarily
    # high value so that we can leverage osrf-websocket-translator's
    # timeout settings.
    proxy_connect_timeout 5m;
    proxy_send_timeout 1h;
    proxy_read_timeout 1h;
}

/etc/nginx/concat_headers.conf

This is not perfectly named; while we do set up the proxy headers in this file, we also include some of the other statements we would otherwise have to repeat inside the server block. Here's what the contents look like:

proxy_set_header Host $host;
proxy_set_header X-Real-IP $remote_addr;
proxy_set_header X-Forwarded-For $proxy_add_x_forwarded_for;
proxy_set_header X-Forwarded-Proto $scheme;

proxy_cache my_cache;
proxy_cache_use_stale error timeout http_500 http_502 http_503 http_504;
proxy_cache_lock on;

rewrite ^/?$ /eg/opac/home permanent;
  • The proxy_set_header directive adds headers to the requests forwarded to Apache, so that Apache can figure out which host was actually requested, accurately log requests (instead of saying everything is coming from localhost), etc. These directives were copied directly from the sample nginx configuration shipped with OpenSRF.
  • proxy_cache tells this server to use the cache we previously named in our keys_zone parameter.
  • proxy_cache_use_stale tells this server to return stale data (if it has a cached copy) if Apache returns an error or a timeout or any of the specified HTTP status codes while trying to fetch a fresh copy.
  • proxy_cache_lock tells this server to, should multiple identical requests for data that needs to be cached or refreshed arrive, only allow a single request to be passed through to Apache and have the other requests wait. This can be one way to avoid the "someone set a book down on a keyboard and caused 100 identical requests in one second" problem.
  • The rewrite simply directs the request for a bare hostname (with or without a trailing slash) to the catalogue home page.

by Dan Scott at August 24, 2017 08:00 PM

August 12, 2017

Dan Scott (Coffee|Code) (Evergreen entries)

Enriching catalogue pages in Evergreen with Wikidata

I'm part of the Music in Canada @ 150 Wikimedia project, organizing wiki edit-a-thons across Canada to help improve the presence of Canadian music and musicians in projects like Wikpedia, Wikidata, and Wikimedia Commons. It's going to be awesome, and it's why I invested time in developing and delivering the Wikidata for Librarians presentation at the CAML preconference.

Right now I'm at the Wikimania 2017 conference, because it is being held in Montréal--just down the road from me when you consider it is an international affair. The first two days were almost entirely devoted to a massive hackathon consisting of hundreds of participants with a very welcoming, friendly ambiance. It was inspiring, and I participated in several activities:

  • installing Wikibase--the technical foundation for Wikidata--from scratch
  • an ad-hoc data modelling session with Jan and Stacy Allison-Cassin that resulted in enhancing the periodicals structure on Wikidata

But I also had the itch to revisit and enhance the JavaScript widget that runs in our Evergreen catalogue which delivers on-demand cards of additional metadata about contributors to recorded works. I had originally developed the widget as a proof-of-concept for the potential value to cultural institutions of contributing data to Wikidata--bearing in mind a challenge put to the room at an Evergreen 2017 conference session that asked what tangible value linked open data offers--but it was quite limited:

  • it would only show a card for the first listed contributor to the work
  • it was hastily coded, and thus duplicated code, used shortcuts, and had no comments
  • the user interface was poorly designed
  • it was not explicitly licensed for reuse

So I spent some of my hackathon time (and some extra time stolen from various sessions) fixing those problems--so now, when you look at the catalogue record for a musical recording by the most excellent Canadian band Rush, you will find that each of the contributors to the album has a musical note (♩) which, when clicked, displays a card based on the data returned from Wikidata using a SPARQL query matching the contributor's name (limited in scope to bands and musicians to avoid too many ambiguous results).

I'm not done yet: the design is still very basic, but I'm happier about the code quality and it now supports queries for all of the contributors to a given album. It is also licensed for reuse under the GPL version 2 or later license, so as long as you can load the script in your catalogue and tweak a few CSS query selector statements to identify where the script should find contributor names and where it should place the cards, it should theoretically be usable in any catalogue of musical recordings. And with the clear "Edit on Wikidata" link, I hope that it encourages users to jump in and contribute if they find one of their favourite performers lacks (or shows incorrect!) information.

You can find the code on the Evergreen contributor git repository.

by Dan Scott at August 12, 2017 08:00 PM

June 19, 2017

Massachusetts Library Network Cooperative

MassLNC Welcomes Minnesota Group to Evergreen Development Initiative

The Massachusetts Library Network Cooperative (MassLNC) is pleased to welcome two Minnesota libraries to its initiative to fund development in the Evergreen open-source Integrated Library System.

The Lake Agassiz Regional Library (LARL) and Northwest Regional Library (NWRL) consortium in Minnesota is joining the MassLNC Evergreen Development Initiative. The project is an opportunity for libraries and consortia running Evergreen to maximize their resources by pooling funds to sponsor exciting new features and important improvements for the open-source system. The Minnesota libraries are joining seven other Evergreen library organizations in this collaborative project.

During the Initiative’s first year, MassLNC partners decided to fund development projects that improve the way users place holds in the system, allow staff to search report templates by keyword, and provide support for alternate/preferred patron names.

As the partnership enters its second year of work, the group is planning to support development to further improve search in the catalog and to continue work on Evergreen’s new web staff client.

The addition of new partners ensures that more libraries will have a voice in upcoming new features while allowing MassLNC to fund more and bigger projects.. More information about the Initiative is available at http://masslnc.org/eg-dev.

by Kathy Lussier at June 19, 2017 08:53 PM

May 22, 2017

Dyrcona's Evergreen Blog

NCIPServer and OpenSRF 2.5+

Thanks to Jason Boyer of Indiana Evergreen, it was brought to my attention that if you use the recommended Apache configuration with NCIPServer, Evergreen, and OpenSRF version 2.5.0 or later, NCIPServer crashes whenever you access its URL with the following error message:

Warning caught during route execution: Use of uninitialized value in scalar assignment at /usr/local/share/perl/5.22.1/OpenSRF/DomainObject/oilsMessage.pm line 246.

The reasons are a bit arbitrary and technical. The solution, however, is simple. To correct the situation, add the following directive inside the <Location /NCIP/> block in your eg_host.conf:

PerlSetEnv DANCER_ENVIRONMENT "production"

If you're upgrading to Evergreen 2.12 and OpenSRF 2.5, then you need to add the above to your configuration. Even if you are not planning an upgrade in the near future, adding that line will not hurt anything and it will help when you do eventually upgrade. Really, that directive should have been in the recommended configuration from the start, but was ignored owing to an oversight on my part.

The README has been updated to include the addition of the above line for configuration of both Apache 2.2 and Apache 2.4.

by Jason Stephenson (noreply@blogger.com) at May 22, 2017 07:21 PM

April 12, 2017

Equinox

Equinox Open Library Initiative FAQ

Q. What is the Equinox Open Library Initiative? As of January 2017, Equinox Software is the Equinox Open Library Initiative, a nonprofit organization ensuring that libraries and cultural heritage institutions can implement open source technologies with confidence. Q. Is Equinox OLI a membership organization?  Do I have to pay fees? No.  Equinox OLI believes in the open source ethos of the meritocracy and does not think a pay-to-play model fairly represents the diversity of user communities for any given software platform. Q. Are Equinox’s services changing? Not at all.  Day-to-day, we're still us, and we will continue to provide hosting, support, migrations, consulting, project management, training, and software development for the open source products we support and develop. Q. Is my contract with Equinox still in force? Yes.  Equinox Open Library Initiative assumed all of Equinox Software’s assets, liabilities, and contracts. Q. Is this for financial reasons? Absolutely not, it’s about ethics and service to the community.  In fact, we’ve had two of our strongest years to date (2015 and 2016).  This change is something we’ve been considering and working towards for several years and we’re excited that it’s finally a reality. Q. Then why become a nonprofit? Equinox was at a point where we knew our ability to grow could stall without significant changes to open up new opportunities and expand our services.  The most common way to do this is through a merger or acquisition with another company.  Our mission does not mesh well with that of proprietary ILS companies, and there aren’t any compatible nonprofit companies who share our particular vision.  By making the change to nonprofit we are able to satisfy our ethical concerns and use new avenues available to us to expand the company’s reach and offerings to ensure libraries have a viable choice in open source software. Q. What are the community benefits of this change? Equinox has been a trusted community partner for a decade; however, there was often some confusion about how Equinox fit into the equation. Being a nonprofit expands our ability to focus the discussion more on the continuation of our communities important work with Evergreen, Koha, and all open source software in cultural heritage institutions. As a nonprofit, Equinox will also be able to apply for grant funding to develop new prototypes, improve integration with existing products, and fund infrastructure projects that will keep Evergreen and Koha on the cutting edge. Q. If I would like to learn more about the Equinox Open Library Initiative, who should I contact? If you are a current or prospective customer of Equinox, please email sales@equinoxinitiative.org. If you are a member of the press, please email info@equinoxinitiative.org.

April 12, 2017 02:37 AM

Welcome, Andrea!

Equinox is excited to announce the newest member of our team:  Andrea Buntz Neiman!  Andrea has filled the position of Project Manager for Software Development and began work this week.  In her new role, she will coordinate with customers, developers, and other stakeholders to make sure everyone stays on the same page about development projects. Andrea received her BA in Music at St. Mary’s College in Maryland before completing her MLS at University of Maryland College Park.  She worked at the Library of Congress for three years  on various special projects in the Music Division and Recorded Sound Section.  She then spent 11 years in public libraries.  Andrea has worked in almost every area of the public library world and has accumulated quite the Summer Reading Shirt collection. In 2008, she helped her library migrate to Evergreen, making it the first (and so far only) public library in Maryland to use an open source ILS.  Grace Dunbar, Equinox Vice President, remarked; “I have been Andrea’s biggest fan ever since she gave an entire conference presentation on the many wonderful attributes of the Item Status screen.   The team at Equinox always been impressed with her work in the community and we are excited to have her as part of our amazing group.” Andrea’s hobbies include gardening, cooking (and eating), shopping at thrift stores, sewing, and baseball.  She is a lifelong Baltimore Orioles fan and is still nursing her heartbreak over their recent Wild Card loss.  She has every intention of learning to play the bass guitar some day.

April 12, 2017 02:37 AM

October 11, 2016

Galen Charlton

Visualizing the global distribution of Evergreen installations from tarballs

In August I made a map of Koha installations based on geolocation of the IP addresses that retrieved the Koha Debian package. Here’s an equivalent map for Evergreen:

Downloads of Evergreen tarballs in past 52 weeks

Click to get larger image

As with the Koha map, this is based on the last 52 weeks of Apache logs as of the date of this post. I included only complete downloads of Evergreen ILS tarballs and excluded downloads done by web crawlers.  A total of 1,317 downloads from 838 distinct IP addresses met these criteria.

The interactive version can be found on Plotly.

by Galen Charlton at October 11, 2016 02:04 AM

October 05, 2016

Rogan Hamby

A Partial History of SCLENDS

A few weeks ago Equinox Software published a blog post I wrote about Evergreen in 2009. My first draft and my final draft were very different. Draft by draft I stripped out the history of how SCLENDS started, not because I didn’t want to tell it but because in the larger Evergreen context it wasn’t what I wanted to say. The very fact that some remained though and that I did start with so much tells me something. It is a story I want to tell and while that post wasn’t the place, this is. Why? Honestly during that first year we did a lot of “make it work and fix it later.” Document? If there’s time. It’s easy to be critical of that approach but we had tight deadlines and if it hadn’t been done the way it was it might never have happened. But now I have a little time to write it and want to do so while my memory is clear, at least of the elements that stand out in 2009.

I’m not going to claim this is a complete history. Beyond the fallibility of memory I doubt I know the whole story and it’s naturally biased towards the events I was present for. SCLENDS was started by many people, library directors, circ managers, systems librarians and more. I worked with most of them but some only tangentially. No single person was present for every conversation and no person could know the whole story. And since I’ve admitted that this will be an incomplete telling I will also offer that I’m going to try to keep it brief. The story begins properly with the development of writing in ancient Mesopotamia and Egypt … just kidding.

In 2008 I was the Systems Librarian in Florence County, South Carolina. The library’s director, Ray McBride, and I had been deeply involved in the process of re-evaluating our technology plan. One thing we were not concerned about was our ILS. We were very happy Horizon users and had assumed that we would upgrade to Horizon 8 when it was released. It had already been delayed but why would we consider other options? Going out for an RFP is a process to be avoided like an invasive unnecessary medical procedure. Plus, we were happy with Horizon, it was user friendly, it fit our needs and was stable. Sure, it had gotten a little long in the tooth but the upgrade would give it the refresh it needed.

Then one day I was reading through my daily mail and there was a correspondence from Sirsi-Dynix. Horizon 8, Rome, was being canceled. Instead they would take the modern code base of their other product and merge it with the user friendliness of Horizon and like tunes being played together it would be Symphony. It was the kind of over the top marketing speak that made it clear they were trying to make users feel positive about news they knew we would be unhappy with. They would have been right about the unhappy part.

Fast forward and we had a meeting. I had compiled a list of possible ILSes we could upgrade to. Polaris was a strong contender. We seriously looked at Symphony, hoping for the potential of an easy migration. There were others we dismissed due to expense or lack of features. There might have been another we considered that I can’t remember now. And I threw Evergreen onto the stack for consideration.

Why did I suggest Evergreen? Florence was an almost pure Windows server environment and this was a radical departure. I didn’t try to convert the Florence environment to Linux despite my preferences because with the staff limitations the library had and applications they had invested in running within a Windows environment, Microsoft made sense. Migrating to a mission critical application on Linux was a big departure. But, when I looked at the growth of open source, what I saw happening in the Evergreen community and my own opinions about the relationship between open source and library philosophies I was of the conviction that we should consider it. Not go to it, just consider it. Frankly, with my time limitations an easy upgrade to Symphony sounded pretty good to me.

We formed a committee of public service staff and administrators. We invited in representatives from companies to talk about their ILSes. Evergreen was open source so I distributed a fact sheet. We had reps from Polaris and SirsiDynix come in. We talked to other libraries. One library referred to recent updates to Symphony in …. unflattering terms and told us they were migrating to Polaris as soon as they could. Others were only slightly kinder. Polaris looked good but didn’t blow us away. A Sirsi representative made it clear that migrating to Symphony would not be like an upgrade and there was Horizon functionality that did not have one for one parity in Symphony.

Discussions were lively but in the end we selected an ILS: Evergreen. At that point Evergreen was about version 1.2 and rough. As we talked about it one theme came up again and again. We believed that whatever shortcomings Evergreen had at that point in mid 2008 that it was the right long term choice for us. We believed that in time it would match and exceed the other options we had to pick from. We also wanted a choice that we felt would last us ten years. I think it was Ray who said later that this would be the last ILS a library would ever need to migrate to. He may well be proven right, only time will tell.

It can be strange what you remember. It was a Thursday afternoon in November that I was having coffee with Ray. We were discussing Evergreen and forming our plans for the migration. One of my concerns was the long term support, especially if I left. We began discussing approaching an external company for support of our servers. That would give me more time to spend in the community and support regardless of staff turnover. As we looked we also began to discuss moving to remote hosting and increasingly liked the idea though it meant moving nearly all technical management external to the library, not something we had traditionally done. However, while we had put a lot of value on internal staff management of technology we also had increasing needs without an increasing budget so going with a remote hosting option made sense.

All of this, especially the budget concerns, was in my head when I threw out another idea. In one sense, this was the start of SCLENDS. What if we invited others a to join us to start a consortium and reduce costs? Ray liked the idea and threw the idea out to the South Carolina library director’s listserv. From there I become a peripheral part of the story until January. During that time in the periphery I was aware that the offer was expressed and interest returned. I was tasked with inviting a vendor who could run servers for us.  The clear option was Equinox, having been founded by the original developers and administrators of Evergreen at Georgia PINES.  Additionally, they had a lot of experience with startup consortiums so they would understand what we were embarking on.

December passed and January of 2009 arrived. I found myself in the large meeting room at the Florence Library. The interested libraries were arriving. Eleven libraries in total attended that meeting, interested in sharing costs and materials in a new consortium. That meeting brought together not only the directors but systems administrators and circulation managers of the libraries.

Eleven libraries were present and ten of them went on to form SCLENDS. Honestly, that day was a blur of faces and voices. One person whose name I don’t hear mentioned much in connection to SCLENDS is Catherine Buck-Morgan and it should be. Although I don’t know this for fact I suspect she is the one who created the name (had it been left to me I probably would have chosen something tree related). Additionally, she was a critical part of this happening. It may have happened without her involvement, it may not have, I don’t know. I do know it wouldn’t have happened as quickly and the way that it did.

Catherine was the head of IT at the State Library and closely involved with the distribution of LSTA money in the state. I later discovered that she had already written a concept paper for creating a resource sharing consortium in South Carolina. I don’t believe her idea was inherently based on open source but she did cite PINES as an example of what she was thinking of in terms of resource sharing. Her idea hadn’t been circulated outside the State Library but this had dovetailed with it perfectly. She was critical to getting us LSTA funding to kickstart the migrations.

SCLENDS would quickly move over to a self sufficient model independent of LSTA and State Library money but those funds paid for the first two years of hosting and many of the migration expenses over two fiscal years that included our first three waves of libraries. Partial funds also helped one later wave.

Honestly, I thought the idea would be a much tougher sell than it was. Eleven libraries attended that first meeting and I had imagined half would back out. In the end only one, Greenville County, chose not to join SCLENDS, objecting to sharing their videos with other libraries. Most of these discussions happened in January and early February. Then we got to work. In less than five months, driven in large part by a window of opportunity for grant monies, we went from a first meeting to go live.

Wave one went live in late May 2009 and consisted of the State Library itself, the Union County Library and Beaufort County Library System. I later went to the State Library myself for a tenure at the IT Director there where I ironically ended up working with the Union County director, Nancy Rosenwald. We had both taken positions there and had offices next to each other. I really enjoyed working with her both within SCLENDS and at the State Library. She also had good taste in tea. Beaufort had one of the most dramatic go live days when a construction crew cut their fiber line during the first day of go live. The story the local newspaper printed was essentially “Evergreen Fails” instead of “No Internet at Library.” I understand they later printed a retraction in small print in an obscure text box. Ray McBride after a stint as a museum director even took over the library system there proving that it is a very small world. I discovered that Beaufort had been investigating Evergreen in 2008 as well though not as far along nor with plans as definite as our’s in Florence.

Wave 2 was in October of 2009 and included Fairfield County, Dorchester County, Chesterfield County and Calhoun County. Frank Bruno of Dorchester I think I fought with as much as I agreed with. I remember his staff loved him because he supported them. He passed away last year and the world is poorer for losing him. Drusilla Carter left Chesterfield for Virginia where she helped start talks that may have led to their own Evergreen consortium and eventually landed in Conneticut where she is a part of Bibliomation, another Evergreen consortium. Kristen Simensen is still at the Calhoun County library and fighting the good fight. Sarah McMaster of Fairfield retired right around the same time I left South Carolina and her last SCLENDS meeting was, I believe, my last one as well. Aside from personally liking Sarah as a person, professionally, there isn’t a library in the country that would not benefit from having a copy of Sarah on staff.

Finally wave three went live in December and included my own library Florence. Shasta Brewer of the York County library became a close co-worker of mine over those months and became the leader of the early cataloging discussions. Faith Line of Anderson had pervious consortium start up experience and continued to long be a voice that people looked to leadership on the executive board. I believe it was Faith her that suggested the creation of the working groups to aid in the migration that eventually became the main functional staff bodies of the consortium. Even when there were later attempts to expand or redefine them the original ones persisted in being the main ones. In Florence, Ray served as the chair man of the board during the infancy of the consortium and after leaving came back to another SCLENDS library.

And there were others – other staff, other stories and later other libraries which brought yet more staff and stories. SCLENDS grew over the next few years. But those stories belong in other years. I may or may not write about those stories some day but I think they’re better documented so there is probably little need. Did I leave some things out? Sure. The Thanksgiving Day Massacre. The Networked Man Incident. The Impossible Script Mystery. Probably others as well, and they make for fun stories, but aren’t core to the history I think.

– Rogan

by roganhamby at October 05, 2016 01:37 AM

September 29, 2016

Bibliomation

New Libraries Everywhere!

In a matter of just a few weeks, we welcomed two new libraries into the Bibliomation family. The Milford Public Library went live on Evergreen on August 18th, and the Babcock Library in Ashford went live on September 23rd. Both of these migrations were completed with the help of Equinox Software, Inc. We have worked with Equinox on other migration projects in the past, and appreciate their expert guidance. We are thrilled to have Milford Public Library, a former Bibliomation member, back in the fold. The staff brings with them a great deal of enthusiasm, and they have taken to Evergreen very quickly.

Milford Public Library staff

Babcock Library has been through many changes lately. Thanks to the leadership of their interim director, Terry Decker, they weathered losing key staff in the midst of the project, and still managed to go live on their target date.

Babcock Library staff

In November, the Burnham Library in Bridgewater will be the final library to join Bibliomation in 2016. Stay tuned for news of their journey!

by biblio at September 29, 2016 05:22 PM

August 05, 2016

Dyrcona's Evergreen Blog

NCIPServer: better_abstraction branch merged into master

The Evergreen ILS driver for the NCIPServer software reached a milestone with the merge of the better_abstraction working branch into the master NCIPServer repository. (NCIPServer is a NCIP version 2.02 responder for processing ILL transactions.) This merge comes after many months of production use with the Massachusetts Commonwealth Catalog.

While this merge marks the end of development on the better_abstraction branch, it is not the end of the road for NCIPServer development with Evergreen. For one thing, NCIPServer needs documentation. The README is just a placeholder. Jason Boyer of Indiana is working on improvements to the request item response messages. Also, we've given up on any pretense of compatibility with Koha, so there is code to be deleted and examples to be updated. Finally, it would be nice to have an installation method better than just copying files into place and manually editing the configuration.

It is recommended that those using the better_abstraction branch in production switch to the master branch of the main repository.

by Jason Stephenson (noreply@blogger.com) at August 05, 2016 02:15 PM

May 02, 2016

Rogan Hamby

Evergreen Conference 2016 and After Thoughts

Once a year I let some time pass from the Evergreen Conference before I try to capture my thoughts about it.  Finding myself in a contemplative mood this evening I finally decided to do it.

What should I write about?  The NC Cardinal folks did a great job, it’s an insane amount of work and they tackled it well.  There were a lot of great presentations.  The hospitality staff running the meeting rooms at the Sheraton were wonderful.  The Resistance was the game of the conference and I had a great time playing it.  As a member of the response team I was heartened that I was unneeded.  Honestly, I’ve been to far larger library events where they should make a model of the balanced, relaxed environment and professionalism of the Evergreen conference.  I had a great time meeting new folks at breakfasts and dinners.

My SQL pre-conference workshop went well.  One person told me that I really helped them with things they had struggled with.  Another told me they used the notes from my workshop last year during the entire intervening year.  Being told things like that make all the work worth it.

My statistics heavy presentation went well and I think I kept everyone awake even though by the end I had created more questions than I had answered.  I showed some clear relationships and likelihood of predictability of data if we can get enough data sets to compare and account for the variables influencing holds.  I think the data also clearly shows the value of sharing materials in a consortium.  I have a dozen thoughts on this that will be their own blog post at some point.

The biggest thing that stands out thinking back on it though is the lack of surprises.  In the early days of the Evergreen Conference I never quite felt like I knew what to expect.  Enthusiasm and passion for Evergreen are as strong now as they were at the very first Evergreen Conference but things have changed.  In the early days of the conference we had presentations about things like “How We Made Evergreen Work For Us.”  I stood at the front of the room doing a few of those myself.  Those are long gone.  The experiences, the presentations they reflect, for lack of a better term, are matured.  So has Evergreen.  So has the community.

We don’t have everything figured out but we’re not trying to figure out if we can manage the challenges either.

This weighs heavily on my thoughts because I saw an article that implied that open source software isn’t as mature as proprietary solutions.  I feel like the assumptions implicit were numerous and would take more time than I have here to deconstruct but again, might be a good future blog post or article.

Obviously, the perception of non-users of the software and non-members of the community doesn’t sync up with that of those who do use it and are members of the community.  I’m not saying my feelings are universal but upon talking to others I know they are widely shared.  So, why?

I believe Evergreen falls into a common pattern of technologies maturing.  Indeed, open source itself does.  Open source is a development methodology but it’s also a shared platform of technologies that build upon each other in chaotic way more akin to natural selection than design.  Why can people who see the adoption and maturation patterns of something like DVD players can’t see that it isn’t that different for software?  I don’t know.  Much like my consortial data presentation I feel like I’m leaving this with more questions created than I’ve answered but maybe that’s a good sign that I’m on the right path.

by roganhamby at May 02, 2016 12:19 AM

April 02, 2016

MVLC Evergreen

pingest.pl Gets a New Option

In case you're using pingest.pl from MVLC's Evergreen utilities repository, then you might be interested to know that it got a new option this week:

--pipe
         Read record IDs to reingest from standard input.
         This option conflicts with --start-id and/or --end-id.


This new option allows you to run a custom query to feed record ids to pingest.pl. For instance, assuming you have a query that returns bibliographic record ids in a text file called query.sql, you could use a command line like the following to ingest the records corresponding to the ids returned from the query:

    psql -q -t -f query.sql | pingest.pl --pipe

In the absence of the --pipe option, pingest.pl continues to use its internal query to determine what records to ingest.

In case you are new here and don't know what all this record ingestion is about, this is Evergreen-speak for generating the indexes used for search, browse, facets, and record attributes. pingest.pl generates these indexes in parallel by splitting the records up into batches and working on more than one batch at a time. Parallel processing is usually faster than starting with one record and going straight through to the end.

by Jason Stephenson (noreply@blogger.com) at April 02, 2016 01:27 AM

February 16, 2016

MVLC Evergreen

pingest.pl Gets an Outside Contribution

Bill Erickson contributed a patch to pingest.pl that adds new command line options and cleans up the handling of the current options.

The command line options as of now are:

--batch-size
Number of records to process per batch
--max-child
Max number of worker processes
--skip-browse
--skip-attrs
--skip-search
--skip-facets
Skip the selected reingest component
--start-id
Start processing at this record ID.
--end-id
Stop processing when this record ID is reached
--max-duration
Stop processing after this many total seconds have passed.
--help
Show the help text and exit.

For those of you who may not know, pingest.pl is useful for reindexing records in your Evergreen database. It can reindex them in parallel thus reducing the time that it takes.

You can find pingest.pl as part of MVLC's Evergreen utilities git repository.

by Jason Stephenson (noreply@blogger.com) at February 16, 2016 09:02 PM

January 19, 2016

Bibliomation

Goodbye to Benjamin Shum!

IMG_1732

January 19, 2016

Until January 12th, his last day on the job, Benjamin Shum has run the Evergreen system for Bibliomation the entire time we have been on Evergreen. Ben was hired in 2009, when Evergreen was just a plan. In 2010, along with Melissa Lefebvre and Kate Sheehan, Ben rolled Evergreen out to a number of development partner libraries, just joining Bibliomation in their own mini-Evergreen-network. Ben learned so much in that first year, enough to migrate the rest of Bibliomation’s libraries in May/June of 2011. His ability to communicate complex Evergreen functionality to staff and libraries alike made Evergreen a real pleasure for the rest of us to learn. Ben’s work as a core committer in the Evergreen community also helped us stay abreast of all Evergreen developments. His contributions were valued highly. Ben leaves Bibliomation’s Evergreen system in the capable hands of his work partner, Melissa Ceraso.

Good luck to Ben in his future endeavors. We will miss him.

by biblio at January 19, 2016 08:58 PM

August 20, 2015

Galen Charlton

Evergreen 2.9: now with fewer zombies

While looking to see what made it into the upcoming 2.9 beta release of Evergreen, I had a suspicion that something unprecedented had happened. I ran some numbers, and it turns out I was right.

Evergreen 2.9 will feature fewer zombies.

Considering that I’m sitting in a hotel room taking a break from Sasquan, the 2015 World Science Fiction Convention, zombies may be an appropriate theme.

But to put it more mundanely, and to reveal the unprecedented bit: more files were deleted in the course of developing Evergreen 2.9 (as compared to the previous stable version) than entirely new files were added.

To reiterate: Evergreen 2.9 will ship with fewer files, even though it includes numerous improvements, including a big chunk of the cataloging section of the web staff client.

Here’s a table counting the number of new files, deleted files, and files that were renamed or moved from the last release in a stable series to the first release in the next series.

Between release… … and release Entirely new files Files deleted Files renamed
rel_1_6_2_3 rel_2_0_0 1159 75 145
rel_2_0_12 rel_2_1_0 201 75 176
rel_2_1_6 rel_2_2_0 519 61 120
rel_2_2_9 rel_2_3_0 215 137 2
rel_2_3_12 rel_2_4_0 125 30 8
rel_2_4_6 rel_2_5_0 143 14 1
rel_2_5_9 rel_2_6_0 83 31 4
rel_2_6_7 rel_2_7_0 239 51 4
rel_2_7_7 rel_2_8_0 84 30 15
rel_2_8_2 master 99 277 0

The counts were made using git diff --summary --find-rename FROM..TO | awk '{print $1}' | sort | uniq -c and ignoring file mode changes. For example, to get the counts between release 2.8.2 and the master branch as of this post, I did:

$ git diff --summary  --find-renames origin/tags/rel_2_8_2..master|awk '{print $1}'|sort|uniq -c
     99 create
    277 delete
      1 mode

Why am I so excited about this? It means that we’ve made significant progress in getting rid of old code that used to serve a purpose, but no longer does. Dead code may not seem so bad — it just sits there, right? — but like a zombie, it has a way of going after developers’ brains. Want to add a feature or fix a bug? Zombies in the code base can sometimes look like they’re still alive — but time spent fixing bugs in dead code is, of course, wasted. For that matter, time spent double-checking whether a section of code is a zombie or not is time wasted.

Best for the zombies to go away — and kudos to Bill Erickson, Jeff Godin, and Jason Stephenson in particular for removing the remnants of Craftsman, script-based circulation rules, and JSPac from Evergreen 2.9.

by Galen Charlton at August 20, 2015 01:57 AM

April 20, 2015

Codey Kolasinski's Evergreen Work Blog

Customizing Evergreen Receipt Templates

checkout_receipt

Checkout receipt fully tricked out

At the 2014 MassLNC Evergreen Conference, Brian Herzog from NOBLE (North of Boston Library Exchange) and I presented the topic of Evergreen Receipts.  These receipts cover many areas of the library such as the patron side with checkout, payment, and items out receipts; but also the staff side such as transit slips, hold slips, and pull lists.  These receipts are HTML layouts with Template Toolkit variables that Evergreen uses to insert relevant information.  For example, with a checkout receipt (pictured right), there is arbitrary text such as ‘welcome to the library…’ but also a list of the items checked out with a bit of metadata.  This receipt is the most complicate (and pretty) that I’ve been able to manage with this medium.

The presentation covered topics including:

  • basic HTML tags to format receipts
    • <p>, <br/>, <ul>, <li>, <span>, etc.
  • basic css to add beauty
    • font-family, -size, -align, etc.
  • adding images
  • changing fonts
  • add the patron’s or item’s barcode to the receipt

One of the coolest things that I have discovered along the way is that a patron or item barcode can be printed on the receipt itself.  This feature requires adding a special font to a workstation (attached below) and the font family of the text to be changed when the system prints the receipt.  For example:

<span style=”font-family:’CodabarLarge‘; font-size:1.25em; text-align:center;”>
%PATRON_BARCODE%
</span>

The patron barcode Template Toolkit variable is invoked with %PATRON_BARCODE% but is modified by the style text of the <span> tag.  The font is changed by the font-family property to the value ‘CodabarLarge’ which is the name of the font used to turn the normal Arabic numbers into a codabar barcode that most library scanners can read.

Brian is especially adept at receipt templates.  He spread the word about a wonderful feature that allows staff members to total the cost of the items checked out to the patron and print it on the receipt itself:

amount saved

Our consortia’s libraries love this feature and a great number of them have added this to their receipts.  Below is a writing sample of mine that documents how to edit receipt templates.

Finally, I’ve attached the full boat of receipts that I created.  All public and staff receipts have a unified design, use the barcode font, and use the ‘amount saved’ widget.

Attached Documents

by ckolasinski at April 20, 2015 08:21 PM

September 27, 2014

Evergreen Open Source ILS - Flickr stream

2014-09-26 17.53.29

Evergreen Open Source ILS posted a photo:

2014-09-26 17.53.29

When Vietnamese fails, eat Irish. Hack-A-Way 2014.

by Evergreen Open Source ILS at September 27, 2014 04:07 PM

2014-09-26 17.21.35

Evergreen Open Source ILS posted a photo:

2014-09-26 17.21.35

Notice that there are a lot of pictures of people just working. Hack-A-Way 2014.

by Evergreen Open Source ILS at September 27, 2014 04:07 PM

January 23, 2014

Moving to Evergreen in Niagara

Macros : Receipt Template Editor Variables

Receipt Template Editor Variables

 

General variables

%LIBRARY%                        Library full name

%SHORTNAME%                  Library Policy Name

%STAFF_FIRSTNAME%         First name of Staff login account

%STAFF_LASTNAME%          Last name of Staff login account

%STAFF_BARCODE%           Barcode of Staff login account

%STAFF_PROFILE%             Profile of Staff login account

%PATRON_FIRSTNAME%      First name of Patron

%PATRON_LASTNAME%       Last name of Patron

%PATRON_BARCODE% or

%patron_barcode%              Patron Barcode

%TODAY%                           Full Date and time in the format: Wed Sep 21 2011 13:20:44 GMT-0400 (Eastern Daylight Time)

%TODAY_TRIM%                  Date and time in a shorted format: 2011-09-21 13:21

%TODAY_m%                      Two digit Month: 09

%TODAY_d%                       Two digit Day: 21

%TODAY_Y%                       Year: 2011

%TODAY_H%                       Hour in 24 hour day: 13

%TODAY_I%                        Hour in 12 hour format: 1

%TODAY_M%                       Minutes of the Hour: 24

%TODAY_D%                       date in standard US format: 09/21/11

%TODAY_F%                       date in International Standard: 2011-09-21

Additional variables for various slips

Hold Slip

%ROUTE_TO%                     It should say Hold Shelf if it is a hold being fulfilled

%item_barcode%                 Item Barcode

%item_title%                       Item Title

%hold_for_msg%                 Hold for Message: this gives the patron’s Name

%PATRON_BARCODE%         Patron’s Barcode

%notify_by_phone%            Phone number listed in the Hold Database.  This may not be the same s what is in the Patron’s record, as they can list another number when placing the hold.

%notify_by_email%             Email listed in Hold Database.  Same as phone number

%request_date%                  The date that the Request was originally placed.

%formatted_note%              Hold Notes (new to 2.1)

Transit Slip

%route_to%                        Library Policy Name that the item is in transit to

%route_to_org_fullname%    Library Full Name that the item is in transit to

%street1%                          Library Street address Line 1 that the item is in transit to.

%street2%                          Library Street address Line 2 that the item is in transit to.

%city_state_zip%                City, State, Zip of Library the Item is in transit to.

%item_barcode%                 Item Barcode

%item_title%                       Item title

%item_author%                   Item Author

 

Hold Transit Slip

%route_to%                        Library Policy Name that the item is in transit to

%route_to_org_fullname%    Library Full Name that the item is in transit to

%street1%                          Library Street address Line 1 that the item is in transit to.

%street2%                          Library Street address Line 2 that the item is in transit to.

%city_state_zip%                City, State, Zip of Library the Item is in transit to.

%item_barcode%                 Item barcode

%item_title%                       Item title

%item_author%                   Item Author

%hold_for_msg%                 Hold for Message: this gives the patron’s Name

%PATRON_BARCODE%         Patron’s Barcode

%notify_by_phone%            Phone number listed in the Hold Database.  This may not be the same s what is in the Patron’s record, as they can list another number when placing the hold.

%notify_by_email%             Email listed in Hold Database.  Same as phone number

%request_date%                  Date that the Request was originally placed

 

Check out

%title%                               Title

%author%                           Author

%barcode%                         Item Barcode

%due_date%                       Due Date in US format with 2.1, International format with 1.6

 

For type: payment

%original_balance%             The original balance the patron owes

%payment_received%          How much was received from the patron

%payment_applied%            How much of the payment was applied

%payment_type%                What type of payment was applied: IE Cash

%voided_balance%              Any Voided balance

%change_given%                 How much change was given

%new_balance%                  The new balance on the account

%note%                              Any notes on the annotated payment

%bill_id%                            The Id for the bill in the Bill database

%payment%                        How much of the payment that was applied was applied to this title

%title%                               Title that the payment was applied to.

%last_billing_type%             The type of bill that was last charged to the patron for this title

%barcode%                         Item barcode

%title%                               title of item

 


by nclibraries at January 23, 2014 02:44 PM

August 08, 2013

BOSS: Bibliomation and Open Source Systems

The blog has a new home

This blog has a new home:
http://biblio.org/blog/

We'll be posting about all things Biblio as well as our work with Evergreen. Check it out!

by Bibliomation HQ Staff (noreply@blogger.com) at August 08, 2013 03:19 PM

May 01, 2013

Tara Robertson

egcon2013: open library ecosystem

egcon2013 website header image by Jon Whipple
egcon2013 design work by Jon Whipple

I just finished chairing the organizing committee for the International Evergreen conference in Vancouver. It’s been more than a year of planning and a labour of love. From our own evaluation and from participant feedback we put on a really excellent conference. Now that I’m caught up on sleep here’s some of my thoughts.

Why this was an awesome  organizing  experience  for me

  • great community – the Evergreen community is awesome. People are kind, hardworking and have a DIY get ‘er done kinda attitude. I don’t write code, so can’t make that kind of contribution to the project, but I am good at event planning. While I’m sure I could organize an event for a group of people I didn’t know, it’s easier and more fulfilling to do this for a community of people I care deeply about.  One of my first jobs out of library school was doing training and support for the Sitka Evergreen installation in BC. I learned a lot and this experience helped me get interesting library technology jobs. I feel grateful for the skills I built and to the people who mentored me. On a personal level it feels good to be able to contribute something back to the Evergreen community.
  • great  organizing  team – This was the second conference that we’ve organized together. I have a lot of respect and admiration for these folks: Anita Cocchia (BCELN),  Caroline Daniels (KPU),  Mark Ellis (RPL),  Mark Jordan (SFU),  Paul Joseph (UBC)  and Shirley Lew (VCC). While Ben Hyman (BC Libraries Coop) wasn’t on the organizing committee he did a stellar job of communicating with and buffering us from the Evergreen Oversight Board and the Software Freedom Conservancy. We all work hard and trust each other. I’ve learned a bunch of soft and hard skills from this group. I enjoyed our group dynamic and loved working together. We were comfortable asking questions and challenging each other. There were a bunch of times i felt like, as a group, we came up with a way better decision than any one of us as individuals would have.

Things that didn’t cost anything and added value

  • We had an amazing team of volunteers who did live note taking as well as helping out stream the technical track. These folks were super enthusiastic and committed. The live notes are written documentation of the conference that makes it easier for everyone to write reports afterwards. One of the participants said “The team of note-takers was awesome.  It let me focus on how any given session could affect my work, without worrying that I’d miss something important as I chased down random thoughts.”  For me they function as a quick summary of a video, and I’ll likely scan the notes of the sessions that I missed to figure out which videos I want to watch. Many thanks to  Kimberly Garmoe,  Eka Grguric,  Mary Jinglewski,  Jonathan Kift,  Jonathan Schatz, and  David Waddell.
  • No-host lunches were a great way to get people outside the building to see a  little  bit of Vancouver. They also were a way to create a structured  opportunity  to socialise in small groups. From an organizing perspective it wasn’t a lot of work. We created a map of places that are nearby the venue with tasty food  that can  accommodate  8 people, found locals who were willing to lead the groups, and put out signup sheets (7 people plus a leader). We made sure we identified places for vegetarians and gluten free folks. According to participant feedback the no-host lunches were a big hit. Also, we had a really tight budget, so this allowed us to provide something for lunch without actually having to pay for it. We did this for the Access conference, but didn’t organize it enough and it was a bit chaotic. With a bit more forethought this time things went much more smoothly.

Live note taking and no-host lunches are ideas that can be adapted to any kind of conference or event, not just an open source library software event.

This was the first time that the conference proceedings were streamed. It was expensive to pay for AV for the main track, but I think is important and should be a requirement of future conferences. There were a total of 183 people watching the live stream from the United States, Canada, Czech Republic, Japan, Mexico, Finland and the UK. As Mark, Shirley and Ben from the BC Library Coop were willing to figure out a DIY streaming solution for the tech track, we were able to also do this for next to no money. It was awesome to hear from someone watching in Mexico (a CS Masters student who is implementing Evergreen for two university libraries) via Twitter. Thank you to Sam Mills  for volunteering to edit the video from the main track and to Mark Jordan for getting it up on the Internet Archive.

by Tara Robertson at May 01, 2013 06:01 PM

April 12, 2013

Evergreen International Conference (2013)

Give us feedback!

The Evergreen 2013 conference has come and gone and we want to hear what you thought about the conference so we can help next year’s organizers in planning an even better conference experience for you. Tell us what you liked, disliked, best food and favourite made-in BC TV show. Thanks for participating in the conference and visiting our lovely city!

All the best, courtesy of your superstar Evergreen 2013 Organizing Committee:

  • Tara Robertson (Chair), CILS
  • Anita Cocchia, BCELN
  • Caroline Daniels, KPU
  • Mark Ellis, RPL
  • Mark Jordan, SFU
  • Paul Joseph, UBC
  • Shirley Lew, VCC

by pjjoseph at April 12, 2013 09:06 PM

Group outing to the Irish Heather tonight

Feel like getting Irish tonight? Need a pint or two of the best Guinness in Vancouver? Meet up with Sharon and Kevin in the hotel lobby at 6:45pm for an excursion to the best little Irish pub in Vancouver, the Irish Heather.

by pjjoseph at April 12, 2013 08:53 PM

October 10, 2012

BOSS: Bibliomation and Open Source Systems

Evergreen Enhancements - Developers Selected

Bibliomation has reviewed the quote submissions from the Evergreen developers and based on our priorities and the available funds for this fiscal year, we will be working with four different developers on thirteen of our original twenty-eight enhancement requests.

The four Evergreen developers are Thomas Berezansky, Equinox, Catalyst IT Services, and EDOCEO.

To see the full list of enhancements that we will be moving forward with, go to http://biblio.org/2012/10/10/evergreen-enhancements/

If you have any questions about any of our enhancements, you can contact Amy Terlaga at terlaga AT biblio DOT org.

by Bibliomation HQ Staff (noreply@blogger.com) at October 10, 2012 05:14 PM

September 19, 2012

GSoC 2012 (Pranjal Prabhash)

HTTP_REQUEST2

In my last post I talked about getting my code published as a PEAR package. Well, I did talk to them and they asked me to change my code a bit, add some PHPunit test, and to organize my directory in a different fashion. But, vacations are over and I have been too busy to make those changes. Yes, its been long since my last post. But, my first update after GSoC is here. I have changed the way I used to POST using cURL. Now, I am using HTTP_REQUEST2 to POST and send headers. As it says on PEAR, HTTP_REAUEST2 “Provides an easy way to perform HTTP requests”, more information over here (http://pear.php.net/package/HTTP_Request2).

Some PHPunit test, and a new package name is what I aim to do next.


by pranjal710 at September 19, 2012 05:18 AM

August 15, 2012

GSoC 2012 (Pranjal Prabhash)

openSRF PHP Library

The openSRF-PHP Library is ready and can be used to get response from an openSRF service using the services and methods present. There are two examples present, which will tell how to use the library. A documentation has also been added (https://github.com/pranjal710/osrf/blob/master/docs/doc.md) . The first part of the documentation describes functions, classes and their member functions, parameters passed and their return values. The second part of the documentation briefly describes what happens when someone uses the library i.e. the role of each class/function. My github repo is mentioned in the github tab.

Someone who helped me throughout, this library would not have been what it is now had my mentor Lebbeous Fogle-Weekley not helped me. I thank him for helping me by breaking the project into small parts and guiding me whenever I got stuck. There are a lot of things (which might be small for him) but were new to me, and they will surely help me throughout. Thankyou Sir, for editing my code, which cleaned it made it a lot shorter and for everything.

The next thing I am looking forward is to get it published as a PEAR package. My next post will tell about the changes I have done to get it Published as a PEAR package.


by pranjal710 at August 15, 2012 05:44 PM

August 12, 2012

GSoC 2012 (Sy Duan)

Problems encoutered in Testing the module


Last two weeks i'm trying to test my new functions in the module. Compared to the normalize function which has a test script. They are more difficult to test. I located they were used in the bib merge function. So i tried to extract some real use cases from the function. After that i can build some test cases to test the vandelay.add_field and vadelay.strip_field which is called by vandelay.merge_record_xml.

So I planned to batch import some marc records and extract some of real cases when there is a match needed to merge the record. But i failed to batch import the marc records(oca_unicode.mrc and Open Access Titles UOP). I tried a lot of ways. But the import process always crashes after i wait a long time. No error recorded in the logs. So i have to turn to the community for help.

by Swenyu Duan (noreply@blogger.com) at August 12, 2012 03:12 AM

July 30, 2012

GSoC 2012 (Sy Duan)

New functions in extension c_functions

Last week I implemented two functions vandelay.add_field and vandelay.strip_field. These two functions are the first functions in extension c_functions which handles MARC in C.
These functions relies on libxml2, libxslt and ICU4C.
Now the extensions has some of the general c functions to perform some of perl functions in a simple way which will improve the development speed to translate perl script into c program.
I push the branch to evergreen/working. You can find the branch on http://git.evergreen-ils.org/?p=working/Evergreen.git;a=shortlog;h=refs/heads/user/dsy/extension_in_c.
I push the previous olis_xslt_process function, too.
All the functions have not tested thoroughly. I'm going to test them in the following week.

by Swenyu Duan (noreply@blogger.com) at July 30, 2012 03:01 AM

April 27, 2012

Tara Robertson

Evergreen Unsung Heroes

I was inspired by Chris Cormack’s excellent series of blog posts highlighting awesome people in the Koha community. I wanted to adapt Chris’ idea to the Evergreen community. Here’s the call for submissions from a few months ago.

I have two observations from the last few months. First, people were reluctant to promote themselves and write bios listing all their accomplishments. I shouldn’t have been surprised by this. It was more effective to ask someone’s coworker, colleague or boss to highlight their contributions. I like that our community values humility, but know that most people enjoy being recognized for work that they are proud of. Second, some people felt that the work that they did was insignificant and not worthy of being recognized. Almost all of these people were women who had been nominated by other people in the community. After an email or two all of these people agreed to be profiled.
I’m going to continue this project for the next year. I’m sure the design students at Emily Carr University will do something interesting with this content (ebook? website? deck of playing cards? laser engraved beef jerky?) for the Evergreen 2013 conference.

by Tara Robertson at April 27, 2012 07:47 PM

March 16, 2012

Warren Layton (Libre-arian)

code4lib North Pub Meetup in Ottawa

Do you want to meet up and talk about libraries, library software, and coding?

I’m organizing a small, informal Ottawa-area code4lib North meetup at the end of March.

When: Wednesday March 28, 5-7 PM

Where: Royal Oak downtown at 188 Bank at Gloucester (on the corner across from L’Esplanade Laurier).

The details are also up on the code4lib wiki.

Beginners are very welcome to join!

Let me know if you are interested by e-mailing me at warren.layton@gmail.com so that I can reserve enough seats for us at the Oak.


by W at March 16, 2012 11:11 AM

March 09, 2012

Moving to Evergreen in Niagara

Check Out Receipt Template

In Header Section:
<img src=”http://niagaracollege.niagaraevergreen.ca//opac//images/small_logo.jpg”&gt;
<br/>
You checked out the following items:<hr/><ol>

In Line Items Section:

<li>%title%<br/>
Barcode: %barcode% <br/>
Due: %due_date%

In Footer Section:
</ol><hr />%SHORTNAME% %TODAY_TRIM%<br/>
You were helped by %STAFF_FIRSTNAME%<br/>
<br/>
Fines: Books/Magazines .25 per day,<br/>
Video/DVD 1.00 per day,<br/>
Equipment 2.00 an hour
<br/>
<br/>
Renew under  My Library Account at <br/>
http://www.niagaracollege.ca/library
<br/><br/>
Or call NCLibraries :  905-735-2211 x7767<br/>
(WC) x7767  (NOTL) x4413 <br/>


by nclibraries at March 09, 2012 04:51 PM

December 06, 2010

Evergreen International Conference (2011)

Call for Presentation Proposal Submissions

The 2011 Evergreen International Conference planning team is pleased to invite you to submit presentation proposals for the conference.

There are three programming tracks, and we hope to have a broad spectrum of programming within each track:  

by Chris Sharp at December 06, 2010 04:13 PM

November 23, 2010

Evergreen International Conference (2011)

Conference Planning Survey - Results

The results from our 2011 Conference Planning survey are in!

Here is a summary:

  • 75 respondents completed the survey; 79 started the survey.

On question #1, What tracks will you likely attend?

by Amy Terlaga at November 23, 2010 02:58 PM

April 01, 2010

Warren Layton (Libre-arian)

SirsiDynix OpenSource Paper

The SirsiDynix paper on open source — the one that caused a stir last fall — seems to have disappeared from WikiLeaks and from Stephen Abram’s related blog post. Fortunately, there’s a copy here, in case anyone is looking for it.


by W at April 01, 2010 07:33 PM