We are just a couple weeks away from the 2021 Evergreen International Online Conference! The conference is taking place Tuesday, May 25th through Thursday, May 27th 2021. A separate preconference event will be held Monday, May 24th and a documentation and developer hackfest is planned for Friday, May 28th. We encourage you to use the conference hashtag #evgils21 in any social media posts.
Registration for preconference & main conference events will be open until Monday, May 17th. Registration for the hackfests, which are free, will remain open though the conference.
There’s a great lineup of sessions planned, so please see the full Conference Schedule and Program Descriptions pages. All preconference and regular conference sessions will be recorded and made available within a few weeks after the conference. Recordings will be posted on the community YouTube page. All sessions except interest groups & roundtables will also have live captioning available.
As a reminder to all attendees, this event is subject to the community Code of Conduct and Photography/Audio/Video Policy. There are designated Responders for this event as well as a form to report Code of Conduct violations.
Last but not least, a huge thanks to our Sponsors and Exhibitors for making this event possible! We are so grateful for your support of both the conference and the larger Evergreen community.
The Evergreen Community is pleased to announce the release of Evergreen 3.7.0. Evergreen is highly-scalable software for libraries that helps library patrons find library materials and helps libraries manage, catalog, and circulate those materials, no matter how large or complex the libraries.
Evergreen 3.7.0 is a major release that includes the following new features of note:
Evergreen admins installing or upgrading to 3.7.0 should be aware of the following:
open-ils.geo
.Geo::Coder::Google
, Geo::Coder::OSM
, String::KeyboardDistance
, and Text::Levenshtein::Damerau::XS
.The release is available on the Evergreen downloads page. Additional information, including a full list of new features, can be found in the release notes.
MassLNC is seeking a candidate for a contract position to serve as an Interim Project Coordinator while the organization seeks a new Executive Director. The Interim Project Coordinator would be responsible for the management of ongoing software development projects to improve the open-source Evergreen Integrated Library System (ILS).
Interested candidates should send a proposal, including an hourly contract rate, their resume and three references to klussier@masslnc.org by the end of the day Friday, November 9, 2018. Questions about this RFP can be sent to klussier@masslnc.org at any time.
Attachment | Size |
---|---|
Request for Proposals for MassLNC Interim Project Coordinator.pdf | 71.45 KB |
MassLNC is seeking quotes for two new Evergreen development projects. One project will allow library users to easily find materials that is closest to them by geographic proximity. The other project will make patron record management easier for circ desk staff by consolidating patron alerts, notes, and messages into one interface.
Requirements for the geographic proximity project are available at http://masslnc.org/node/3404. Requirements for the patron note consolidate project are available at http://masslnc.org/node/3403.
Developers and companies interested in working on one or both of these projects should submit a quote and project description to Kathy Lussier at klussier@masslnc.org by 11:59 p.m. EDT on Monday, July 9, 2018. All work performed for MassLNC should follow our General Guidelines for Submitting Proposals.
These development projects were selected by the MassLNC Development Committee after reviewing activity on the MassLNC Ideas site and prioritizing those ideas. Many thanks to the MassLNC Development Initiative partners for communicating the needs of their libraries, helping to select the ideas, and working on the development requirements.
Yesterday I gave a lightning talk at the Evergreen conference on being wrong. Appropriately, I started out the talk on the wrong foot. I intended to give the talk today, but when I signed up for a slot, I failed to notice that the signup sheet I used was for yesterday. It was a good thing that I had decided to listen to the other lightning talks yesterday, as that way the facilitator was able to find me to tell me that I was up next.
Oops.
When she did that, I initially asked to do it today as I had intended… but changed my mind and decided to charge ahead. Lightning talks are all about serendipity, right?
The talk went something like this: after mentioning my scheduling mix-up, I spoke about how I have been active in the Evergreen project for almost nine years. I’ve worn a variety of project hats over that time, including those of developer, core committer, release manager, member of the Evergreen Oversight Board, chair of the EOB, and so forth. While I am of course proud of the contributions I’ve made, my history with the project also includes being wrong about many things and failing a lot.
I’ve been wrong about coding issues. I’ve been responsible for regressions. I’ve had my share of brown-bag releases. I’ve misunderstood what library staff and patrons were trying to accomplish. I’ve made assumptions about the working conditions and circumstances of users that were very wrong indeed. Some of my bug reports and test plans have not been particularly clear.
Why bring up my wrongness? Prior to the talk, I had been part of a couple conversations about how some folks feel intimidated about writing bug reports or posting to the mailing lists for fear of being judged if their submission was not perfect. Of course, I don’t want people to feel intimidated; the project needs bug reports and contributions from anybody who cares enough about the software to make the effort. By mentioning how I — as somebody who is unquestionably a senior contributor to the project — have been repeatedly wrong, I hoped to humanize people like me: we’re not perfect. Perfection is not a requirement for gaining status in the community as a respected contributor — and that’s a good thing.
I also wanted to give permission for folks to be wrong, in the hopes that doing so might help lower a barrier to participating.
So much for the gist of the lightning talk. People in the audience seemed to enjoy it, and I got a couple nice comments about it, including somebody mentioning how they wished they had heard something like that as they were making their first contributions to the project.
However, I would also like to expand a bit on a couple points.
Permission to be wrong is not something I can grant all by myself. While I can try to model good ways of providing feedback (and get better myself at it; I’ve certainly been wrong many a time about how to do so), it sometimes doesn’t take much for an interaction with a new contributor (or an experienced one!) to become unwelcoming to the point where we lose the contributor forever. This is not a theoretical concern; while I think we have gotten much better over the years, there were certainly times and circumstances where it was very rational to feel intimidated about participating in the project in certain ways for fear of getting dismissive feedback.
Giving ourselves permission to be wrong is a community responsibility; by doing so we can give ourselves permission to improve. However, this can’t be treated as a platitude: it takes effort and thoughtfulness both to ensure that the community is welcoming at all levels, and to ensure that permission to be wrong isn’t accorded only to people who look like me.
One of the things that the conference keynote speaker Crystal Martin asked the community to consider was this: “Lift as you climb.” I challenge senior contributors to the Evergreen project — including myself — to take this to heart. I have benefited a lot by being able to be wrong; we should act to ensure that everybody else in the community can be allowed to be wrong as well.
The eContent Committee will meet next on February 13, 2018 at 10am at the Plainfield-Guilford Township Public Library.
The Executive Committee will meet on February 13, 2018 at 1pm at the Plainfield-Guilford Township Public Library. An executive session will precede the public meeting and is anticipated to last approximately 15 minutes.
Whiting Public Library’s go-live has been scheduled for Wednesday, March 14, 2018. A modified cataloging freeze is anticipated beginning on Friday morning, March 9, 2018.
The 2018 Evergreen International Conference will be held in St. Charles, Missouri, April 30-May 3, 2018. April 30 is the pre-conference. The Evergreen Indiana Executive Committee has authorized a scholarship fund to assist interested staff to attend the conference. The application is now open through February 5, 2018.
The webclient feature review class is almost ready for release. Due to the upgrades and fixes integrated since December, there have been enough changes that it has had to be rewritten a couple of times already. The LEUs for all recorded participants have gone out. I anticipate releasing the recording by the end of the week. Thank you for your patience!
Due to increased interest related to the webclient workflows, the basic series will be offered twice in Q1 2018.
February 5, 2018
Basic Circulation, 9am-12pm, webinar (3 TLEU)
Holds, 1pm-3pm, webinar (2 TLEU)
February 7, 2018
Local Administration and Basic Reporting, 9am-12pm, webinar (3 TLEU)
Basic Cataloging, 1pm-3pm, webinar (2 TLEU)
February 27, 2018
Basic Circulation, 9am-12pm, webinar (3 TLEU)
Holds, 1pm-3pm, webinar (2 TLEU)
February 28, 2018
Local Administration and Basic Reporting, 9am-12pm, webinar (3 TLEU)
Basic Cataloging, 1pm-3pm, webinar (2 TLEU)
March 5, 2018
Advanced Cataloging, Part I, 9am-Noon, webinar. You must register for and attend both parts of the webinar to be eligible for your Cat1 and receive your 6 TLEUs.
March 6, 2018
Advanced Cataloging, Part II, 9am-Noon, webinar. You must register for and attend both parts of the webinar to be eligible for your Cat1 and receive your 6 TLEUs.
April 17, 2018
Fine-Free Libraries – What, Why, and How, 10-11am EST, webinar, Ruth Frasur (HGSTN) and Scott Tracey (WLAFY) presenting.
The final visits of the 2017 library tour by the support team were completed at the end of November. All told, we visited 163 service locations as part of this tour and had a great time getting to know you and your libraries better. An executive report will be released later this spring together with a full album of the visit cards so that we can share more of the great ideas, programs, and partnerships you celebrate. Thank you all so much for sharing your time, ideas, and libraries with us!
In case you’re interested in tracking where we traveled, there is an interactive map (with links to the full sized profile cards) available: 2017 Tour.
Whiting Public Library’s go-live has been scheduled for Wednesday, March 14, 2018. A modified cataloging freeze is anticipated beginning on Friday morning, March 9, 2018.
The 2018 Evergreen International Conference
The webclient feature review class is still undergoing conversion for asynchronous release. We anticipate its release, together with the live class LEUs, by the end of the month. Thank you for your patience!
Due to increased interest related to the webclient workflows, the basic series will be offered twice in Q1 2018.
February 5, 2018
Basic Circulation, 9am-12pm, webinar (3 TLEU)
Holds, 1pm-3pm, webinar (2 TLEU)
February 7, 2018
Local Administration and Basic Reporting, 9am-12pm, webinar (3 TLEU)
Basic Cataloging, 1pm-3pm, webinar (2 TLEU)
February 27, 2018
Basic Circulation, 9am-12pm, webinar (3 TLEU)
Holds, 1pm-3pm, webinar (2 TLEU)
February 28, 2018
Local Administration and Basic Reporting, 9am-12pm, webinar (3 TLEU)
Basic Cataloging, 1pm-3pm, webinar (2 TLEU)
March 5, 2018
Advanced Cataloging, Part I, 9am-Noon, Webinar. You must register for and attend both parts of the webinar to be eligible for your Cat1 and receive your 6 TLEUs.
March 6, 2018
Advanced Cataloging, Part II, 9am-Noon, Webinar. You must register for and attend both parts of the webinar to be eligible for your Cat1 and receive your 6 TLEUs.
A long time ago, I experimented with using nginx as a caching proxy in front of Evergreen but never quite got it to work. Since then, a lot has changed in both nginx and Evergreen, and Bill Erickson figured out how to get nginx to proxy the websockets that Evergreen now needs for its web-based staff client. This spring, as part of my work towards building prototype offline support for the Evergreen catalogue's My Account section, I dug in and started figuring out some of the final pieces that are needed to enable nginx to proxy most of the static content that Apache (with its bloated processes) would otherwise have to serve up, and wrote a configuration generator script for the nginx and Apache pieces. And in July, we went live with the configuration.
This post documents what we currently (as of August 2017) are running on our Evergreen 2.12 server with Ubuntu 16.04. If you have any questions about this or our corresponding Apache configuration, please let me know and I'll attempt to answer them!
This is the core configuration for the nginx server:
proxy_cache_path /tmp/nginx_cache levels=1:2 keys_zone=my_cache:10m max_size=1g inactive=60m use_temp_path=off; proxy_cache_key $scheme$http_host$request_uri; server { listen 80; server_name clients.concat.ca; include /etc/nginx/concat_ssl.conf; include /etc/nginx/osrf_sockets.conf; location / { proxy_pass https://localhost:7443; rewrite ^/?$ /updates/manualupdate.html permanent; include /etc/nginx/concat_headers.conf; } }
proxy_cache_path
directive tells nginx where to store the data it is caching, what kind
of directory structure it should create (levels), the name of
the shared memory zone to use (keys_zone), the maximum size of
the disk cache (max_size), how long to retain a cached copy of
the file (inactive), and whether to use the value of the
proxy_temp_path
directive as a parent directory for the cache.
proxy_cache_key
tells nginx to use a combination of the request scheme (typically HTTP
or HTTPS), the hostname, and the full request URI (including GET
arguments) to store and lookup the cached data. Apache's response
tells nginx how long the request should be cached (whether it should
expire immediately, or as of
#1681095 "Extend browser cache-busting support",
cache for a full year for images, JavaScript, and CSS (at least until
you run autogen.sh
again).
server
directive per hostname
that we support, which is quite repetitive. Looking at this with fresh
eyes, we should probably simply use something like server_name
*.concat.ca
to cover all of our hostnames on our domain with a
single directive.
include /etc/nginx/concat_ssl.conf;
keeps all of the
TLS-related configuration in one place, including listening to port
443. We'll pry open this file later.
include /etc/nginx/osrf_sockets.conf;
keeps all of the
OpenSRF websockets translator proxy configuration in one place. We'll
also pry open this file later.
location /
block handles the proxying. At first I was nervous
and wanted to proxy the actual hostname instead of
localhost
to ensure we got the right templates, etc, but
it turns out the proxy headers guide the request to the right host. So
now I'm relaxed and we simply pass the request on to
https://localhost:443
. Be very careful with those trailing
slashes!
listen 443 ssl http2; ssl_certificate /etc/apache2/ssl/server.crt; ssl_certificate_key /etc/apache2/ssl/server.key; if ($scheme != "https") { return 301 https://$host$request_uri; } # generate with openssl dhparam -out dhparams.pem 2048 ssl_dhparam /etc/apache2/dhparams.pem; # From https://mozilla.github.io/server-side-tls/ssl-config-generator/ ssl_prefer_server_ciphers on; ssl_session_timeout 1d; ssl_session_cache shared:SSL:50m; ssl_session_tickets off; # intermediate configuration. tweak to your needs. ssl_protocols TLSv1 TLSv1.1 TLSv1.2; ssl_ciphers 'ECDHE-ECDSA-CHACHA20-POLY1305:ECDHE-RSA-CHACHA20-POLY1305:ECDHE-ECDSA-AES128-GCM-SHA256:ECDHE-RSA-AES128-GCM-SHA256:ECDHE-ECDSA-AES256-GCM-SHA384:ECDHE-RSA-AES256-GCM-SHA384:DHE-RSA-AES128-GCM-SHA256:DHE-RSA-AES256-GCM-SHA384:ECDHE-ECDSA-AES128-SHA256:ECDHE-RSA-AES128-SHA256:ECDHE-ECDSA-AES128-SHA:ECDHE-RSA-AES256-SHA384:ECDHE-RSA-AES128-SHA:ECDHE-ECDSA-AES256-SHA384:ECDHE-ECDSA-AES256-SHA:ECDHE-RSA-AES256-SHA:DHE-RSA-AES128-SHA256:DHE-RSA-AES128-SHA:DHE-RSA-AES256-SHA256:DHE-RSA-AES256-SHA:ECDHE-ECDSA-DES-CBC3-SHA:ECDHE-RSA-DES-CBC3-SHA:EDH-RSA-DES-CBC3-SHA:AES128-GCM-SHA256:AES256-GCM-SHA384:AES128-SHA256:AES256-SHA256:AES128-SHA:AES256-SHA:DES-CBC3-SHA:!DSS'; # HSTS (ngx_http_headers_module is required) (15768000 seconds = 6 months) add_header Strict-Transport-Security max-age=15768000; # OCSP Stapling --- # fetch OCSP records from URL in ssl_certificate and cache them ssl_stapling on; ssl_stapling_verify on;
There's a fair bit going on here, but it's almost entirely related to TLS
support and a lot of the content comes either from the
Mozilla TLS configuration generator
or from Certbot's configuration plugin for nginx. Perhaps most interesting
is the listen 443 ssl http2;
line that enables listening on
the standard HTTPS port and also supports HTTP/2
for browsers that support it--effectively a way to use a single connection
from a browser to a server to issue many parallel requests for resources,
amongst other performance enhancements.
We also force any HTTP request to use an HTTPS connection using the
if ($scheme != "https") {
block.
This is extracted from the sample nginx configuration shipped with OpenSRF:
location /osrf-websocket-translator { proxy_pass https://localhost:7682; proxy_set_header X-Real-IP $remote_addr; proxy_set_header X-Forwarded-For $proxy_add_x_forwarded_for; # Needed for websockets proxying. proxy_http_version 1.1; proxy_set_header Upgrade $http_upgrade; proxy_set_header Connection "upgrade"; # Raise the default nginx proxy timeout values to an arbitrarily # high value so that we can leverage osrf-websocket-translator's # timeout settings. proxy_connect_timeout 5m; proxy_send_timeout 1h; proxy_read_timeout 1h; }
This is not perfectly named; while we do set up the proxy headers in this
file, we also include some of the other statements we would otherwise have
to repeat inside the server
block. Here's what the contents
look like:
proxy_set_header Host $host; proxy_set_header X-Real-IP $remote_addr; proxy_set_header X-Forwarded-For $proxy_add_x_forwarded_for; proxy_set_header X-Forwarded-Proto $scheme; proxy_cache my_cache; proxy_cache_use_stale error timeout http_500 http_502 http_503 http_504; proxy_cache_lock on; rewrite ^/?$ /eg/opac/home permanent;
proxy_set_header
directive adds headers to the requests forwarded to Apache, so that Apache
can figure out which host was actually requested, accurately log requests
(instead of saying everything is coming from localhost
),
etc. These directives were copied directly from the sample nginx configuration shipped with OpenSRF.
proxy_cache
tells this server to use the cache we previously named in our keys_zone
parameter.
proxy_cache_use_stale
tells this server to return stale data (if it has a cached copy) if Apache
returns an error or a timeout or any of the specified HTTP status codes
while trying to fetch a fresh copy.
proxy_cache_lock
tells this server to, should multiple identical requests for data that
needs to be cached or refreshed arrive, only allow a single request to be
passed through to Apache and have the other requests wait. This can be one
way to avoid the "someone set a book down on a keyboard and caused 100
identical requests in one second" problem.
rewrite
simply directs the request for a bare hostname
(with or without a trailing slash) to the catalogue home page.
I'm part of the Music in Canada @ 150 Wikimedia project, organizing wiki edit-a-thons across Canada to help improve the presence of Canadian music and musicians in projects like Wikpedia, Wikidata, and Wikimedia Commons. It's going to be awesome, and it's why I invested time in developing and delivering the Wikidata for Librarians presentation at the CAML preconference.
Right now I'm at the Wikimania 2017 conference, because it is being held in Montréal--just down the road from me when you consider it is an international affair. The first two days were almost entirely devoted to a massive hackathon consisting of hundreds of participants with a very welcoming, friendly ambiance. It was inspiring, and I participated in several activities:
But I also had the itch to revisit and enhance the JavaScript widget that runs in our Evergreen catalogue which delivers on-demand cards of additional metadata about contributors to recorded works. I had originally developed the widget as a proof-of-concept for the potential value to cultural institutions of contributing data to Wikidata--bearing in mind a challenge put to the room at an Evergreen 2017 conference session that asked what tangible value linked open data offers--but it was quite limited:
So I spent some of my hackathon time (and some extra time stolen from various sessions) fixing those problems--so now, when you look at the catalogue record for a musical recording by the most excellent Canadian band Rush, you will find that each of the contributors to the album has a musical note (♩) which, when clicked, displays a card based on the data returned from Wikidata using a SPARQL query matching the contributor's name (limited in scope to bands and musicians to avoid too many ambiguous results).
I'm not done yet: the design is still very basic, but I'm happier about the code quality and it now supports queries for all of the contributors to a given album. It is also licensed for reuse under the GPL version 2 or later license, so as long as you can load the script in your catalogue and tweak a few CSS query selector statements to identify where the script should find contributor names and where it should place the cards, it should theoretically be usable in any catalogue of musical recordings. And with the clear "Edit on Wikidata" link, I hope that it encourages users to jump in and contribute if they find one of their favourite performers lacks (or shows incorrect!) information.
You can find the code on the Evergreen contributor git repository.
In August I made a map of Koha installations based on geolocation of the IP addresses that retrieved the Koha Debian package. Here’s an equivalent map for Evergreen:
As with the Koha map, this is based on the last 52 weeks of Apache logs as of the date of this post. I included only complete downloads of Evergreen ILS tarballs and excluded downloads done by web crawlers. A total of 1,317 downloads from 838 distinct IP addresses met these criteria.
The interactive version can be found on Plotly.
In a matter of just a few weeks, we welcomed two new libraries into the Bibliomation family. The Milford Public Library went live on Evergreen on August 18th, and the Babcock Library in Ashford went live on September 23rd. Both of these migrations were completed with the help of Equinox Software, Inc. We have worked with Equinox on other migration projects in the past, and appreciate their expert guidance. We are thrilled to have Milford Public Library, a former Bibliomation member, back in the fold. The staff brings with them a great deal of enthusiasm, and they have taken to Evergreen very quickly.
Babcock Library has been through many changes lately. Thanks to the leadership of their interim director, Terry Decker, they weathered losing key staff in the midst of the project, and still managed to go live on their target date.
In November, the Burnham Library in Bridgewater will be the final library to join Bibliomation in 2016. Stay tuned for news of their journey!
January 19, 2016
Until January 12th, his last day on the job, Benjamin Shum has run the Evergreen system for Bibliomation the entire time we have been on Evergreen. Ben was hired in 2009, when Evergreen was just a plan. In 2010, along with Melissa Lefebvre and Kate Sheehan, Ben rolled Evergreen out to a number of development partner libraries, just joining Bibliomation in their own mini-Evergreen-network. Ben learned so much in that first year, enough to migrate the rest of Bibliomation’s libraries in May/June of 2011. His ability to communicate complex Evergreen functionality to staff and libraries alike made Evergreen a real pleasure for the rest of us to learn. Ben’s work as a core committer in the Evergreen community also helped us stay abreast of all Evergreen developments. His contributions were valued highly. Ben leaves Bibliomation’s Evergreen system in the capable hands of his work partner, Melissa Ceraso.
Good luck to Ben in his future endeavors. We will miss him.
Evergreen Open Source ILS posted a photo:
Notice that there are a lot of pictures of people just working. Hack-A-Way 2014.
Receipt Template Editor Variables
General variables
%LIBRARY% Library full name
%SHORTNAME% Library Policy Name
%STAFF_FIRSTNAME% First name of Staff login account
%STAFF_LASTNAME% Last name of Staff login account
%STAFF_BARCODE% Barcode of Staff login account
%STAFF_PROFILE% Profile of Staff login account
%PATRON_FIRSTNAME% First name of Patron
%PATRON_LASTNAME% Last name of Patron
%PATRON_BARCODE% or
%patron_barcode% Patron Barcode
%TODAY% Full Date and time in the format: Wed Sep 21 2011 13:20:44 GMT-0400 (Eastern Daylight Time)
%TODAY_TRIM% Date and time in a shorted format: 2011-09-21 13:21
%TODAY_m% Two digit Month: 09
%TODAY_d% Two digit Day: 21
%TODAY_Y% Year: 2011
%TODAY_H% Hour in 24 hour day: 13
%TODAY_I% Hour in 12 hour format: 1
%TODAY_M% Minutes of the Hour: 24
%TODAY_D% date in standard US format: 09/21/11
%TODAY_F% date in International Standard: 2011-09-21
Additional variables for various slips
Hold Slip
%ROUTE_TO% It should say Hold Shelf if it is a hold being fulfilled
%item_barcode% Item Barcode
%item_title% Item Title
%hold_for_msg% Hold for Message: this gives the patron’s Name
%PATRON_BARCODE% Patron’s Barcode
%notify_by_phone% Phone number listed in the Hold Database. This may not be the same s what is in the Patron’s record, as they can list another number when placing the hold.
%notify_by_email% Email listed in Hold Database. Same as phone number
%request_date% The date that the Request was originally placed.
%formatted_note% Hold Notes (new to 2.1)
Transit Slip
%route_to% Library Policy Name that the item is in transit to
%route_to_org_fullname% Library Full Name that the item is in transit to
%street1% Library Street address Line 1 that the item is in transit to.
%street2% Library Street address Line 2 that the item is in transit to.
%city_state_zip% City, State, Zip of Library the Item is in transit to.
%item_barcode% Item Barcode
%item_title% Item title
%item_author% Item Author
%route_to% Library Policy Name that the item is in transit to
%route_to_org_fullname% Library Full Name that the item is in transit to
%street1% Library Street address Line 1 that the item is in transit to.
%street2% Library Street address Line 2 that the item is in transit to.
%city_state_zip% City, State, Zip of Library the Item is in transit to.
%hold_for_msg% Hold for Message: this gives the patron’s Name
%PATRON_BARCODE% Patron’s Barcode
%notify_by_phone% Phone number listed in the Hold Database. This may not be the same s what is in the Patron’s record, as they can list another number when placing the hold.
%notify_by_email% Email listed in Hold Database. Same as phone number
%request_date% Date that the Request was originally placed
%original_balance% The original balance the patron owes
%payment_received% How much was received from the patron
%payment_applied% How much of the payment was applied
%payment_type% What type of payment was applied: IE Cash
%voided_balance% Any Voided balance
%change_given% How much change was given
%new_balance% The new balance on the account
%note% Any notes on the annotated payment
%bill_id% The Id for the bill in the Bill database
%payment% How much of the payment that was applied was applied to this title
%title% Title that the payment was applied to.
%last_billing_type% The type of bill that was last charged to the patron for this title
%barcode% Item barcode
%title% title of item
The Evergreen 2013 conference has come and gone and we want to hear what you thought about the conference so we can help next year’s organizers in planning an even better conference experience for you. Tell us what you liked, disliked, best food and favourite made-in BC TV show. Thanks for participating in the conference and visiting our lovely city!
All the best, courtesy of your superstar Evergreen 2013 Organizing Committee:
Feel like getting Irish tonight? Need a pint or two of the best Guinness in Vancouver? Meet up with Sharon and Kevin in the hotel lobby at 6:45pm for an excursion to the best little Irish pub in Vancouver, the Irish Heather.
In my last post I talked about getting my code published as a PEAR package. Well, I did talk to them and they asked me to change my code a bit, add some PHPunit test, and to organize my directory in a different fashion. But, vacations are over and I have been too busy to make those changes. Yes, its been long since my last post. But, my first update after GSoC is here. I have changed the way I used to POST using cURL. Now, I am using HTTP_REQUEST2 to POST and send headers. As it says on PEAR, HTTP_REAUEST2 “Provides an easy way to perform HTTP requests”, more information over here (http://pear.php.net/package/HTTP_Request2).
Some PHPunit test, and a new package name is what I aim to do next.
The openSRF-PHP Library is ready and can be used to get response from an openSRF service using the services and methods present. There are two examples present, which will tell how to use the library. A documentation has also been added (https://github.com/pranjal710/osrf/blob/master/docs/doc.md) . The first part of the documentation describes functions, classes and their member functions, parameters passed and their return values. The second part of the documentation briefly describes what happens when someone uses the library i.e. the role of each class/function. My github repo is mentioned in the github tab.
Someone who helped me throughout, this library would not have been what it is now had my mentor Lebbeous Fogle-Weekley not helped me. I thank him for helping me by breaking the project into small parts and guiding me whenever I got stuck. There are a lot of things (which might be small for him) but were new to me, and they will surely help me throughout. Thankyou Sir, for editing my code, which cleaned it made it a lot shorter and for everything.
The next thing I am looking forward is to get it published as a PEAR package. My next post will tell about the changes I have done to get it Published as a PEAR package.
In Header Section:
<img src=”http://niagaracollege.niagaraevergreen.ca//opac//images/small_logo.jpg”>
<br/>
You checked out the following items:<hr/><ol>
In Line Items Section:
<li>%title%<br/>
Barcode: %barcode% <br/>
Due: %due_date%
In Footer Section:
</ol><hr />%SHORTNAME% %TODAY_TRIM%<br/>
You were helped by %STAFF_FIRSTNAME%<br/>
<br/>
Fines: Books/Magazines .25 per day,<br/>
Video/DVD 1.00 per day,<br/>
Equipment 2.00 an hour
<br/>
<br/>
Renew under My Library Account at <br/>
http://www.niagaracollege.ca/library
<br/><br/>
Or call NCLibraries : 905-735-2211 x7767<br/>
(WC) x7767 (NOTL) x4413 <br/>
The 2011 Evergreen International Conference planning team is pleased to invite you to submit presentation proposals for the conference.
There are three programming tracks, and we hope to have a broad spectrum of programming within each track:
The results from our 2011 Conference Planning survey are in!
Here is a summary:
On question #1, What tracks will you likely attend?