I would find this issue extremely important, I just commented on it to the SPARQL WG: http://lists.w3.org/Archives/Public/public-rdf-dawg-comments/2011Jul/0014.html!
I would suggest adding custom functions makes it much more obvious that this is an extension that is not standard.
The danger (as we have seen recently) by extending the range of datatypes that a standard function can accept is that it will lead to different behaviour on different implementations. This causes a lot of confusion for users and leads them to ask questions like "Why does my standard SPARQL query give different results on XXX's database, when compared to YYY's database?"
Also, be careful with XSD date/time/duration data types. They are notoriously difficult to deal with due to having ambiguous sort order and values that are conceptually the same, but with different representations.
I agree that supporting this through custom functions is the way to go. rescheduling to 2.5.2 (for now).
I would like to propose two basic functions, which are implemented as standard SPARQL custom functions.
fn:DATE() is equivalent of NOW(), but instead of xsd:datetime it returns only xsd:date.
fn:DATE_ADD(xsd:datetime/xsd:date, interval) is function responsible for datetime arithmetics. The implementation is based/inspired by SQL date functions (http://www.w3schools.com/sql/func_date_add.asp
(df:DATE_ADD("2012-12-31T10:59:01", "INTERVAL 1 DAY")
Current implementation allows to work with these time intervals : MILLISECOND, SECOND, MINUTE, HOUR, DAY, WEEK, MONTH, YEAR
The function also permits to substract dates, by using negative integer :
(df:DATE_ADD("2012-12-31T10:59:01", "INTERVAL -1 DAY")
I could improve these or add new functions to help in Sesame developement.