Unfiltered and Slick

People sometimes ask for example “crud” webapps for Unfiltered. I’ve never made one because a) it’s boring and b) most people use Unfiltered to make APIs. But we had Jan Christopher Vogt for our September meetup and I was reminded how much I liked Slick (or how much I liked ScalaQuery, what it was called when I last used it).

Making a Slick app would not be boring. For even more interestingness and less future-irrelevance I made it touch friendly. Looks like this:

Breeds

It’s a database for you to enter dog breeds and famous dogs of that breed. Super useful. If you want to update and delete, you have to write the code for that. Ruff ruff.

The unfiltered-slick.g8 template is pretty simple. I adopted the DB setup code (and ASCII art!) from Chris’s ny-scala demo project.

If you want to try it out, install giter8 and run this:

g8 unfiltered/unfiltered-slick

My favorite thing in there is this little function that makes a directive for loading data into a Breed instance:

def breedOfId(id: Int)(implicit session: Session) =
  getOrElse(
    Breeds.ofId(id).firstOption,
    NotFound ~> ResponseString("Breed not found")
  )

This is built with Slick 2.0.0-M2. It uses H2 in-memory and I haven’t tested it with anything else. If I didn’t do something the slickest way, send a pull request.

Untupled

Unfiltered validation has been on my to-do list for as long as there’s been an Unfiltered. I tried a few different approaches, none of which were good enough to move it to-done.

First I tried doing it with extractors, the golden hammer of Unfiltered.

Requests are commonly accompanied by named parameters, either in the query string of the URL or in the body of a POST. Unfiltered supports access to these parameters with the Params extractor.

Extracting Params

The results were unsatisfying. If you want typed results, and you probably do, the only thing you can do with unacceptable input is refuse to match it, typically responding with a 404.

I never expected the extractor approach to work out, but the extractors are easy to build and easy to use — for mediocre results. My grander scheme was to support parameter validation such that it would be easy to these things:

  1. build your own reusable validators.
  2. build your own validators inline.
  3. respond to multiple unacceptable parameters with multiple error messages.

And I tried for a while to do this by hacking for-expressions. It was difficult to build, and difficult to use. I didn’t use it much myself and periodically forgot how it worked. Something kept me from ever documenting it. Common sense?

Thanks Norway!

Later, Jan-Anders Teigen contributed directives to Unfiltered. Directives use for-expressions to validate requests in a straightforward way. Missing a header we require? Respond with the appropriate status code. You can also use them for routing, by orElse-ing through non-fatal failures. What you couldn’t do was accumulate errors, my requirement #3.

I had a week with no internet in the Adirondacks, and it seemed as good a time as any to face my old nemesis, parameter validation for the people.

Thinking clearly

The first thing I did was to build into directives a syntax for interpreting parameters into any type and producing errors when interpretation fails.

This took some time to get right, mostly choosing what to call things and crafting an API for both explicit and implicit use. Define your own implicit interpreters to produce your own types and error responses, then you can collect data like a lovesick NSA officer.

val result = for {
  device <- data.as.Required[Device] named "udid"
} yield device.location

This could report failure in different ways according to your own interpreter; the udid parameter is missing, not in the right format, isn’t in the database, the database is down, and so on.

Snake in the grass

After this I decided to tackle my dreaded requirement #3. This time I wouldn’t abuse for-expressions or change the way directives fundamentally work. When directives are flat-mapped together, the result is a mapping of all their successes or the first failure.

So how do we produce multiple responses for multiple failures? I settled on the idea of combining multiple directives into one, which would itself produce a mapping of all their successes or a combination of all their failures. This combined directive would then, typically, be flat-mapped to other directives in a for-expression.

The combination step is easy enough to express, even if it was a little tricky to implement.

scala> (data.as.Required[String] named "a") &
  (data.as.Required[Int] named "b") 
res1: unfiltered.directives.Directive[
  Any,
  unfiltered.directives.JoiningResponseFunction[String,Any],
  (String, Int)] = <function1>

The first type parameter of Directive has to do with the underlying request, the second is the joined error response type, and the third is the success type — a tuple of the two directives’ success types.

The joining method & produces a tuple so that successes preserve all their type information. We might use it in a for expression like so:

(a, b) <- (data.as.Required[String] named "a") &
  (data.as.Required[Int] named "b") 

But what happens if there’s more than one independent parameter?

scala> (data.as.Required[String] named "a") &
  (data.as.Required[Int] named "b") &
  (data.as.Required[BigInt] named "c")
res2: unfiltered.directives.Directive[
  Any,
  unfiltered.directives.JoiningResponseFunction[String,Any],
  ((String, Int), BigInt)] = <function1>

Oh dear — our tuples are nested. To assign the values now, we would need to nest tuples exactly the same.

((a, b), c) <- (data.as.Required[String] named "a") &
  (data.as.Required[Int] named "b") &
  (data.as.Required[BigInt] named "c")

This could get rather confusing, especially when we want to add or remove a parameter later.

Nesting

To understand why the tuples are nested, think about what the & method does. For d1 & d2, it produces a new directive where the success values are a tupled pair. It does this always, according to its return type.

Now consider the case of three directives: d1 & d2 & d3. We could write it without infix notation: d1.&(d2).&(d3). It’s clearer still with parenthesis grouping the infix operations in their normal order of evaluation: ((d1 & d2) & d3). With repeated applications of & we’ll necessarily produce typed, nested pairs. You can see a simpler example with the standard library’s tuple constructor:

scala> 1 -> 2 -> 3
res3: ((Int, Int), Int) = ((1,2),3)

So this makes sense, even if we don’t like it. We’d rather access the results as if the structure were flat. A Seq would allow that, but we’d lose the component types. Another approach, would be to apply a single joining function across all the directives we want to combine:

a, b, c <- &(data.as.Required[String] named "a"),
  (data.as.Required[Int] named "b"),
  (data.as.Required[BigInt] named "c")

This looks pretty nice, but it comes at a high cost: the function & would have to be defined specifically for 2 arguments, 3 arguments, and so on up to 22. And if somebody wanted to use it for 23 independent parameters, too bad. You’ll see that kind of code in some libraries including the standard library; I think it’s usually generated. But I really didn’t want to add it to Unfiltered if I didn’t have to.

And I didn’t have to.

In Scala, common data types are built into the standard library instead of the language. So we can do things like this without List being special, or known to the compiler at all.

scala> val a :: b :: c :: Nil = 1 :: 2 :: 3 :: Nil
a: Int = 1
b: Int = 2
c: Int = 3

In order to support rich user-defined interfaces comparable to language-level features in other languages, the Scala language itself has features like infix notation that are surprising to the newcomer. This is all Scala 101, but on occasion I’m still impressed by the possibilities that basic design decision gives to the programmer.

Let’s try our list example again, with grouping parenthesis.

scala> val (a :: (b :: (c :: Nil))) = (1 :: (2 :: (3 :: Nil)))
a: Int = 1
b: Int = 2
c: Int = 3

Because of the right-associativity of methods ending with a colon, the nesting is reversed, but you can probably see that a solution to flattening our nested tuples is getting closer.

The standard library provides a :: case class as a helper to the :: method of List, and like all case classes it has a companion extractor object. The above relies on infix notation for the extractor; the constructor style makes it a little more plain.

scala> val ::(a, ::(b, ::(c, Nil))) = (1 :: (2 :: (3 :: Nil)))
a: Int = 1
b: Int = 2
c: Int = 3

That’s just how you expect case class extraction to work. On the right-hand side, :: is a method call on a list but its definition constructs the same case class. See for yourself. So actually, it’s more like the standard library provides the method as a helper to the case class.

That’s great for lists and we’d like to do the same for a pair, but we can’t use the same case class technique since Tuple2 is itself a case class. Not to worry, even though case classes are a language feature, the extractor functionality they use is available to any object. (It was a short holiday for this golden hammer.) We’ll call ours & since we want it to partner with the & method of Directive.

scala> object & {
     |   def unapply[A,B](tup: Tuple2[A,B]) = Some(tup)
     | }
defined module $amp

Let’s try it out with modest constructor notation first.

scala> val &(a,b) = (1,2)
a: Int = 1
b: Int = 2

Infix?

scala> val a & b = (1,2)
a: Int = 1
b: Int = 2

Nesting???

scala> val a & b & c = ((1,2),3)
a: Int = 1
b: Int = 2
c: Int = 3

Sweet!!!

Nested

Now we can assign arbitrarily many nested success values from combined directives in simple flat statement.

a & b & c <- (data.as.Required[String] named "a") &
  (data.as.Required[Int] named "b") &
  (data.as.Required[BigInt] named "c")

And that is basically how it works. One caveat is that Unfiltered already had a & extractor object for pattern matching on requests, but I was able to overload the unapply method without issue — once again, I’m rescued by the soundness of Scala’s core features.

You might wonder why I didn’t name it ->. And indeed, I could have.

scala> object -> {
     | def unapply[A,B](tup: Tuple2[A,B]) = Some(tup)
     | }
defined module $minus$greater

scala> val a -> b -> c = 1 -> 2.0 -> "3"
a: Int = 1
b: Double = 2.0
c: String = 3

Isn’t that pretty? I don’t know why this isn’t defined in the standard library already, perhaps it’s been discussed before. I think it would be useful, and I would have used that here instead of defining my own extractor. It would promote the pattern of nesting tuples, rather than defining 22 methods while taking a shot of vodka after each one.

(I am also aware, vaguely, that I have wondered onto ground inhabited by HLists. Don’t shoot! I come in peace.)

In any case, an object -> is not in the standard library. I don’t want to invite the possibility of a collision, should it be added later. And also, if nested tuples and -> extractors were a common pattern, it would be one thing. Since they are not yet, I don’t want to have to explain to people why & is used to join directives while a -> extractor is used to split them.

All told, I’m very pleased with the resulting API for parameter directives, and hope people will get good use out of it. Directives are now fully documented and Unfiltered parameter validation is finally done.



3 notes

This post was reblogged from implicit.ly.



12 notes

Scala in 2007 - 2013

One of the highights of this year’s Scala Days, for me, was Shadaj Laddad’s talk on Fun Programming in Scala. His unabashed love of programming reminds me why I started to learn it myself, at a later age when I had saved up enough money to buy a C++ development environment. Today anyone can download compilers for free, for any programming language. To be honest, I’m jealous of kids who now get to learn Scala as a first language.

I was surprised and proud to see how enthusiastically Shadaj was using giter8 to efficiently create new projects. When I made giter8 I was mostly inspired by the automation, efficiency, and evident pleasure that Ruby programmers took in automating mundane tasks like making a new CRUD website.

I’d tried to do the same with Java tool-chains in the past, mostly Maven’s archetypes, but found them to be over-architected, and under-designed for creation and maintenance. To eliminate grunt work for the end user you had to do about 10x the grunt work as an author, not exactly a formula for success in the unmarket of free software.

Sprung

The other cool thing about giter8’s appearance in Shadaj’s talk is it lifted my spirits a bit from what I’d been hearing about the morning’s keynote, which I hadn’t seen. Apparently Rod Johnson, Spring Framework author and recent dependency injected to the Typesafe board, had used Dispatch (my first Scala software project and one of my proudest creations) as an example of what not to do in Scala.

I wasn’t going to watch the talk myself or write about it. It was the same old criticism of an old version of Dispatch, a programming style argument that never interested me and was ultimately easier to leave behind.

But after the videos were posted, inevitably, more of my colleagues saw the talk, and talked about the talk. Friends spoke up for me, and for Dispatch. I started to imagine that the criticism was very harsh, that it was personal, that it was something I should worry about. So yesterday, I watched Scala in 2018.

Johnson doesn’t speak in the sneering tone I’d imagined. There’s no emotion, no blood at all. His criticism of Dispatch isn’t cutting, it’s very general. Dispatch and libraries like it, he informs the audience, simply should not exist.

I have to say I was surprised when finally watching the video just how flatly Johnson ignores the fact that Dispatch was completely rewritten over a year ago. There is no point in listing the differences; they are many and self-evident. I invite you to watch the "Dispatch" portion of the talk with the Dispatch home page open in another window, and see for yourself.

I’ve given a few talks and I know the work that goes into every beat. It’s time consuming; many hours of work go into a one-hour talk of any quality. So it’s astonishing to me that a fundamental error in one of the longer chunks of the talk, and which is used to support one of its primary themes, survived Johnson’s own review of his notes. Whether Dispatch is, still, a wrapper for Apache HttpClient, with dozens of symbols that cause right-thinking enterprise developers to blush politely — these are facts you can check in ten seconds.

Further, you might expect a secondary review by someone more familiar with the Scala community, when a keynote puts its critical focus on that community. It’s how you avoid these kinds of blunders, and how you later avoid having to say, “stop talking about my blunders and instead please discuss my important message.” (We’ll get to that.)

But for the record, if anybody is weirdly still interested in the full story of the Dispatch rewrite that I completed early last year, I wrote a series of posts about it. If not, that’s great! Neither am I.

The road to perdition

Johnson makes his thesis plain. In five years he hopes Scala is a leading programming language for traditional enterprise apps. He doesn’t have the same hope for startups. Or for front-end (web?) programming. This stated goal is the motivation for his criticism of some Scala libraries and of the ways that some people participate in the Scala community. He projects a series of beliefs and behaviors onto the community using the trusty myth list formula.

One must imagine all the hard core Scala programmers champing at the bit, or champing like that doctor zombie in the B wing of the W.H.O., to debate these “myths”. But before tearing through the glass, it’s important to ask yourself: do I want to be where Rod Johnson wants me to want to be in five years? Do I want Scala to be less popular in startups, more popular in traditional enterprise settings?

For me, debating how to get there would be like arguing over the fastest route to the gulag. If that’s where this bus is headed I don’t particularly care how it gets there, I just want off.

Luckily, that’s not where the bus is headed, at all. Johnson greatly overreaches in discounting the use of Scala in startups, allowing only that some startups may experience a “Twitter scenario” where they are successful and have to scale.

Like many things in this talk that is purportedly about the next five years, the “Twitter scenario” is from five years ago. Today there’s no shortage of startups using Scala from the beginning, in New York and elsewhere. You don’t have to look any further than the list of sponsors for Scala Days to see some of them. There’s no reason to think, and less reason to hope, that Scala’s popularity in startups won’t continue.

Aside from that, if you value a healthy open source community for its own sake, you might want that as a target. If you value projects that do something creative, cool, beautiful, or clever, you might want to list that. If you want Scala to be used as a teaching language in public schools, you set it as a goal for 2018.

In other words, you might want some things for Scala that are outside your immediate career interests.

Alternative mythologies

Suffice to say that my hopes for Scala are very different from Johnson’s, and so would produce a very different set of guidelines. Or perhaps, none at all: it’s your day off, your computer, your electricity — do what you want.

I’m not going through the whole values exercise here, so I’ll end with this thought: when I escaped Java five years ago it wasn’t entirely, or even mostly, about the language. I was stifled and utterly uninspired by the Java Community — the same one Johnson puts up as a stretch goal for Scala programmers.

I don’t feel there is a Java community so much as there is a Java audience, in the sway of one “thought leader” after another, chasing one magic bullet after another, always one enterprise software purchase away from the ultimate salvation of not having to program.

But there are many programming language communities where individual freedom of expression is prized, where experimenters are praised for their successes and their failures, where we all thank our lucky stars we live in an era where programming exists and we can do all we want for free. For me Scala is one of these languages.

Having seen a few other talks that same morning, I don’t worry about our future in the slightest.



2 notes

This post was reblogged from Making Meetup.



1 note

Participatory Conferencing

We’re less than a week from the Northeast Scala Symposium, and the few remaining RSVPs spots (out of three hundred this year) are dwindling.

The symposium was born as a gathering of the Scala meetups of New York, Boston, and Philadelphia. We simply took the planning and budgeting processes of our meetups and scaled them up. Which is to say, we didn’t do much planning or budgeting at all. We held the first nescala in one of our usual meetup space and we selected speakers with a voting process open to everyone who RSVP’d. This not only seemed fair, it saved us the task of trying to anticipate everyone else’s preferences.

This shooting-from-the-hip style of organizing has worked well for us organizers and, as far as I tell, for the attendees. Or rather, participants. Everyone at the gathering—not just the esteemed speakers—must contribute for it to be special. Otherwise we could all stay at home and watch the same dudes talking on youtube.

Each year nescala has grown in size more or less on its own. We haven’t made growth a priority, but it is gratifying and it’s got to be good for the Scala community that a regional conference is growing.

The greater the number of participants in nescala, the more people who have to be registered, fed, cleaned up after, and—well—taken care of. I would love it if we could do this as a group, but you have to remember that this is a conference by and for programmers. Nobody is going to cut short a conversation about the pros and cons of iteratees for something as mundane as disposing of paper plates with pizza crusts on them.

Meetup organizers excel at throwing away paper plates, but as the conference grows there just aren’t enough of us to keep up. So we’ve done the only rational thing and hired people to do this for us. And it works out: as the conference grows we get $50 from more people and we also get more offers of sponsorship.

But there is one side effect of growth I’m constantly on guard against: commoditization of the experience. The more paid staff there are helping make the day a success, the more people will confuse the organizers with paid staff. The more they will see a customer-service relationship where there is none. The less they will be active participants in the symposium, and the more they may taint the experience of others—including, if I may be selfish, the organizers.

We don’t make money off of nescala and have no reason to. For a living we write software, like you. For a hobby we bring people together, and the payback is in seeing others participate actively in the conference. Watching people help each other. It’s as simple as that.

See you on Friday.



2 notes

Configrity 0.10.2

Thanks to sbt-scalashim, Configrity is now backward compatible with Scala 2.8.1 and 2.8.2.

This is how a software community builds itself.

This post was reblogged from implicit.ly.



4 notes
Scala services at Tumblr (but you knew that already)

Scala services at Tumblr (but you knew that already)

flatMap Oslo  (May 15, 16) is Norway&#8217;s first Scala conference, and the first anywhere to have such exciting light fixtures. (Good speaker lineup, too.)

flatMap Oslo (May 15, 16) is Norway’s first Scala conference, and the first anywhere to have such exciting light fixtures. (Good speaker lineup, too.)

Page 1 of 20

}