Sei sulla pagina 1di 38

- 18 Months With Scala Building a Driver For MongoDB

Casbah (MongoDB + Scala)

http://github.com/mongodb/casbah

Brendan McAdams
brendan@10gen.com

@rit

Scala Days 2011


June 3, 2011

- 18 Months With Scala Building a Driver For MongoDB


"class Insanity extends Object {}": Getting Started

With Scala
"This s%#! doesn't work the way I want it to": Where
Casbah Came From ... (mongo-scala-wrappers)
"Do You Know What I Am Saying?": Pimping Java,
Syntactic Sugar and Internal DSLs
"Rock The Casbah": Silly Names and 1.0 Releases
They Got Me A Freaking Pony: New Job, New
Problems
"Eating Your Own Dog Food Isn't The Same As Making
It Palatable": 1.1 becomes 2.0, Real Users

class Insanity extends Object {}


Getting Started With Scala

2009 Learning Year


Deeper Python, started to get Lambdas, FP
Learned C#; loved the better Java with lots of Pythonic
functional stuff

Data Processing tools; Disco, Hadoop, OpenCL / CUDA, R, etc


New database technologies (NoSQL); Cassandra, Redis,

CouchDB, Riak, MongoDB.


October 2009 ...

Put together NY NoSQL Conference (100+ ppl)


Job Imploded
New Job, New to Scala

October 2010 ...

Joined 10gen
Fulltime MongoDB Developer, work on Hadoop integration,
general Scala support as significant portion of my job

Casbah &

class Insanity extends Object {}


Getting Started With Scala

Big Problems, New Tools needed


For much of it, Java wasnt the answer
Scala brilliant tool for solving problems
Had read Wampler / Payne, not written code
Impulse Control Problem or Good Gut Feeling?
Akka huge part ... #legendofklang
Custom formulas, DSLs and other tools
Began fiddling with MongoDB tools for interstitial
caching layer

This s%#! doesnt work the way I want it to

mongo-scala-wrappers Is Born
Learned MongoDB from Python
Dynamic language with flexible syntax; Dynamic database
with flexible schemas
Tooling for MongoDB + Scala was limited or unsuited.
Mostly focused on ODM. None of what I loved about Scala
or MongoDB possible together.
Java Driver ... No Scala sugar or tricks
scamongo (pre-lift): ODM (ORMey) or JSON tools
mongo-scala-driver: A little syntactic sugar but mostly
ODM; didnt get it

MongoDB from Python


1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17

doc = {
"name": {
"first": "Brendan",
"last": "McAdams"
},
"email": "brendan@10gen.com",
"twitter": "@rit",
"age": 31,
"interests": ["scala", "python", "akka", "mongodb"]
}
age = doc['age']
type(age) # <type 'int'>
doc['interests'][1] # 'python'
type(doc['interests']) # <type 'list'>

MongoDB from Java


(or Scala, pre-Casbah)

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26

val b = BasicDBObjectBuilder.start()
b.add("name", new BasicDBObject("first", "Brendan").append("last", "McAdams"))
b.add("email", "brendan@10gen.com")
b.add("twitter", "@rit")
b.add("age", 31)
val interests = new BasicDBList()
interests.add("scala")
interests.add("python")
interests.add("akka")
interests.add("mongodb")
b.add("interests", interests)
val doc = b.get()
val age = doc("age") // AnyRef = 31
doc("interests")(1)
/* error: AnyRef does not take parameters
doc("interests")(1)
*/
doc("interests").asInstanceOf[BasicDBList](1) // java.lang.Object = python

Type Safety and Compilation


Shouldnt Necessitate Syntactic
Suicide
Theres absolutely nothing wrong with that Syntax... For
Java.
Scala is expressive, fluid and beautiful; so is (IMHO)
MongoDB.
My goal: Teach Scala to be as close to Python / Mongo
Shell as possible
Self Imposed Limitation: Dont reinvent the wheel. Java
Drivers Network Layers, BSON Encoding, etc work great.
Just add Syntactic Sugar!

Today ...
1 val doc = MongoDBObject(
2
"name"
-> MongoDBObject("first" -> "Brendan", "last" -> "McAdams"),
3
"email"
-> "brendan@10gen.com",
4
"twitter" -> "@rit",
5
"age"
-> 31,
6
"interests"-> Seq("scala", "python", "akka", "mongodb")
7 )
8 // Full support also for Scala 2.8 Collections' Factory / Builders
9
10 val age = doc.getAs[Int]("age") // Option[Int] = Some(31)
11
12 val interests = doc.as[Seq[_]]("interests") // Seq[java.lang.String] =
List(scala, python, akka, mongodb)
13
14 interests(2) // akka
15
16 // Experimental Dynamic support in 2.9 lets you do doc.age.typed[Int]

Today ...
"Chained core operators" should {
"Function correctly" in {
val ltGt = "foo" $gte 15 $lt 35.2 $ne 16
log.debug("LTGT: %s", ltGt)
ltGt must beEqualTo(MongoDBObject("foo" ->
}
"Function correctly with deeper nesting e.g.
val ltGt = "foo" $not { _ $gte 15 $lt 35.2
log.debug("LTGT: %s", ltGt)
ltGt must beEqualTo(MongoDBObject("foo" ->
"$ne" -> 16))))
}
}

MongoDBObject("$gte" -> 15, "$lt" -> 35.2, "$ne" -> 16)))


$not" in {
$ne 16 }
MongoDBObject("$not" -> MongoDBObject("$gte" -> 15, "$lt" -> 35.2,

"with Long" in {
val neLong = "foo" $lte 10L
neLong must haveEntry("foo.$lte" -> 10L)
}
"with Short" in {
val neShort = "foo" $lte java.lang.Short.parseShort("10")
neShort must haveEntry("foo.$lte" -> java.lang.Short.parseShort("10"))
}
"with JDKDate" in {
val neJDKDate = "foo" $lte testDate
neJDKDate must haveEntry("foo.$lte" -> testDate)
}
"with JodaDT" in {
RegisterJodaTimeConversionHelpers()
val neJodaDT = "foo" $lte new org.joda.time.DateTime(testDate.getTime)
neJodaDT must haveEntry("foo.$lte" -> new org.joda.time.DateTime(testDate.getTime))
}

... But it took ~18 months to get


there
Feb. 12, 2010: Initial Open Source Release (0.1) No Tests.
- Initial import
Compiles, reflects the working code currently in Novus Trunk but does not have
full documentation, or tests yet.
NOT FOR PUBLIC CONSUMPTION - USE AT YOUR OWN RISK.
* Release 0.1 - May or may not blow your system up...
- Updated headers, scaladoc/javadoc documentation, etc.
- Next step: Written docs with examples, test classes

July 17, 2010: Release 1.0.


New collaborator/contributor Max Afonov (@max4f)
January 03, 2010: Release 2.0.
Refactoring & Stupidity cleanups.

Do You Know What I Am Saying?


Pimping Java, Syntactic Sugar, &
Internal DSLs

It took awhile to get things right


1 val doc = (
2
"name"
-> ("first" -> "Brendan", "last" -> "McAdams"),
3
"email"
-> "brendan@10gen.com",
4
"twitter"
-> "@rit",
5
"age"
-> 31,
6
"interests" -> ("scala", "python", "akka", "mongodb")
7 ) // Tuple conversion looked nice but BOY was it problematic
8
9 // Syntax for getAs, etc came much, much later
10 val age = doc("age") // AnyRef = 31
11
12 doc("interests")(1)
13 /* error: AnyRef does not take parameters
14
doc("interests")(1)
15 */
16
17 doc("interests").asInstanceOf[BasicDBList](1) // java.lang.Object =
python
18

So very many things wrong with


this code

1 /**
2 * Hacky mildly absurd method for converting a <code>Product</code> (Example being any <code>Tuple</
code>) to
3 * a Mongo <code>DBObject</code> on the fly to minimize spaghetti code from long builds of Maps or
DBObjects.
4 *
5 * Intended to facilitate fluid code but may be dangerous.
6 *
_ * SNIP
17 */
18 implicit def productToMongoDBObject(p: Product): DBObject = {
19
val builder = BasicDBObjectBuilder.start
20
val arityRange = 0.until(p.productArity)
21
//println("Converting Product P %s with an Arity range of %s to a MongoDB Object".format(p,
arityRange))
22
for (i <- arityRange) {
23
val x = p.productElement(i)
24
//println("\tI: %s X: %s".format(i, x))
25
if (x.isInstanceOf[Tuple2[_,_]]) {
26
val t = x.asInstanceOf[Tuple2[String, Any]]
27
//println("\t\tT: %s".format(t))
28
builder.add(t._1, t._2)
29
} else if (p.productArity == 2 && p.productElement(0).isInstanceOf[String]) {
30
// backup plan if it's a one entry tuple, the outer wrapper gets stripped
31
val t = p.asInstanceOf[Tuple2[String, Any]]
32
builder.add(t._1, t._2)
33
return builder.get
34
} else {
35
throw new IllegalArgumentException("Products to convert to DBObject must contain Tuple2's.")
36
}
37
}
38
builder.get
39 }
40

In a Strongly Typed Language Like Java,


Square Pegs do not fit Into Round Holes

In Loosely/Dynamically Typed Languages


(Perl, Python, Ruby, etc) Round Holes can be
convinced to Accept Square Pegs

Scala gives us a lathe instead ...

Implicits, Type Classes, Abstract & Parameterized Types


(Scalas variant, esp. with Covariance/Contravariance
annotation), Structural (aka sort of a duck typing) are
incredible
For third party libraries though...
Am I helping my users or hurting them?
Have I accounted for all the use cases?
Do I have any idea what the f$%k Im doing?

But the rule is not Measure once, cut


twice ...

(Shamelessly swiped from


https://jepeachey.wordpress.com/2010/07/19/measure-twice-cut-once/)

- Manifest fun can protect you from a lot of compile time stupidity
(so can @tailrec!) but when youre doing runtime serialization it may
not be enough.
- Type Classes let you create type safe (or quasi-type safe) methods
but still let your users add on to them. Important in a serialization
arch where users can define custom class ser/deser
- Manifests vs. Type Classes

But the rule is not Measure once, cut


twice ...
...then again, Im no carpenter

Much of what I love about Scala are often compile time

checks and dont keep you from misunderstanding things,


hurting your users or just plain screwing up.

Fun with Type Inference aka Oops, I screwed the


explicit annotators

Know and understand the fancy features, but also know


when to use them.

The difference between a junior and a senior programmer is


that the senior knows when not to write code.

Know when not to use them, and when one choice is


better than another.

- Manifest fun can protect you from a lot of compile time stupidity
(so can @tailrec!) but when youre doing runtime serialization it may
not be enough.
- Type Classes let you create type safe (or quasi-type safe) methods
but still let your users add on to them. Important in a serialization
arch where users can define custom class ser/deser
- Manifests vs. Type Classes

Learning New Features


(however hard/undocumented)

always made me a better Scala developer


How do you have a quasi-type safe (compile time valid types
enforcement) Query DSL in a language/engine where users can
define serialization of arbitrary custom types?
aka This code sucks
trait
def
def
def
def
def
}

LessThanEqualOp extends QueryOperator {


$lte(target: String) = op("$lte", target)
$lte(target: java.util.Date) = op("$lte", target)
$lte(target: AnyVal) = op("$lte", target)
$lte(target: DBObject) = op("$lte", target)
$lte(target: Map[String, Any]) = op("$lte", target.asDBObject)

Question lead in to me admitting my code sucked.

Type Classes for Safe Configurability


trait LessThanEqualOp extends QueryOperator {
private val oper = "$lte"
def $lte(target: String) = op(oper, target)
def $lte(target: DBObject) = op(oper, target)
def $lte(target: Array[_]) = op(oper, target.toList)
def $lte(target: Tuple1[_]) = op(oper, target.productIterator.toList)
def $lte(target: Tuple2[_, _]) = op(oper, target.productIterator.toList)
def $lte(target: Tuple3[_, _, _]) = op(oper, target.productIterator.toList)
def $lte(target: Tuple4[_, _, _, _]) = op(oper, target.productIterator.toList)
def $lte(target: Tuple5[_, _, _, _, _]) = op(oper, target.productIterator.toList)
def $lte(target: Tuple6[_, _, _, _, _, _]) = op(oper, target.productIterator.toList)
def $lte(target: Tuple7[_, _, _, _, _, _, _]) = op(oper, target.productIterator.toList)
def $lte(target: Tuple8[_, _, _, _, _, _, _, _]) = op(oper, target.productIterator.toList)
def $lte(target: Tuple9[_, _, _, _, _, _, _, _, _]) = op(oper, target.productIterator.toList)
def $lte(target: Tuple10[_, _, _, _, _, _, _, _, _, _]) = op(oper, target.productIterator.toList)
def $lte(target: Tuple11[_, _, _, _, _, _, _, _, _, _, _]) = op(oper, target.productIterator.toList)
def $lte(target: Tuple12[_, _, _, _, _, _, _, _, _, _, _, _]) = op(oper, target.productIterator.toList)
def $lte(target: Tuple13[_, _, _, _, _, _, _, _, _, _, _, _, _]) = op(oper, target.productIterator.toList)
def $lte(target: Tuple14[_, _, _, _, _, _, _, _, _, _, _, _, _, _]) = op(oper, target.productIterator.toList)
def $lte(target: Tuple15[_, _, _, _, _, _, _, _, _, _, _, _, _, _, _]) = op(oper, target.productIterator.toList)
def $lte(target: Tuple16[_, _, _, _, _, _, _, _, _, _, _, _, _, _, _, _]) = op(oper, target.productIterator.toList)
def $lte(target: Tuple17[_, _, _, _, _, _, _, _, _, _, _, _, _, _, _, _, _]) = op(oper, target.productIterator.toList)
def $lte(target: Tuple18[_, _, _, _, _, _, _, _, _, _, _, _, _, _, _, _, _, _]) = op(oper,
target.productIterator.toList)
def $lte(target: Tuple19[_, _, _, _, _, _, _, _, _, _, _, _, _, _, _, _, _, _, _]) = op(oper,
target.productIterator.toList)
def $lte(target: Tuple20[_, _, _, _, _, _, _, _, _, _, _, _, _, _, _, _, _, _, _, _]) = op(oper,
target.productIterator.toList)
def $lte(target: Tuple21[_, _, _, _, _, _, _, _, _, _, _, _, _, _, _, _, _, _, _, _, _]) = op(oper,
target.productIterator.toList)
def $lte(target: Tuple22[_, _, _, _, _, _, _, _, _, _, _, _, _, _, _, _, _, _, _, _, _, _]) = op(oper,
target.productIterator.toList)
def $lte(target: Iterable[_]) = op(oper, target.toList)
def $lte[T: ValidDateOrNumericType](target: T) = op(oper, target)
}

Type Classes for Safe Configurability


/**
* User configurable filters!
*/
trait ValidDateOrNumericTypeHolder extends ValidDateTypeHolder with
ValidNumericTypeHolder {
implicit object JDKDateDoNOk extends JDKDateOk with ValidDateOrNumericType
[java.util.Date]
implicit object JodaDateTimeDoNOk extends JDKDateOk with ValidDateOrNumericType
[org.joda.time.DateTime]
implicit object BigIntDoNOk extends BigIntOk with ValidDateOrNumericType
[BigInt]
implicit object IntDoNOk extends IntOk with ValidDateOrNumericType[Int]
implicit object ShortDoNOk extends ShortOk with ValidDateOrNumericType[Short]
implicit object ByteDoNOk extends ByteOk with ValidDateOrNumericType[Byte]
implicit object LongDoNOk extends LongOk with ValidDateOrNumericType[Long]
implicit object FloatDoNOk extends FloatOk with ValidDateOrNumericType[Float]
implicit object BigDecimalDoNOk extends BigDecimalOk with
ValidDateOrNumericType[BigDecimal]
implicit object DoubleDoNOk extends DoubleOk with ValidDateOrNumericType
[Double]
}

Figuring out Manifests was hard


... but valuable to emulate type safety

def $type[A: BSONType: Manifest] =


if (manifest[A] <:< manifest[Double])
op(oper, BSON.NUMBER)
else if (manifest[A] <:< manifest[String])
op(oper, BSON.STRING)
else if (manifest[A] <:< manifest[BasicDBList] ||
manifest[A] <:< manifest[BasicBSONList])
op(oper, BSON.ARRAY)
else if (manifest[A] <:< manifest[BSONObject] ||
manifest[A] <:< manifest[DBObject])
op(oper, BSON.OBJECT)
else if (manifest[A] <:< manifest[ObjectId])
op(oper, BSON.OID)
else if (manifest[A] <:< manifest[Boolean])
op(oper, BSON.BOOLEAN)
else if (manifest[A] <:< manifest[java.sql.Timestamp])
op(oper, BSON.TIMESTAMP)
else if (manifest[A] <:< manifest[java.util.Date] ||
manifest[A] <:< manifest[org.joda.time.DateTime])
op(oper, BSON.DATE)
else if (manifest[A] <:< manifest[Option[Nothing]])
op(oper, BSON.NULL)
else if (manifest[A] <:< manifest[Regex])
op(oper, BSON.REGEX)
else if (manifest[A] <:< manifest[Symbol])
op(oper, BSON.SYMBOL)
else if (manifest[A] <:< manifest[Int])
op(oper, BSON.NUMBER_INT)
else if (manifest[A] <:< manifest[Long])
op(oper, BSON.NUMBER_LONG)
else if (manifest[A].erasure.isArray &&
manifest[A] <:< manifest[Array[Byte]])
op(oper, BSON.BINARY)
else
throw new IllegalArgumentException("Invalid BSON Type '%s' for matching".format(manifest.erasure))

* Manifests worked *GREAT* For solving this weird dynamic typing problem

Figuring out Manifests was hard, but


valuable to emulate type safety

/**
* I had used Type classes elsewhere, but when I posted the preceding
* manifest code as an example of cool stuff to show @ ScalaDays,
* Jon-Anders Teigen (@jteigen) sent me a gist with a better way.
* Type Classes for this!
*/
def $type[A](implicit bsonType: BSONType[A]) = op(oper, bsonType.operator)

/**
* Thats now it for the $type support, it uses a few type class definitions as
* well to match the BSON types.
*/
implicit object BSONDouble extends BSONType[Double](BSON.NUMBER)
implicit object BSONString extends BSONType[String](BSON.STRING)
implicit object BSONObject extends BSONType[BSONObject](BSON.OBJECT)
implicit object DBObject extends BSONType[DBObject](BSON.OBJECT)
implicit object DBList extends BSONType[BasicDBList](BSON.ARRAY)
implicit object BSONDBList extends BSONType[BasicBSONList](BSON.ARRAY)
implicit object BSONBinary extends BSONType[Array[Byte]](BSON.BINARY)
implicit object BSONObjectId extends BSONType[ObjectId](BSON.OID)
implicit object BSONBoolean extends BSONType[Boolean](BSON.BOOLEAN)
implicit object BSONJDKDate extends BSONType[java.util.Date](BSON.DATE)
implicit object BSONJodaDateTime extends BSONType[org.joda.time.DateTime](BSON.DATE)
implicit object BSONNull extends BSONType[Option[Nothing]](BSON.NULL)
implicit object BSONRegex extends BSONType[Regex](BSON.REGEX)
implicit object BSONSymbol extends BSONType[Symbol](BSON.SYMBOL)
implicit object BSON32BitInt extends BSONType[Int](BSON.NUMBER_INT)
implicit object BSON64BitInt extends BSONType[Long](BSON.NUMBER_LONG)
implicit object BSONSQLTimestamp extends BSONType[java.sql.Timestamp](BSON.TIMESTAMP)

Rock The Casbah

Silly Names and 1.0 Releases


June 01, 2010: The death of mongo-scala-wrappers
-

Changed package to com.novus.casbah.mongodb


Rolled Scala version to 2.8.0.RC3 and SBT version to 0.7.4
Updated dependency libraries as appropriate for 2.8rc3
Cleaned up package declarations in code
Rolled module version to 1.0-SNAPSHOT as the next release goal is to be complete at a 1.0

Casbah inspired mostly randomly from The Clash


"Rock The Casbah": Silly Names and 1.0 Releases
1.1 Began on a mission of modularisation and

functionality expansion
casbah-mapper borne unto git, never released
mainstream and ultimately reimagined as Salat (Russian
word for salad)

salat-avro (@rubbish)
@codas Jerkson project using some of the ScalaSig
code utils from Salat

Growth far beyond just being a wrapper


Then again, a lot is in a name... Arguments against renaming later.
randomness of radio inspiration
Salat - The strength of an ecosystem (Being used for not just mongo ...

They Got Me a Freaking


Pony!

Pro Tip: Working with the kind of people who will


*actually* buy you a pony is highly recommended

Not featured in the PDF version: Sparkly transition effect

Eating Your Own Dog Food


Isnt The Same As Making It
Palatable
Theres a difference between Fixing bugs in production
and Shipping libraries to users
Eating my own dog food was great, but in many ways it
made me complacent
In many cases I initially only implemented MongoDB
features I was using...
... In others, only the way I was using them.

Eating Your Own Dog Food


Isnt The Same As Making It
Palatable
for (i <- 1 to )
println(Tests. Matter.)

15 years as a developer taught me this:

Tests seem like a really good idea...


Im tired of fixing my broken crap in production

Eating Your Own Dog Food


Isnt The Same As Making It
Palatable
15 years of reality tempered nice to have with shutup

and code:
I wish I had time to actually write tests and learn to
write good tests
<Boss> Just put it in production and fix it later, we dont
have time to wait
Lets face it: This isnt excuses but in many cases, reality.
Ship code or flip burgers.

Eating Your Own Dog Food


Isnt The Same As Making It
Palatable
If you plan to ship code to users, eating your own dog food
is NEVER ENOUGH*
Take the time to learn how to write good tests and GOOD
DATA
I am head over heels in love with the tools in Scala

ScalaTest (I dont use anymore but still amazing)


Specs / Specs 2: Alien Technology for breaking my code
ScalaCheck - Havent learned it yet, but does fuzzing, etc

Differentiate between integration tests and unit tests


But *use* integration tests with conditional skips, and
WRITE THEM.

* Assuming of course you care about


code quality and/or your users

Tons and tons of bugs found as I moved to specs2 , that had lurked
under the surface for time immemorial

Eating Your Own Dog Food


Isnt The Same As Making It
Palatable
Some of why I didnt test Casbah as well early on is I
couldnt easily test the values as MongoDB saw them.

With moving to Specs2, it was much more strict and I was


inspired to write custom matchers to do the job; provided
for users too! (Tests all the way down...)

Tests are much cleaner and I feel more confident about


them; able to achieve higher coverage

Higher coverage definitively relates to less bugs users find


in their production apps

Tons and tons of bugs found as I moved to specs2 , that had lurked
under the surface for time immemorial

Yak Shaving Becomes Yak Nairing


trait CasbahSpecification extends Specification with DBObjectMatchers with Logging {
/** SNIP */
}
trait DBObjectMatchers extends DBObjectBaseMatchers
trait DBObjectBaseMatchers extends Logging {
protected def someField(map: Expectable[Option[DBObject]], k: String) = if (k.indexOf('.') < 0) {
map.value.getOrElse(MongoDBObject.empty).getAs[AnyRef](k)
} else {
map.value.getOrElse(MongoDBObject.empty).expand[AnyRef](k)
}
protected def field(map: Expectable[DBObject], k: String) = if (k.indexOf('.') < 0) {
map.value.getAs[AnyRef](k)
} else {
map.value.expand[AnyRef](k)
}
protected def listField(map: Expectable[DBObject], k: String) = if (k.indexOf('.') < 0) {
map.value.getAs[Seq[Any]](k)
} else {
map.value.expand[Seq[Any]](k)
}
def beDBObject: Matcher[AnyRef] = ((_: AnyRef).isInstanceOf[DBObject], " is a DBObject", " is not a DBObject")
def haveSomeField(k: String) = new Matcher[Option[DBObject]] {
def apply[S <: Option[DBObject]](map: Expectable[S]) = {
result(someField(map, k).isDefined, map.description + " has the key " + k, map.description + " doesn't have the key " + k, map)
}
}
/** matches if dbObject.contains(k) */
def haveField(k: String) = new Matcher[DBObject] {
def apply[S <: DBObject](map: Expectable[S]) = {
result(field(map, k).isDefined, map.description + " has the key " + k, map.description + " doesn't have the key " + k, map)
}
}
/** matches if a Some(map) contains a pair (key, value) == (k, v)
* Will expand out dot notation for matching.
**/
def haveSomeEntry[V](p: (String, V)) = new Matcher[Option[DBObject]] {
def apply[S <: Option[DBObject]](map: Expectable[S]) = {

Tons and tons of bugs found as I moved to specs2 , that had lurked
under the surface for time immemorial

Yak Shaving Becomes Yak Nairing


/** matches if dbObject.contains(k) */
def haveField(k: String) = new Matcher[DBObject] {
def apply[S <: DBObject](map: Expectable[S]) = {
result(field(map, k).isDefined, map.description + " has the key " + k, map.description + " doesn't have the key " + k, map)
}
}
/** matches if a Some(map) contains a pair (key, value) == (k, v)
* Will expand out dot notation for matching.
**/
def haveSomeEntry[V](p: (String, V)) = new Matcher[Option[DBObject]] {
def apply[S <: Option[DBObject]](map: Expectable[S]) = {
result(someField(map, p._1).exists(_ == p._2), // match only the value
map.description + " has the pair " + p, map.description + " doesn't have the pair " + p, map)
}
}
/** Special version of "HaveEntry" that expects a list and then uses
* "hasSameElements" on it.
*/
def haveListEntry(k: String, l: => Traversable[Any]) = new Matcher[DBObject] {
def apply[S <: DBObject](map: Expectable[S]) = {
val objL = listField(map, k).getOrElse(Seq.empty[Any]).toSeq
val _l = l.toSeq
result(objL.sameElements(_l), // match only the value
map.description + " has the pair " + k,
map.description + " doesn't have the pair " + k,
map)
}
}
/** matches if map contains a pair (key, value) == (k, v)
* Will expand out dot notation for matching.
**/
def haveEntry[V](p: (String, V)) = new Matcher[DBObject] {
def apply[S <: DBObject](map: Expectable[S]) = {
result(field(map, p._1).exists(_.equals(p._2)), // match only the value
map.description + " has the pair " + p,
map.description + "[" + field(map, p._1) + "] doesn't have the pair " + p + "[" + p._2 + "]",
map)
}
}
/** matches if Some(map) contains all the specified pairs
* can expand dot notation to match specific sub-keys */
def haveSomeEntries[V](pairs: (String, V)*) = new Matcher[Option[DBObject]] {
def apply[S <: Option[DBObject]](map: Expectable[S]) = {
result(pairs.forall(pair => someField(map, pair._1).exists(_ == pair._2) /* match only the value */ ),
map.description + " has the pairs " + pairs.mkString(", "), map.description + " doesn't have the pairs " + pairs.mkString(", "), map)
}
}
/** matches if map contains all the specified pairs

Tons and tons of bugs found as I moved to specs2 , that had lurked
under the surface for time immemorial

epilogue
Casbah lives on and will continue to evolve, but it also has a

younger brother/cousin
Hammersmith, purely asynchronous, purely Scala and a
distillation of ~2 years of MongoDB knowledge
Only Java is the BSON serialization; still no excuse for
reinventing the wheel
Netty for now, but probably will end up as pure NIO
NOT (contrary to popular panic/confusion) a replacement for
Casbah

Focused more on framework support than userspace


Will likely offer optional synchronous and asynchronous

hammersmith module for casbah-core, with Java driver as casbahcore-classic

Working on sharing as much code as possible between


Hammersmith & Casbah for MongoDBObject, etc.

Porting casbah-query to target Hammersmith (as well as Lift)

epilogue
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31

def iterateSimpleCursor(conn: MongoConnection) = {


var x = 0
conn("bookstore").find("inventory")(Document.empty, Document.empty)((cursor: Cursor) => {
for (doc <- cursor) {
x += 1
}
})
x must eventually (be_==(336))
}
def iterateComplexCursor(conn: MongoConnection) = {
var x = 0
conn("bookstore").find("inventory")(Document.empty, Document.empty)((cursor: Cursor) => {
def next(op: Cursor.IterState): Cursor.IterCmd = op match {
case Cursor.Entry(doc) => {
x += 1
if (x < 100) Cursor.Next(next) else Cursor.Done
}
case Cursor.Empty => {
if (x < 100) Cursor.NextBatch(next) else Cursor.Done
}
case Cursor.EOF => {
Cursor.Done
}
}
Cursor.iterate(cursor)(next)
})
x must eventually(5, 5.seconds) (be_==(100))
}

- Iteratees, No blocking calls, Callbacks and no Java wrappers or


wackiness
- Thanks to @prasinous, @jdegoes, @etorreborre who shared ideas,
code and beat me about the head and neck at times for stupidity as I
evolved this.

33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64

epilogue

def insertWithSafeImplicitWriteConcern(conn: MongoConnection) = {


val mongo = conn("testHammersmith")("test_insert")
implicit val safeWrite = WriteConcern.Safe
mongo.dropCollection()(success => {
log.info("Dropped collection... Success? " + success)
})
var id: Option[AnyRef] = null
var ok: Option[Boolean] = None
val handler = RequestFutures.write((result: Either[Throwable, (Option[AnyRef], WriteResult)]) => {
result match {
case Right((oid, wr)) => {
ok = Some(true)
id = oid
}
case Left(t) => {
ok = Some(false)
log.error(t, "Command Failed.")
}
}
})
mongo.insert(Document("foo" -> "bar", "bar" -> "baz"))(handler)
ok must eventually { beSome(true) }
id must not (beNull.eventually)
// TODO - Implement 'count'
var doc: BSONDocument = null
mongo.findOne(Document("foo" -> "bar"))((_doc: BSONDocument) => {
doc = _doc
})
doc must not (beNull.eventually)
doc must eventually (havePairs("foo" -> "bar", "bar" -> "baz"))
}

- Iteratees, No blocking calls, Callbacks and no Java wrappers or


wackiness
- Thanks to @prasinous, @jdegoes, @etorreborre who shared ideas,
code and beat me about the head and neck at times for stupidity as I
evolved this.

download at mongodb.org
github.com/mongodb/casbah

Were Hiring !
( http://10gen.jobscore.com/jobs/10gen/list )

slides:

brendan@10gen.com
twitter: @rit
http://speakerdeck.com/u/bwmcadams/p/scala-days-2011

International Scala Hackathon (Scalathon) Coming!


July 16-17, 2011 @ UPenn (Philadelphia, PA USA)
http://scalathon.org

conferences, appearances, and meetups


http://www.10gen.com/events

Facebook | Twitter | LinkedIn

http://bit.ly/mongoH

@mongodb

http://linkd.in/joinmongo

- Goal of Scalathon: to get Scala developers contributing to the language, its tools, and its libraries.
* Attending:
- Almost full but if response/interest continues, may be possible to open more seats soon!

Acknowledgements
Because this genuinely was never my effort alone, these people contributed
patches, suggestions, bugs, ideas or were just early users...

Michael Merwitz (@michaelmerwitz) who suggested/encouraged we

give MongoDB a try (Python/IronPython/C# project, pre-Novus/Casbah)

Gregg Carrier (@greggcarrier)


Gerolf Seitz (@gseitz)
Phil Wills (@philwills)
Max Afonov (@max4f)
Marcello Basta-Forte (@marcello3d)
Rose Toomey (@prasinous)

(Her contributions are probably largest in many ways)

Novus Partners, esp. Basil (CEO) for incubating and cultivating Casbah
and allowing it (and me) to grow beyond his company.

10gen ... Eliot, Dwight and every single other person (my coworkers)

for creating MongoDB, cultivating an amazing community and creating a


fantastic place to work.
Not in any way to diminish anyone I forgot. Nothing personal, I just have a
short attention span and memory!

Potrebbero piacerti anche