Scala Generalized Type constraints

Generalized Type Constraints, also known as <:<, <%<(deprecated though) and =:=, also known as type relation operator, or call whatever you want, are not operators but identifiers. It’s quite confusing for new comers to distinguish them from operators, well…, identifiers which are not that esoteric.

This is just plain Scala feature that non-alphanum symbols can act as legal identifiers, just like + method.
More specifically, they are type-constructors. But before we inspect their implementations, let’s first consider their usage.

Usage

You want to implement a generic container for every type, however, you also want to add a special method that only applies to Special type. (notice: this is different from the annotation @specialized which deals with JVM’s primitive type. Here Special is just a plain old scala type)

1
2
3
4
5
6
7
class Container[A](value: A) {
def diff[A <: Int](b: Int) = value - b
}

// BOOM
// error: value - is not a member of type parameter A
// def diff[A <: Int](b: A) = value - b

Why? The type bound A <: Int does not work. A has been defined at the class declaration, in the class body Scala compiler requries every type bound is consistent with A’s definition. Here, A has no bound so it is bounded by Any, not Int.

Instead of setting type bound, methods may ask for some kinds of specific ad-hoc “evidence” for a type.

1
2
3
4
5
6
7
8
9
10
11
12
13
14
scala> class Container[A](value: A) {
// other generic methods for A
/* blah blah */

// specialized method for Int
def addIt(implicit evidence: A =:= Int) = 123 + value
}
defined class Container

scala> (new Container(123)).addIt
res11: Int = 246

scala> (new Container("123")).addIt
<console>:10: error: could not find implicit value for parameter evidence: =:=[java.lang.String,Int]

Cool, evidence is an implicit provided by scala predef. And A =:= Int is just a type like Map[Int, String], but is infixed due to scala’s syntactic sugar.

Scala does not impose type constraints until the specific method is called, so addIt does not violate A‘s definition. Still, given the implicit evidence, compiler can still infer that value in addIt is an sub-instance of Int.

As stated before, type constraints are ad-hoc. So it can achieve type inference more specific than type bound. (Fairly, this is the power of implicit).

1
2
3
4
def foo[A, B <: A](a: A, b: B) = (a,b)

scala> foo(1, List(1,2,3))
res1: (Any, List[Int]) = (1,List(1, 2, 3))

1 is clearly Int but why does compiler infer it as Any? The B <: A bound requires the first argument type is a super type of the second. A is inferred as the most general type between Int and List[Int], Any.

<:< comes to help.

1
2
3
4
def bar[A,B](a: A, b: B)(implicit ev: B <:< A) = (a,b)

scala> bar(1,List(1,2,3))
<console>:9: error: Cannot prove that List[Int] <:< Int.

Because generalized type constraints does not interfere with inference, A is Int here. Only then does the compiler find evidence for <:<[Int, List[Int]] and then fails.
(Actually, implicit can feedback type information back to inference, see typelevel programming’s HList and scala collection library’s CanBuildFrom)

Also implicit conversion does not impact <:<

1
2
3
4
5
6
7
8
9
10
11
12
13
scala> def foo[B, A<:B] (a:A,b:B) = print("OK")

scala> class A; class B;

scala> implicit def a2b(a:A) = new B

scala> foo(new A, new B) // implicit conversion!
OK

scala> def bar[A,B](a:A,b:B)(implicit ev: A<:<B) = print("OK")

scala> bar(new A, new B) // does not work
<console>:17: error: Cannot prove that A <:< B.

Implementation

Actually =:= is just a type constructor in scala.
It is somewhat like Map[A, B], that is,
=:= is defined like

1
class =:=[A, B]

so in the implictly’s bracket, Int =:= Int is just a type
A =:= B is the infix form of type parameterization for
non-alphanumeric identifier. It is equivalent to =:=[A, B]

so one can define implicts for =:=, so that compiler can find

1
implicit def EqualTypeEvidence[A]: =:=[A, A] = new =:=[A, A]

So, when implictly[A =:= B] is compiled,
compiler tries to find the correct implicit evidence.

If and only If A and B are the same, say Int, the compiler can find
=:=[Int, Int], by the result of implicit function EqualTypeEvidence[Int]

More compelling is <:<, the conformance evidence,
it leverages variance annotation in scala

1
2
class <:<[-A, +B]
implict def Conformance[A]: <:<[A, A] = new <:<[A, A]

Consider, when String <:< java.io.Serializable is needed,
compiler tries to find an instance of <:<[String, j.i.Serializable]
It can only find instance of the type <:<[String, String]
(or another alternative <:<[Serializable, Serializable])
But given the variance annotation of <:<,
since String is the very type String
and String is a subtype of Serializable and B is in a covariant position
, or, in another direction
snice Serializable is a supertype of String and A is in a contravariant position
and Serializable is the very type Serializable

<:<[String, String] is a subtype of <:<[String, Serializable]
So compiler finds the correct implicit instance as the evidence that
String is a subtype of Serializable. By the principle of subtype subsititution.
(Liskov)

Similarly we can define

1
2
3
4
5
6
7
8
Conversion evidence

class <%<[A <% B, B]
implicit def Conversion[A, B] = new <%<[A, B]

Contra-conformance
class >:>[+A, -B]
implicit def Contra[A] = new >:>[A, A]

Magic, Right?
The actual implementations uses singleton pattern so it is more efficient. For this illustration post, sloppy implementation is just fine :).

Reference:
http://hongjiang.info/scala-type-contraints-and-specialized-methods/
http://apocalisp.wordpress.com/2010/07/17/type-level-programming-in-scala-part-6d-hlist%C2%A0zipunzip/

play framework with scalate and activerecord

WRYYYYYYYYYYYYYYYY

Dio Brando on Scala crazy dependencies

Scala works like CSS selectors in that every successor overrides its predecessor.

You will have to work as a detective to figure out the correct recipe to manage a huge casserole of hodgepodge.

To achieve a working Play configuration with scalate and activerecord needs:

In build.sbt:

1
2
3
4
5
6
7
8
9
scalaVersion :=  "2.10.3"

libraryDenpendencies ++= Seq(
jdbc,
"org.scalatra.scalate" %% "scalate-core" % "1.7.0",
"com.github.aselab" %% "scala-activerecord" % "0.2.3",
"com.github.aselab" %% "scala-activerecord-play2" % "0.2.3",
"com.h2database" % "h2" % "1.3.170"
)

Several notes:

  1. Currently scala-activerecord only supports 2.10.3
  2. scalate must be 1.7.0+ for better support on scala 2.10 but the current stable version is 1.6.0

Then in the root path of play project, create a new file located at app/lib/ScalateIntegration.scala

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56

package controllers

import play.api._
import http.{Writeable, ContentTypeOf, ContentTypes}
import mvc.Codec
import play.api.Play.current
import org.fusesource.scalate.layout.DefaultLayoutStrategy
import collection.JavaConversions._

object Scalate {

import org.fusesource.scalate._
import org.fusesource.scalate.util._

var format = Play.configuration.getString("scalate.format") match {
case Some(configuredFormat) => configuredFormat
case _ => "scaml"
}

lazy val scalateEngine = {
val engine = new TemplateEngine
engine.resourceLoader = new FileResourceLoader(Some(Play.getFile("app/views")))
engine.layoutStrategy = new DefaultLayoutStrategy(engine, "app/views/layouts/default." + format)
engine.classpath = "tmp/classes"
engine.workingDirectory = Play.getFile("tmp")
engine.combinedClassPath = true
engine.classLoader = Play.classloader
engine
}

def apply(template: String) = Template(template)

case class Template(name: String) {

def render(args: java.util.Map[String, Any]) = {
ScalateContent{
scalateEngine.layout(name, args.map {
case (k, v) => k -> v
} toMap)
}
}

}

case class ScalateContent(val cont: String)

implicit def writeableOf_ScalateContent(implicit codec: Codec): Writeable[ScalateContent] = {
Writeable[ScalateContent]((scalate:ScalateContent) => codec.encode(scalate.cont))
}

implicit def contentTypeOf_ScalateContent(implicit codec: Codec): ContentTypeOf[ScalateContent] = {
ContentTypeOf[ScalateContent](Some(ContentTypes.HTML))
}
}

Again only works on scalate 1.6.

Finally, according to activerecord‘s doc.
To load the plugin, in conf/play.plugin

1
9999:com.github.aselab.activerecord.ActiveRecordPlugin

To configure database, in conf/application.conf

1
2
3
4
5
6
7
8
9
10
11
12
# Database configuration
# ~~~~~
#

# Scala ActiveRecord configurations
db.activerecord.driver=org.h2.Driver
db.activerecord.url="jdbc:h2:mem:play"
db.activerecord.user="sa"
db.activerecord.password=""

# Schema definition class
activerecord.schema=models.Tables

And in app/models/person.scala

1
2
3
4
5
6
7
package models

import com.github.aselab.activerecord._
import com.github.aselab.activerecord.dsl._

case class Person(@Required name: String) extends ActiveRecord
object Person extends ActiveRecordCompanion[Person] with PlayFormSupport[Person]

In app/models/tabels.scala

1
2
3
4
5
6
7
8
package models

import com.github.aselab.activerecord._
import com.github.aselab.activerecord.dsl._

object Tables extends ActiveRecordTables with PlaySupport {
val models = table[Person]
}

And finally you can try this in console

1
2
3
4
5
6
7
8
9
activator console

import models._
import play.core.StaticApplication

new StaticApplication(new java.io.File("."))

Person("f@ck").save
Person.findBy("name", "f@ck")

WTF have I done? I just typed mechanically what I have ferreted on Google and StackOverflow.
WTF is play.core.StaticApplication that is just one confusing page on the doc?
Speciously tantializing is the magical code under which lurk complicated dependencies.

Reference: PlayframeWork Quick Tip

How to set up a scaloid project from scratch

TL;DR; It’s hard to set up android development environment without the aid of IDE. And it’s harder for scaloid.

  1. Download standalone SDK

  2. Download Android SDK tools/ SDK platform tools/ SDK build tools
    NB> f@ck GFW, I got around that bastard by modifying /etc/hosts.

  3. Install sbt, reference for GFW: here

  4. android create project --target <target-id> --name scaloidApp --path <path>/scaloidApp --activity MainActivity --package com.example.scaloidapp

  5. Create a directory named project within your project and add the file project/plugins.sbt, in it, add the following line:
    addSbtPlugin(“com.hanhuy.sbt” % “android-sdk-plugin” % “1.2.20”)

  6. Create project/build.properties and add the following line:

sbt.version=0.12.4 # newer versions may be used instead

  1. Create build.sbt in root directory(example). Remember to import android.Keys._
1
2
3
4
5
6
7
8
9
10
11
12
import android.Keys._

android.Plugin.androidBuild

name := "scaloidApp"

scalaVersion := "2.11.0"

platformTarget in Android := "android-20"

libraryDependencies += "org.scaloid" %% "scaloid" % "3.4-10"

UPDATE: scaloid-android-plugin fixed the building =.=

magic HTML.js

source: pixiv

HTML.js is a library full of syntactic sugar. It changes html elements dynamically so that the methods reflect their children node. For example, code like HTML.body.header.hgroup.h1 utilizes chain methods to mirror the structure of dom.

ES5 Object.defineProperty and mutationObserver conjure up the magic. HTML.js provides an eponymous HTML api object initialized by an internal node method, which add all tag methods to its argument object. All tag methods are defined by Object.defineProperty with get option. So tag methods behave like getter methods: every time user access these attributes, tag methods return HTMLifyed elements that are ready to be chained (HTMLifyed elements are normal HTMLElements that have been extended by the internal node method mentioned above).

Enable tag methods responsive to dom manipulation, HTML opts for mutationObserver to keep an eye on the root element. Once elements have been changed, mutationObserver detects the change and notifies HTML to refresh methods of the corresponding elements.

However, syntactic sweetheart fails to belie some design deficits and practical problems in this library. Getter methods abstain legacy browsers that still occupy about 10% market share. MutationObserver itself is not that horribly slow, but registering a watchdog on is almost certainly a performance killer for massive dom manipulations.

But the most notorious code smell comes from yet another place, a pure design decision that functions returning either element or array. It is certainly one of the most sloppy practice in dynamic language. In static typed language those function can only have return type Any. Surely this is not informative and bothers user to take the risk of casting results. Indeed, the author mentioned this on the homepage and tried defend this api design by the excuse of conditional context where users can avoid quandary. But a good library shall be as much care-free as possible. Providing api that returns one single element is probably better than leaving the users to guarantee element’s uniqueness. Ad-hoc polymorphism is determined by function arguments, not return type.

HTML’s api reminds me of the keyword null. Admittedly it is theoretically feasible to entrust programmers with the role of checking uniqueness/existence. But why ぬるぽ is still one of the most prominent haunting apparition in our code?

至少、像那老坛酸菜牛肉面一样

source: unknown

「東山覇権」盛行下的碧池这个词,几乎是个定番的萌点了。双马尾金发碧眼归国麻将日侨的威压统治了円光排行榜数载,
使得麻将军团的地位牢不可破。动画中一有搔首弄姿的镜头,屏幕前众宅男便心领神会。而年度nico某排行榜里,当壮大的BGM配着罗格翻开冠军ako酱的名字时,就能感到这王者风范:就像全屏大赤字弹幕一样霸气。

而在同人薄本界,霸主地位也高高在上遥不可及。《清纯奔放的木静酱》是一部不错的全彩短篇,可惜故事不成气候,难以动摇王座。然而,却有一部作品是円光界独树一帜的经典,震慑了女王新子瞳的王者地位——就是传说中感动2ch全J民的C79本《せめて、あの雪のように》。

这部作品是作者藤丸涉足同人界十周年的作品。虽说藤丸比较低调,并非是脍炙人口的名流。但画风却相当不错,线条俊秀,一本全彩本也相当实用,是个值得称道的绅士。

然而,这部十周年的作品,却并不是以其实用性著称,这正是她不同凡响之处。让这部作品不仅对作者而言纪念作品,更是援交作品界的一座里程碑式的存在。除此之外,同名小说以及MAD亦有爱好者制作,而作品的经典台词和分镜,更是渗入到了匿名基坛。可见《雪》是多么地震撼。

艺术手法上,《雪》是一部大量使用“留白”的作品,甚至可以说完全呈递了一张30页白纸给读者。与其浓墨重彩混体液描绘JK和大叔的床单故事,作品仅仅是让女主角雪和客人拉手让男主目击。余下的,就只是轻轻勾勒男女主间淡淡的青涩。这是一部“没有工口场景的工口作品”,没有一笔一墨淫猥,把故事和故事配套的动作戏码完全在读者脑内展开。也正是如此,原本是看中作者画风的看客,是硬生生把体液从下半身挤到了泪腺里流出来。

如果在这里做人物分析的话绝对属于多余,甚至是对作品的不敬。这部作品是经典,女主是经典的碧池妖精白无垢天使,男主是经典的迟钝缩卵。一个没有意外展开的故事,但确实是经典的故事,也正是一次、又一次碾碎读者心理防线的故事。前几页是Sucubus的开朗笑颜,下几页就是女主的无奈和夙愿。这笔触下的人物味道,不是死甜的糖也不是突兀的苦味,而是银勺轻舀咖啡,在口中化开甜味苦味醇香几个层次,在后味中品咖啡豆是如何烘焙烧制的——作品留白了女主雪的背景,却又提供了她的苦恼,让读者自然地在脑内补完。

因而刻画女主白山雪尤其经典,女主的每句台词都是经典
—— 好羡慕月村同学,真好啊,我也想谈恋爱
—— 花丘君对我没兴趣,所以喜欢,真的喜欢
—— 各种人,各种想法,我都懂,被教会了 (英译本中为 I was made to know真是被动语态在翻译暧昧语言的合适用法)
—— 像我这样的二手货,清仓甩卖也没人要啦
这似曾相识的句子,简直是掩藏在人类集体无意识中的原型,被挖掘出来压缩在几十页中冲击读者的精神。这,就是经典,恒久不变,历久弥新。

和众多薄本一样,《雪》足够给看客带来充分贤者时间。而不同的是,《雪》并不局限在短暂肤浅的生理不应期,而是对人性深层的震撼和拷问。在各处对援交的舌战),总是能看到在遥遥道德高点上用“干净”“肮脏”的概念,反对援交。在没有论证前就用着“试想蔚然成风后社会还能干净吗”的论调强占高点。然而《雪》这部作品却展现了另一面:少女在成长和生活中的无奈和困苦,迫使少女的早熟和世故。对援交的鄙夷在宏观上或许是大义凛然的,但在微观上却显得无情无义。或许不该忘记,在臆想中或污染社会的援交,恰恰是现实里由龌龊的社会所制造的。

当然,这部作品无法脱离校园恋爱剧的范畴,金钱交易和校园纯爱的反差也是看点之一,合乎心智(尤其异性人际方面)水平(滞留)在中学时代的宅男。而最后女主去向不明,成为男主的回忆,倒也是提供了不少可以意淫之处。这样的结局倒是影射了漫画的特色:任何角色,只能成为看客精神世界里的影子。《雪》确实挖了一道沟渠让读者能意淫出一道爱河了。

《せめて、あの雪のように》是一本让人心酸的本子。其影响已经超过作品本身,演化成了”像我这样的二手货,也不会要XXX”的成句,真是又给这个漫画添加了别样的风味。然而转念一想,可能更增加了一份实用。每每看到年轻情侣亲密时,总是会有难免见不得理还乱的忧愁;又或者看到大叔拉着百褶裙的手在草地上散步时,百感交集。然而重温这个经典的镜头和台词,却瞬间心头感到:这酸爽!
啊,这酸爽!至少、像那老坛酸菜牛肉面一样

作者简介:藤丸 漫画家,原画家, 曾用名F.blue,曾与ひらめ、綾瀬三人组团出本,不过其中一人在C75的时候撂了挑子(迷上了和洛奇很像的一种网游)。藤丸也就在那个时候改了名,自 己一个人单练。藤丸的作品不多,原创气息浓厚,双手磨砺的很充分,仅有的几部同人都有很高的还原度,分镜规矩,笔法公整,平凡的剧情,简直就是一个传统的 漫画家

Scala Symbol Soup Salvage

source: yande.re

By no several pages could one explain or exhausted the absurd bizarre creepy daunting esoteric flabbergasting type system in Scala.
Here are some aspects that Learn Scala by example does not cover.

Digest:

http://twitter.github.io/scala_school/advanced-types.html

Context Bound and Implicitly:

def sort[A : Ordered] => def sort[A](implicit x: Ordred[A])
def implicitly[T](implicit e: T) = e
http://stackoverflow.com/questions/3855595/what-is-the-scala-identifier-implicitly

Type Bound:

=:= <:< <%<
http://stackoverflow.com/questions/3427345/what-do-and-mean-in-scala-2-8-and-where-are-they-documented

Existential Type:

def foo(x: Array[_]) => def foo(x: Array[T] forSome {type T})
http://www.drmaciver.com/2008/03/existential-types-in-scala/

Self Type

class BarUsingFooable {self: Fooable => ....} // yet-to-be Fooable, make this binded to self
https://coderwall.com/p/t_rapw

Structural Type

def Quack(duck: {def quack: Unit})// inline anonymous class like ducktyping
http://java.dzone.com/articles/duck-typing-scala-structural

Abstract Type member

trait Job {type A; def get: A}
http://docs.scala-lang.org/tutorials/tour/abstract-types.html

Type Level Programming:

http://apocalisp.wordpress.com/2010/06/08/type-level-programming-in-scala/

Miscellaneous:

http://stackoverflow.com/questions/1025181/hidden-features-of-scala

Last but not least:

Site significantly serving Scala symbol soup salvage saves spiritually severed souls

QuickNote Descriptor in Python

TL;DR: Descriptor in Python is just getter/setter

Three ways to implement descriptor:

  1. Use a class that implement magic method __get__, __set__, __delete__
  2. Use builtin function property(fget=None, fset=None, fdel=None, doc=None)
  3. Use decorator form of property, like @property def attr(): ..., @attr.setter, @attr.deleter

Make sure all three methods are done at class level rather than at instance level.
(That is, write my_attr = property(...) under class MyClass(object): statement,
but not self.my_attr = property(..) in __init__)

Because Python’s MRO is:

  1. find attr in instance.__dict__ (instance level)
  2. if not found, find attr in instance.__class__.__dict__. If found, try return attr.__get__, otherwise return attr (class level)
  3. if not found, repeat step 2 on instance.__class__.__base__ until attr found or no base class found
  4. if not found, try returning computed attr if __getattr__ is defined

So the descriptor magic is done at class level

Pitfall:

Because descriptor dwells at class level, if one uses descriptor class implementation, all instances may share one common variable.

To fix that:

  1. Hoard one hidden variable in the instance
  2. Use a dictionary in descriptor class to store info of different instances.

The second solution needs the class hashable. (So a meta class is needed to guarantee hashability)

Personal view: To maintain Python’s one simple way to work Pythonists just introduced much more complexity.

Reference:
http://www.ibm.com/developerworks/library/os-pythondescriptors/
http://nbviewer.ipython.org/urls/gist.github.com/ChrisBeaumont/5758381/raw/descriptor_writeup.ipynb
http://docs.python.org/2/howto/descriptor.html

Breadth First Search without Queue

source: かみやまねき

TL;DR: The most intuitive implementation of BFS is using Queue

Just out of fun, I wonder whether BFS can be implemented without Queue.
While DFS can be easily done by recursion, naive BFS implementation just incurs loop mayhem.

The first solution jumped into my mind is to add a depth parameter into BFS function.
The search function only visits nodes whose depth equals to the parameter and skips nodes whose depth does not.
I am not alone

Generator is sweeping NodeJS community (admittedly this is my exaggeration). I’m quite obsessed with generator’s suspension power. Here’s my try in Python.

'''
    implicit BFS, without queue or depth
    incurs infinite circulation on loop graph
    implement for fun (Really need a queue)
'''
def bfs_gen(node):
    '''
    generator function on a node, yield an iterator of frontier nodes
    the first call yield an iterator contains the node itself
    each subsequent call yields an iterator that is composed of children's gen
    try line chains children's yields, and thus accumulates frontier nodes
    By recursion, the root node's bfs_gen yields all frontier nodes on the graph
    frontier is maintained on callstack, by the suspension feature of generators
    '''
    try:
        yield iter((node.value, ))
        # :( still needs a list to maintain generators
        children = [bfs_gen(n) for n in node.children]
    except AttributeError:
        raise StopIteration

    chain = 0xDEADBABE
    while chain != ():
        chain = ()
        for child in children:
            try:
                chain = (e for it in (chain, next(child)) for e in it)
            except StopIteration:
                continue
        yield chain

def bfs(root, pred):
    '''
    make generator on root node
    call next method to yield frontier nodes
    apply predicate function on it, if any matches, return True
    else continues to yield more frontier
    '''
    gens = bfs_gen(root)
    try:
        while not any(pred(v) for v in next(gens)):
            continue
        return True
    except StopIteration:
        return False

def main():
    '''
    3 4 5 6
     1   2
       0
    '''
    from collections import namedtuple

    node = namedtuple('Node', ['value', 'children'])
    six   = node(6, [None])
    five  = node(5, [None])
    four  = node(4, [None])
    three = node(3, [None])
    two   = node(2, [five, six])
    one   = node(1, [three, four])
    zero  = node(0, [one, two])

    def demo(value):
        '''dumb log'''
        print(value)
        return False

    # bfs(zero, demo)
    for i in bfs_gen(zero):
        demo(i.value)

if __name__ == '__main__':
    main()

Wow, much ugly, very obscure. But a old post on certain unsung site gives me an interesting solution.

def bfs(root):
    yield root
    for node in bfs(root):
        for child in node.children:
            yield label(child)

Tada, done! Let’s peruse this piece of code.
We want a generator that yields a sequence of nodes, this can be done by induction.

  1. Given the root of a tree, clearly the first node of the sequence is the root
  2. Because the next level of frontier nodes in the sequence is the children of the current level frontier nodes, construct a new sequence that lags behind.
  3. yield each child given its parent.

Doing this recursively makes a BFS generator.

The trick part is the second step, for a normal function, this will make infinite loops. But generator is expanded dynamically, execution jumps out of BFS, eschewing looping condition. Suspension of code also implicitly stores searching states, where calling stacks morph into a Queue. Of course, visited nodes also need to be maintained in a list(not checked here). Termination check is absent, as well.

Interestingly, adapting BFS into DFS only requires swapping two lines, just as substituting stack for queue in classical implementation.

def dfs(root):
    yield root
    for node in node.children:
        for child in dfs(node):
            yield child

Whatever fun generators have brought, BFS/DFS should probably never done like this. Storing information in stack does not save space. Static language like C/C++ lacks such generator feature. And, notably, function overhead and slow generator in most script language make developers repugnant to such winding trick.

Request is a good lib

source: pixiv

Web scrapping is a common task for script languages like python.
Yet Python standard libraries provide twisted utility to couple with this simple task. urllib[23$] ruins my python newbie days.
To achieve cookie support, you have to import cookielib, create a new cookie jar via the factory method, subclass a opener from the urllib, and finally use the forged opener to make a request.

Bloated boilerplate code does not seem pythonic at all. We want a simple scrapping utility that provides concise API like http verbs and, preferably, automatic session management. Request is the library comes to rescue.

The structure of Request:

  • model
  • session
  • adapter
  • api
  • etc…

model irons response and request parameters into unified objects. Both Request and Response supports generator style and file style. Magic methods empowers Request pythonic syntax. The most dirty works(say, encoding stuffs) lies here.

session acts as controller in MVC design. It combines adapter, which sends request, authentication and cookie session. Request’s streamlined API stems from an interface mocking real browsers.

adapter is a wrapper of urllib3, supporting connection pool management, keep-alive request and proxies.

api is just an alias for 1. open session 2. prepare request 3. send request. And all other modules are helpers that do stuffs like encoding and making auth token.

The design philosophy under Request is : Simple is better than functionality, however, it still grants users plenty of features.
Specific as the library aims to be, the philosophy underlying it applies much wider. jQuery’s almighty $, Chrome’s omnipotent omnibox, Google’s preeminent search engine and etc. Simplicity rocks, Usability counts.

dark
sans