Explaining Scala SAM type

source

For better syntax and user friendliness (and a lot more), Java 8 introduces SAM type, Single Abstract Method type, starting to embrace the functional programming world.

Previous to 1.8, Java already has somewhat a (bulky and leaky) type of closure: annonyous inner classes. For example, to start a working thread in Java usually requires following trivial but bloated statements, without a lexical this:

1
2
3
4
5
6
7
8
9
// from android developer guide
public void onClick(View v) {
new Thread(new Runnable() {
public void run() {
Bitmap b = loadImageFromNetwork("http://www.example.org/image.gif");
mImageView.setImage(b);
}
}).start()
}

Two classes, one method. The introduction of SAM type greatly reduces the syntactical overhead of Java.

1
2
3
4
5
6

public void onClick(View v) {
new Thread(#{ ->
mImage.setImage(loadImageFromNetwork("/image.gif"));
}).start();
}

I’m not explaining Java8’s new features here, as a Scala user has already relished the conciseness and expressiveness of functional programming.

So, why would scala user cares about SAM type in Java? I would say that interoperation with Java and performance are the main concerns here.

Java8 introduces Function type, which is widely used in library like stream. Sadly, Function is not that of scala. Scala compiler just frowns at you using scala native function with Stream.

1
2
3
4
5
6
import java.util.Arrays
Arrays.asList(1,2,3).stream.map((i: Int) => i * 2)

// <console>: error: type mismatch;
// found : Int => Int
// required: java.util.function.Function[_ >: Int, _]

Side notes: because function parameter is in a contravariant position, Function[_ >: Int, _] has a lower bound Int rather than a upper bound.
That is, the function passed as argument must accept types that are super types of Int.

One can manually provide implicit conversion here to transform scala function types to java function types. However, implementing such implicit conversion is of no fun. The implementation is either not generic enough, or requires mechanical code duplication(another alternative is advanced macro generation). Compiler support is more ideal, not only because it generates more efficient byte code, but also because this precludes incompatibility across different implementations.

SAM type is enabled by -Xexperimental flag in scala 2.11.x flags. Specifically, in scala 2.11.5, SAM type is better supported.
SAM gets eta-expansion(one can use a method from another class as SAM), overloading(overloaded function/method can also accept functions as SAM) and existential type support in scala 2.11.5.

Basic usage os SAM is quite simple, if a trait/abstract class with exactly one abstract method, then a Function of the same parameter and return type of the abstract method can be converted into the trait/abstract class.

1
2
3
4
5
6
7
8
9
10
11
12
trait Flyable {
// exactly one abstracg method
def fly(miles: Int): Unit
// optional concrete object
val name = "Unidentified Flyable Object"
}

// to reference SAM type itself
// create a named self-referencing lambda expression
val ufo: Flyable = (m: Int) => println(s"${ufo.name} flies $m miles!")
ufo.fly(123)
// Unidentified Flyable Object flies 123 miles!

Easy Peasy. So for the stream example, if compiler has the -Xexperimental flag, scala will automatically change the function to java’s function, which grant scala user a seamless experience with the library.

Usually, you don’t need SAM in scala, as scala already has first class generic function type, eta-expansion and a lot more. SAM reduces the readability as implicit conversion does. One can always use type alias to give function a more understandable name, instead of using SAM. SAM type cannot be pattern matched, at least for now.

However, interoperating with Java requires SAM. And self-referencing SAM gives you additional flexibity in designing API. SAM also generates more efficient byte code since SAM has a native byte code counterpart. Using annonymous class for event handler or callback can be more pleasant in Scala just as in Java.

Anyway, adding a feature is easy, but adding a feature that couples with edge case is hard. Scala already has bunches of features(variance, higher kinded, type level, continuation), whether SAM will gain its popularity is still an open question.

Rant on TypeScript type guard

source: Ad: hisuitei

Recently TypeScript team has released TypeScript 1.4, adding a new feature called Union Type which is intended for better incorporation into native JavaScript.
Type guard, as a natural dyad of union type, also comes into TypeScript world. But sadly, Microsoft choose a bizzare way to introduce type guard as they said in the manual

TypeScript now understands these conditions and will change type inference accordingly when used in an if block.

It accepts and only accepts conditional statement like if (typeof x === 'string') as type guard.
TypeScript now creates new type like number | string meaning a type of either number or string.
Users can further refine the type by comparing the value typeof gives, like the example below:

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
function createCustomer(name: { firstName: string; lastName: string } | string) {
if (typeof name === "string") {
// Because of the typeof check in the if, we know name has type string
return { fullName: name };
}
else {
// Since it's not a string, we know name has
// type { firstName: string; lastName: string }
return { fullName: name.firstName + " " + name.lastName };
}
}

// Both customers have type { fullName: string }
var customer = createCustomer("John Smith");
var customer2 = createCustomer({ firstName: "Samuel", lastName: "Jones" });

I would rather say this is a bad idea because:

  1. it intermixes type level constucts with value level constructs.
  2. for complex and flexible language like javascript, microsoft’s approach is unable to handle various expressions regarding types.

Value level contructs are expression or statements dealing with values, e.g. assignment, comparison. typeof and instanceof in JavaScript are value level constructs because they generate boolean value, and their values can be passed to other variable or compared with other variable. Value type constucts do imply types, say, creating a new object of specific type, but they do not explicitly manipulate the types of expressions. There is no type casting JavaScript can do. On the other hand, type level constructs deals with types, for example, type annotation and generics.

Doubling typeof as type guard blurs the demarcation between type level and value level, and naturally reduces program’s readability(somewhat subjective claim though). A variable can, without distinct syntax, change its type in a conditional block. if branching is ubiquitous typescript programs, from hello-world toys to cathedral-like projects. It’s quite hard to find the “type switch” for union type among other irrelevant ifs. Also, one has to pay attention to call correct method of a same variable in different branches. So TypeScript’s type guard introduces a new type scope different from both lexical scope and function scope. It also cripples the compiler because now compiler has to check whether the condition in a if parentheses is type guard.

What’s worse? Type guard is a value level constructs so it can interact with all other language constucts. But microsoft does not intend to support that. None of the following code compiles in TypeScript 1.4.1, but they ought to run correctly in plain javascript, if they can be compiled.

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
function testNot(x: string|number|Function) {
var isNonNum = typeof x !== 'number'
if (isNonNum) return x.length
}

function testReturn(x: string|number) {
if (typeof x === 'number') return;
return x.length
}

function testReturn(x: string|number) {
if (typeof x === 'number') throw new Error('error type')
return x.length
}

function testFor(xs: (string|number)[]) {
for(var i = 0, x = xs[i]; typeof x === 'string'; i++) {
console.log(x.length)
}
}

function testWhile(xs: [](string|number)) {
var i = 0;
while (typeof xs[i] === 'string') {
console.log(xs[i].length)
i++
}
}

function testFilter(xs: (string|number)[]) {
xs.filter((x) => typeof x === 'string').map((x) => x.length)
}

Indeed, TypeScript is not the first to mix type level and value level. Language constructs like Pattern Match also do that(and usually introduce bugs related to type inference, see scala bug track). But at least Pattern Match is a specialized syntax that does not interact much with other syntax. But Type guard is, well, too ubiquitous to be good.

大雅雅于俗

问君何能尔,心远地自偏。

我跟你讲,八神老师的LL本才是横跨氪金手游和同人薄本两界的旗舰之作。

不要看每本本子只有薄薄三十几页,但每一页都体现了老师对氪金手游业的大见识:一切氪金的本质,与“恶女榨金”无二。老师的本子虽薄,却为ACG产业,乃至现代文化消费形式,描绘了一张鸟瞰图。

如若看官仅仅以无鸟事为鸟找事的心态去看LL本, 看到的不过是人物单方面榨取主人公体液的画面。进入了贤者状态的上级读者,才能看到老师在本子中表达的良苦用心。八神老师的整个作品都是个隐喻,从注射器到温度计,无一不暗示了氪金手游的运营手段。它们都有一个简单明了的目的:尽一切奇技淫巧,榨出玩家钱包里最后一滴金。

本子中的主人公被描写为一个被动的体液机器,这和免费增值(Freemium)的商业模式中将消费者定位成消费机器的思路实际是异曲同工的。偶像不断用各色工具刺激玩家产生快感,玩家在这个过程中形成了氪金与成就的关联概念,在老师本子中,就是以受虐与快感关联的受虐情节所体现出来的。

运营在情节和宣传中构造出的“偶像”、“奋斗”等概念,实际上是对玩家消费欲望的一个“改装”(distortion),让消费欲能被玩家自身和玩家所在的社会环境所接受。在本子中刺激腺体的桥段,影射了游戏对玩家消费欲的刺激(生理层次上,对腹侧被盖区的刺激)。然而八神老师的文笔隽永之处就在于,他的笔触白虚伪的改装一一清除,入目三分地刻画了氪金手游的榨金本色。这一点在老师SC60的作品《 soldier money game》中展现无余。

称LL薄本为俗萌之物,恰恰是没有理解八神秋一老师的用心之处。视角从薄本本身放开,提升到一个“元薄本”(meta-usuihon)的高度,才能品读出LL本中的精妙之处。八神老师的作品看似大俗,但作品的旨味却是大雅。在大俗的故事中反思了现代人的消费行为,才是艺术家在当代社会中的应有作为。

作为人民艺术家,如果一味追求艺术形式上的“雅”,只会和观众拉开远非“一步之遥”的距离。所谓小雅雅于形,中雅雅于意,大雅雅于俗,八神老师的作品,不可不谓大俗大雅、雅俗共赏之作。

typelevel-html

Type level programming is a technique that exploits type system to represents information and logic, to the extent of language’s limit.

Since values are encoded by variable’s type, type level drives compiler to validate logic or even determine program’s output.
All validation and computation are conducted statically at compiling phase, so the greatest benefit of type-level programming is its safety and reliability.

As a rule of thumb, the more dynamic code is, the more flexible it can computes. Type level programming require all logic encoded into the source code. It is to hard to cram all logic into type system, as handling whimsical input from external source is either impossible or reduce source code to unwieldy state machine. So the niche of type level programming is usually encoding, pickling or parsing.

But there is field where code is statically written: GUI. HTML template is hard coded in source. Type level can be used as linting and validation in html edition, especially in authoring web components. A piece of HTML fragment can be encoded in ordinary object, with type denoting its structure. Once the structure of HTMl is fixed, the output of JavaScript and css can be determined as well.
This is more helpful when one wants to make component. A tab-container must have tab-pane as child, and a tab-pane must live within a tab-container. Current approach of constraining HTML structure is encoding requirements in JavaScript and checking it in runtime. For example, angular uses require: '^parentDirective' to express the constraints and enable directive communication. If the component is programmatically constructed, using type annotation is a natural way to express the constraints. (As in Angular 2.0, query<ChildDirective>). We can go further in a language with full-bloomed type system.

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
trait Tag
class Concat[A <: Tag, B <: Tag](a: A, b: B) extends Tag
trait NestTag[A <: Tag] extends Tag {
type Child = A
}
trait Inline extends Tag
trait Block extends Tag
case class div[T <: Tag](t :T = null) extends NestTag[T] with Block
case class p[T <: Tag](t :T = null) extends NestTag[T] with Block
case class a[T <: Inline](t :T = null) extends NestTag[T] with Inline

implicit class PlusTag[A <: Tag](a: A) {
def +[B <: Tag](b: B) = new Concat(a, b)
}

class Contains[A <: Tag, C[_ <: Tag] <: NestTag[_], T[_ <: Tag] <: NestTag[_]]

case class jQ[A <: Tag, C[_ <: Tag] <: NestTag[_]](c: C[A]) {
def has[T[_ <: Tag] <: NestTag[_]](implicit ev: Contains[A, C, T]) = true
}

implicit def htmlEq[A <: Tag, C[_ <: Tag] <: NestTag[_], T[_ <: Tag] <: NestTag[_]](implicit ev: C[A] =:= T[A]) =
new Contains[A, C, T]
implicit def recurEq[A <: Tag, B[_ <: Tag] <: NestTag[_], C[_ <: Tag] <: NestTag[_], T[_ <: Tag] <: NestTag[_]]
(implicit ev: Contains[A, B, T]) = new Contains[B[A], C, T]

val ele = div(
p(
a()
)
)
val r = jQ(ele).has[p]
println(r)

The code above is just a demo. All html elements has type that denotes its structure. And one can tell whether a tag is in a html elemnt by calling jQ(ele).has[Tag]. (Note: ele is value level variable and Tag is a type level constructor.) And inline element cannot contains block element, because inline element’s child must be a subtype of inline.

Programmatical markup has several benefits:

  1. no switching between script and template
  2. static type checking
  3. component dependency requirements
  4. component communication
  5. subtyping, inheritance… Classical OOP features
  6. relatively clean layout (though not as concise as Jade/Slim)

The biggest problem is, well, type level templates is strongly constrained by host language. Dynamic languages simply cannot have it. Users of classical static typed language without type inference cannot afford the verbosity of deeply nested type. And languages capable of type level have different approaches and implementation towards Type Level Programming.

After all, type level is too crazy…, at least for daily business logic.

Scala Generalized Type constraints

Generalized Type Constraints, also known as <:<, <%<(deprecated though) and =:=, also known as type relation operator, or call whatever you want, are not operators but identifiers. It’s quite confusing for new comers to distinguish them from operators, well…, identifiers which are not that esoteric.

This is just plain Scala feature that non-alphanum symbols can act as legal identifiers, just like + method.
More specifically, they are type-constructors. But before we inspect their implementations, let’s first consider their usage.

Usage

You want to implement a generic container for every type, however, you also want to add a special method that only applies to Special type. (notice: this is different from the annotation @specialized which deals with JVM’s primitive type. Here Special is just a plain old scala type)

1
2
3
4
5
6
7
class Container[A](value: A) {
def diff[A <: Int](b: Int) = value - b
}

// BOOM
// error: value - is not a member of type parameter A
// def diff[A <: Int](b: A) = value - b

Why? The type bound A <: Int does not work. A has been defined at the class declaration, in the class body Scala compiler requries every type bound is consistent with A’s definition. Here, A has no bound so it is bounded by Any, not Int.

Instead of setting type bound, methods may ask for some kinds of specific ad-hoc “evidence” for a type.

1
2
3
4
5
6
7
8
9
10
11
12
13
14
scala> class Container[A](value: A) {
// other generic methods for A
/* blah blah */

// specialized method for Int
def addIt(implicit evidence: A =:= Int) = 123 + value
}
defined class Container

scala> (new Container(123)).addIt
res11: Int = 246

scala> (new Container("123")).addIt
<console>:10: error: could not find implicit value for parameter evidence: =:=[java.lang.String,Int]

Cool, evidence is an implicit provided by scala predef. And A =:= Int is just a type like Map[Int, String], but is infixed due to scala’s syntactic sugar.

Scala does not impose type constraints until the specific method is called, so addIt does not violate A‘s definition. Still, given the implicit evidence, compiler can still infer that value in addIt is an sub-instance of Int.

As stated before, type constraints are ad-hoc. So it can achieve type inference more specific than type bound. (Fairly, this is the power of implicit).

1
2
3
4
def foo[A, B <: A](a: A, b: B) = (a,b)

scala> foo(1, List(1,2,3))
res1: (Any, List[Int]) = (1,List(1, 2, 3))

1 is clearly Int but why does compiler infer it as Any? The B <: A bound requires the first argument type is a super type of the second. A is inferred as the most general type between Int and List[Int], Any.

<:< comes to help.

1
2
3
4
def bar[A,B](a: A, b: B)(implicit ev: B <:< A) = (a,b)

scala> bar(1,List(1,2,3))
<console>:9: error: Cannot prove that List[Int] <:< Int.

Because generalized type constraints does not interfere with inference, A is Int here. Only then does the compiler find evidence for <:<[Int, List[Int]] and then fails.
(Actually, implicit can feedback type information back to inference, see typelevel programming’s HList and scala collection library’s CanBuildFrom)

Also implicit conversion does not impact <:<

1
2
3
4
5
6
7
8
9
10
11
12
13
scala> def foo[B, A<:B] (a:A,b:B) = print("OK")

scala> class A; class B;

scala> implicit def a2b(a:A) = new B

scala> foo(new A, new B) // implicit conversion!
OK

scala> def bar[A,B](a:A,b:B)(implicit ev: A<:<B) = print("OK")

scala> bar(new A, new B) // does not work
<console>:17: error: Cannot prove that A <:< B.

Implementation

Actually =:= is just a type constructor in scala.
It is somewhat like Map[A, B], that is,
=:= is defined like

1
class =:=[A, B]

so in the implictly’s bracket, Int =:= Int is just a type
A =:= B is the infix form of type parameterization for
non-alphanumeric identifier. It is equivalent to =:=[A, B]

so one can define implicts for =:=, so that compiler can find

1
implicit def EqualTypeEvidence[A]: =:=[A, A] = new =:=[A, A]

So, when implictly[A =:= B] is compiled,
compiler tries to find the correct implicit evidence.

If and only If A and B are the same, say Int, the compiler can find
=:=[Int, Int], by the result of implicit function EqualTypeEvidence[Int]

More compelling is <:<, the conformance evidence,
it leverages variance annotation in scala

1
2
class <:<[-A, +B]
implict def Conformance[A]: <:<[A, A] = new <:<[A, A]

Consider, when String <:< java.io.Serializable is needed,
compiler tries to find an instance of <:<[String, j.i.Serializable]
It can only find instance of the type <:<[String, String]
(or another alternative <:<[Serializable, Serializable])
But given the variance annotation of <:<,
since String is the very type String
and String is a subtype of Serializable and B is in a covariant position
, or, in another direction
snice Serializable is a supertype of String and A is in a contravariant position
and Serializable is the very type Serializable

<:<[String, String] is a subtype of <:<[String, Serializable]
So compiler finds the correct implicit instance as the evidence that
String is a subtype of Serializable. By the principle of subtype subsititution.
(Liskov)

Similarly we can define

1
2
3
4
5
6
7
8
Conversion evidence

class <%<[A <% B, B]
implicit def Conversion[A, B] = new <%<[A, B]

Contra-conformance
class >:>[+A, -B]
implicit def Contra[A] = new >:>[A, A]

Magic, Right?
The actual implementations uses singleton pattern so it is more efficient. For this illustration post, sloppy implementation is just fine :).

Reference:
http://hongjiang.info/scala-type-contraints-and-specialized-methods/
http://apocalisp.wordpress.com/2010/07/17/type-level-programming-in-scala-part-6d-hlist%C2%A0zipunzip/

play framework with scalate and activerecord

WRYYYYYYYYYYYYYYYY

Dio Brando on Scala crazy dependencies

Scala works like CSS selectors in that every successor overrides its predecessor.

You will have to work as a detective to figure out the correct recipe to manage a huge casserole of hodgepodge.

To achieve a working Play configuration with scalate and activerecord needs:

In build.sbt:

1
2
3
4
5
6
7
8
9
scalaVersion :=  "2.10.3"

libraryDenpendencies ++= Seq(
jdbc,
"org.scalatra.scalate" %% "scalate-core" % "1.7.0",
"com.github.aselab" %% "scala-activerecord" % "0.2.3",
"com.github.aselab" %% "scala-activerecord-play2" % "0.2.3",
"com.h2database" % "h2" % "1.3.170"
)

Several notes:

  1. Currently scala-activerecord only supports 2.10.3
  2. scalate must be 1.7.0+ for better support on scala 2.10 but the current stable version is 1.6.0

Then in the root path of play project, create a new file located at app/lib/ScalateIntegration.scala

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56

package controllers

import play.api._
import http.{Writeable, ContentTypeOf, ContentTypes}
import mvc.Codec
import play.api.Play.current
import org.fusesource.scalate.layout.DefaultLayoutStrategy
import collection.JavaConversions._

object Scalate {

import org.fusesource.scalate._
import org.fusesource.scalate.util._

var format = Play.configuration.getString("scalate.format") match {
case Some(configuredFormat) => configuredFormat
case _ => "scaml"
}

lazy val scalateEngine = {
val engine = new TemplateEngine
engine.resourceLoader = new FileResourceLoader(Some(Play.getFile("app/views")))
engine.layoutStrategy = new DefaultLayoutStrategy(engine, "app/views/layouts/default." + format)
engine.classpath = "tmp/classes"
engine.workingDirectory = Play.getFile("tmp")
engine.combinedClassPath = true
engine.classLoader = Play.classloader
engine
}

def apply(template: String) = Template(template)

case class Template(name: String) {

def render(args: java.util.Map[String, Any]) = {
ScalateContent{
scalateEngine.layout(name, args.map {
case (k, v) => k -> v
} toMap)
}
}

}

case class ScalateContent(val cont: String)

implicit def writeableOf_ScalateContent(implicit codec: Codec): Writeable[ScalateContent] = {
Writeable[ScalateContent]((scalate:ScalateContent) => codec.encode(scalate.cont))
}

implicit def contentTypeOf_ScalateContent(implicit codec: Codec): ContentTypeOf[ScalateContent] = {
ContentTypeOf[ScalateContent](Some(ContentTypes.HTML))
}
}

Again only works on scalate 1.6.

Finally, according to activerecord‘s doc.
To load the plugin, in conf/play.plugin

1
9999:com.github.aselab.activerecord.ActiveRecordPlugin

To configure database, in conf/application.conf

1
2
3
4
5
6
7
8
9
10
11
12
# Database configuration
# ~~~~~
#

# Scala ActiveRecord configurations
db.activerecord.driver=org.h2.Driver
db.activerecord.url="jdbc:h2:mem:play"
db.activerecord.user="sa"
db.activerecord.password=""

# Schema definition class
activerecord.schema=models.Tables

And in app/models/person.scala

1
2
3
4
5
6
7
package models

import com.github.aselab.activerecord._
import com.github.aselab.activerecord.dsl._

case class Person(@Required name: String) extends ActiveRecord
object Person extends ActiveRecordCompanion[Person] with PlayFormSupport[Person]

In app/models/tabels.scala

1
2
3
4
5
6
7
8
package models

import com.github.aselab.activerecord._
import com.github.aselab.activerecord.dsl._

object Tables extends ActiveRecordTables with PlaySupport {
val models = table[Person]
}

And finally you can try this in console

1
2
3
4
5
6
7
8
9
activator console

import models._
import play.core.StaticApplication

new StaticApplication(new java.io.File("."))

Person("f@ck").save
Person.findBy("name", "f@ck")

WTF have I done? I just typed mechanically what I have ferreted on Google and StackOverflow.
WTF is play.core.StaticApplication that is just one confusing page on the doc?
Speciously tantializing is the magical code under which lurk complicated dependencies.

Reference: PlayframeWork Quick Tip

How to set up a scaloid project from scratch

TL;DR; It’s hard to set up android development environment without the aid of IDE. And it’s harder for scaloid.

  1. Download standalone SDK

  2. Download Android SDK tools/ SDK platform tools/ SDK build tools
    NB> f@ck GFW, I got around that bastard by modifying /etc/hosts.

  3. Install sbt, reference for GFW: here

  4. android create project --target <target-id> --name scaloidApp --path <path>/scaloidApp --activity MainActivity --package com.example.scaloidapp

  5. Create a directory named project within your project and add the file project/plugins.sbt, in it, add the following line:
    addSbtPlugin(“com.hanhuy.sbt” % “android-sdk-plugin” % “1.2.20”)

  6. Create project/build.properties and add the following line:

sbt.version=0.12.4 # newer versions may be used instead

  1. Create build.sbt in root directory(example). Remember to import android.Keys._
1
2
3
4
5
6
7
8
9
10
11
12
import android.Keys._

android.Plugin.androidBuild

name := "scaloidApp"

scalaVersion := "2.11.0"

platformTarget in Android := "android-20"

libraryDependencies += "org.scaloid" %% "scaloid" % "3.4-10"

UPDATE: scaloid-android-plugin fixed the building =.=

magic HTML.js

source: pixiv

HTML.js is a library full of syntactic sugar. It changes html elements dynamically so that the methods reflect their children node. For example, code like HTML.body.header.hgroup.h1 utilizes chain methods to mirror the structure of dom.

ES5 Object.defineProperty and mutationObserver conjure up the magic. HTML.js provides an eponymous HTML api object initialized by an internal node method, which add all tag methods to its argument object. All tag methods are defined by Object.defineProperty with get option. So tag methods behave like getter methods: every time user access these attributes, tag methods return HTMLifyed elements that are ready to be chained (HTMLifyed elements are normal HTMLElements that have been extended by the internal node method mentioned above).

Enable tag methods responsive to dom manipulation, HTML opts for mutationObserver to keep an eye on the root element. Once elements have been changed, mutationObserver detects the change and notifies HTML to refresh methods of the corresponding elements.

However, syntactic sweetheart fails to belie some design deficits and practical problems in this library. Getter methods abstain legacy browsers that still occupy about 10% market share. MutationObserver itself is not that horribly slow, but registering a watchdog on is almost certainly a performance killer for massive dom manipulations.

But the most notorious code smell comes from yet another place, a pure design decision that functions returning either element or array. It is certainly one of the most sloppy practice in dynamic language. In static typed language those function can only have return type Any. Surely this is not informative and bothers user to take the risk of casting results. Indeed, the author mentioned this on the homepage and tried defend this api design by the excuse of conditional context where users can avoid quandary. But a good library shall be as much care-free as possible. Providing api that returns one single element is probably better than leaving the users to guarantee element’s uniqueness. Ad-hoc polymorphism is determined by function arguments, not return type.

HTML’s api reminds me of the keyword null. Admittedly it is theoretically feasible to entrust programmers with the role of checking uniqueness/existence. But why ぬるぽ is still one of the most prominent haunting apparition in our code?

至少、像那老坛酸菜牛肉面一样

source: unknown

「東山覇権」盛行下的碧池这个词,几乎是个定番的萌点了。双马尾金发碧眼归国麻将日侨的威压统治了円光排行榜数载,
使得麻将军团的地位牢不可破。动画中一有搔首弄姿的镜头,屏幕前众宅男便心领神会。而年度nico某排行榜里,当壮大的BGM配着罗格翻开冠军ako酱的名字时,就能感到这王者风范:就像全屏大赤字弹幕一样霸气。

而在同人薄本界,霸主地位也高高在上遥不可及。《清纯奔放的木静酱》是一部不错的全彩短篇,可惜故事不成气候,难以动摇王座。然而,却有一部作品是円光界独树一帜的经典,震慑了女王新子瞳的王者地位——就是传说中感动2ch全J民的C79本《せめて、あの雪のように》。

这部作品是作者藤丸涉足同人界十周年的作品。虽说藤丸比较低调,并非是脍炙人口的名流。但画风却相当不错,线条俊秀,一本全彩本也相当实用,是个值得称道的绅士。

然而,这部十周年的作品,却并不是以其实用性著称,这正是她不同凡响之处。让这部作品不仅对作者而言纪念作品,更是援交作品界的一座里程碑式的存在。除此之外,同名小说以及MAD亦有爱好者制作,而作品的经典台词和分镜,更是渗入到了匿名基坛。可见《雪》是多么地震撼。

艺术手法上,《雪》是一部大量使用“留白”的作品,甚至可以说完全呈递了一张30页白纸给读者。与其浓墨重彩混体液描绘JK和大叔的床单故事,作品仅仅是让女主角雪和客人拉手让男主目击。余下的,就只是轻轻勾勒男女主间淡淡的青涩。这是一部“没有工口场景的工口作品”,没有一笔一墨淫猥,把故事和故事配套的动作戏码完全在读者脑内展开。也正是如此,原本是看中作者画风的看客,是硬生生把体液从下半身挤到了泪腺里流出来。

如果在这里做人物分析的话绝对属于多余,甚至是对作品的不敬。这部作品是经典,女主是经典的碧池妖精白无垢天使,男主是经典的迟钝缩卵。一个没有意外展开的故事,但确实是经典的故事,也正是一次、又一次碾碎读者心理防线的故事。前几页是Sucubus的开朗笑颜,下几页就是女主的无奈和夙愿。这笔触下的人物味道,不是死甜的糖也不是突兀的苦味,而是银勺轻舀咖啡,在口中化开甜味苦味醇香几个层次,在后味中品咖啡豆是如何烘焙烧制的——作品留白了女主雪的背景,却又提供了她的苦恼,让读者自然地在脑内补完。

因而刻画女主白山雪尤其经典,女主的每句台词都是经典
—— 好羡慕月村同学,真好啊,我也想谈恋爱
—— 花丘君对我没兴趣,所以喜欢,真的喜欢
—— 各种人,各种想法,我都懂,被教会了 (英译本中为 I was made to know真是被动语态在翻译暧昧语言的合适用法)
—— 像我这样的二手货,清仓甩卖也没人要啦
这似曾相识的句子,简直是掩藏在人类集体无意识中的原型,被挖掘出来压缩在几十页中冲击读者的精神。这,就是经典,恒久不变,历久弥新。

和众多薄本一样,《雪》足够给看客带来充分贤者时间。而不同的是,《雪》并不局限在短暂肤浅的生理不应期,而是对人性深层的震撼和拷问。在各处对援交的舌战),总是能看到在遥遥道德高点上用“干净”“肮脏”的概念,反对援交。在没有论证前就用着“试想蔚然成风后社会还能干净吗”的论调强占高点。然而《雪》这部作品却展现了另一面:少女在成长和生活中的无奈和困苦,迫使少女的早熟和世故。对援交的鄙夷在宏观上或许是大义凛然的,但在微观上却显得无情无义。或许不该忘记,在臆想中或污染社会的援交,恰恰是现实里由龌龊的社会所制造的。

当然,这部作品无法脱离校园恋爱剧的范畴,金钱交易和校园纯爱的反差也是看点之一,合乎心智(尤其异性人际方面)水平(滞留)在中学时代的宅男。而最后女主去向不明,成为男主的回忆,倒也是提供了不少可以意淫之处。这样的结局倒是影射了漫画的特色:任何角色,只能成为看客精神世界里的影子。《雪》确实挖了一道沟渠让读者能意淫出一道爱河了。

《せめて、あの雪のように》是一本让人心酸的本子。其影响已经超过作品本身,演化成了”像我这样的二手货,也不会要XXX”的成句,真是又给这个漫画添加了别样的风味。然而转念一想,可能更增加了一份实用。每每看到年轻情侣亲密时,总是会有难免见不得理还乱的忧愁;又或者看到大叔拉着百褶裙的手在草地上散步时,百感交集。然而重温这个经典的镜头和台词,却瞬间心头感到:这酸爽!
啊,这酸爽!至少、像那老坛酸菜牛肉面一样

作者简介:藤丸 漫画家,原画家, 曾用名F.blue,曾与ひらめ、綾瀬三人组团出本,不过其中一人在C75的时候撂了挑子(迷上了和洛奇很像的一种网游)。藤丸也就在那个时候改了名,自 己一个人单练。藤丸的作品不多,原创气息浓厚,双手磨砺的很充分,仅有的几部同人都有很高的还原度,分镜规矩,笔法公整,平凡的剧情,简直就是一个传统的 漫画家

dark
sans