Quantcast

disambiguation of double definition resulting from generic type erasure

classic Classic list List threaded Threaded
10 messages Options
Reply | Threaded
Open this post in threaded view
|  
Report Content as Inappropriate

disambiguation of double definition resulting from generic type erasure

Aaron Novstrup
-----BEGIN PGP SIGNED MESSAGE-----
Hash: SHA1

Disclaimer: I'm not sure if this is the right list to which to post what
is essentially a feature request, but since I don't commit to the
compiler I didn't think it appropriate to post to the internals list,
and it didn't seem wise to create a ticket without some public
discussion first.

The fact that generic type erasure can lead to duplicate definitions is
a hindrance to Java and Scala developers alike (see this StackOverflow
question for a recent example:
http://stackoverflow.com/questions/3422336/how-can-i-differentiate-between-def-fooaxs-a-and-def-fooa-bxs-a-b).
Fortunately, Scala developers have a nifty workaround in the form of
DummyImplicits:

object Baz {
   def foo(xs: String*) = 1
   def foo(xs: Int*)(implicit e: DummyImplicit) = 2
   def foo(xs: Any*)(implicit e1: DummyImplicit, e2: DummyImplicit) = 3
}

Given that this is a general solution, couldn't the compiler
automatically add implicit parameters when such duplication would arise,
thus allowing the developer to write simply:

object Baz {
   def foo(xs: String*) = 1
   def foo(xs: Int*) = 2
   def foo(xs: Any*) = 3
}

One issue I could see is that the DummyImplicit solution complicates
Java interop.  Whereas users currently have to make this choice quite
deliberately, it could come as a surprise if the compiler mangles the
signature automatically.  To my mind, it's still worthwhile.

Regards,
~Aaron
-----BEGIN PGP SIGNATURE-----
Version: GnuPG v1.4.10 (GNU/Linux)
Comment: Using GnuPG with Mozilla - http://enigmail.mozdev.org/

iEYEARECAAYFAkxp3tUACgkQW6MGPLg7nilhYgCgxF4UMmHYmHUOSlf40mR6uDFa
Ao4AnRfvCGC27cM8IBrtcQEa9xygNyLb
=sHOA
-----END PGP SIGNATURE-----
Reply | Threaded
Open this post in threaded view
|  
Report Content as Inappropriate

Re: disambiguation of double definition resulting from generic type erasure

Paul Phillips-3
On Mon, Aug 16, 2010 at 05:59:02PM -0700, Aaron Novstrup wrote:
> object Baz {
>    def foo(xs: String*) = 1
>    def foo(xs: Int*)(implicit e: DummyImplicit) = 2
>    def foo(xs: Any*)(implicit e1: DummyImplicit, e2: DummyImplicit) = 3
> }

You could quiet that a bit with a better name and context bounds:

object Baz {
  class O[T]
  implicit def makeO[T] = new O[T]
 
  def foo[T1: O](xs: String*) = 1
  def foo[T1: O, T2: O](xs: Int*) = 2
  def foo[T1: O, T2: O, T3: O](xs: Any*) = 3
}                                                                              

> Given that this is a general solution, couldn't the compiler
> automatically add implicit parameters when such duplication would
> arise, thus allowing the developer to write simply:

The day something this hacky goes in is the day I assemble strike team
bravo to find out what kind of bodysnatcher menace we're up against.

--
Paul Phillips      | The most dangerous man to any government is the man who
Apatheist          | is able to think things out [...] Almost inevitably he
Empiricist         | comes to the conclusion that the government he lives under
up hill, pi pals!  | is dishonest, insane, intolerable.   -- H. L. Mencken
Reply | Threaded
Open this post in threaded view
|  
Report Content as Inappropriate

Re: disambiguation of double definition resulting from generic type erasure

Jason Zaugg
I think you'll agree that this better captures the intent ;)

scala> class __[_]
defined class __

scala> object overload {
     |    def foo[_ : __](a: Int) = 0
     |    def foo[_ : __ : __](a: String) = 0
     | }
defined module overload

-jason

On Tue, Aug 17, 2010 at 8:15 AM, Paul Phillips <[hidden email]> wrote:

> On Mon, Aug 16, 2010 at 05:59:02PM -0700, Aaron Novstrup wrote:
> You could quiet that a bit with a better name and context bounds:
>
> object Baz {
>  class O[T]
>  implicit def makeO[T] = new O[T]
>
>  def foo[T1: O](xs: String*) = 1
>  def foo[T1: O, T2: O](xs: Int*) = 2
>  def foo[T1: O, T2: O, T3: O](xs: Any*) = 3
> }
Reply | Threaded
Open this post in threaded view
|  
Report Content as Inappropriate

Re: disambiguation of double definition resulting from generic type erasure

Jason Zaugg
Note that I'm not a huge fan of method overloading, I find that it
interacts poorly with other features of the language [1].

Overloading the meaning of '_', on the other hand...

-jason

[1] http://stackoverflow.com/questions/2510108/why-avoid-method-overloading/2512001#2512001

On Tue, Aug 17, 2010 at 9:10 AM, Jason Zaugg <[hidden email]> wrote:

> I think you'll agree that this better captures the intent ;)
>
> scala> class __[_]
> defined class __
>
> scala> object overload {
>     |    def foo[_ : __](a: Int) = 0
>     |    def foo[_ : __ : __](a: String) = 0
>     | }
> defined module overload
Reply | Threaded
Open this post in threaded view
|  
Report Content as Inappropriate

Re: disambiguation of double definition resulting from generic type erasure

Aaron Novstrup
Jason -

Point well made about method overloading.  However, the real issue is
whether overloading *with* generics should be more difficult than
overloading *without* generics.  For better or worse, Scala supports
method overloading, and the fact that it's more difficult with generics
is an accidental consequence of type erasure rather than a conscious
design decision.

I think it's best to allow developers to decide whether they want to use
overloading, without unnecessary restrictions. If the language is to
discourage overloading, the compiler should do so uniformly regardless
of whether generics are used (preferably by issuing a warning rather
than an error).

Paul -

I suggest you get out the assault rifles and body armor -- observe how
DummyImplicit is used in Array.scala.  And are there no other examples
of the compiler doing strange things to work around bytecode
limitations?  Better to implement the "hack" once in the compiler rather
than pushing it out to user code.

~Aaron


On 08/17/2010 12:19 AM, Jason Zaugg wrote:

> Note that I'm not a huge fan of method overloading, I find that it
> interacts poorly with other features of the language [1].
>
> Overloading the meaning of '_', on the other hand...
>
> -jason
>
> [1] http://stackoverflow.com/questions/2510108/why-avoid-method-overloading/2512001#2512001
>
> On Tue, Aug 17, 2010 at 9:10 AM, Jason Zaugg<[hidden email]>  wrote:
>    
>> I think you'll agree that this better captures the intent ;)
>>
>> scala>  class __[_]
>> defined class __
>>
>> scala>  object overload {
>>      |    def foo[_ : __](a: Int) = 0
>>      |    def foo[_ : __ : __](a: String) = 0
>>      | }
>> defined module overload
>>      
> .
>
>    

Reply | Threaded
Open this post in threaded view
|  
Report Content as Inappropriate

Re: disambiguation of double definition resulting from generic type erasure

Aaron Novstrup
In reply to this post by Jason Zaugg
What about mixins? With

trait A { def foo(xs: String*) = 1 }
trait B { def foo(xs: Int*) = 2}

`new A with B` fails to compile.  Since the two traits may be defined
completely separately, perhaps in separate libraries, it may not be
possible for the user to resolve the issue.  Could the compiler do it?

The example is admittedly contrived, but I think it poses an interesting
question nonetheless.

On 08/17/2010 12:19 AM, Jason Zaugg wrote:

> Note that I'm not a huge fan of method overloading, I find that it
> interacts poorly with other features of the language [1].
>
> Overloading the meaning of '_', on the other hand...
>
> -jason
>
> [1] http://stackoverflow.com/questions/2510108/why-avoid-method-overloading/2512001#2512001
>
> On Tue, Aug 17, 2010 at 9:10 AM, Jason Zaugg<[hidden email]>  wrote:
>    
>> I think you'll agree that this better captures the intent ;)
>>
>> scala>  class __[_]
>> defined class __
>>
>> scala>  object overload {
>>      |    def foo[_ : __](a: Int) = 0
>>      |    def foo[_ : __ : __](a: String) = 0
>>      | }
>> defined module overload
>>      
> .
>
>    

Reply | Threaded
Open this post in threaded view
|  
Report Content as Inappropriate

Re: disambiguation of double definition resulting from generic type erasure

Paul Phillips-3
In reply to this post by Aaron Novstrup
On Tue, Aug 17, 2010 at 10:58:19AM -0700, Aaron Novstrup wrote:
> If the language is to discourage overloading, the compiler should do
> so uniformly regardless of whether generics are used (preferably by
> issuing a warning rather than an error).

Why? Different things are different things.  Consistency is only
valuable until it isn't.  And the language already discourages
overloading quite effectively, I'd say.  I refer to the sort of
passive-aggressive discouragement one sometimes see in significant
others who no longer wish to be significant: "maybe if I make things
unpleasant enough around here..."

> I suggest you get out the assault rifles and body armor -- observe how
> DummyImplicit is used in Array.scala.

I don't see the relevance.  It doesn't look anything like what you're
suggesting: but even if it were exactly like it, that wouldn't be an
argument for having the compiler do it automatically.  It'd only be an
admission that unpleassant hacks are sometimes necessary, something I do
not see as controversial.

--
Paul Phillips      | Before a man speaks it is always safe to assume
Stickler           | that he is a fool.  After he speaks, it is seldom
Empiricist         | necessary to assume it.
pull his pi pal!   |     -- H. L. Mencken
Reply | Threaded
Open this post in threaded view
|  
Report Content as Inappropriate

Re: disambiguation of double definition resulting from generic type erasure

Aaron Novstrup
>> If the language is to discourage overloading, the compiler should do
>> so uniformly regardless of whether generics are used (preferably by
>> issuing a warning rather than an error).
>
> Why? Different things are different things.  Consistency is only
> valuable until it isn't.

Sure, but how is a method that takes parameterized arguments (e.g. an
Option[String] => R) meaningfully different in this context from one
that doesn't (e.g. a String => R)? Why does the latter deserve a
privileged ability for easy overloading?

> And the language already discourages
> overloading quite effectively, I'd say.  I refer to the sort of
> passive-aggressive discouragement one sometimes see in significant
> others who no longer wish to be significant: "maybe if I make things
> unpleasant enough around here..."

This kind of discouragement is... well, discouraging.  It doesn't serve
adopters well to find out that method overloading is a bad idea only by
using it and getting bitten. A simple compiler warning would be much
more helpful.

> I don't see the relevance.  It doesn't look anything like what you're
> suggesting:

In both cases, DummyImplicits are used as a trick to modify the type
signature so that the right method can be selected at the call-site
without changing the code at the call-site.  Indeed, do DummyImplicits
serve any other purpose?

> It'd only be an admission that unpleassant hacks are sometimes
 > necessary, something I do not see as controversial.

My point was precisely that unpleasant hacks are sometimes necessary and
that they're already present in the Scala codebase. No, this alone does
not imply that this particular hack is necessary, but I think I've made
a good case that it should be considered.



Reply | Threaded
Open this post in threaded view
|  
Report Content as Inappropriate

Re: disambiguation of double definition resulting from generic type erasure

Paul Phillips-3
On Tue, Aug 17, 2010 at 12:09:33PM -0700, Aaron Novstrup wrote:
> Sure, but how is a method that takes parameterized arguments (e.g. an
> Option[String] => R) meaningfully different in this context from one
> that doesn't (e.g. a String => R)? Why does the latter deserve a
> privileged ability for easy overloading?

LITTLE BILL: I don't deserve this... to die like this.
WILL MUNNY: Deserve's got nothin' to do with it.

> This kind of discouragement is... well, discouraging.  It doesn't
> serve adopters well to find out that method overloading is a bad idea
> only by using it and getting bitten. A simple compiler warning would
> be much more helpful.

To be clear, I wasn't suggesting it offers that form of discouragement
as a matter of policy.  Just that it works out that way.  I don't think
anyone is advocating for worse error messages.

> In both cases, DummyImplicits are used as a trick to modify the type
> signature so that the right method can be selected at the call-site
> without changing the code at the call-site.  Indeed, do DummyImplicits
> serve any other purpose?

I wouldn't sweep a default implicit builder into the same category as an
increasing series of dummy parameters, but either way.  I come at these
matters from the standpoint of what is likely to actually be
implemented, what it would displace (because nothing gets done without
something else not getting done) and trying to anticipate the zillion
hard to anticipate places which end up as nice new pain points.  And
from that standpoint ideas like this are not the least bit appealing.  I
suffer all the same issues everyone else does, so I understand the
desire to hack in a bandaid for your personal top annoyances.  But those
bandaids have this tendency to morph into hungry parasites, and the
accumulation of parasites will suck a body dry.

Insert usual disclaimer that I'm just a guy who writes a lot of scala
code, not someone whose opinion has any particular relevance.

--
Paul Phillips      | Giving every man a vote has no more made men wise
Caged Spirit       | and free than Christianity has made them good.
Empiricist         |     -- H. L. Mencken
pp: i haul pills   |----------* http://www.improving.org/paulp/ *----------
Reply | Threaded
Open this post in threaded view
|  
Report Content as Inappropriate

Re: disambiguation of double definition resulting from generic type erasure

Aaron Novstrup
Beating this dead horse some more...

It occurred to me that a cleaner hack is to use a unique dummy type for
each method with erased types in its signature:

object Baz {
   private object dummy1 { implicit val dummy: dummy1.type = this }
   private object dummy2 { implicit val dummy: dummy2.type = this }

   def foo(xs: String*)(implicit e: dummy1.type) = 1
   def foo(xs: Int*)(implicit e: dummy2.type) = 2
}

If the compiler creates such dummies for any methods that have erasures,
we can avoid accidental clashes in mixins (even with traits from
different libraries):

trait Baz1 { def foo(xs: String*) = 1 }
trait Baz2 { def foo(xs: Int*) = 2 }

becomes

trait Baz1 {
   private object dummy1 { implicit val dummy: dummy1.type = this }

   def foo(xs: String*)(implicit e: dummy1.type) = 1
}

trait Baz2 {
   private object dummy1 { implicit val dummy: dummy1.type = this }

   def foo(xs: Int*)(implicit e: dummy1.type) = 2
}

and `new Baz1 with Baz2` compiles.

If there's any interest, I may code up a compiler plugin.

On 08/17/2010 01:32 PM, Paul Phillips wrote:

> On Tue, Aug 17, 2010 at 12:09:33PM -0700, Aaron Novstrup wrote:
>> Sure, but how is a method that takes parameterized arguments (e.g. an
>> Option[String] => R) meaningfully different in this context from one
>> that doesn't (e.g. a String => R)? Why does the latter deserve a
>> privileged ability for easy overloading?
>
> LITTLE BILL: I don't deserve this... to die like this.
> WILL MUNNY: Deserve's got nothin' to do with it.
>
>> This kind of discouragement is... well, discouraging.  It doesn't
>> serve adopters well to find out that method overloading is a bad idea
>> only by using it and getting bitten. A simple compiler warning would
>> be much more helpful.
>
> To be clear, I wasn't suggesting it offers that form of discouragement
> as a matter of policy.  Just that it works out that way.  I don't think
> anyone is advocating for worse error messages.
>
>> In both cases, DummyImplicits are used as a trick to modify the type
>> signature so that the right method can be selected at the call-site
>> without changing the code at the call-site.  Indeed, do DummyImplicits
>> serve any other purpose?
>
> I wouldn't sweep a default implicit builder into the same category as an
> increasing series of dummy parameters, but either way.  I come at these
> matters from the standpoint of what is likely to actually be
> implemented, what it would displace (because nothing gets done without
> something else not getting done) and trying to anticipate the zillion
> hard to anticipate places which end up as nice new pain points.  And
> from that standpoint ideas like this are not the least bit appealing.  I
> suffer all the same issues everyone else does, so I understand the
> desire to hack in a bandaid for your personal top annoyances.  But those
> bandaids have this tendency to morph into hungry parasites, and the
> accumulation of parasites will suck a body dry.
>
> Insert usual disclaimer that I'm just a guy who writes a lot of scala
> code, not someone whose opinion has any particular relevance.
>


anovstrup.vcf (300 bytes) Download Attachment
Loading...