From 5cc4a9cb1efea46c33322e8f01fffaf02245de69 Mon Sep 17 00:00:00 2001 From: =?UTF-8?q?Pawe=C5=82=20Marks?= Date: Mon, 23 Jan 2023 11:02:15 +0100 Subject: [PATCH 001/371] Add changelog for 3.3.0-RC1 --- changelogs/3.3.0-RC1.md | 225 ++++++++++++++++++++++++++++++++++++++++ 1 file changed, 225 insertions(+) create mode 100644 changelogs/3.3.0-RC1.md diff --git a/changelogs/3.3.0-RC1.md b/changelogs/3.3.0-RC1.md new file mode 100644 index 000000000000..1d632e49032a --- /dev/null +++ b/changelogs/3.3.0-RC1.md @@ -0,0 +1,225 @@ +# Highlights of the release + +- Stabilize new lazy vals [#16614](https://github.com/lampepfl/dotty/pull/16614) +- Experimental Macro annotations [#16392](https://github.com/lampepfl/dotty/pull/16392) [#16454](https://github.com/lampepfl/dotty/pull/16454) [#16534](https://github.com/lampepfl/dotty/pull/16534) +- Fix stability check for inline parameters [#15511](https://github.com/lampepfl/dotty/pull/15511) +- Make `fewerBraces` a standard feature [#16297](https://github.com/lampepfl/dotty/pull/16297) +- Add new front-end phase for unused entities and add support for unused imports [#16157](https://github.com/lampepfl/dotty/pull/16157) +- Implement -Wvalue-discard warning [#15975](https://github.com/lampepfl/dotty/pull/15975) +- Introduce boundary/break control abstraction. [#16612](https://github.com/lampepfl/dotty/pull/16612) + +# Other changes and fixes + +## Annotations + +- Support use-site meta-annotations [#16445](https://github.com/lampepfl/dotty/pull/16445) + +## Desugaring + +- Reuse typed prefix for `applyDynamic` and `applyDynamicNamed` [#16552](https://github.com/lampepfl/dotty/pull/16552) +- Fix object selftype match error [#16441](https://github.com/lampepfl/dotty/pull/16441) + +## Erasure + +- Dealias before checking for outer references in types [#16525](https://github.com/lampepfl/dotty/pull/16525) +- Fix generic signature for type params bounded by primitive [#16442](https://github.com/lampepfl/dotty/pull/16442) +- Avoid EmptyScope.cloneScope crashing, eg on missing references [#16314](https://github.com/lampepfl/dotty/pull/16314) + +## GADTs + +- Inline GADT state restoring in TypeComparer [#16564](https://github.com/lampepfl/dotty/pull/16564) +- Add extension/conversion to GADT selection healing [#16638](https://github.com/lampepfl/dotty/pull/16638) + +## Incremental compilation + +- Unpickle arguments of parent constructors in Templates lazily [#16688](https://github.com/lampepfl/dotty/pull/16688) + +## Initialization + +- Fix #16438: Supply dummy args for erroneous parent call in init check [#16448](https://github.com/lampepfl/dotty/pull/16448) + +## Inline + +- Dealias in ConstantValue, for inline if cond [#16652](https://github.com/lampepfl/dotty/pull/16652) +- Set Span for top level annotations generated in PostTyper [#16378](https://github.com/lampepfl/dotty/pull/16378) +- Interpolate any type vars from comparing against SelectionProto [#16348](https://github.com/lampepfl/dotty/pull/16348) +- Handle binding of beta reduced inlined lambdas [#16377](https://github.com/lampepfl/dotty/pull/16377) +- Do not add dummy RHS to abstract inline methods [#16510](https://github.com/lampepfl/dotty/pull/16510) +- Warn on inline given aliases with functions as RHS [#16499](https://github.com/lampepfl/dotty/pull/16499) +- Support inline overrides in value classes [#16523](https://github.com/lampepfl/dotty/pull/16523) + +## Java interop + +- Represent Java annotations as interfaces so they can be extended, and disallow various misuses of them [#16260](https://github.com/lampepfl/dotty/pull/16260) + +## Opaque Types + +- Delay opaque alias checking until PostTyper [#16644](https://github.com/lampepfl/dotty/pull/16644) + +## Overloading + +- Handle context function arguments in overloading resolution [#16511](https://github.com/lampepfl/dotty/pull/16511) + +## Parser + +- Improve support for Unicode supplementary characters in identifiers and string interpolation (as in Scala 2) [#16278](https://github.com/lampepfl/dotty/pull/16278) +- Require indent after colon at EOL [#16466](https://github.com/lampepfl/dotty/pull/16466) +- Help givens return refined types [#16293](https://github.com/lampepfl/dotty/pull/16293) + +## Pattern Matching + +- Tweak AvoidMap's derivedSelect [#16563](https://github.com/lampepfl/dotty/pull/16563) +- Space: Use RHS of & when refining subtypes [#16573](https://github.com/lampepfl/dotty/pull/16573) +- Freeze constraints in a condition check of maximiseType [#16526](https://github.com/lampepfl/dotty/pull/16526) +- Restrict syntax of typed patterns [#16150](https://github.com/lampepfl/dotty/pull/16150) +- Test case to show that #16252 works with transparent [#16262](https://github.com/lampepfl/dotty/pull/16262) +- Support inline unapplySeq and with leading given parameters [#16358](https://github.com/lampepfl/dotty/pull/16358) +- Handle sealed prefixes in exh checking [#16621](https://github.com/lampepfl/dotty/pull/16621) +- Detect irrefutable quoted patterns [#16674](https://github.com/lampepfl/dotty/pull/16674) + +## Pickling + +- Allow case classes with up to 254 parameters [#16501](https://github.com/lampepfl/dotty/pull/16501) +- Correctly unpickle Scala 2 private case classes in traits [#16519](https://github.com/lampepfl/dotty/pull/16519) + +## Polyfunctions + +- Fix #9996: Crash with function accepting polymorphic function type with singleton result [#16327](https://github.com/lampepfl/dotty/pull/16327) + +## Quotes + +- Remove contents of inline methods [#16345](https://github.com/lampepfl/dotty/pull/16345) +- Fix errors in explicit type annotations in inline match cases [#16257](https://github.com/lampepfl/dotty/pull/16257) +- Handle macro annotation suspends and crashes [#16509](https://github.com/lampepfl/dotty/pull/16509) +- Fix macro annotations `spliceOwner` [#16513](https://github.com/lampepfl/dotty/pull/16513) + +## REPL + +- REPL: Fix crash when printing instances of value classes [#16393](https://github.com/lampepfl/dotty/pull/16393) +- Attempt to fix completion crash [#16267](https://github.com/lampepfl/dotty/pull/16267) +- Fix REPL shadowing bug [#16389](https://github.com/lampepfl/dotty/pull/16389) +- Open up for extensibility [#16276](https://github.com/lampepfl/dotty/pull/16276) +- Don't crash if completions throw [#16687](https://github.com/lampepfl/dotty/pull/16687) + +## Reflection + +- Fix reflect typeMembers to return all members [#15033](https://github.com/lampepfl/dotty/pull/15033) +- Deprecate reflect Flags.Static [#16568](https://github.com/lampepfl/dotty/pull/16568) + +## Reporting + +- Suppress follow-on errors for erroneous import qualifiers [#16658](https://github.com/lampepfl/dotty/pull/16658) +- Fix order in which errors are reported for assignment to val [#16660](https://github.com/lampepfl/dotty/pull/16660) +- Fix class name in error message [#16635](https://github.com/lampepfl/dotty/pull/16635) +- Make refined type printing more source compatible [#16303](https://github.com/lampepfl/dotty/pull/16303) +- Add error hint on local inline def used in quotes [#16572](https://github.com/lampepfl/dotty/pull/16572) +- Fix Text wrapping [#16277](https://github.com/lampepfl/dotty/pull/16277) +- Fix -Wunused:import registering constructor `` instead of its owner (also fix false positive for enum) [#16661](https://github.com/lampepfl/dotty/pull/16661) +- Fix #16675 : -Wunused false positive on case class generated method, due to flags used to distinguish case accessors. [#16683](https://github.com/lampepfl/dotty/pull/16683) +- Fix #16680 by registering Ident not containing a symbol [#16689](https://github.com/lampepfl/dotty/pull/16689) +- Fix #16682: CheckUnused missed some used symbols [#16690](https://github.com/lampepfl/dotty/pull/16690) +- Fix the non-miniphase tree traverser [#16684](https://github.com/lampepfl/dotty/pull/16684) + +## Scala-JS + +- Fix #14289: Accept Ident refs to `js.native` in native member rhs. [#16185](https://github.com/lampepfl/dotty/pull/16185) + +## Standard Library + +- Add `CanEqual` instance for `Map` [#15886](https://github.com/lampepfl/dotty/pull/15886) +- Refine `Tuple.Append` return type [#16140](https://github.com/lampepfl/dotty/pull/16140) + +## TASTy format + +- Make it a fatal error if erasure cannot resolve a type [#16373](https://github.com/lampepfl/dotty/pull/16373) + +## Tooling + +- Add -Yimports compiler flag [#16218](https://github.com/lampepfl/dotty/pull/16218) +- Allow BooleanSettings to be set with a colon [#16425](https://github.com/lampepfl/dotty/pull/16425) + +## Transform + +- Avoid stackoverflow in ExplicitOuter [#16381](https://github.com/lampepfl/dotty/pull/16381) +- Make lazy vals run on non-fallback graal image - remove dynamic reflection [#16346](https://github.com/lampepfl/dotty/pull/16346) +- Patch to avoid crash in #16351 [#16354](https://github.com/lampepfl/dotty/pull/16354) +- Don't treat package object's `` methods as package members [#16667](https://github.com/lampepfl/dotty/pull/16667) +- Space: Refine isSubspace property & an example [#16574](https://github.com/lampepfl/dotty/pull/16574) + +## Typer + +- Drop requirement that self types are closed [#16648](https://github.com/lampepfl/dotty/pull/16648) +- Disallow constructor params from appearing in parent types for soundness [#16664](https://github.com/lampepfl/dotty/pull/16664) +- Don't search implicit arguments in singleton type prefix [#16490](https://github.com/lampepfl/dotty/pull/16490) +- Don't rely on isProvisional to determine whether atoms computed [#16489](https://github.com/lampepfl/dotty/pull/16489) +- Support signature polymorphic methods (`MethodHandle` and `VarHandle`) [#16225](https://github.com/lampepfl/dotty/pull/16225) +- Prefer parameterless alternatives during ambiguous overload resolution [#16315](https://github.com/lampepfl/dotty/pull/16315) +- Fix calculation to drop transparent classes [#16344](https://github.com/lampepfl/dotty/pull/16344) +- Test case for issue 16311 [#16317](https://github.com/lampepfl/dotty/pull/16317) +- Skip caching provisional OrType atoms [#16295](https://github.com/lampepfl/dotty/pull/16295) +- Avoid cyclic references due to experimental check when inlining [#16195](https://github.com/lampepfl/dotty/pull/16195) +- Track type variable dependencies to guide instantiation decisions [#16042](https://github.com/lampepfl/dotty/pull/16042) +- Two fixes to constraint solving [#16353](https://github.com/lampepfl/dotty/pull/16353) +- Fix regression in cyclic constraint handling [#16514](https://github.com/lampepfl/dotty/pull/16514) +- Sharpen range approximation for applied types with capture set ranges [#16261](https://github.com/lampepfl/dotty/pull/16261) +- Cut the Gordian Knot: Don't widen unions to transparent [#15642](https://github.com/lampepfl/dotty/pull/15642) +- Fix widening logic to keep instantiation within bounds [#16417](https://github.com/lampepfl/dotty/pull/16417) +- Skip ambiguous reference error when symbols are aliases [#16401](https://github.com/lampepfl/dotty/pull/16401) +- Avoid incorrect simplifications when updating bounds in the constraint [#16410](https://github.com/lampepfl/dotty/pull/16410) +- Take `@targetName` into account when resolving extension methods [#16487](https://github.com/lampepfl/dotty/pull/16487) +- Improve ClassTag handling to avoid invalid ClassTag generation and inference failure [#16492](https://github.com/lampepfl/dotty/pull/16492) +- Fix extracting the elemType of a union of arrays [#16569](https://github.com/lampepfl/dotty/pull/16569) +- Make sure annotations are typed in expression contexts [#16699](https://github.com/lampepfl/dotty/pull/16699) +- Throw a type error when using hk-types in unions or intersections [#16712](https://github.com/lampepfl/dotty/pull/16712) + +# Contributors + +Thank you to all the contributors who made this release possible 🎉 + +According to `git shortlog -sn --no-merges 3.2.2..3.3.0-RC1` these are: + +``` + 225 Martin Odersky + 73 Dale Wijnand + 58 Szymon Rodziewicz + 54 Nicolas Stucki + 48 Kamil Szewczyk + 48 Paul Coral + 30 Paweł Marks + 28 Florian3k + 28 Yichen Xu + 14 Guillaume Martres + 8 Fengyun Liu + 8 Michał Pałka + 7 Chris Birchall + 7 rochala + 6 Kacper Korban + 6 Sébastien Doeraene + 6 jdudrak + 5 Seth Tisue + 5 Som Snytt + 5 nizhikov + 4 Filip Zybała + 4 Jan Chyb + 4 Michael Pollmeier + 4 Natsu Kagami + 3 Jamie Thompson + 2 Alex + 2 Anatolii Kmetiuk + 2 Dmitrii Naumenko + 2 Lukas Rytz + 2 adampauls + 2 yoshinorin + 1 Alexander Slesarenko + 1 Chris Kipp + 1 Guillaume Raffin + 1 Jakub Kozłowski + 1 Jan-Pieter van den Heuvel + 1 Julien Richard-Foy + 1 Kenji Yoshida + 1 Philippus + 1 Szymon R + 1 Tim Spence + 1 s.bazarsadaev + +``` \ No newline at end of file From 5522929ff3c45741899f9f3c01a7b789eb178023 Mon Sep 17 00:00:00 2001 From: =?UTF-8?q?Pawe=C5=82=20Marks?= Date: Thu, 26 Jan 2023 15:19:30 +0100 Subject: [PATCH 002/371] Fix incorrect TASTy version --- tasty/src/dotty/tools/tasty/TastyFormat.scala | 4 ++-- 1 file changed, 2 insertions(+), 2 deletions(-) diff --git a/tasty/src/dotty/tools/tasty/TastyFormat.scala b/tasty/src/dotty/tools/tasty/TastyFormat.scala index ded313fb171c..2d18923e1b0c 100644 --- a/tasty/src/dotty/tools/tasty/TastyFormat.scala +++ b/tasty/src/dotty/tools/tasty/TastyFormat.scala @@ -289,7 +289,7 @@ object TastyFormat { * compatibility, but remains backwards compatible, with all * preceeding `MinorVersion`. */ - final val MinorVersion: Int = 2 + final val MinorVersion: Int = 3 /** Natural Number. The `ExperimentalVersion` allows for * experimentation with changes to TASTy without committing @@ -305,7 +305,7 @@ object TastyFormat { * is able to read final TASTy documents if the file's * `MinorVersion` is strictly less than the current value. */ - final val ExperimentalVersion: Int = 0 + final val ExperimentalVersion: Int = 1 /**This method implements a binary relation (`<:<`) between two TASTy versions. * From 57a6de25f532c8ac6d7ed2ee1ee067e0599d524e Mon Sep 17 00:00:00 2001 From: =?UTF-8?q?Pawe=C5=82=20Marks?= Date: Thu, 26 Jan 2023 15:24:51 +0100 Subject: [PATCH 003/371] Add changelog for 3.3.0-RC2 --- changelogs/3.3.0-RC2.md | 229 ++++++++++++++++++++++++++++++++++++++++ 1 file changed, 229 insertions(+) create mode 100644 changelogs/3.3.0-RC2.md diff --git a/changelogs/3.3.0-RC2.md b/changelogs/3.3.0-RC2.md new file mode 100644 index 000000000000..57d785816489 --- /dev/null +++ b/changelogs/3.3.0-RC2.md @@ -0,0 +1,229 @@ +This release is nearly identical to 3.3.0-RC1. The only difference is that 3.3.0-RC1 generated output with incorrect TASTy version. + +The following changelog is identical to the changelog of 3.3.0-RC1. + +# Highlights of the release + +- Stabilize new lazy vals [#16614](https://github.com/lampepfl/dotty/pull/16614) +- Experimental Macro annotations [#16392](https://github.com/lampepfl/dotty/pull/16392) [#16454](https://github.com/lampepfl/dotty/pull/16454) [#16534](https://github.com/lampepfl/dotty/pull/16534) +- Fix stability check for inline parameters [#15511](https://github.com/lampepfl/dotty/pull/15511) +- Make `fewerBraces` a standard feature [#16297](https://github.com/lampepfl/dotty/pull/16297) +- Add new front-end phase for unused entities and add support for unused imports [#16157](https://github.com/lampepfl/dotty/pull/16157) +- Implement -Wvalue-discard warning [#15975](https://github.com/lampepfl/dotty/pull/15975) +- Introduce boundary/break control abstraction. [#16612](https://github.com/lampepfl/dotty/pull/16612) + +# Other changes and fixes + +## Annotations + +- Support use-site meta-annotations [#16445](https://github.com/lampepfl/dotty/pull/16445) + +## Desugaring + +- Reuse typed prefix for `applyDynamic` and `applyDynamicNamed` [#16552](https://github.com/lampepfl/dotty/pull/16552) +- Fix object selftype match error [#16441](https://github.com/lampepfl/dotty/pull/16441) + +## Erasure + +- Dealias before checking for outer references in types [#16525](https://github.com/lampepfl/dotty/pull/16525) +- Fix generic signature for type params bounded by primitive [#16442](https://github.com/lampepfl/dotty/pull/16442) +- Avoid EmptyScope.cloneScope crashing, eg on missing references [#16314](https://github.com/lampepfl/dotty/pull/16314) + +## GADTs + +- Inline GADT state restoring in TypeComparer [#16564](https://github.com/lampepfl/dotty/pull/16564) +- Add extension/conversion to GADT selection healing [#16638](https://github.com/lampepfl/dotty/pull/16638) + +## Incremental compilation + +- Unpickle arguments of parent constructors in Templates lazily [#16688](https://github.com/lampepfl/dotty/pull/16688) + +## Initialization + +- Fix #16438: Supply dummy args for erroneous parent call in init check [#16448](https://github.com/lampepfl/dotty/pull/16448) + +## Inline + +- Dealias in ConstantValue, for inline if cond [#16652](https://github.com/lampepfl/dotty/pull/16652) +- Set Span for top level annotations generated in PostTyper [#16378](https://github.com/lampepfl/dotty/pull/16378) +- Interpolate any type vars from comparing against SelectionProto [#16348](https://github.com/lampepfl/dotty/pull/16348) +- Handle binding of beta reduced inlined lambdas [#16377](https://github.com/lampepfl/dotty/pull/16377) +- Do not add dummy RHS to abstract inline methods [#16510](https://github.com/lampepfl/dotty/pull/16510) +- Warn on inline given aliases with functions as RHS [#16499](https://github.com/lampepfl/dotty/pull/16499) +- Support inline overrides in value classes [#16523](https://github.com/lampepfl/dotty/pull/16523) + +## Java interop + +- Represent Java annotations as interfaces so they can be extended, and disallow various misuses of them [#16260](https://github.com/lampepfl/dotty/pull/16260) + +## Opaque Types + +- Delay opaque alias checking until PostTyper [#16644](https://github.com/lampepfl/dotty/pull/16644) + +## Overloading + +- Handle context function arguments in overloading resolution [#16511](https://github.com/lampepfl/dotty/pull/16511) + +## Parser + +- Improve support for Unicode supplementary characters in identifiers and string interpolation (as in Scala 2) [#16278](https://github.com/lampepfl/dotty/pull/16278) +- Require indent after colon at EOL [#16466](https://github.com/lampepfl/dotty/pull/16466) +- Help givens return refined types [#16293](https://github.com/lampepfl/dotty/pull/16293) + +## Pattern Matching + +- Tweak AvoidMap's derivedSelect [#16563](https://github.com/lampepfl/dotty/pull/16563) +- Space: Use RHS of & when refining subtypes [#16573](https://github.com/lampepfl/dotty/pull/16573) +- Freeze constraints in a condition check of maximiseType [#16526](https://github.com/lampepfl/dotty/pull/16526) +- Restrict syntax of typed patterns [#16150](https://github.com/lampepfl/dotty/pull/16150) +- Test case to show that #16252 works with transparent [#16262](https://github.com/lampepfl/dotty/pull/16262) +- Support inline unapplySeq and with leading given parameters [#16358](https://github.com/lampepfl/dotty/pull/16358) +- Handle sealed prefixes in exh checking [#16621](https://github.com/lampepfl/dotty/pull/16621) +- Detect irrefutable quoted patterns [#16674](https://github.com/lampepfl/dotty/pull/16674) + +## Pickling + +- Allow case classes with up to 254 parameters [#16501](https://github.com/lampepfl/dotty/pull/16501) +- Correctly unpickle Scala 2 private case classes in traits [#16519](https://github.com/lampepfl/dotty/pull/16519) + +## Polyfunctions + +- Fix #9996: Crash with function accepting polymorphic function type with singleton result [#16327](https://github.com/lampepfl/dotty/pull/16327) + +## Quotes + +- Remove contents of inline methods [#16345](https://github.com/lampepfl/dotty/pull/16345) +- Fix errors in explicit type annotations in inline match cases [#16257](https://github.com/lampepfl/dotty/pull/16257) +- Handle macro annotation suspends and crashes [#16509](https://github.com/lampepfl/dotty/pull/16509) +- Fix macro annotations `spliceOwner` [#16513](https://github.com/lampepfl/dotty/pull/16513) + +## REPL + +- REPL: Fix crash when printing instances of value classes [#16393](https://github.com/lampepfl/dotty/pull/16393) +- Attempt to fix completion crash [#16267](https://github.com/lampepfl/dotty/pull/16267) +- Fix REPL shadowing bug [#16389](https://github.com/lampepfl/dotty/pull/16389) +- Open up for extensibility [#16276](https://github.com/lampepfl/dotty/pull/16276) +- Don't crash if completions throw [#16687](https://github.com/lampepfl/dotty/pull/16687) + +## Reflection + +- Fix reflect typeMembers to return all members [#15033](https://github.com/lampepfl/dotty/pull/15033) +- Deprecate reflect Flags.Static [#16568](https://github.com/lampepfl/dotty/pull/16568) + +## Reporting + +- Suppress follow-on errors for erroneous import qualifiers [#16658](https://github.com/lampepfl/dotty/pull/16658) +- Fix order in which errors are reported for assignment to val [#16660](https://github.com/lampepfl/dotty/pull/16660) +- Fix class name in error message [#16635](https://github.com/lampepfl/dotty/pull/16635) +- Make refined type printing more source compatible [#16303](https://github.com/lampepfl/dotty/pull/16303) +- Add error hint on local inline def used in quotes [#16572](https://github.com/lampepfl/dotty/pull/16572) +- Fix Text wrapping [#16277](https://github.com/lampepfl/dotty/pull/16277) +- Fix -Wunused:import registering constructor `` instead of its owner (also fix false positive for enum) [#16661](https://github.com/lampepfl/dotty/pull/16661) +- Fix #16675 : -Wunused false positive on case class generated method, due to flags used to distinguish case accessors. [#16683](https://github.com/lampepfl/dotty/pull/16683) +- Fix #16680 by registering Ident not containing a symbol [#16689](https://github.com/lampepfl/dotty/pull/16689) +- Fix #16682: CheckUnused missed some used symbols [#16690](https://github.com/lampepfl/dotty/pull/16690) +- Fix the non-miniphase tree traverser [#16684](https://github.com/lampepfl/dotty/pull/16684) + +## Scala-JS + +- Fix #14289: Accept Ident refs to `js.native` in native member rhs. [#16185](https://github.com/lampepfl/dotty/pull/16185) + +## Standard Library + +- Add `CanEqual` instance for `Map` [#15886](https://github.com/lampepfl/dotty/pull/15886) +- Refine `Tuple.Append` return type [#16140](https://github.com/lampepfl/dotty/pull/16140) + +## TASTy format + +- Make it a fatal error if erasure cannot resolve a type [#16373](https://github.com/lampepfl/dotty/pull/16373) + +## Tooling + +- Add -Yimports compiler flag [#16218](https://github.com/lampepfl/dotty/pull/16218) +- Allow BooleanSettings to be set with a colon [#16425](https://github.com/lampepfl/dotty/pull/16425) + +## Transform + +- Avoid stackoverflow in ExplicitOuter [#16381](https://github.com/lampepfl/dotty/pull/16381) +- Make lazy vals run on non-fallback graal image - remove dynamic reflection [#16346](https://github.com/lampepfl/dotty/pull/16346) +- Patch to avoid crash in #16351 [#16354](https://github.com/lampepfl/dotty/pull/16354) +- Don't treat package object's `` methods as package members [#16667](https://github.com/lampepfl/dotty/pull/16667) +- Space: Refine isSubspace property & an example [#16574](https://github.com/lampepfl/dotty/pull/16574) + +## Typer + +- Drop requirement that self types are closed [#16648](https://github.com/lampepfl/dotty/pull/16648) +- Disallow constructor params from appearing in parent types for soundness [#16664](https://github.com/lampepfl/dotty/pull/16664) +- Don't search implicit arguments in singleton type prefix [#16490](https://github.com/lampepfl/dotty/pull/16490) +- Don't rely on isProvisional to determine whether atoms computed [#16489](https://github.com/lampepfl/dotty/pull/16489) +- Support signature polymorphic methods (`MethodHandle` and `VarHandle`) [#16225](https://github.com/lampepfl/dotty/pull/16225) +- Prefer parameterless alternatives during ambiguous overload resolution [#16315](https://github.com/lampepfl/dotty/pull/16315) +- Fix calculation to drop transparent classes [#16344](https://github.com/lampepfl/dotty/pull/16344) +- Test case for issue 16311 [#16317](https://github.com/lampepfl/dotty/pull/16317) +- Skip caching provisional OrType atoms [#16295](https://github.com/lampepfl/dotty/pull/16295) +- Avoid cyclic references due to experimental check when inlining [#16195](https://github.com/lampepfl/dotty/pull/16195) +- Track type variable dependencies to guide instantiation decisions [#16042](https://github.com/lampepfl/dotty/pull/16042) +- Two fixes to constraint solving [#16353](https://github.com/lampepfl/dotty/pull/16353) +- Fix regression in cyclic constraint handling [#16514](https://github.com/lampepfl/dotty/pull/16514) +- Sharpen range approximation for applied types with capture set ranges [#16261](https://github.com/lampepfl/dotty/pull/16261) +- Cut the Gordian Knot: Don't widen unions to transparent [#15642](https://github.com/lampepfl/dotty/pull/15642) +- Fix widening logic to keep instantiation within bounds [#16417](https://github.com/lampepfl/dotty/pull/16417) +- Skip ambiguous reference error when symbols are aliases [#16401](https://github.com/lampepfl/dotty/pull/16401) +- Avoid incorrect simplifications when updating bounds in the constraint [#16410](https://github.com/lampepfl/dotty/pull/16410) +- Take `@targetName` into account when resolving extension methods [#16487](https://github.com/lampepfl/dotty/pull/16487) +- Improve ClassTag handling to avoid invalid ClassTag generation and inference failure [#16492](https://github.com/lampepfl/dotty/pull/16492) +- Fix extracting the elemType of a union of arrays [#16569](https://github.com/lampepfl/dotty/pull/16569) +- Make sure annotations are typed in expression contexts [#16699](https://github.com/lampepfl/dotty/pull/16699) +- Throw a type error when using hk-types in unions or intersections [#16712](https://github.com/lampepfl/dotty/pull/16712) + +# Contributors + +Thank you to all the contributors who made this release possible 🎉 + +According to `git shortlog -sn --no-merges 3.2.2..3.3.0-RC1` these are: + +``` + 225 Martin Odersky + 73 Dale Wijnand + 58 Szymon Rodziewicz + 54 Nicolas Stucki + 48 Kamil Szewczyk + 48 Paul Coral + 30 Paweł Marks + 28 Florian3k + 28 Yichen Xu + 14 Guillaume Martres + 8 Fengyun Liu + 8 Michał Pałka + 7 Chris Birchall + 7 rochala + 6 Kacper Korban + 6 Sébastien Doeraene + 6 jdudrak + 5 Seth Tisue + 5 Som Snytt + 5 nizhikov + 4 Filip Zybała + 4 Jan Chyb + 4 Michael Pollmeier + 4 Natsu Kagami + 3 Jamie Thompson + 2 Alex + 2 Anatolii Kmetiuk + 2 Dmitrii Naumenko + 2 Lukas Rytz + 2 adampauls + 2 yoshinorin + 1 Alexander Slesarenko + 1 Chris Kipp + 1 Guillaume Raffin + 1 Jakub Kozłowski + 1 Jan-Pieter van den Heuvel + 1 Julien Richard-Foy + 1 Kenji Yoshida + 1 Philippus + 1 Szymon R + 1 Tim Spence + 1 s.bazarsadaev + +``` \ No newline at end of file From 8dbc9051840e8962f49f42500bf6769d1294d4f2 Mon Sep 17 00:00:00 2001 From: =?UTF-8?q?Pawe=C5=82=20Marks?= Date: Thu, 26 Jan 2023 15:25:20 +0100 Subject: [PATCH 004/371] Release 3.3.0-RC2 --- project/Build.scala | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/project/Build.scala b/project/Build.scala index 9babd3c9c679..75d3e12baf66 100644 --- a/project/Build.scala +++ b/project/Build.scala @@ -82,7 +82,7 @@ object Build { val referenceVersion = "3.2.2" - val baseVersion = "3.3.0-RC1" + val baseVersion = "3.3.0-RC2" // Versions used by the vscode extension to create a new project // This should be the latest published releases. From 033a3b8ac9bac359f6883dd8c4027f8ca7ba4cc1 Mon Sep 17 00:00:00 2001 From: Kacper Korban Date: Tue, 24 Jan 2023 23:22:54 +0100 Subject: [PATCH 005/371] Add default scaladoc settings to scaladoc artifact publishing --- project/Build.scala | 9 +++++++-- project/ScaladocGeneration.scala | 4 ++++ 2 files changed, 11 insertions(+), 2 deletions(-) diff --git a/project/Build.scala b/project/Build.scala index 5fab2b80229a..94deedc50582 100644 --- a/project/Build.scala +++ b/project/Build.scala @@ -839,6 +839,7 @@ object Build { "-sourcepath", (Compile / sourceDirectories).value.map(_.getAbsolutePath).distinct.mkString(File.pathSeparator), "-Yexplicit-nulls", ), + (Compile / doc / scalacOptions) ++= ScaladocConfigs.DefaultGenerationSettings.value.settings ) lazy val `scala3-library` = project.in(file("library")).asDottyLibrary(NonBootstrapped) @@ -1877,8 +1878,7 @@ object ScaladocConfigs { ) } - lazy val DefaultGenerationConfig = Def.task { - def distLocation = (dist / pack).value + lazy val DefaultGenerationSettings = Def.task { def projectVersion = version.value def socialLinks = SocialLinks(List( "github::https://github.com/lampepfl/dotty", @@ -1919,6 +1919,11 @@ object ScaladocConfigs { ) } + lazy val DefaultGenerationConfig = Def.task { + def distLocation = (dist / pack).value + DefaultGenerationSettings.value + } + lazy val Scaladoc = Def.task { DefaultGenerationConfig.value .add(UseJavacp(true)) diff --git a/project/ScaladocGeneration.scala b/project/ScaladocGeneration.scala index c6c4393c071f..fd972311da1d 100644 --- a/project/ScaladocGeneration.scala +++ b/project/ScaladocGeneration.scala @@ -141,6 +141,7 @@ object ScaladocGeneration { def remove[T <: Arg[_]: ClassTag]: GenerationConfig def withTargets(targets: Seq[String]): GenerationConfig def serialize: String + def settings: Seq[String] } object GenerationConfig { @@ -173,6 +174,9 @@ object ScaladocGeneration { ++ targets ).mkString(" ") + override def settings: Seq[String] = + args.map(_.serialize) ++ targets + private def argsWithout[T <: Arg[_]]( implicit tag: ClassTag[T] ): (Option[T], Seq[Arg[_]]) = args.foldLeft[(Option[T], Seq[Arg[_]])]((None, Seq.empty)) { From c188f1dc96361bc8a3dcc494efffb55e53938d1b Mon Sep 17 00:00:00 2001 From: Mohammad Yousuf Minhaj Zia Date: Wed, 25 Jan 2023 00:48:25 +0800 Subject: [PATCH 006/371] Added jpath check to `ClassLikeSupport` getParentsAsTreeSymbolTuples Fixes #15927 Check for whether the non-scala3 parent exists before checking the start and end of the span to confirm whether the span exists in getParentsAsTreeSymbolTuples. --- scaladoc-testcases/src/tests/nonScala3Parent.scala | 13 +++++++++++++ .../tools/scaladoc/tasty/ClassLikeSupport.scala | 3 ++- .../TranslatableSignaturesTestCases.scala | 2 ++ 3 files changed, 17 insertions(+), 1 deletion(-) create mode 100644 scaladoc-testcases/src/tests/nonScala3Parent.scala diff --git a/scaladoc-testcases/src/tests/nonScala3Parent.scala b/scaladoc-testcases/src/tests/nonScala3Parent.scala new file mode 100644 index 000000000000..91183d25b583 --- /dev/null +++ b/scaladoc-testcases/src/tests/nonScala3Parent.scala @@ -0,0 +1,13 @@ +package tests +package nonScala3Parent + +import javax.swing.JPanel +import javax.swing.JFrame + +// https://github.com/lampepfl/dotty/issues/15927 + +trait Foo1 extends Numeric[Any] +trait Foo2 extends JPanel +trait Foo3 extends JFrame +trait Foo4 extends Ordering[Any] +trait Foo5 extends Enumeration diff --git a/scaladoc/src/dotty/tools/scaladoc/tasty/ClassLikeSupport.scala b/scaladoc/src/dotty/tools/scaladoc/tasty/ClassLikeSupport.scala index 920621b8577c..38cc90330265 100644 --- a/scaladoc/src/dotty/tools/scaladoc/tasty/ClassLikeSupport.scala +++ b/scaladoc/src/dotty/tools/scaladoc/tasty/ClassLikeSupport.scala @@ -266,7 +266,8 @@ trait ClassLikeSupport: def getParentsAsTreeSymbolTuples: List[(Tree, Symbol)] = if noPosClassDefs.contains(c.symbol) then Nil else for - parentTree <- c.parents if parentTree.pos.start != parentTree.pos.end // We assume here that order is correct + // TODO: add exists function to position methods in Quotes and replace the condition here for checking the JPath + parentTree <- c.parents if parentTree.pos.sourceFile.getJPath.isDefined && parentTree.pos.start != parentTree.pos.end // We assume here that order is correct parentSymbol = parentTree match case t: TypeTree => t.tpe.typeSymbol case tree if tree.symbol.isClassConstructor => tree.symbol.owner diff --git a/scaladoc/test/dotty/tools/scaladoc/signatures/TranslatableSignaturesTestCases.scala b/scaladoc/test/dotty/tools/scaladoc/signatures/TranslatableSignaturesTestCases.scala index ab7c2189e5d5..49316b08dbc0 100644 --- a/scaladoc/test/dotty/tools/scaladoc/signatures/TranslatableSignaturesTestCases.scala +++ b/scaladoc/test/dotty/tools/scaladoc/signatures/TranslatableSignaturesTestCases.scala @@ -106,3 +106,5 @@ class ImplicitMembers extends SignatureTest( Seq("def"), filterFunc = _.toString.endsWith("OuterClass$ImplicitMemberTarget.html") ) + +class NonScala3Parent extends SignatureTest("nonScala3Parent", SignatureTest.all) From 92c5dade99f41cb820bcefe946cc21e0eaf6934e Mon Sep 17 00:00:00 2001 From: Dale Wijnand Date: Fri, 30 Dec 2022 17:58:05 +0000 Subject: [PATCH 007/371] Split out immutable GadtConstraint --- .../dotty/tools/dotc/core/Constraint.scala | 3 + .../tools/dotc/core/ConstraintHandling.scala | 11 +- .../src/dotty/tools/dotc/core/Contexts.scala | 10 +- .../tools/dotc/core/GadtConstraint.scala | 351 +++++++++--------- .../tools/dotc/core/OrderingConstraint.scala | 11 + .../dotc/core/PatternTypeConstrainer.scala | 2 +- .../dotty/tools/dotc/core/TypeComparer.scala | 26 +- .../tools/dotc/transform/PostTyper.scala | 2 +- .../dotty/tools/dotc/typer/Implicits.scala | 8 +- .../src/dotty/tools/dotc/typer/Typer.scala | 4 +- 10 files changed, 213 insertions(+), 215 deletions(-) diff --git a/compiler/src/dotty/tools/dotc/core/Constraint.scala b/compiler/src/dotty/tools/dotc/core/Constraint.scala index b849c7aa7093..c634f847e510 100644 --- a/compiler/src/dotty/tools/dotc/core/Constraint.scala +++ b/compiler/src/dotty/tools/dotc/core/Constraint.scala @@ -71,6 +71,9 @@ abstract class Constraint extends Showable { */ def nonParamBounds(param: TypeParamRef)(using Context): TypeBounds + /** The current bounds of type parameter `param` */ + def bounds(param: TypeParamRef)(using Context): TypeBounds + /** A new constraint which is derived from this constraint by adding * entries for all type parameters of `poly`. * @param tvars A list of type variables associated with the params, diff --git a/compiler/src/dotty/tools/dotc/core/ConstraintHandling.scala b/compiler/src/dotty/tools/dotc/core/ConstraintHandling.scala index 6207e0a3d728..9ffe2bda73cb 100644 --- a/compiler/src/dotty/tools/dotc/core/ConstraintHandling.scala +++ b/compiler/src/dotty/tools/dotc/core/ConstraintHandling.scala @@ -749,16 +749,7 @@ trait ConstraintHandling { } /** The current bounds of type parameter `param` */ - def bounds(param: TypeParamRef)(using Context): TypeBounds = { - val e = constraint.entry(param) - if (e.exists) e.bounds - else { - // TODO: should we change the type of paramInfos to nullable? - val pinfos: List[param.binder.PInfo] | Null = param.binder.paramInfos - if (pinfos != null) pinfos(param.paramNum) // pinfos == null happens in pos/i536.scala - else TypeBounds.empty - } - } + def bounds(param: TypeParamRef)(using Context): TypeBounds = constraint.bounds(param) /** Add type lambda `tl`, possibly with type variables `tvars`, to current constraint * and propagate all bounds. diff --git a/compiler/src/dotty/tools/dotc/core/Contexts.scala b/compiler/src/dotty/tools/dotc/core/Contexts.scala index 5f82e8c8b6ce..398af92e494d 100644 --- a/compiler/src/dotty/tools/dotc/core/Contexts.scala +++ b/compiler/src/dotty/tools/dotc/core/Contexts.scala @@ -141,7 +141,7 @@ object Contexts { def tree: Tree[?] def scope: Scope def typerState: TyperState - def gadt: GadtConstraint + def gadt: GadtConstraintHandling def searchHistory: SearchHistory def source: SourceFile @@ -541,8 +541,8 @@ object Contexts { private var _typerState: TyperState = uninitialized final def typerState: TyperState = _typerState - private var _gadt: GadtConstraint = uninitialized - final def gadt: GadtConstraint = _gadt + private var _gadt: GadtConstraintHandling = uninitialized + final def gadt: GadtConstraintHandling = _gadt private var _searchHistory: SearchHistory = uninitialized final def searchHistory: SearchHistory = _searchHistory @@ -624,7 +624,7 @@ object Contexts { this._scope = typer.scope setTypeAssigner(typer) - def setGadt(gadt: GadtConstraint): this.type = + def setGadt(gadt: GadtConstraintHandling): this.type = util.Stats.record("Context.setGadt") this._gadt = gadt this @@ -721,7 +721,7 @@ object Contexts { .updated(notNullInfosLoc, Nil) .updated(compilationUnitLoc, NoCompilationUnit) c._searchHistory = new SearchRoot - c._gadt = GadtConstraint.empty + c._gadt = GadtConstraintHandling(GadtConstraint.empty) c end FreshContext diff --git a/compiler/src/dotty/tools/dotc/core/GadtConstraint.scala b/compiler/src/dotty/tools/dotc/core/GadtConstraint.scala index 7515898a36df..d683c15a9241 100644 --- a/compiler/src/dotty/tools/dotc/core/GadtConstraint.scala +++ b/compiler/src/dotty/tools/dotc/core/GadtConstraint.scala @@ -2,34 +2,145 @@ package dotty.tools package dotc package core -import Decorators._ -import Contexts._ -import Types._ -import Symbols._ +import Contexts.*, Decorators.*, Symbols.*, Types.* +import config.Printers.{gadts, gadtsConstr} import util.{SimpleIdentitySet, SimpleIdentityMap} -import collection.mutable import printing._ +import scala.annotation.tailrec +import scala.annotation.internal.sharable +import scala.collection.mutable + object GadtConstraint: - def apply(): GadtConstraint = empty - def empty: GadtConstraint = - new ProperGadtConstraint(OrderingConstraint.empty, SimpleIdentityMap.empty, SimpleIdentityMap.empty, false) + @sharable val empty: GadtConstraint = + GadtConstraint(OrderingConstraint.empty, SimpleIdentityMap.empty, SimpleIdentityMap.empty, false) /** Represents GADT constraints currently in scope */ -sealed trait GadtConstraint ( - private var myConstraint: Constraint, - private var mapping: SimpleIdentityMap[Symbol, TypeVar], - private var reverseMapping: SimpleIdentityMap[TypeParamRef, Symbol], - private var wasConstrained: Boolean -) extends Showable { - this: ConstraintHandling => +class GadtConstraint private ( + private val myConstraint: Constraint, + private val mapping: SimpleIdentityMap[Symbol, TypeVar], + private val reverseMapping: SimpleIdentityMap[TypeParamRef, Symbol], + private val wasConstrained: Boolean, +) extends Showable: + def constraint: Constraint = myConstraint + def symbols: List[Symbol] = mapping.keys + def withConstraint(c: Constraint) = copy(myConstraint = c) + def withWasConstrained = copy(wasConstrained = true) + + def add(sym: Symbol, tv: TypeVar): GadtConstraint = copy( + mapping = mapping.updated(sym, tv), + reverseMapping = reverseMapping.updated(tv.origin, sym), + ) + + /** Is `sym1` ordered to be less than `sym2`? */ + def isLess(sym1: Symbol, sym2: Symbol)(using Context): Boolean = + constraint.isLess(tvarOrError(sym1).origin, tvarOrError(sym2).origin) + + /** Full bounds of `sym`, including TypeRefs to other lower/upper symbols. + * + * @note this performs subtype checks between ordered symbols. + * Using this in isSubType can lead to infinite recursion. Consider `bounds` instead. + */ + def fullBounds(sym: Symbol)(using Context): TypeBounds | Null = mapping(sym) match + case null => null + case tv: TypeVar => fullBounds(tv.origin) // .ensuring(containsNoInternalTypes(_)) + + /** Immediate bounds of `sym`. Does not contain lower/upper symbols (see [[fullBounds]]). */ + def bounds(sym: Symbol)(using Context): TypeBounds | Null = + mapping(sym) match + case null => null + case tv: TypeVar => + def retrieveBounds: TypeBounds = externalize(constraint.bounds(tv.origin)).bounds + retrieveBounds + //.showing(i"gadt bounds $sym: $result", gadts) + //.ensuring(containsNoInternalTypes(_)) + + /** Is the symbol registered in the constraint? + * + * @note this is true even if the symbol is constrained to be equal to another type, unlike [[Constraint.contains]]. + */ + def contains(sym: Symbol)(using Context): Boolean = mapping(sym) != null + + /** GADT constraint narrows bounds of at least one variable */ + def isNarrowing: Boolean = wasConstrained + + def fullBounds(param: TypeParamRef)(using Context): TypeBounds = + nonParamBounds(param).derivedTypeBounds(fullLowerBound(param), fullUpperBound(param)) + + def nonParamBounds(param: TypeParamRef)(using Context): TypeBounds = + externalize(constraint.nonParamBounds(param)).bounds + + def fullLowerBound(param: TypeParamRef)(using Context): Type = + constraint.minLower(param).foldLeft(nonParamBounds(param).lo) { + (t, u) => t | externalize(u) + } + + def fullUpperBound(param: TypeParamRef)(using Context): Type = + constraint.minUpper(param).foldLeft(nonParamBounds(param).hi) { (t, u) => + val eu = externalize(u) + // Any as the upper bound means "no bound", but if F is higher-kinded, + // Any & F = F[_]; this is wrong for us so we need to short-circuit + if t.isAny then eu else t & eu + } + + def externalize(tp: Type, theMap: TypeMap | Null = null)(using Context): Type = tp match + case param: TypeParamRef => reverseMapping(param) match + case sym: Symbol => sym.typeRef + case null => param + case tp: TypeAlias => tp.derivedAlias(externalize(tp.alias, theMap)) + case tp => (if theMap == null then ExternalizeMap() else theMap).mapOver(tp) + + private class ExternalizeMap(using Context) extends TypeMap: + def apply(tp: Type): Type = externalize(tp, this)(using mapCtx) + + def tvarOrError(sym: Symbol)(using Context): TypeVar = + mapping(sym).ensuring(_ != null, i"not a constrainable symbol: $sym").uncheckedNN + + @tailrec final def stripInternalTypeVar(tp: Type): Type = tp match + case tv: TypeVar => + val inst = constraint.instType(tv) + if inst.exists then stripInternalTypeVar(inst) else tv + case _ => tp + + def internalize(tp: Type)(using Context): Type = tp match + case nt: NamedType => + val ntTvar = mapping(nt.symbol) + if ntTvar == null then tp + else ntTvar + case _ => tp + + private def containsNoInternalTypes(tp: Type, theAcc: TypeAccumulator[Boolean] | Null = null)(using Context): Boolean = tp match { + case tpr: TypeParamRef => !reverseMapping.contains(tpr) + case tv: TypeVar => !reverseMapping.contains(tv.origin) + case tp => + (if (theAcc != null) theAcc else new ContainsNoInternalTypesAccumulator()).foldOver(true, tp) + } - import dotty.tools.dotc.config.Printers.{gadts, gadtsConstr} + private class ContainsNoInternalTypesAccumulator(using Context) extends TypeAccumulator[Boolean] { + override def apply(x: Boolean, tp: Type): Boolean = x && containsNoInternalTypes(tp, this) + } - private[core] def getConstraint: Constraint = constraint - private[core] def getMapping: SimpleIdentityMap[Symbol, TypeVar] = mapping - private[core] def getReverseMapping: SimpleIdentityMap[TypeParamRef, Symbol] = reverseMapping - private[core] def getWasConstrained: Boolean = wasConstrained + override def toText(printer: Printer): Texts.Text = printer.toText(this) + + /** Provides more information than toText, by showing the underlying Constraint details. */ + def debugBoundsDescription(using Context): String = i"$this\n$constraint" + + private def copy( + myConstraint: Constraint = myConstraint, + mapping: SimpleIdentityMap[Symbol, TypeVar] = mapping, + reverseMapping: SimpleIdentityMap[TypeParamRef, Symbol] = reverseMapping, + wasConstrained: Boolean = wasConstrained, + ): GadtConstraint = GadtConstraint(myConstraint, mapping, reverseMapping, wasConstrained) +end GadtConstraint + +object GadtConstraintHandling: + def apply(gadt: GadtConstraint): GadtConstraintHandling = new ProperGadtConstraintHandling(gadt) + +sealed trait GadtConstraintHandling(private var myGadt: GadtConstraint) { + this: ConstraintHandling => + + def gadt: GadtConstraint = myGadt + private def gadt_=(g: GadtConstraint) = myGadt = g /** Exposes ConstraintHandling.subsumes */ def subsumes(left: GadtConstraint, right: GadtConstraint, pre: GadtConstraint)(using Context): Boolean = { @@ -57,22 +168,19 @@ sealed trait GadtConstraint ( // and used as orderings. def substDependentSyms(tp: Type, isUpper: Boolean)(using Context): Type = { def loop(tp: Type) = substDependentSyms(tp, isUpper) - tp match { + tp match case tp @ AndType(tp1, tp2) if !isUpper => tp.derivedAndType(loop(tp1), loop(tp2)) case tp @ OrType(tp1, tp2) if isUpper => tp.derivedOrType(loop(tp1), loop(tp2)) case tp: NamedType => - params.indexOf(tp.symbol) match { + params.indexOf(tp.symbol) match case -1 => - mapping(tp.symbol) match { + gadt.internalize(tp) match case tv: TypeVar => tv.origin - case null => tp - } + case _ => tp case i => pt.paramRefs(i) - } case tp => tp - } } val tb = param.info.bounds @@ -86,205 +194,92 @@ sealed trait GadtConstraint ( val tvars = params.lazyZip(poly1.paramRefs).map { (sym, paramRef) => val tv = TypeVar(paramRef, creatorState = null) - mapping = mapping.updated(sym, tv) - reverseMapping = reverseMapping.updated(tv.origin, sym) + gadt = gadt.add(sym, tv) tv } // The replaced symbols are picked up here. addToConstraint(poly1, tvars) - .showing(i"added to constraint: [$poly1] $params%, % gadt = $this", gadts) + .showing(i"added to constraint: [$poly1] $params%, % gadt = $gadt", gadts) } /** Further constrain a symbol already present in the constraint. */ def addBound(sym: Symbol, bound: Type, isUpper: Boolean)(using Context): Boolean = { - @annotation.tailrec def stripInternalTypeVar(tp: Type): Type = tp match { - case tv: TypeVar => - val inst = constraint.instType(tv) - if (inst.exists) stripInternalTypeVar(inst) else tv - case _ => tp - } - - val symTvar: TypeVar = stripInternalTypeVar(tvarOrError(sym)) match { + val symTvar: TypeVar = gadt.stripInternalTypeVar(gadt.tvarOrError(sym)) match case tv: TypeVar => tv case inst => gadts.println(i"instantiated: $sym -> $inst") - return if (isUpper) isSub(inst, bound) else isSub(bound, inst) - } + return if isUpper then isSub(inst, bound) else isSub(bound, inst) - val internalizedBound = bound match { - case nt: NamedType => - val ntTvar = mapping(nt.symbol) - if (ntTvar != null) stripInternalTypeVar(ntTvar) else bound - case _ => bound - } + val internalizedBound = gadt.stripInternalTypeVar(gadt.internalize(bound)) val saved = constraint val result = internalizedBound match case boundTvar: TypeVar => - if (boundTvar eq symTvar) true - else if (isUpper) addLess(symTvar.origin, boundTvar.origin) + if boundTvar eq symTvar then true + else if isUpper + then addLess(symTvar.origin, boundTvar.origin) else addLess(boundTvar.origin, symTvar.origin) case bound => addBoundTransitively(symTvar.origin, bound, isUpper) gadts.println { - val descr = if (isUpper) "upper" else "lower" - val op = if (isUpper) "<:" else ">:" + val descr = if isUpper then "upper" else "lower" + val op = if isUpper then "<:" else ">:" i"adding $descr bound $sym $op $bound = $result" } - if constraint ne saved then wasConstrained = true + if constraint ne saved then gadt = gadt.withWasConstrained result } - /** Is `sym1` ordered to be less than `sym2`? */ - def isLess(sym1: Symbol, sym2: Symbol)(using Context): Boolean = - constraint.isLess(tvarOrError(sym1).origin, tvarOrError(sym2).origin) - - /** Full bounds of `sym`, including TypeRefs to other lower/upper symbols. - * - * @note this performs subtype checks between ordered symbols. - * Using this in isSubType can lead to infinite recursion. Consider `bounds` instead. - */ - def fullBounds(sym: Symbol)(using Context): TypeBounds | Null = - mapping(sym) match { - case null => null - // TODO: Improve flow typing so that ascription becomes redundant - case tv: TypeVar => - fullBounds(tv.origin) - // .ensuring(containsNoInternalTypes(_)) - } - - /** Immediate bounds of `sym`. Does not contain lower/upper symbols (see [[fullBounds]]). */ - def bounds(sym: Symbol)(using Context): TypeBounds | Null = - mapping(sym) match { - case null => null - // TODO: Improve flow typing so that ascription becomes redundant - case tv: TypeVar => - def retrieveBounds: TypeBounds = externalize(bounds(tv.origin)).bounds - retrieveBounds - //.showing(i"gadt bounds $sym: $result", gadts) - //.ensuring(containsNoInternalTypes(_)) - } - - /** Is the symbol registered in the constraint? - * - * @note this is true even if the symbol is constrained to be equal to another type, unlike [[Constraint.contains]]. - */ - def contains(sym: Symbol)(using Context): Boolean = mapping(sym) != null - - /** GADT constraint narrows bounds of at least one variable */ - def isNarrowing: Boolean = wasConstrained + def isLess(sym1: Symbol, sym2: Symbol)(using Context): Boolean = gadt.isLess(sym1, sym2) + def fullBounds(sym: Symbol)(using Context): TypeBounds | Null = gadt.fullBounds(sym) + def bounds(sym: Symbol)(using Context): TypeBounds | Null = gadt.bounds(sym) + def contains(sym: Symbol)(using Context): Boolean = gadt.contains(sym) + def isNarrowing: Boolean = gadt.isNarrowing + def symbols: List[Symbol] = gadt.symbols /** See [[ConstraintHandling.approximation]] */ def approximation(sym: Symbol, fromBelow: Boolean, maxLevel: Int = Int.MaxValue)(using Context): Type = { - val res = - approximation(tvarOrError(sym).origin, fromBelow, maxLevel) match - case tpr: TypeParamRef => - // Here we do externalization when the returned type is a TypeParamRef, - // b/c ConstraintHandling.approximation may return internal types when - // the type variable is instantiated. See #15531. - externalize(tpr) - case tp => tp - - gadts.println(i"approximating $sym ~> $res") - res + approximation(gadt.tvarOrError(sym).origin, fromBelow, maxLevel).match + case tpr: TypeParamRef => + // Here we do externalization when the returned type is a TypeParamRef, + // b/c ConstraintHandling.approximation may return internal types when + // the type variable is instantiated. See #15531. + gadt.externalize(tpr) + case tp => tp + .showing(i"approximating $sym ~> $result", gadts) } - def symbols: List[Symbol] = mapping.keys + def fresh: GadtConstraintHandling = GadtConstraintHandling(gadt) - def fresh: GadtConstraint = new ProperGadtConstraint(myConstraint, mapping, reverseMapping, wasConstrained) - - /** Restore the state from other [[GadtConstraint]], probably copied using [[fresh]] */ - def restore(other: GadtConstraint): Unit = - this.myConstraint = other.myConstraint - this.mapping = other.mapping - this.reverseMapping = other.reverseMapping - this.wasConstrained = other.wasConstrained - - def restore(constr: Constraint, mapping: SimpleIdentityMap[Symbol, TypeVar], revMapping: SimpleIdentityMap[TypeParamRef, Symbol], wasConstrained: Boolean): Unit = - this.myConstraint = constr - this.mapping = mapping - this.reverseMapping = revMapping - this.wasConstrained = wasConstrained + /** Restore the GadtConstraint state. */ + def restore(gadt: GadtConstraint): Unit = this.gadt = gadt inline def rollbackGadtUnless(inline op: Boolean): Boolean = - val savedConstr = myConstraint - val savedMapping = mapping - val savedReverseMapping = reverseMapping - val savedWasConstrained = wasConstrained + val saved = gadt var result = false - try - result = op - finally - if !result then - restore(savedConstr, savedMapping, savedReverseMapping, savedWasConstrained) + try result = op + finally if !result then restore(saved) result // ---- Protected/internal ----------------------------------------------- - override protected def constraint = myConstraint - override protected def constraint_=(c: Constraint) = myConstraint = c + override protected def constraint = gadt.constraint + override protected def constraint_=(c: Constraint) = gadt = gadt.withConstraint(c) override protected def isSub(tp1: Type, tp2: Type)(using Context): Boolean = TypeComparer.isSubType(tp1, tp2) override protected def isSame(tp1: Type, tp2: Type)(using Context): Boolean = TypeComparer.isSameType(tp1, tp2) - override def nonParamBounds(param: TypeParamRef)(using Context): TypeBounds = - externalize(constraint.nonParamBounds(param)).bounds - - override def fullLowerBound(param: TypeParamRef)(using Context): Type = - constraint.minLower(param).foldLeft(nonParamBounds(param).lo) { - (t, u) => t | externalize(u) - } - - override def fullUpperBound(param: TypeParamRef)(using Context): Type = - constraint.minUpper(param).foldLeft(nonParamBounds(param).hi) { (t, u) => - val eu = externalize(u) - // Any as the upper bound means "no bound", but if F is higher-kinded, - // Any & F = F[_]; this is wrong for us so we need to short-circuit - if t.isAny then eu else t & eu - } - - // ---- Private ---------------------------------------------------------- - - private def externalize(tp: Type, theMap: TypeMap | Null = null)(using Context): Type = tp match - case param: TypeParamRef => reverseMapping(param) match - case sym: Symbol => sym.typeRef - case null => param - case tp: TypeAlias => tp.derivedAlias(externalize(tp.alias, theMap)) - case tp => (if theMap == null then ExternalizeMap() else theMap).mapOver(tp) - - private class ExternalizeMap(using Context) extends TypeMap: - def apply(tp: Type): Type = externalize(tp, this)(using mapCtx) - - private def tvarOrError(sym: Symbol)(using Context): TypeVar = - mapping(sym).ensuring(_ != null, i"not a constrainable symbol: $sym").uncheckedNN - - private def containsNoInternalTypes(tp: Type, theAcc: TypeAccumulator[Boolean] | Null = null)(using Context): Boolean = tp match { - case tpr: TypeParamRef => !reverseMapping.contains(tpr) - case tv: TypeVar => !reverseMapping.contains(tv.origin) - case tp => - (if (theAcc != null) theAcc else new ContainsNoInternalTypesAccumulator()).foldOver(true, tp) - } - - private class ContainsNoInternalTypesAccumulator(using Context) extends TypeAccumulator[Boolean] { - override def apply(x: Boolean, tp: Type): Boolean = x && containsNoInternalTypes(tp, this) - } + override def nonParamBounds(param: TypeParamRef)(using Context): TypeBounds = gadt.nonParamBounds(param) + override def fullLowerBound(param: TypeParamRef)(using Context): Type = gadt.fullLowerBound(param) + override def fullUpperBound(param: TypeParamRef)(using Context): Type = gadt.fullUpperBound(param) // ---- Debug ------------------------------------------------------------ override def constr = gadtsConstr - - override def toText(printer: Printer): Texts.Text = printer.toText(this) - - /** Provides more information than toText, by showing the underlying Constraint details. */ - def debugBoundsDescription(using Context): String = i"$this\n$constraint" } -private class ProperGadtConstraint ( - myConstraint: Constraint, - mapping: SimpleIdentityMap[Symbol, TypeVar], - reverseMapping: SimpleIdentityMap[TypeParamRef, Symbol], - wasConstrained: Boolean, -) extends ConstraintHandling with GadtConstraint(myConstraint, mapping, reverseMapping, wasConstrained) +// Hide ConstraintHandling within GadtConstraintHandling +private class ProperGadtConstraintHandling(gadt: GadtConstraint) extends ConstraintHandling with GadtConstraintHandling(gadt) diff --git a/compiler/src/dotty/tools/dotc/core/OrderingConstraint.scala b/compiler/src/dotty/tools/dotc/core/OrderingConstraint.scala index 212b70336f4b..faea30390d2b 100644 --- a/compiler/src/dotty/tools/dotc/core/OrderingConstraint.scala +++ b/compiler/src/dotty/tools/dotc/core/OrderingConstraint.scala @@ -224,6 +224,17 @@ class OrderingConstraint(private val boundsMap: ParamBounds, def exclusiveUpper(param: TypeParamRef, butNot: TypeParamRef): List[TypeParamRef] = upper(param).filterNot(isLess(butNot, _)) + def bounds(param: TypeParamRef)(using Context): TypeBounds = { + val e = entry(param) + if (e.exists) e.bounds + else { + // TODO: should we change the type of paramInfos to nullable? + val pinfos: List[param.binder.PInfo] | Null = param.binder.paramInfos + if (pinfos != null) pinfos(param.paramNum) // pinfos == null happens in pos/i536.scala + else TypeBounds.empty + } + } + // ---------- Info related to TypeParamRefs ------------------------------------------- def isLess(param1: TypeParamRef, param2: TypeParamRef): Boolean = diff --git a/compiler/src/dotty/tools/dotc/core/PatternTypeConstrainer.scala b/compiler/src/dotty/tools/dotc/core/PatternTypeConstrainer.scala index e7f54d088c09..9fff257ee963 100644 --- a/compiler/src/dotty/tools/dotc/core/PatternTypeConstrainer.scala +++ b/compiler/src/dotty/tools/dotc/core/PatternTypeConstrainer.scala @@ -261,7 +261,7 @@ trait PatternTypeConstrainer { self: TypeComparer => val assumeInvariantRefinement = migrateTo3 || forceInvariantRefinement || refinementIsInvariant(patternTp) - trace(i"constraining simple pattern type $tp >:< $pt", gadts, (res: Boolean) => i"$res gadt = ${ctx.gadt}") { + trace(i"constraining simple pattern type $tp >:< $pt", gadts, (res: Boolean) => i"$res gadt = ${ctx.gadt.gadt}") { (tp, pt) match { case (AppliedType(tyconS, argsS), AppliedType(tyconP, argsP)) => val saved = state.nn.constraint diff --git a/compiler/src/dotty/tools/dotc/core/TypeComparer.scala b/compiler/src/dotty/tools/dotc/core/TypeComparer.scala index a0eb5139eb07..0bb1529ef396 100644 --- a/compiler/src/dotty/tools/dotc/core/TypeComparer.scala +++ b/compiler/src/dotty/tools/dotc/core/TypeComparer.scala @@ -1442,14 +1442,11 @@ class TypeComparer(@constructorOnly initctx: Context) extends ConstraintHandling if tp2 eq NoType then false else if tp1 eq tp2 then true else - val saved = constraint - val savedGadtConstr = ctx.gadt.getConstraint - val savedMapping = ctx.gadt.getMapping - val savedReverseMapping = ctx.gadt.getReverseMapping - val savedWasConstrained = ctx.gadt.getWasConstrained + val savedCstr = constraint + val savedGadt = ctx.gadt.gadt inline def restore() = - state.constraint = saved - ctx.gadt.restore(savedGadtConstr, savedMapping, savedReverseMapping, savedWasConstrained) + state.constraint = savedCstr + ctx.gadt.restore(savedGadt) val savedSuccessCount = successCount try recCount += 1 @@ -1855,22 +1852,23 @@ class TypeComparer(@constructorOnly initctx: Context) extends ConstraintHandling */ private def necessaryEither(op1: => Boolean, op2: => Boolean): Boolean = val preConstraint = constraint - val preGadt = ctx.gadt.fresh + val preGadtHandling = ctx.gadt.fresh + val preGadt = preGadtHandling.gadt def allSubsumes(leftGadt: GadtConstraint, rightGadt: GadtConstraint, left: Constraint, right: Constraint): Boolean = - subsumes(left, right, preConstraint) && preGadt.subsumes(leftGadt, rightGadt, preGadt) + subsumes(left, right, preConstraint) && preGadtHandling.subsumes(leftGadt, rightGadt, preGadt) if op1 then val op1Constraint = constraint - val op1Gadt = ctx.gadt.fresh + val op1Gadt = ctx.gadt.gadt constraint = preConstraint ctx.gadt.restore(preGadt) if op2 then - if allSubsumes(op1Gadt, ctx.gadt, op1Constraint, constraint) then - gadts.println(i"GADT CUT - prefer ${ctx.gadt} over $op1Gadt") + if allSubsumes(op1Gadt, ctx.gadt.gadt, op1Constraint, constraint) then + gadts.println(i"GADT CUT - prefer ${ctx.gadt.gadt} over $op1Gadt") constr.println(i"CUT - prefer $constraint over $op1Constraint") - else if allSubsumes(ctx.gadt, op1Gadt, constraint, op1Constraint) then - gadts.println(i"GADT CUT - prefer $op1Gadt over ${ctx.gadt}") + else if allSubsumes(ctx.gadt.gadt, op1Gadt, constraint, op1Constraint) then + gadts.println(i"GADT CUT - prefer $op1Gadt over ${ctx.gadt.gadt}") constr.println(i"CUT - prefer $op1Constraint over $constraint") constraint = op1Constraint ctx.gadt.restore(op1Gadt) diff --git a/compiler/src/dotty/tools/dotc/transform/PostTyper.scala b/compiler/src/dotty/tools/dotc/transform/PostTyper.scala index 5abb32b15d57..19a18305ee65 100644 --- a/compiler/src/dotty/tools/dotc/transform/PostTyper.scala +++ b/compiler/src/dotty/tools/dotc/transform/PostTyper.scala @@ -269,7 +269,7 @@ class PostTyper extends MacroTransform with IdentityDenotTransformer { thisPhase case CaseDef(pat, _, _) => val gadtCtx = pat.removeAttachment(typer.Typer.InferredGadtConstraints) match - case Some(gadt) => ctx.fresh.setGadt(gadt) + case Some(gadt) => ctx.fresh.setGadt(GadtConstraintHandling(gadt)) case None => ctx super.transform(tree)(using gadtCtx) diff --git a/compiler/src/dotty/tools/dotc/typer/Implicits.scala b/compiler/src/dotty/tools/dotc/typer/Implicits.scala index 3e0e7dd5879d..596ad01bb888 100644 --- a/compiler/src/dotty/tools/dotc/typer/Implicits.scala +++ b/compiler/src/dotty/tools/dotc/typer/Implicits.scala @@ -1030,7 +1030,7 @@ trait Implicits: case result: SearchSuccess => if result.tstate ne ctx.typerState then result.tstate.commit() - if result.gstate ne ctx.gadt then + if result.gstate ne ctx.gadt.gadt then ctx.gadt.restore(result.gstate) if hasSkolem(false, result.tree) then report.error(SkolemInInferred(result.tree, pt, argument), ctx.source.atSpan(span)) @@ -1145,7 +1145,7 @@ trait Implicits: SearchFailure(adapted.withType(new MismatchedImplicit(ref, pt, argument))) } else - SearchSuccess(adapted, ref, cand.level, cand.isExtension)(ctx.typerState, ctx.gadt) + SearchSuccess(adapted, ref, cand.level, cand.isExtension)(ctx.typerState, ctx.gadt.gadt) } /** An implicit search; parameters as in `inferImplicit` */ @@ -1343,7 +1343,7 @@ trait Implicits: case _: SearchFailure => SearchSuccess(ref(defn.NotGiven_value), defn.NotGiven_value.termRef, 0)( ctx.typerState.fresh().setCommittable(true), - ctx.gadt + ctx.gadt.gadt ) case _: SearchSuccess => NoMatchingImplicitsFailure @@ -1526,7 +1526,7 @@ trait Implicits: // other candidates need to be considered. recursiveRef match case ref: TermRef => - SearchSuccess(tpd.ref(ref).withSpan(span.startPos), ref, 0)(ctx.typerState, ctx.gadt) + SearchSuccess(tpd.ref(ref).withSpan(span.startPos), ref, 0)(ctx.typerState, ctx.gadt.gadt) case _ => searchImplicit(contextual = true) end bestImplicit diff --git a/compiler/src/dotty/tools/dotc/typer/Typer.scala b/compiler/src/dotty/tools/dotc/typer/Typer.scala index 1a24a94e527e..1cb723b85c27 100644 --- a/compiler/src/dotty/tools/dotc/typer/Typer.scala +++ b/compiler/src/dotty/tools/dotc/typer/Typer.scala @@ -1782,7 +1782,7 @@ class Typer(@constructorOnly nestingLevel: Int = 0) extends Namer // see tests/pos/i12226 and issue #12226. It might be possible that this // will end up taking too much memory. If it does, we should just limit // how much GADT constraints we infer - it's always sound to infer less. - pat1.putAttachment(InferredGadtConstraints, ctx.gadt) + pat1.putAttachment(InferredGadtConstraints, ctx.gadt.gadt) if (pt1.isValueType) // insert a cast if body does not conform to expected type if we disregard gadt bounds body1 = body1.ensureConforms(pt1)(using originalCtx) assignType(cpy.CaseDef(tree)(pat1, guard1, body1), pat1, body1) @@ -3835,7 +3835,7 @@ class Typer(@constructorOnly nestingLevel: Int = 0) extends Namer adaptToSubType(wtp) case CompareResult.OKwithGADTUsed if pt.isValueType - && !inContext(ctx.fresh.setGadt(GadtConstraint.empty)) { + && !inContext(ctx.fresh.setGadt(GadtConstraintHandling(GadtConstraint.empty))) { val res = (tree.tpe.widenExpr frozen_<:< pt) if res then // we overshot; a cast is not needed, after all. From 58d0a291e64cb177357872ad4927e712b5df18a5 Mon Sep 17 00:00:00 2001 From: Dale Wijnand Date: Tue, 24 Jan 2023 12:16:05 +0000 Subject: [PATCH 008/371] Rename GadtConstraintHandling to GadtState --- .../src/dotty/tools/dotc/core/Contexts.scala | 21 ++++++------ .../tools/dotc/core/GadtConstraint.scala | 33 +++++++------------ .../src/dotty/tools/dotc/core/NamerOps.scala | 6 ++-- .../dotc/core/PatternTypeConstrainer.scala | 4 +-- .../src/dotty/tools/dotc/core/Symbols.scala | 2 +- .../dotty/tools/dotc/core/TypeComparer.scala | 32 +++++++++--------- .../src/dotty/tools/dotc/core/TypeOps.scala | 6 ++-- .../tools/dotc/inlines/InlineReducer.scala | 4 +-- .../tools/dotc/transform/PostTyper.scala | 2 +- .../dotty/tools/dotc/typer/Implicits.scala | 10 +++--- .../dotty/tools/dotc/typer/Inferencing.scala | 4 +-- .../src/dotty/tools/dotc/typer/Namer.scala | 2 +- .../src/dotty/tools/dotc/typer/Typer.scala | 8 ++--- .../quoted/runtime/impl/QuotesImpl.scala | 2 +- 14 files changed, 63 insertions(+), 73 deletions(-) diff --git a/compiler/src/dotty/tools/dotc/core/Contexts.scala b/compiler/src/dotty/tools/dotc/core/Contexts.scala index 398af92e494d..2f28975dd066 100644 --- a/compiler/src/dotty/tools/dotc/core/Contexts.scala +++ b/compiler/src/dotty/tools/dotc/core/Contexts.scala @@ -141,7 +141,8 @@ object Contexts { def tree: Tree[?] def scope: Scope def typerState: TyperState - def gadt: GadtConstraintHandling + def gadt: GadtConstraint = gadtState.gadt + def gadtState: GadtState def searchHistory: SearchHistory def source: SourceFile @@ -410,7 +411,7 @@ object Contexts { val constrCtx = outersIterator.dropWhile(_.outer.owner == owner).next() superOrThisCallContext(owner, constrCtx.scope) .setTyperState(typerState) - .setGadt(gadt) + .setGadtState(gadtState) .fresh .setScope(this.scope) } @@ -541,8 +542,8 @@ object Contexts { private var _typerState: TyperState = uninitialized final def typerState: TyperState = _typerState - private var _gadt: GadtConstraintHandling = uninitialized - final def gadt: GadtConstraintHandling = _gadt + private var _gadtState: GadtState = uninitialized + final def gadtState: GadtState = _gadtState private var _searchHistory: SearchHistory = uninitialized final def searchHistory: SearchHistory = _searchHistory @@ -567,7 +568,7 @@ object Contexts { _owner = origin.owner _tree = origin.tree _scope = origin.scope - _gadt = origin.gadt + _gadtState = origin.gadtState _searchHistory = origin.searchHistory _source = origin.source _moreProperties = origin.moreProperties @@ -624,12 +625,12 @@ object Contexts { this._scope = typer.scope setTypeAssigner(typer) - def setGadt(gadt: GadtConstraintHandling): this.type = - util.Stats.record("Context.setGadt") - this._gadt = gadt + def setGadtState(gadtState: GadtState): this.type = + util.Stats.record("Context.setGadtState") + this._gadtState = gadtState this def setFreshGADTBounds: this.type = - setGadt(gadt.fresh) + setGadtState(gadtState.fresh) def setSearchHistory(searchHistory: SearchHistory): this.type = util.Stats.record("Context.setSearchHistory") @@ -721,7 +722,7 @@ object Contexts { .updated(notNullInfosLoc, Nil) .updated(compilationUnitLoc, NoCompilationUnit) c._searchHistory = new SearchRoot - c._gadt = GadtConstraintHandling(GadtConstraint.empty) + c._gadtState = GadtState(GadtConstraint.empty) c end FreshContext diff --git a/compiler/src/dotty/tools/dotc/core/GadtConstraint.scala b/compiler/src/dotty/tools/dotc/core/GadtConstraint.scala index d683c15a9241..a863a982a44d 100644 --- a/compiler/src/dotty/tools/dotc/core/GadtConstraint.scala +++ b/compiler/src/dotty/tools/dotc/core/GadtConstraint.scala @@ -133,20 +133,14 @@ class GadtConstraint private ( ): GadtConstraint = GadtConstraint(myConstraint, mapping, reverseMapping, wasConstrained) end GadtConstraint -object GadtConstraintHandling: - def apply(gadt: GadtConstraint): GadtConstraintHandling = new ProperGadtConstraintHandling(gadt) +object GadtState: + def apply(gadt: GadtConstraint): GadtState = ProperGadtState(gadt) -sealed trait GadtConstraintHandling(private var myGadt: GadtConstraint) { - this: ConstraintHandling => +sealed trait GadtState { + this: ConstraintHandling => // Hide ConstraintHandling within GadtConstraintHandling - def gadt: GadtConstraint = myGadt - private def gadt_=(g: GadtConstraint) = myGadt = g - - /** Exposes ConstraintHandling.subsumes */ - def subsumes(left: GadtConstraint, right: GadtConstraint, pre: GadtConstraint)(using Context): Boolean = { - def extractConstraint(g: GadtConstraint) = g.constraint - subsumes(extractConstraint(left), extractConstraint(right), extractConstraint(pre)) - } + def gadt: GadtConstraint + def gadt_=(g: GadtConstraint): Unit override protected def legalBound(param: TypeParamRef, rawBound: Type, isUpper: Boolean)(using Context): Type = // GADT constraints never involve wildcards and are not propagated outside @@ -233,13 +227,6 @@ sealed trait GadtConstraintHandling(private var myGadt: GadtConstraint) { result } - def isLess(sym1: Symbol, sym2: Symbol)(using Context): Boolean = gadt.isLess(sym1, sym2) - def fullBounds(sym: Symbol)(using Context): TypeBounds | Null = gadt.fullBounds(sym) - def bounds(sym: Symbol)(using Context): TypeBounds | Null = gadt.bounds(sym) - def contains(sym: Symbol)(using Context): Boolean = gadt.contains(sym) - def isNarrowing: Boolean = gadt.isNarrowing - def symbols: List[Symbol] = gadt.symbols - /** See [[ConstraintHandling.approximation]] */ def approximation(sym: Symbol, fromBelow: Boolean, maxLevel: Int = Int.MaxValue)(using Context): Type = { approximation(gadt.tvarOrError(sym).origin, fromBelow, maxLevel).match @@ -252,7 +239,7 @@ sealed trait GadtConstraintHandling(private var myGadt: GadtConstraint) { .showing(i"approximating $sym ~> $result", gadts) } - def fresh: GadtConstraintHandling = GadtConstraintHandling(gadt) + def fresh: GadtState = GadtState(gadt) /** Restore the GadtConstraint state. */ def restore(gadt: GadtConstraint): Unit = this.gadt = gadt @@ -281,5 +268,7 @@ sealed trait GadtConstraintHandling(private var myGadt: GadtConstraint) { override def constr = gadtsConstr } -// Hide ConstraintHandling within GadtConstraintHandling -private class ProperGadtConstraintHandling(gadt: GadtConstraint) extends ConstraintHandling with GadtConstraintHandling(gadt) +// Hide ConstraintHandling within GadtState +private class ProperGadtState(private var myGadt: GadtConstraint) extends ConstraintHandling with GadtState: + def gadt: GadtConstraint = myGadt + def gadt_=(gadt: GadtConstraint): Unit = myGadt = gadt diff --git a/compiler/src/dotty/tools/dotc/core/NamerOps.scala b/compiler/src/dotty/tools/dotc/core/NamerOps.scala index 66912537dbce..db6f72590818 100644 --- a/compiler/src/dotty/tools/dotc/core/NamerOps.scala +++ b/compiler/src/dotty/tools/dotc/core/NamerOps.scala @@ -212,11 +212,11 @@ object NamerOps: * by (ab?)-using GADT constraints. See pos/i941.scala. */ def linkConstructorParams(sym: Symbol, tparams: List[Symbol], rhsCtx: Context)(using Context): Unit = - rhsCtx.gadt.addToConstraint(tparams) + rhsCtx.gadtState.addToConstraint(tparams) tparams.lazyZip(sym.owner.typeParams).foreach { (psym, tparam) => val tr = tparam.typeRef - rhsCtx.gadt.addBound(psym, tr, isUpper = false) - rhsCtx.gadt.addBound(psym, tr, isUpper = true) + rhsCtx.gadtState.addBound(psym, tr, isUpper = false) + rhsCtx.gadtState.addBound(psym, tr, isUpper = true) } end NamerOps diff --git a/compiler/src/dotty/tools/dotc/core/PatternTypeConstrainer.scala b/compiler/src/dotty/tools/dotc/core/PatternTypeConstrainer.scala index 9fff257ee963..5e8a960608e6 100644 --- a/compiler/src/dotty/tools/dotc/core/PatternTypeConstrainer.scala +++ b/compiler/src/dotty/tools/dotc/core/PatternTypeConstrainer.scala @@ -261,12 +261,12 @@ trait PatternTypeConstrainer { self: TypeComparer => val assumeInvariantRefinement = migrateTo3 || forceInvariantRefinement || refinementIsInvariant(patternTp) - trace(i"constraining simple pattern type $tp >:< $pt", gadts, (res: Boolean) => i"$res gadt = ${ctx.gadt.gadt}") { + trace(i"constraining simple pattern type $tp >:< $pt", gadts, (res: Boolean) => i"$res gadt = ${ctx.gadt}") { (tp, pt) match { case (AppliedType(tyconS, argsS), AppliedType(tyconP, argsP)) => val saved = state.nn.constraint val result = - ctx.gadt.rollbackGadtUnless { + ctx.gadtState.rollbackGadtUnless { tyconS.typeParams.lazyZip(argsS).lazyZip(argsP).forall { (param, argS, argP) => val variance = param.paramVarianceSign if variance == 0 || assumeInvariantRefinement || diff --git a/compiler/src/dotty/tools/dotc/core/Symbols.scala b/compiler/src/dotty/tools/dotc/core/Symbols.scala index d14be2b0dfb9..aa3ae0c3c513 100644 --- a/compiler/src/dotty/tools/dotc/core/Symbols.scala +++ b/compiler/src/dotty/tools/dotc/core/Symbols.scala @@ -686,7 +686,7 @@ object Symbols { addToGadt: Boolean = true, flags: FlagSet = EmptyFlags)(using Context): Symbol = { val sym = newSymbol(ctx.owner, name, Case | flags, info, coord = span) - if (addToGadt && name.isTypeName) ctx.gadt.addToConstraint(sym) + if (addToGadt && name.isTypeName) ctx.gadtState.addToConstraint(sym) sym } diff --git a/compiler/src/dotty/tools/dotc/core/TypeComparer.scala b/compiler/src/dotty/tools/dotc/core/TypeComparer.scala index 0bb1529ef396..cd1e55ef028c 100644 --- a/compiler/src/dotty/tools/dotc/core/TypeComparer.scala +++ b/compiler/src/dotty/tools/dotc/core/TypeComparer.scala @@ -116,7 +116,7 @@ class TypeComparer(@constructorOnly initctx: Context) extends ConstraintHandling private def isBottom(tp: Type) = tp.widen.isRef(NothingClass) protected def gadtBounds(sym: Symbol)(using Context) = ctx.gadt.bounds(sym) - protected def gadtAddBound(sym: Symbol, b: Type, isUpper: Boolean): Boolean = ctx.gadt.addBound(sym, b, isUpper) + protected def gadtAddBound(sym: Symbol, b: Type, isUpper: Boolean): Boolean = ctx.gadtState.addBound(sym, b, isUpper) protected def typeVarInstance(tvar: TypeVar)(using Context): Type = tvar.underlying @@ -1443,10 +1443,10 @@ class TypeComparer(@constructorOnly initctx: Context) extends ConstraintHandling else if tp1 eq tp2 then true else val savedCstr = constraint - val savedGadt = ctx.gadt.gadt + val savedGadt = ctx.gadt inline def restore() = state.constraint = savedCstr - ctx.gadt.restore(savedGadt) + ctx.gadtState.restore(savedGadt) val savedSuccessCount = successCount try recCount += 1 @@ -1852,34 +1852,34 @@ class TypeComparer(@constructorOnly initctx: Context) extends ConstraintHandling */ private def necessaryEither(op1: => Boolean, op2: => Boolean): Boolean = val preConstraint = constraint - val preGadtHandling = ctx.gadt.fresh - val preGadt = preGadtHandling.gadt + val preGadt = ctx.gadt def allSubsumes(leftGadt: GadtConstraint, rightGadt: GadtConstraint, left: Constraint, right: Constraint): Boolean = - subsumes(left, right, preConstraint) && preGadtHandling.subsumes(leftGadt, rightGadt, preGadt) + subsumes(left, right, preConstraint) + && subsumes(leftGadt.constraint, rightGadt.constraint, preGadt.constraint) if op1 then val op1Constraint = constraint - val op1Gadt = ctx.gadt.gadt + val op1Gadt = ctx.gadt constraint = preConstraint - ctx.gadt.restore(preGadt) + ctx.gadtState.restore(preGadt) if op2 then - if allSubsumes(op1Gadt, ctx.gadt.gadt, op1Constraint, constraint) then - gadts.println(i"GADT CUT - prefer ${ctx.gadt.gadt} over $op1Gadt") + if allSubsumes(op1Gadt, ctx.gadt, op1Constraint, constraint) then + gadts.println(i"GADT CUT - prefer ${ctx.gadt} over $op1Gadt") constr.println(i"CUT - prefer $constraint over $op1Constraint") - else if allSubsumes(ctx.gadt.gadt, op1Gadt, constraint, op1Constraint) then - gadts.println(i"GADT CUT - prefer $op1Gadt over ${ctx.gadt.gadt}") + else if allSubsumes(ctx.gadt, op1Gadt, constraint, op1Constraint) then + gadts.println(i"GADT CUT - prefer $op1Gadt over ${ctx.gadt}") constr.println(i"CUT - prefer $op1Constraint over $constraint") constraint = op1Constraint - ctx.gadt.restore(op1Gadt) + ctx.gadtState.restore(op1Gadt) else gadts.println(i"GADT CUT - no constraint is preferable, reverting to $preGadt") constr.println(i"CUT - no constraint is preferable, reverting to $preConstraint") constraint = preConstraint - ctx.gadt.restore(preGadt) + ctx.gadtState.restore(preGadt) else constraint = op1Constraint - ctx.gadt.restore(op1Gadt) + ctx.gadtState.restore(op1Gadt) true else op2 end necessaryEither @@ -2051,7 +2051,7 @@ class TypeComparer(@constructorOnly initctx: Context) extends ConstraintHandling gadts.println(i"narrow gadt bound of $tparam: ${tparam.info} from ${if (isUpper) "above" else "below"} to $bound ${bound.toString} ${bound.isRef(tparam)}") if (bound.isRef(tparam)) false else - ctx.gadt.rollbackGadtUnless(gadtAddBound(tparam, bound, isUpper)) + ctx.gadtState.rollbackGadtUnless(gadtAddBound(tparam, bound, isUpper)) } } diff --git a/compiler/src/dotty/tools/dotc/core/TypeOps.scala b/compiler/src/dotty/tools/dotc/core/TypeOps.scala index ea8dcee5fca5..d9da11c561e8 100644 --- a/compiler/src/dotty/tools/dotc/core/TypeOps.scala +++ b/compiler/src/dotty/tools/dotc/core/TypeOps.scala @@ -687,8 +687,8 @@ object TypeOps: val bound1 = massage(bound) if (bound1 ne bound) { if (checkCtx eq ctx) checkCtx = ctx.fresh.setFreshGADTBounds - if (!checkCtx.gadt.contains(sym)) checkCtx.gadt.addToConstraint(sym) - checkCtx.gadt.addBound(sym, bound1, fromBelow) + if (!checkCtx.gadt.contains(sym)) checkCtx.gadtState.addToConstraint(sym) + checkCtx.gadtState.addBound(sym, bound1, fromBelow) typr.println("install GADT bound $bound1 for when checking F-bounded $sym") } } @@ -872,7 +872,7 @@ object TypeOps: case tp: TypeRef if tp.symbol.exists && !tp.symbol.isClass => foldOver(tp.symbol :: xs, tp) case tp => foldOver(xs, tp) val syms2 = getAbstractSymbols(Nil, tp2).reverse - if syms2.nonEmpty then ctx.gadt.addToConstraint(syms2) + if syms2.nonEmpty then ctx.gadtState.addToConstraint(syms2) // If parent contains a reference to an abstract type, then we should // refine subtype checking to eliminate abstract types according to diff --git a/compiler/src/dotty/tools/dotc/inlines/InlineReducer.scala b/compiler/src/dotty/tools/dotc/inlines/InlineReducer.scala index 42e86b71eff8..e1b2aaa02866 100644 --- a/compiler/src/dotty/tools/dotc/inlines/InlineReducer.scala +++ b/compiler/src/dotty/tools/dotc/inlines/InlineReducer.scala @@ -311,11 +311,11 @@ class InlineReducer(inliner: Inliner)(using Context): def addTypeBindings(typeBinds: TypeBindsMap)(using Context): Unit = typeBinds.foreachBinding { case (sym, shouldBeMinimized) => newTypeBinding(sym, - ctx.gadt.approximation(sym, fromBelow = shouldBeMinimized, maxLevel = Int.MaxValue)) + ctx.gadtState.approximation(sym, fromBelow = shouldBeMinimized, maxLevel = Int.MaxValue)) } def registerAsGadtSyms(typeBinds: TypeBindsMap)(using Context): Unit = - if (typeBinds.size > 0) ctx.gadt.addToConstraint(typeBinds.keys) + if (typeBinds.size > 0) ctx.gadtState.addToConstraint(typeBinds.keys) pat match { case Typed(pat1, tpt) => diff --git a/compiler/src/dotty/tools/dotc/transform/PostTyper.scala b/compiler/src/dotty/tools/dotc/transform/PostTyper.scala index 19a18305ee65..2039a8f19558 100644 --- a/compiler/src/dotty/tools/dotc/transform/PostTyper.scala +++ b/compiler/src/dotty/tools/dotc/transform/PostTyper.scala @@ -269,7 +269,7 @@ class PostTyper extends MacroTransform with IdentityDenotTransformer { thisPhase case CaseDef(pat, _, _) => val gadtCtx = pat.removeAttachment(typer.Typer.InferredGadtConstraints) match - case Some(gadt) => ctx.fresh.setGadt(GadtConstraintHandling(gadt)) + case Some(gadt) => ctx.fresh.setGadtState(GadtState(gadt)) case None => ctx super.transform(tree)(using gadtCtx) diff --git a/compiler/src/dotty/tools/dotc/typer/Implicits.scala b/compiler/src/dotty/tools/dotc/typer/Implicits.scala index 596ad01bb888..03d3011b4bcd 100644 --- a/compiler/src/dotty/tools/dotc/typer/Implicits.scala +++ b/compiler/src/dotty/tools/dotc/typer/Implicits.scala @@ -1030,8 +1030,8 @@ trait Implicits: case result: SearchSuccess => if result.tstate ne ctx.typerState then result.tstate.commit() - if result.gstate ne ctx.gadt.gadt then - ctx.gadt.restore(result.gstate) + if result.gstate ne ctx.gadt then + ctx.gadtState.restore(result.gstate) if hasSkolem(false, result.tree) then report.error(SkolemInInferred(result.tree, pt, argument), ctx.source.atSpan(span)) implicits.println(i"success: $result") @@ -1145,7 +1145,7 @@ trait Implicits: SearchFailure(adapted.withType(new MismatchedImplicit(ref, pt, argument))) } else - SearchSuccess(adapted, ref, cand.level, cand.isExtension)(ctx.typerState, ctx.gadt.gadt) + SearchSuccess(adapted, ref, cand.level, cand.isExtension)(ctx.typerState, ctx.gadt) } /** An implicit search; parameters as in `inferImplicit` */ @@ -1343,7 +1343,7 @@ trait Implicits: case _: SearchFailure => SearchSuccess(ref(defn.NotGiven_value), defn.NotGiven_value.termRef, 0)( ctx.typerState.fresh().setCommittable(true), - ctx.gadt.gadt + ctx.gadt ) case _: SearchSuccess => NoMatchingImplicitsFailure @@ -1526,7 +1526,7 @@ trait Implicits: // other candidates need to be considered. recursiveRef match case ref: TermRef => - SearchSuccess(tpd.ref(ref).withSpan(span.startPos), ref, 0)(ctx.typerState, ctx.gadt.gadt) + SearchSuccess(tpd.ref(ref).withSpan(span.startPos), ref, 0)(ctx.typerState, ctx.gadt) case _ => searchImplicit(contextual = true) end bestImplicit diff --git a/compiler/src/dotty/tools/dotc/typer/Inferencing.scala b/compiler/src/dotty/tools/dotc/typer/Inferencing.scala index 2aef3433228b..3442207653d4 100644 --- a/compiler/src/dotty/tools/dotc/typer/Inferencing.scala +++ b/compiler/src/dotty/tools/dotc/typer/Inferencing.scala @@ -262,7 +262,7 @@ object Inferencing { && ctx.gadt.contains(tp.symbol) => val sym = tp.symbol - val res = ctx.gadt.approximation(sym, fromBelow = variance < 0) + val res = ctx.gadtState.approximation(sym, fromBelow = variance < 0) gadts.println(i"approximated $tp ~~ $res") res @@ -432,7 +432,7 @@ object Inferencing { } // We add the created symbols to GADT constraint here. - if (res.nonEmpty) ctx.gadt.addToConstraint(res) + if (res.nonEmpty) ctx.gadtState.addToConstraint(res) res } diff --git a/compiler/src/dotty/tools/dotc/typer/Namer.scala b/compiler/src/dotty/tools/dotc/typer/Namer.scala index 6cdd0150518b..6f85efb0fc8a 100644 --- a/compiler/src/dotty/tools/dotc/typer/Namer.scala +++ b/compiler/src/dotty/tools/dotc/typer/Namer.scala @@ -1864,7 +1864,7 @@ class Namer { typer: Typer => // so we must allow constraining its type parameters // compare with typedDefDef, see tests/pos/gadt-inference.scala rhsCtx.setFreshGADTBounds - rhsCtx.gadt.addToConstraint(typeParams) + rhsCtx.gadtState.addToConstraint(typeParams) } def typedAheadRhs(pt: Type) = diff --git a/compiler/src/dotty/tools/dotc/typer/Typer.scala b/compiler/src/dotty/tools/dotc/typer/Typer.scala index 1cb723b85c27..eb09d30e60f3 100644 --- a/compiler/src/dotty/tools/dotc/typer/Typer.scala +++ b/compiler/src/dotty/tools/dotc/typer/Typer.scala @@ -1782,7 +1782,7 @@ class Typer(@constructorOnly nestingLevel: Int = 0) extends Namer // see tests/pos/i12226 and issue #12226. It might be possible that this // will end up taking too much memory. If it does, we should just limit // how much GADT constraints we infer - it's always sound to infer less. - pat1.putAttachment(InferredGadtConstraints, ctx.gadt.gadt) + pat1.putAttachment(InferredGadtConstraints, ctx.gadt) if (pt1.isValueType) // insert a cast if body does not conform to expected type if we disregard gadt bounds body1 = body1.ensureConforms(pt1)(using originalCtx) assignType(cpy.CaseDef(tree)(pat1, guard1, body1), pat1, body1) @@ -2362,7 +2362,7 @@ class Typer(@constructorOnly nestingLevel: Int = 0) extends Namer ctx.outer.outersIterator.takeWhile(!_.owner.is(Method)) .filter(ctx => ctx.owner.isClass && ctx.owner.typeParams.nonEmpty) .toList.reverse - .foreach(ctx => rhsCtx.gadt.addToConstraint(ctx.owner.typeParams)) + .foreach(ctx => rhsCtx.gadtState.addToConstraint(ctx.owner.typeParams)) if tparamss.nonEmpty then rhsCtx.setFreshGADTBounds @@ -2371,7 +2371,7 @@ class Typer(@constructorOnly nestingLevel: Int = 0) extends Namer // we're typing a polymorphic definition's body, // so we allow constraining all of its type parameters // constructors are an exception as we don't allow constraining type params of classes - rhsCtx.gadt.addToConstraint(tparamSyms) + rhsCtx.gadtState.addToConstraint(tparamSyms) else if !sym.isPrimaryConstructor then linkConstructorParams(sym, tparamSyms, rhsCtx) @@ -3835,7 +3835,7 @@ class Typer(@constructorOnly nestingLevel: Int = 0) extends Namer adaptToSubType(wtp) case CompareResult.OKwithGADTUsed if pt.isValueType - && !inContext(ctx.fresh.setGadt(GadtConstraintHandling(GadtConstraint.empty))) { + && !inContext(ctx.fresh.setGadtState(GadtState(GadtConstraint.empty))) { val res = (tree.tpe.widenExpr frozen_<:< pt) if res then // we overshot; a cast is not needed, after all. diff --git a/compiler/src/scala/quoted/runtime/impl/QuotesImpl.scala b/compiler/src/scala/quoted/runtime/impl/QuotesImpl.scala index dd6471a882bd..4d08e0582d1d 100644 --- a/compiler/src/scala/quoted/runtime/impl/QuotesImpl.scala +++ b/compiler/src/scala/quoted/runtime/impl/QuotesImpl.scala @@ -3130,7 +3130,7 @@ class QuotesImpl private (using val ctx: Context) extends Quotes, QuoteUnpickler if typeHoles.isEmpty then ctx else val ctx1 = ctx.fresh.setFreshGADTBounds.addMode(dotc.core.Mode.GadtConstraintInference) - ctx1.gadt.addToConstraint(typeHoles) + ctx1.gadtState.addToConstraint(typeHoles) ctx1 val matchings = QuoteMatcher.treeMatch(scrutinee, pat1)(using ctx1) From abbb54956c96013b9fb512a008481e819e51a4c2 Mon Sep 17 00:00:00 2001 From: Dale Wijnand Date: Mon, 13 Feb 2023 22:23:12 +0000 Subject: [PATCH 009/371] Avoid bidirectional GADT typebounds from fullBounds --- .../tools/dotc/core/GadtConstraint.scala | 22 +++++++- .../src/dotty/tools/dotc/typer/Typer.scala | 2 +- tests/pos/i14287.min.scala | 7 +++ tests/pos/i14287.scala | 16 ++++++ tests/pos/i15523.avoid.scala | 8 +++ tests/pos/i15523.scala | 7 +++ tests/pos/i16777.scala | 52 +++++++++++++++++++ 7 files changed, 111 insertions(+), 3 deletions(-) create mode 100644 tests/pos/i14287.min.scala create mode 100644 tests/pos/i14287.scala create mode 100644 tests/pos/i15523.avoid.scala create mode 100644 tests/pos/i15523.scala create mode 100644 tests/pos/i16777.scala diff --git a/compiler/src/dotty/tools/dotc/core/GadtConstraint.scala b/compiler/src/dotty/tools/dotc/core/GadtConstraint.scala index a863a982a44d..211c8e637b1f 100644 --- a/compiler/src/dotty/tools/dotc/core/GadtConstraint.scala +++ b/compiler/src/dotty/tools/dotc/core/GadtConstraint.scala @@ -71,12 +71,26 @@ class GadtConstraint private ( externalize(constraint.nonParamBounds(param)).bounds def fullLowerBound(param: TypeParamRef)(using Context): Type = - constraint.minLower(param).foldLeft(nonParamBounds(param).lo) { + val self = externalize(param) + constraint.minLower(param).filterNot { p => + val sym = paramSymbol(p) + sym.name.is(NameKinds.UniqueName) && { + val hi = sym.info.hiBound + !hi.isExactlyAny && self <:< hi + } + }.foldLeft(nonParamBounds(param).lo) { (t, u) => t | externalize(u) } def fullUpperBound(param: TypeParamRef)(using Context): Type = - constraint.minUpper(param).foldLeft(nonParamBounds(param).hi) { (t, u) => + val self = externalize(param) + constraint.minUpper(param).filterNot { p => + val sym = paramSymbol(p) + sym.name.is(NameKinds.UniqueName) && { + val lo = sym.info.loBound + !lo.isExactlyNothing && lo <:< self + } + }.foldLeft(nonParamBounds(param).hi) { (t, u) => val eu = externalize(u) // Any as the upper bound means "no bound", but if F is higher-kinded, // Any & F = F[_]; this is wrong for us so we need to short-circuit @@ -96,6 +110,10 @@ class GadtConstraint private ( def tvarOrError(sym: Symbol)(using Context): TypeVar = mapping(sym).ensuring(_ != null, i"not a constrainable symbol: $sym").uncheckedNN + private def paramSymbol(p: TypeParamRef): Symbol = reverseMapping(p) match + case sym: Symbol => sym + case null => NoSymbol + @tailrec final def stripInternalTypeVar(tp: Type): Type = tp match case tv: TypeVar => val inst = constraint.instType(tv) diff --git a/compiler/src/dotty/tools/dotc/typer/Typer.scala b/compiler/src/dotty/tools/dotc/typer/Typer.scala index eb09d30e60f3..cb2758c088b6 100644 --- a/compiler/src/dotty/tools/dotc/typer/Typer.scala +++ b/compiler/src/dotty/tools/dotc/typer/Typer.scala @@ -1743,7 +1743,7 @@ class Typer(@constructorOnly nestingLevel: Int = 0) extends Namer else report.error(new DuplicateBind(b, cdef), b.srcPos) if (!ctx.isAfterTyper) { val bounds = ctx.gadt.fullBounds(sym) - if (bounds != null) sym.info = bounds + if (bounds != null) sym.info = checkNonCyclic(sym, bounds, reportErrors = true) } b case t: UnApply if t.symbol.is(Inline) => Inlines.inlinedUnapply(t) diff --git a/tests/pos/i14287.min.scala b/tests/pos/i14287.min.scala new file mode 100644 index 000000000000..8f7773ab0ac8 --- /dev/null +++ b/tests/pos/i14287.min.scala @@ -0,0 +1,7 @@ +enum Foo[+F[_]]: + case Bar[B[_]](value: Foo[B]) extends Foo[B] + +class Test: + def test[X[_]](foo: Foo[X]): Foo[X] = foo match + case Foo.Bar(Foo.Bar(x)) => Foo.Bar(x) + case _ => foo diff --git a/tests/pos/i14287.scala b/tests/pos/i14287.scala new file mode 100644 index 000000000000..1291dc8adefc --- /dev/null +++ b/tests/pos/i14287.scala @@ -0,0 +1,16 @@ +// scalac: -Yno-deep-subtypes +enum Free[+F[_], A]: + case Return(a: A) + case Suspend(s: F[A]) + case FlatMap[F[_], A, B]( + s: Free[F, A], + f: A => Free[F, B]) extends Free[F, B] + + def flatMap[F2[x] >: F[x], B](f: A => Free[F2,B]): Free[F2,B] = + FlatMap(this, f) + + @scala.annotation.tailrec + final def step: Free[F, A] = this match + case FlatMap(FlatMap(fx, f), g) => fx.flatMap(x => f(x).flatMap(y => g(y))).step + case FlatMap(Return(x), f) => f(x).step + case _ => this diff --git a/tests/pos/i15523.avoid.scala b/tests/pos/i15523.avoid.scala new file mode 100644 index 000000000000..afbfc1a69d60 --- /dev/null +++ b/tests/pos/i15523.avoid.scala @@ -0,0 +1,8 @@ +// scalac: -Werror +// like the original, but with a case body `a` +// which caused type avoidance to infinitely recurse +sealed trait Parent +final case class Leaf[A, B >: A](a: A, b: B) extends Parent + +def run(x: Parent) = x match + case Leaf(a, _) => a diff --git a/tests/pos/i15523.scala b/tests/pos/i15523.scala new file mode 100644 index 000000000000..cf63613c29ac --- /dev/null +++ b/tests/pos/i15523.scala @@ -0,0 +1,7 @@ +// scalac: -Werror +sealed trait Parent +final case class Leaf[A, B >: A](a: A, b: B) extends Parent + +def run(x: Parent): Unit = x match { + case Leaf(a, b) => +} diff --git a/tests/pos/i16777.scala b/tests/pos/i16777.scala new file mode 100644 index 000000000000..4218aea29d9f --- /dev/null +++ b/tests/pos/i16777.scala @@ -0,0 +1,52 @@ +// scalac: -Ykind-projector:underscores + +sealed abstract class Free[+S[_, _], +E, +A] { + @inline final def flatMap[S1[e, a] >: S[e, a], B, E1 >: E](fun: A => Free[S1, E1, B]): Free[S1, E1, B] = Free.FlatMapped[S1, E, E1, A, B](this, fun) + @inline final def map[B](fun: A => B): Free[S, E, B] = flatMap(a => Free.pure[S, B](fun(a))) + @inline final def as[B](as: => B): Free[S, E, B] = map(_ => as) + @inline final def *>[S1[e, a] >: S[e, a], B, E1 >: E](sc: Free[S1, E1, B]): Free[S1, E1, B] = flatMap(_ => sc) + @inline final def <*[S1[e, a] >: S[e, a], B, E1 >: E](sc: Free[S1, E1, B]): Free[S1, E1, A] = flatMap(r => sc.as(r)) + + @inline final def void: Free[S, E, Unit] = map(_ => ()) + + // FIXME: Scala 3.1.4 bug: false unexhaustive match warning + /// @nowarn("msg=pattern case: Free.FlatMapped") + @inline final def foldMap[S1[e, a] >: S[e, a], G[+_, +_]](transform: S1 ~>> G)(implicit G: Monad2[G]): G[E, A] = { + this match { + case Free.Pure(a) => G.pure(a) + case Free.Suspend(a) => transform.apply(a) + case Free.FlatMapped(sub, cont) => + sub match { + case Free.FlatMapped(sub2, cont2) => sub2.flatMap(a => cont2(a).flatMap(cont)).foldMap(transform) + case another => G.flatMap(another.foldMap(transform))(cont(_).foldMap(transform)) + } + } + } +} + +trait ~>>[-F[_, _], +G[_, _]] { + def apply[E, A](f: F[E, A]): G[E, A] +} + +object Free { + @inline def pure[S[_, _], A](a: A): Free[S, Nothing, A] = Pure(a) + @inline def lift[S[_, _], E, A](s: S[E, A]): Free[S, E, A] = Suspend(s) + + final case class Pure[S[_, _], A](a: A) extends Free[S, Nothing, A] { + override def toString: String = s"Pure:[$a]" + } + final case class Suspend[S[_, _], E, A](a: S[E, A]) extends Free[S, E, A] { + override def toString: String = s"Suspend:[$a]" + } + final case class FlatMapped[S[_, _], E, E1 >: E, A, B](sub: Free[S, E, A], cont: A => Free[S, E1, B]) extends Free[S, E1, B] { + override def toString: String = s"FlatMapped:[sub=$sub]" + } +} + +type Monad2[F[+_, +_]] = Monad3[λ[(`-R`, `+E`, `+A`) => F[E, A]]] + +trait Monad3[F[-_, +_, +_]] { + def flatMap[R, E, A, B](r: F[R, E, A])(f: A => F[R, E, B]): F[R, E, B] + def flatten[R, E, A](r: F[R, E, F[R, E, A]]): F[R, E, A] = flatMap(r)(identity) + def pure[A](a: A): F[Any, Nothing, A] +} From 6d141f38875fd3a71ae57d5b826c461e5a249a10 Mon Sep 17 00:00:00 2001 From: Dale Wijnand Date: Tue, 14 Feb 2023 13:26:11 +0000 Subject: [PATCH 010/371] Drop failsafe checkNonCyclic and document GADT fullBounds change --- compiler/src/dotty/tools/dotc/core/GadtConstraint.scala | 9 ++++++++- compiler/src/dotty/tools/dotc/typer/Typer.scala | 2 +- 2 files changed, 9 insertions(+), 2 deletions(-) diff --git a/compiler/src/dotty/tools/dotc/core/GadtConstraint.scala b/compiler/src/dotty/tools/dotc/core/GadtConstraint.scala index 211c8e637b1f..4b580c06d11f 100644 --- a/compiler/src/dotty/tools/dotc/core/GadtConstraint.scala +++ b/compiler/src/dotty/tools/dotc/core/GadtConstraint.scala @@ -77,6 +77,13 @@ class GadtConstraint private ( sym.name.is(NameKinds.UniqueName) && { val hi = sym.info.hiBound !hi.isExactlyAny && self <:< hi + // drop any lower param that is a GADT symbol + // and is upper-bounded by a non-Any super-type of the original parameter + // e.g. in pos/i14287.min + // B$1 had info <: X and fullBounds >: B$2 <: X, and + // B$2 had info <: B$1 and fullBounds <: B$1 + // We can use the info of B$2 to drop the lower-bound of B$1 + // and return non-bidirectional bounds B$1 <: X and B$2 <: B$1. } }.foldLeft(nonParamBounds(param).lo) { (t, u) => t | externalize(u) @@ -88,7 +95,7 @@ class GadtConstraint private ( val sym = paramSymbol(p) sym.name.is(NameKinds.UniqueName) && { val lo = sym.info.loBound - !lo.isExactlyNothing && lo <:< self + !lo.isExactlyNothing && lo <:< self // same as fullLowerBounds } }.foldLeft(nonParamBounds(param).hi) { (t, u) => val eu = externalize(u) diff --git a/compiler/src/dotty/tools/dotc/typer/Typer.scala b/compiler/src/dotty/tools/dotc/typer/Typer.scala index cb2758c088b6..eb09d30e60f3 100644 --- a/compiler/src/dotty/tools/dotc/typer/Typer.scala +++ b/compiler/src/dotty/tools/dotc/typer/Typer.scala @@ -1743,7 +1743,7 @@ class Typer(@constructorOnly nestingLevel: Int = 0) extends Namer else report.error(new DuplicateBind(b, cdef), b.srcPos) if (!ctx.isAfterTyper) { val bounds = ctx.gadt.fullBounds(sym) - if (bounds != null) sym.info = checkNonCyclic(sym, bounds, reportErrors = true) + if (bounds != null) sym.info = bounds } b case t: UnApply if t.symbol.is(Inline) => Inlines.inlinedUnapply(t) From 4cc0e0d80af513866b0d4a65aa471f9296fc82fb Mon Sep 17 00:00:00 2001 From: Dale Wijnand Date: Wed, 15 Feb 2023 21:39:51 +0000 Subject: [PATCH 011/371] GADT: move dropping GADT symbols into foldLeft --- .../tools/dotc/core/GadtConstraint.scala | 51 ++++++++----------- 1 file changed, 21 insertions(+), 30 deletions(-) diff --git a/compiler/src/dotty/tools/dotc/core/GadtConstraint.scala b/compiler/src/dotty/tools/dotc/core/GadtConstraint.scala index 4b580c06d11f..c98445c61a7e 100644 --- a/compiler/src/dotty/tools/dotc/core/GadtConstraint.scala +++ b/compiler/src/dotty/tools/dotc/core/GadtConstraint.scala @@ -3,6 +3,7 @@ package dotc package core import Contexts.*, Decorators.*, Symbols.*, Types.* +import NameKinds.UniqueName import config.Printers.{gadts, gadtsConstr} import util.{SimpleIdentitySet, SimpleIdentityMap} import printing._ @@ -72,36 +73,30 @@ class GadtConstraint private ( def fullLowerBound(param: TypeParamRef)(using Context): Type = val self = externalize(param) - constraint.minLower(param).filterNot { p => - val sym = paramSymbol(p) - sym.name.is(NameKinds.UniqueName) && { - val hi = sym.info.hiBound - !hi.isExactlyAny && self <:< hi - // drop any lower param that is a GADT symbol - // and is upper-bounded by a non-Any super-type of the original parameter - // e.g. in pos/i14287.min - // B$1 had info <: X and fullBounds >: B$2 <: X, and - // B$2 had info <: B$1 and fullBounds <: B$1 - // We can use the info of B$2 to drop the lower-bound of B$1 - // and return non-bidirectional bounds B$1 <: X and B$2 <: B$1. - } - }.foldLeft(nonParamBounds(param).lo) { - (t, u) => t | externalize(u) + constraint.minLower(param).foldLeft(nonParamBounds(param).lo) { (acc, p) => + externalize(p) match + case tp: TypeRef + // drop any lower param that is a GADT symbol + // and is upper-bounded by a non-Any super-type of the original parameter + // e.g. in pos/i14287.min + // B$1 had info <: X and fullBounds >: B$2 <: X, and + // B$2 had info <: B$1 and fullBounds <: B$1 + // We can use the info of B$2 to drop the lower-bound of B$1 + // and return non-bidirectional bounds B$1 <: X and B$2 <: B$1. + if tp.name.is(UniqueName) && !tp.info.hiBound.isExactlyAny && self <:< tp.info.hiBound => acc + case tp => acc | tp } def fullUpperBound(param: TypeParamRef)(using Context): Type = val self = externalize(param) - constraint.minUpper(param).filterNot { p => - val sym = paramSymbol(p) - sym.name.is(NameKinds.UniqueName) && { - val lo = sym.info.loBound - !lo.isExactlyNothing && lo <:< self // same as fullLowerBounds - } - }.foldLeft(nonParamBounds(param).hi) { (t, u) => - val eu = externalize(u) - // Any as the upper bound means "no bound", but if F is higher-kinded, - // Any & F = F[_]; this is wrong for us so we need to short-circuit - if t.isAny then eu else t & eu + constraint.minUpper(param).foldLeft(nonParamBounds(param).hi) { (acc, u) => + externalize(u) match + case tp: TypeRef // same as fullLowerBounds + if tp.name.is(UniqueName) && !tp.info.loBound.isExactlyNothing && tp.info.loBound <:< self => acc + case tp => + // Any as the upper bound means "no bound", but if F is higher-kinded, + // Any & F = F[_]; this is wrong for us so we need to short-circuit + if acc.isAny then tp else acc & tp } def externalize(tp: Type, theMap: TypeMap | Null = null)(using Context): Type = tp match @@ -117,10 +112,6 @@ class GadtConstraint private ( def tvarOrError(sym: Symbol)(using Context): TypeVar = mapping(sym).ensuring(_ != null, i"not a constrainable symbol: $sym").uncheckedNN - private def paramSymbol(p: TypeParamRef): Symbol = reverseMapping(p) match - case sym: Symbol => sym - case null => NoSymbol - @tailrec final def stripInternalTypeVar(tp: Type): Type = tp match case tv: TypeVar => val inst = constraint.instType(tv) From 5453d5c5de4e2b3588330614ec5c872332c89564 Mon Sep 17 00:00:00 2001 From: Dale Wijnand Date: Wed, 15 Feb 2023 22:25:14 +0000 Subject: [PATCH 012/371] GADT: Use isPatternBound, ofc... --- compiler/src/dotty/tools/dotc/core/GadtConstraint.scala | 4 ++-- 1 file changed, 2 insertions(+), 2 deletions(-) diff --git a/compiler/src/dotty/tools/dotc/core/GadtConstraint.scala b/compiler/src/dotty/tools/dotc/core/GadtConstraint.scala index c98445c61a7e..74d668e7ca87 100644 --- a/compiler/src/dotty/tools/dotc/core/GadtConstraint.scala +++ b/compiler/src/dotty/tools/dotc/core/GadtConstraint.scala @@ -83,7 +83,7 @@ class GadtConstraint private ( // B$2 had info <: B$1 and fullBounds <: B$1 // We can use the info of B$2 to drop the lower-bound of B$1 // and return non-bidirectional bounds B$1 <: X and B$2 <: B$1. - if tp.name.is(UniqueName) && !tp.info.hiBound.isExactlyAny && self <:< tp.info.hiBound => acc + if tp.symbol.isPatternBound && !tp.info.hiBound.isExactlyAny && self <:< tp.info.hiBound => acc case tp => acc | tp } @@ -92,7 +92,7 @@ class GadtConstraint private ( constraint.minUpper(param).foldLeft(nonParamBounds(param).hi) { (acc, u) => externalize(u) match case tp: TypeRef // same as fullLowerBounds - if tp.name.is(UniqueName) && !tp.info.loBound.isExactlyNothing && tp.info.loBound <:< self => acc + if tp.symbol.isPatternBound && !tp.info.loBound.isExactlyNothing && tp.info.loBound <:< self => acc case tp => // Any as the upper bound means "no bound", but if F is higher-kinded, // Any & F = F[_]; this is wrong for us so we need to short-circuit From ace96f7628d318192bd8840d4ce87da3dba5266d Mon Sep 17 00:00:00 2001 From: Dale Wijnand Date: Wed, 15 Feb 2023 22:30:09 +0000 Subject: [PATCH 013/371] GADT: Use =:= instead of Any/Nothing --- .../tools/dotc/core/GadtConstraint.scala | 20 +++++++++---------- 1 file changed, 9 insertions(+), 11 deletions(-) diff --git a/compiler/src/dotty/tools/dotc/core/GadtConstraint.scala b/compiler/src/dotty/tools/dotc/core/GadtConstraint.scala index 74d668e7ca87..bb65cce84042 100644 --- a/compiler/src/dotty/tools/dotc/core/GadtConstraint.scala +++ b/compiler/src/dotty/tools/dotc/core/GadtConstraint.scala @@ -75,15 +75,14 @@ class GadtConstraint private ( val self = externalize(param) constraint.minLower(param).foldLeft(nonParamBounds(param).lo) { (acc, p) => externalize(p) match - case tp: TypeRef - // drop any lower param that is a GADT symbol - // and is upper-bounded by a non-Any super-type of the original parameter - // e.g. in pos/i14287.min - // B$1 had info <: X and fullBounds >: B$2 <: X, and - // B$2 had info <: B$1 and fullBounds <: B$1 - // We can use the info of B$2 to drop the lower-bound of B$1 - // and return non-bidirectional bounds B$1 <: X and B$2 <: B$1. - if tp.symbol.isPatternBound && !tp.info.hiBound.isExactlyAny && self <:< tp.info.hiBound => acc + // drop any lower param that is a GADT symbol + // and is upper-bounded by a non-Any super-type of the original parameter + // e.g. in pos/i14287.min + // B$1 had info <: X and fullBounds >: B$2 <: X, and + // B$2 had info <: B$1 and fullBounds <: B$1 + // We can use the info of B$2 to drop the lower-bound of B$1 + // and return non-bidirectional bounds B$1 <: X and B$2 <: B$1. + case tp: TypeRef if tp.symbol.isPatternBound && self =:= tp.info.hiBound => acc case tp => acc | tp } @@ -91,8 +90,7 @@ class GadtConstraint private ( val self = externalize(param) constraint.minUpper(param).foldLeft(nonParamBounds(param).hi) { (acc, u) => externalize(u) match - case tp: TypeRef // same as fullLowerBounds - if tp.symbol.isPatternBound && !tp.info.loBound.isExactlyNothing && tp.info.loBound <:< self => acc + case tp: TypeRef if tp.symbol.isPatternBound && self =:= tp.info.loBound => acc // like fullLowerBound case tp => // Any as the upper bound means "no bound", but if F is higher-kinded, // Any & F = F[_]; this is wrong for us so we need to short-circuit From fad1584d1b4c518c5637eca516b4f67c9461a642 Mon Sep 17 00:00:00 2001 From: Szymon Rodziewicz Date: Wed, 1 Feb 2023 17:39:54 +0100 Subject: [PATCH 014/371] Fix static lazy field holder for GraalVM --- .../lazyvals/InitializedAccessInt.scala | 30 +++++++++++++++++++ .../lazyvals/InitializedObject.scala | 22 ++++++++++++++ .../tools/benchmarks/lazyvals/LazyVals.scala | 18 +++++++++++ .../tools/backend/jvm/BCodeSkelBuilder.scala | 2 +- .../backend/jvm/DottyBackendInterface.scala | 7 +++-- .../dotty/tools/dotc/transform/LazyVals.scala | 11 ++----- 6 files changed, 77 insertions(+), 13 deletions(-) create mode 100644 bench-micro/src/main/scala/dotty/tools/benchmarks/lazyvals/InitializedAccessInt.scala create mode 100644 bench-micro/src/main/scala/dotty/tools/benchmarks/lazyvals/InitializedObject.scala diff --git a/bench-micro/src/main/scala/dotty/tools/benchmarks/lazyvals/InitializedAccessInt.scala b/bench-micro/src/main/scala/dotty/tools/benchmarks/lazyvals/InitializedAccessInt.scala new file mode 100644 index 000000000000..2a115ad63496 --- /dev/null +++ b/bench-micro/src/main/scala/dotty/tools/benchmarks/lazyvals/InitializedAccessInt.scala @@ -0,0 +1,30 @@ +package dotty.tools.benchmarks.lazyvals + +import org.openjdk.jmh.annotations.* +import org.openjdk.jmh.infra.Blackhole +import LazyVals.LazyIntHolder +import java.util.concurrent.TimeUnit + +@BenchmarkMode(Array(Mode.AverageTime)) +@Fork(2) +@Threads(1) +@Warmup(iterations = 5) +@Measurement(iterations = 5) +@OutputTimeUnit(TimeUnit.NANOSECONDS) +@State(Scope.Benchmark) +class InitializedAccessInt { + + var holder: LazyIntHolder = _ + + @Setup + def prepare: Unit = { + holder = new LazyIntHolder + holder.value + } + + @Benchmark + def measureInitialized(bh: Blackhole) = { + bh.consume(holder) + bh.consume(holder.value) + } +} diff --git a/bench-micro/src/main/scala/dotty/tools/benchmarks/lazyvals/InitializedObject.scala b/bench-micro/src/main/scala/dotty/tools/benchmarks/lazyvals/InitializedObject.scala new file mode 100644 index 000000000000..672cc4bf6544 --- /dev/null +++ b/bench-micro/src/main/scala/dotty/tools/benchmarks/lazyvals/InitializedObject.scala @@ -0,0 +1,22 @@ +package dotty.tools.benchmarks.lazyvals + +import org.openjdk.jmh.annotations.* +import org.openjdk.jmh.infra.Blackhole +import LazyVals.ObjectHolder +import java.util.concurrent.TimeUnit + +@BenchmarkMode(Array(Mode.AverageTime)) +@Fork(2) +@Threads(1) +@Warmup(iterations = 5) +@Measurement(iterations = 5) +@OutputTimeUnit(TimeUnit.NANOSECONDS) +@State(Scope.Benchmark) +class InitializedObject { + + @Benchmark + def measureInitialized(bh: Blackhole) = { + bh.consume(ObjectHolder) + bh.consume(ObjectHolder.value) + } +} diff --git a/bench-micro/src/main/scala/dotty/tools/benchmarks/lazyvals/LazyVals.scala b/bench-micro/src/main/scala/dotty/tools/benchmarks/lazyvals/LazyVals.scala index 0afd93d086be..68379f9e142c 100644 --- a/bench-micro/src/main/scala/dotty/tools/benchmarks/lazyvals/LazyVals.scala +++ b/bench-micro/src/main/scala/dotty/tools/benchmarks/lazyvals/LazyVals.scala @@ -50,4 +50,22 @@ object LazyVals { } } } + + class LazyIntHolder { + lazy val value: Int = { + (System.nanoTime() % 1000).toInt + } + } + + object ObjectHolder { + lazy val value: String = { + System.nanoTime() % 5 match { + case 0 => "abc" + case 1 => "def" + case 2 => "ghi" + case 3 => "jkl" + case 4 => "mno" + } + } + } } diff --git a/compiler/src/dotty/tools/backend/jvm/BCodeSkelBuilder.scala b/compiler/src/dotty/tools/backend/jvm/BCodeSkelBuilder.scala index 1885210a6687..9c1ff1f26763 100644 --- a/compiler/src/dotty/tools/backend/jvm/BCodeSkelBuilder.scala +++ b/compiler/src/dotty/tools/backend/jvm/BCodeSkelBuilder.scala @@ -151,7 +151,7 @@ trait BCodeSkelBuilder extends BCodeHelpers { // !!! Part of this logic is duplicated in JSCodeGen.genCompilationUnit claszSymbol.info.decls.foreach { f => - if f.isField && !f.name.is(LazyBitMapName) then + if f.isField && !f.name.is(LazyBitMapName) && !f.name.is(LazyLocalName) then f.setFlag(JavaStatic) } diff --git a/compiler/src/dotty/tools/backend/jvm/DottyBackendInterface.scala b/compiler/src/dotty/tools/backend/jvm/DottyBackendInterface.scala index ecdd0ae98803..f8f683a429f6 100644 --- a/compiler/src/dotty/tools/backend/jvm/DottyBackendInterface.scala +++ b/compiler/src/dotty/tools/backend/jvm/DottyBackendInterface.scala @@ -22,7 +22,7 @@ import dotty.tools.dotc.report import tpd._ import StdNames.nme -import NameKinds.LazyBitMapName +import NameKinds.{LazyBitMapName, LazyLocalName} import Names.Name class DottyBackendInterface(val outputDirectory: AbstractFile, val superCallsMap: ReadOnlyMap[Symbol, Set[ClassSymbol]])(using val ctx: Context) { @@ -129,10 +129,11 @@ object DottyBackendInterface { * the new lazy val encoding: https://github.com/lampepfl/dotty/issues/7140 */ def isStaticModuleField(using Context): Boolean = - sym.owner.isStaticModuleClass && sym.isField && !sym.name.is(LazyBitMapName) + sym.owner.isStaticModuleClass && sym.isField && !sym.name.is(LazyBitMapName) && !sym.name.is(LazyLocalName) def isStaticMember(using Context): Boolean = (sym ne NoSymbol) && - (sym.is(JavaStatic) || sym.isScalaStatic || sym.isStaticModuleField) + (sym.is(JavaStatic) || sym.isScalaStatic || sym.isStaticModuleField) + // guard against no sumbol cause this code is executed to select which call type(static\dynamic) to use to call array.clone /** diff --git a/compiler/src/dotty/tools/dotc/transform/LazyVals.scala b/compiler/src/dotty/tools/dotc/transform/LazyVals.scala index 0a9a2b1214b2..0861350c30a9 100644 --- a/compiler/src/dotty/tools/dotc/transform/LazyVals.scala +++ b/compiler/src/dotty/tools/dotc/transform/LazyVals.scala @@ -466,13 +466,9 @@ class LazyVals extends MiniPhase with IdentityDenotTransformer { val containerSymbol = newSymbol(claz, containerName, x.symbol.flags &~ containerFlagsMask | containerFlags | Private, defn.ObjectType, coord = x.symbol.coord).enteredAfter(this) containerSymbol.addAnnotation(Annotation(defn.VolatileAnnot, containerSymbol.span)) // private @volatile var _x: AnyRef containerSymbol.addAnnotations(x.symbol.annotations) // pass annotations from original definition - val stat = x.symbol.isStatic - if stat then - containerSymbol.setFlag(JavaStatic) + containerSymbol.removeAnnotation(defn.ScalaStaticAnnot) + containerSymbol.resetFlag(JavaStatic) val getOffset = - if stat then - Select(ref(defn.LazyValsModule), lazyNme.RLazyVals.getStaticFieldOffset) - else Select(ref(defn.LazyValsModule), lazyNme.RLazyVals.getOffsetStatic) val containerTree = ValDef(containerSymbol, nullLiteral) @@ -490,9 +486,6 @@ class LazyVals extends MiniPhase with IdentityDenotTransformer { val offset = ref(offsetSymbol.nn) val swapOver = - if stat then - tpd.clsOf(x.symbol.owner.typeRef) - else This(claz) val (accessorDef, initMethodDef) = mkThreadSafeDef(x, claz, containerSymbol, offset, swapOver) From e466fa4541002f299bebf301e7872fcbba41a0ea Mon Sep 17 00:00:00 2001 From: Szymon Rodziewicz Date: Wed, 1 Feb 2023 18:19:19 +0100 Subject: [PATCH 015/371] No need to reset JavaStatic as its removed with the amsk --- compiler/src/dotty/tools/dotc/transform/LazyVals.scala | 1 - 1 file changed, 1 deletion(-) diff --git a/compiler/src/dotty/tools/dotc/transform/LazyVals.scala b/compiler/src/dotty/tools/dotc/transform/LazyVals.scala index 0861350c30a9..8d3702190763 100644 --- a/compiler/src/dotty/tools/dotc/transform/LazyVals.scala +++ b/compiler/src/dotty/tools/dotc/transform/LazyVals.scala @@ -467,7 +467,6 @@ class LazyVals extends MiniPhase with IdentityDenotTransformer { containerSymbol.addAnnotation(Annotation(defn.VolatileAnnot, containerSymbol.span)) // private @volatile var _x: AnyRef containerSymbol.addAnnotations(x.symbol.annotations) // pass annotations from original definition containerSymbol.removeAnnotation(defn.ScalaStaticAnnot) - containerSymbol.resetFlag(JavaStatic) val getOffset = Select(ref(defn.LazyValsModule), lazyNme.RLazyVals.getOffsetStatic) val containerTree = ValDef(containerSymbol, nullLiteral) From ef8e8553e6e653f95ca44a692279ead1ad7b71d8 Mon Sep 17 00:00:00 2001 From: Szymon Rodziewicz Date: Wed, 1 Feb 2023 18:25:07 +0100 Subject: [PATCH 016/371] Removing getStaticFieldOffset as it's not used anymore --- compiler/src/dotty/tools/dotc/transform/LazyVals.scala | 1 - library/src/scala/runtime/LazyVals.scala | 8 -------- 2 files changed, 9 deletions(-) diff --git a/compiler/src/dotty/tools/dotc/transform/LazyVals.scala b/compiler/src/dotty/tools/dotc/transform/LazyVals.scala index 8d3702190763..e4cb21a279d6 100644 --- a/compiler/src/dotty/tools/dotc/transform/LazyVals.scala +++ b/compiler/src/dotty/tools/dotc/transform/LazyVals.scala @@ -674,7 +674,6 @@ object LazyVals { val cas: TermName = N.cas.toTermName val getOffset: TermName = N.getOffset.toTermName val getOffsetStatic: TermName = "getOffsetStatic".toTermName - val getStaticFieldOffset: TermName = "getStaticFieldOffset".toTermName val getDeclaredField: TermName = "getDeclaredField".toTermName } val flag: TermName = "flag".toTermName diff --git a/library/src/scala/runtime/LazyVals.scala b/library/src/scala/runtime/LazyVals.scala index 5d1e8e74b89d..a75042671efa 100644 --- a/library/src/scala/runtime/LazyVals.scala +++ b/library/src/scala/runtime/LazyVals.scala @@ -142,14 +142,6 @@ object LazyVals { r } - def getStaticFieldOffset(field: java.lang.reflect.Field): Long = { - @nowarn - val r = unsafe.staticFieldOffset(field) - if (debug) - println(s"getStaticFieldOffset(${field.getDeclaringClass}, ${field.getName}) = $r") - r - } - def getOffsetStatic(field: java.lang.reflect.Field) = @nowarn val r = unsafe.objectFieldOffset(field) From 2bfbe7553098e050c8403557ff35a75902b8a0b7 Mon Sep 17 00:00:00 2001 From: Szymon Rodziewicz Date: Thu, 2 Feb 2023 16:45:19 +0100 Subject: [PATCH 017/371] Revert deletion of getStaticFieldOffset for now --- library/src/scala/runtime/LazyVals.scala | 8 ++++++++ 1 file changed, 8 insertions(+) diff --git a/library/src/scala/runtime/LazyVals.scala b/library/src/scala/runtime/LazyVals.scala index a75042671efa..5d1e8e74b89d 100644 --- a/library/src/scala/runtime/LazyVals.scala +++ b/library/src/scala/runtime/LazyVals.scala @@ -142,6 +142,14 @@ object LazyVals { r } + def getStaticFieldOffset(field: java.lang.reflect.Field): Long = { + @nowarn + val r = unsafe.staticFieldOffset(field) + if (debug) + println(s"getStaticFieldOffset(${field.getDeclaringClass}, ${field.getName}) = $r") + r + } + def getOffsetStatic(field: java.lang.reflect.Field) = @nowarn val r = unsafe.objectFieldOffset(field) From 41cfb62df29b9cf893eb65034af169e0266a3632 Mon Sep 17 00:00:00 2001 From: Szymon Rodziewicz Date: Fri, 3 Feb 2023 15:21:56 +0100 Subject: [PATCH 018/371] Update printing tests to have matching AST --- .../printing/transformed/lazy-vals-new.check | 24 +++++++++---------- 1 file changed, 12 insertions(+), 12 deletions(-) diff --git a/tests/printing/transformed/lazy-vals-new.check b/tests/printing/transformed/lazy-vals-new.check index 406417845c20..4b81cd457a38 100644 --- a/tests/printing/transformed/lazy-vals-new.check +++ b/tests/printing/transformed/lazy-vals-new.check @@ -10,19 +10,19 @@ package { @static private def (): Unit = { A.OFFSET$_m_0 = - scala.runtime.LazyVals.getStaticFieldOffset( + scala.runtime.LazyVals.getOffsetStatic( classOf[Object {...}].getDeclaredField("x$lzy1")) () } @static @static val OFFSET$_m_0: Long = - scala.runtime.LazyVals.getStaticFieldOffset( + scala.runtime.LazyVals.getOffsetStatic( classOf[Object {...}].getDeclaredField("x$lzy1")) private def writeReplace(): Object = new scala.runtime.ModuleSerializationProxy(classOf[A]) - @volatile private lazy var x$lzy1: Object = null + @volatile private lazy var x$lzy1: Object = null lazy def x(): Int = { - val result: Object = A#x$lzy1 + val result: Object = A.x$lzy1 if result.isInstanceOf[Int] then scala.Int.unbox(result) else if result.eq(scala.runtime.LazyVals.NullValue) then scala.Int.unbox(null) else scala.Int.unbox(A.x$lzyINIT1()) @@ -30,10 +30,10 @@ package { private def x$lzyINIT1(): Object = while do { - val current: Object = A#x$lzy1 + val current: Object = A.x$lzy1 if current.eq(null) then if - scala.runtime.LazyVals.objCAS(classOf[A], A.OFFSET$_m_0, null, + scala.runtime.LazyVals.objCAS(this, A.OFFSET$_m_0, null, scala.runtime.LazyVals.Evaluating) then { @@ -49,15 +49,15 @@ package { } finally if - scala.runtime.LazyVals.objCAS(classOf[A], A.OFFSET$_m_0, + scala.runtime.LazyVals.objCAS(this, A.OFFSET$_m_0, scala.runtime.LazyVals.Evaluating, result).unary_!() then { val lock: scala.runtime.LazyVals.LazyVals$Waiting = - A#x$lzy1.asInstanceOf[ + A.x$lzy1.asInstanceOf[ scala.runtime.LazyVals.LazyVals$Waiting] - scala.runtime.LazyVals.objCAS(classOf[A], A.OFFSET$_m_0, - lock, result) + scala.runtime.LazyVals.objCAS(this, A.OFFSET$_m_0, lock, + result) lock.countDown() } else () @@ -71,8 +71,8 @@ package { then if current.eq(scala.runtime.LazyVals.Evaluating) then { - scala.runtime.LazyVals.objCAS(classOf[A], A.OFFSET$_m_0, - current, new scala.runtime.LazyVals.LazyVals$Waiting()) + scala.runtime.LazyVals.objCAS(this, A.OFFSET$_m_0, current, + new scala.runtime.LazyVals.LazyVals$Waiting()) () } else From 805c49f2067ff1a0eedfd2bef4b023d908af7e17 Mon Sep 17 00:00:00 2001 From: Vasil Vasilev Date: Thu, 2 Feb 2023 14:13:09 +0100 Subject: [PATCH 019/371] Add support for disabling redirected output in the REPL driver for usage in worksheets in the Scala Plugin for IntelliJ IDEA - Calling `setOut/setErr` in a concurrent environment without any synchronization (such as the Scala compile server in the Scala Plugin for IntelliJ IDEA, which is used to execute Scala 3 worksheets) can lead to unpredictable outcomes where the out/err streams are not restored properly after changing. - This change adds a new default method `redirectOutput` which can be overriden by others to control the redirecting behavior of the REPL driver. --- .../src/dotty/tools/repl/ReplDriver.scala | 26 +++++++++++-------- 1 file changed, 15 insertions(+), 11 deletions(-) diff --git a/compiler/src/dotty/tools/repl/ReplDriver.scala b/compiler/src/dotty/tools/repl/ReplDriver.scala index b072d58f6bb7..0f29591e2121 100644 --- a/compiler/src/dotty/tools/repl/ReplDriver.scala +++ b/compiler/src/dotty/tools/repl/ReplDriver.scala @@ -187,19 +187,23 @@ class ReplDriver(settings: Array[String], // TODO: i5069 final def bind(name: String, value: Any)(using state: State): State = state + protected def redirectOutput: Boolean = true + // redirecting the output allows us to test `println` in scripted tests private def withRedirectedOutput(op: => State): State = { - val savedOut = System.out - val savedErr = System.err - try { - System.setOut(out) - System.setErr(out) - op - } - finally { - System.setOut(savedOut) - System.setErr(savedErr) - } + if redirectOutput then + val savedOut = System.out + val savedErr = System.err + try { + System.setOut(out) + System.setErr(out) + op + } + finally { + System.setOut(savedOut) + System.setErr(savedErr) + } + else op } private def newRun(state: State, reporter: StoreReporter = newStoreReporter) = { From b36f3192ed3bc75877837b7fa5182d827fc4239e Mon Sep 17 00:00:00 2001 From: Vasil Vasilev Date: Mon, 6 Feb 2023 12:08:05 +0100 Subject: [PATCH 020/371] Add scaladoc documentation for `ReplDriver#redirectOutput` --- compiler/src/dotty/tools/repl/ReplDriver.scala | 11 +++++++++++ 1 file changed, 11 insertions(+) diff --git a/compiler/src/dotty/tools/repl/ReplDriver.scala b/compiler/src/dotty/tools/repl/ReplDriver.scala index 0f29591e2121..905f4f06de08 100644 --- a/compiler/src/dotty/tools/repl/ReplDriver.scala +++ b/compiler/src/dotty/tools/repl/ReplDriver.scala @@ -187,6 +187,17 @@ class ReplDriver(settings: Array[String], // TODO: i5069 final def bind(name: String, value: Any)(using state: State): State = state + /** + * Controls whether the `System.out` and `System.err` streams are set to the provided constructor parameter instance + * of [[java.io.PrintStream]] during the execution of the repl. On by default. + * + * Disabling this can be beneficial when executing a repl instance inside a concurrent environment, for example a + * thread pool (such as the Scala compile server in the Scala Plugin for IntelliJ IDEA). + * + * In such environments, indepently executing `System.setOut` and `System.setErr` without any synchronization can + * lead to unpredictable results when restoring the original streams (dependent on the order of execution), leaving + * the Java process in an inconsistent state. + */ protected def redirectOutput: Boolean = true // redirecting the output allows us to test `println` in scripted tests From ff006d0677777d64810cf825fb4871b01b323509 Mon Sep 17 00:00:00 2001 From: odersky Date: Sat, 11 Feb 2023 15:37:04 +0100 Subject: [PATCH 021/371] Add missing criterion to subtype check Fixes #16850 --- compiler/src/dotty/tools/dotc/core/TypeComparer.scala | 1 + tests/neg/i16850.check | 10 ++++++++++ tests/neg/i16850.scala | 10 ++++++++++ 3 files changed, 21 insertions(+) create mode 100644 tests/neg/i16850.check create mode 100644 tests/neg/i16850.scala diff --git a/compiler/src/dotty/tools/dotc/core/TypeComparer.scala b/compiler/src/dotty/tools/dotc/core/TypeComparer.scala index cd1e55ef028c..6428c5315263 100644 --- a/compiler/src/dotty/tools/dotc/core/TypeComparer.scala +++ b/compiler/src/dotty/tools/dotc/core/TypeComparer.scala @@ -309,6 +309,7 @@ class TypeComparer(@constructorOnly initctx: Context) extends ConstraintHandling thirdTryNamed(tp2) else ( (tp1.name eq tp2.name) + && !sym1.is(Private) && tp2.isPrefixDependentMemberRef && isSubPrefix(tp1.prefix, tp2.prefix) && tp1.signature == tp2.signature diff --git a/tests/neg/i16850.check b/tests/neg/i16850.check new file mode 100644 index 000000000000..6c9c7f7e0eac --- /dev/null +++ b/tests/neg/i16850.check @@ -0,0 +1,10 @@ +-- [E007] Type Mismatch Error: tests/neg/i16850.scala:7:33 ------------------------------------------------------------- +7 | def add(elm: Y): Unit = list = elm :: list // error + | ^^^ + | Found: (elm : Y) + | Required: Class.this.Y² + | + | where: Y is a type in class Class + | Y² is a type in trait Trait + | + | longer explanation available when compiling with `-explain` diff --git a/tests/neg/i16850.scala b/tests/neg/i16850.scala new file mode 100644 index 000000000000..e7904fcd44e7 --- /dev/null +++ b/tests/neg/i16850.scala @@ -0,0 +1,10 @@ + +trait Trait : + type Y + var list: List[Y] = Nil + +class Class[Y] extends Trait : + def add(elm: Y): Unit = list = elm :: list // error + +object Object extends Class[Int] : + add(42) From f3347dbc802cb4661466028b0a9ddc83a4a1a0a0 Mon Sep 17 00:00:00 2001 From: Nicolas Stucki Date: Wed, 18 Jan 2023 16:52:16 +0100 Subject: [PATCH 022/371] =?UTF-8?q?Avoid=20timeouts=20in=20community?= =?UTF-8?q?=E2=80=93build-C?= MIME-Version: 1.0 Content-Type: text/plain; charset=UTF-8 Content-Transfer-Encoding: 8bit Increased timeout due to timeouts when running on dotty community build --- community-build/community-projects/requests-scala | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/community-build/community-projects/requests-scala b/community-build/community-projects/requests-scala index 6d4a223bc33d..23b4895710f1 160000 --- a/community-build/community-projects/requests-scala +++ b/community-build/community-projects/requests-scala @@ -1 +1 @@ -Subproject commit 6d4a223bc33def14ae9a4def24a3f5c258451e8e +Subproject commit 23b4895710f17bf892563b28755b225c8be7f7e3 From 8562128549869d4e9aa2db5bf5c215a3a6b49f33 Mon Sep 17 00:00:00 2001 From: =?UTF-8?q?Pawe=C5=82=20Marks?= Date: Fri, 17 Feb 2023 13:33:31 +0100 Subject: [PATCH 023/371] Add changelog for 3.3.0-RC3 --- changelogs/3.3.0-RC3.md | 23 +++++++++++++++++++++++ 1 file changed, 23 insertions(+) create mode 100644 changelogs/3.3.0-RC3.md diff --git a/changelogs/3.3.0-RC3.md b/changelogs/3.3.0-RC3.md new file mode 100644 index 000000000000..79a47fcf0bb9 --- /dev/null +++ b/changelogs/3.3.0-RC3.md @@ -0,0 +1,23 @@ +# Backported fixes + +- Added jpath check to `ClassLikeSupport` getParentsAsTreeSymbolTuples [#16759](https://github.com/lampepfl/dotty/pull/16759) +- Split out immutable GadtConstraint [#16602](https://github.com/lampepfl/dotty/pull/16602) +- Avoid bidirectional GADT typebounds from fullBounds [#15683](https://github.com/lampepfl/dotty/pull/15683) +- Fix static lazy field holder for GraalVM [#16800](https://github.com/lampepfl/dotty/pull/16800) +- Add support for disabling redirected output in the REPL driver for usage in worksheets in the Scala Plugin for IntelliJ IDEA [#16810](https://github.com/lampepfl/dotty/pull/16810) +- Add missing criterion to subtype check [#16889](https://github.com/lampepfl/dotty/pull/16889) + +# Contributors + +Thank you to all the contributors who made this release possible 🎉 + +According to `git shortlog -sn --no-merges 3.3.0-RC2..3.3.0-RC3` these are: + +``` + 7 Dale Wijnand + 5 Szymon Rodziewicz + 2 Paweł Marks + 2 Vasil Vasilev + 1 Martin Odersky + 1 Mohammad Yousuf Minhaj Zia +``` From b3c1c98c47769930ef6108e3f641b5f5509dfabe Mon Sep 17 00:00:00 2001 From: =?UTF-8?q?Pawe=C5=82=20Marks?= Date: Fri, 17 Feb 2023 13:35:41 +0100 Subject: [PATCH 024/371] Release 3.3.0-RC3 --- project/Build.scala | 4 ++-- 1 file changed, 2 insertions(+), 2 deletions(-) diff --git a/project/Build.scala b/project/Build.scala index 75d3e12baf66..4360add9578a 100644 --- a/project/Build.scala +++ b/project/Build.scala @@ -82,7 +82,7 @@ object Build { val referenceVersion = "3.2.2" - val baseVersion = "3.3.0-RC2" + val baseVersion = "3.3.0-RC3" // Versions used by the vscode extension to create a new project // This should be the latest published releases. @@ -98,7 +98,7 @@ object Build { * set to 3.1.3. If it is going to be 3.1.0, it must be set to the latest * 3.0.x release. */ - val previousDottyVersion = "3.2.2" + val previousDottyVersion = "3.3.0-RC2" object CompatMode { final val BinaryCompatible = 0 From 014be6f46c769cf83f458d2ec5f19d4fb8344548 Mon Sep 17 00:00:00 2001 From: Nicolas Stucki Date: Tue, 14 Feb 2023 10:36:21 +0100 Subject: [PATCH 025/371] Fix HK quoted pattern type variables The issue was in the encoding into `{ExprMatchModule,TypeMatchModule}.unapply`. Specifically with the `TypeBindings` argument. This arguments holds the list of type variable definitions (`tpd.Bind` trees). We used a `Tuple` to list all the types inside. The problem is that higher-kinded type variables do not conform with the upper bounds of the tuple elements. The solution is to use an HList with any-kinded elements. --- compiler/src/dotty/tools/dotc/ast/tpd.scala | 8 ++++++-- .../src/dotty/tools/dotc/core/Definitions.scala | 3 +++ .../tools/dotc/typer/QuotesAndSplices.scala | 4 ++-- .../quoted/runtime/impl/QuoteMatcher.scala | 4 ++-- .../scala/quoted/runtime/impl/QuotesImpl.scala | 4 ++-- .../scala/quoted/runtime/QuoteMatching.scala | 11 ++++++++--- .../hk-quoted-type-patterns/Macro_1.scala | 17 +++++++++++++++++ .../hk-quoted-type-patterns/Test_2.scala | 5 +++++ 8 files changed, 45 insertions(+), 11 deletions(-) create mode 100644 tests/pos-macros/hk-quoted-type-patterns/Macro_1.scala create mode 100644 tests/pos-macros/hk-quoted-type-patterns/Test_2.scala diff --git a/compiler/src/dotty/tools/dotc/ast/tpd.scala b/compiler/src/dotty/tools/dotc/ast/tpd.scala index dd1e46c62223..01d61986dee4 100644 --- a/compiler/src/dotty/tools/dotc/ast/tpd.scala +++ b/compiler/src/dotty/tools/dotc/ast/tpd.scala @@ -1498,7 +1498,7 @@ object tpd extends Trees.Instance[Type] with TypedTreeInfo { } } - /** Creates the tuple type tree repesentation of the type trees in `ts` */ + /** Creates the tuple type tree representation of the type trees in `ts` */ def tupleTypeTree(elems: List[Tree])(using Context): Tree = { val arity = elems.length if arity <= Definitions.MaxTupleArity then @@ -1509,10 +1509,14 @@ object tpd extends Trees.Instance[Type] with TypedTreeInfo { else nestedPairsTypeTree(elems) } - /** Creates the nested pairs type tree repesentation of the type trees in `ts` */ + /** Creates the nested pairs type tree representation of the type trees in `ts` */ def nestedPairsTypeTree(ts: List[Tree])(using Context): Tree = ts.foldRight[Tree](TypeTree(defn.EmptyTupleModule.termRef))((x, acc) => AppliedTypeTree(TypeTree(defn.PairClass.typeRef), x :: acc :: Nil)) + /** Creates the nested higher-kinded pairs type tree representation of the type trees in `ts` */ + def hkNestedPairsTypeTree(ts: List[Tree])(using Context): Tree = + ts.foldRight[Tree](TypeTree(defn.QuoteMatching_KNil.typeRef))((x, acc) => AppliedTypeTree(TypeTree(defn.QuoteMatching_KCons.typeRef), x :: acc :: Nil)) + /** Replaces all positions in `tree` with zero-extent positions */ private def focusPositions(tree: Tree)(using Context): Tree = { val transformer = new tpd.TreeMap { diff --git a/compiler/src/dotty/tools/dotc/core/Definitions.scala b/compiler/src/dotty/tools/dotc/core/Definitions.scala index ed86050436e8..56409ad050f6 100644 --- a/compiler/src/dotty/tools/dotc/core/Definitions.scala +++ b/compiler/src/dotty/tools/dotc/core/Definitions.scala @@ -865,6 +865,9 @@ class Definitions { @tu lazy val QuoteMatching_ExprMatchModule: Symbol = QuoteMatchingClass.requiredClass("ExprMatchModule") @tu lazy val QuoteMatching_TypeMatch: Symbol = QuoteMatchingClass.requiredMethod("TypeMatch") @tu lazy val QuoteMatching_TypeMatchModule: Symbol = QuoteMatchingClass.requiredClass("TypeMatchModule") + @tu lazy val QuoteMatchingModule: Symbol = requiredModule("scala.quoted.runtime.QuoteMatching") + @tu lazy val QuoteMatching_KNil: Symbol = QuoteMatchingModule.requiredType("KNil") + @tu lazy val QuoteMatching_KCons: Symbol = QuoteMatchingModule.requiredType("KCons") @tu lazy val ToExprModule: Symbol = requiredModule("scala.quoted.ToExpr") @tu lazy val ToExprModule_BooleanToExpr: Symbol = ToExprModule.requiredMethod("BooleanToExpr") diff --git a/compiler/src/dotty/tools/dotc/typer/QuotesAndSplices.scala b/compiler/src/dotty/tools/dotc/typer/QuotesAndSplices.scala index 2fe5770c5b4b..65d8abfdf6a7 100644 --- a/compiler/src/dotty/tools/dotc/typer/QuotesAndSplices.scala +++ b/compiler/src/dotty/tools/dotc/typer/QuotesAndSplices.scala @@ -364,7 +364,7 @@ trait QuotesAndSplices { * * ``` * case scala.internal.quoted.Expr.unapply[ - * Tuple1[t @ _], // Type binging definition + * KList[t @ _, KNil], // Type binging definition * Tuple2[Type[t], Expr[List[t]]] // Typing the result of the pattern match * ]( * Tuple2.unapply @@ -411,7 +411,7 @@ trait QuotesAndSplices { val replaceBindings = new ReplaceBindings val patType = defn.tupleType(splices.tpes.map(tpe => replaceBindings(tpe.widen))) - val typeBindingsTuple = tpd.tupleTypeTree(typeBindings.values.toList) + val typeBindingsTuple = tpd.hkNestedPairsTypeTree(typeBindings.values.toList) val replaceBindingsInTree = new TreeMap { private var bindMap = Map.empty[Symbol, Symbol] diff --git a/compiler/src/scala/quoted/runtime/impl/QuoteMatcher.scala b/compiler/src/scala/quoted/runtime/impl/QuoteMatcher.scala index d85d92de5455..7c952dbbe142 100644 --- a/compiler/src/scala/quoted/runtime/impl/QuoteMatcher.scala +++ b/compiler/src/scala/quoted/runtime/impl/QuoteMatcher.scala @@ -121,9 +121,9 @@ object QuoteMatcher { private def withEnv[T](env: Env)(body: Env ?=> T): T = body(using env) - def treeMatch(scrutineeTerm: Tree, patternTerm: Tree)(using Context): Option[Tuple] = + def treeMatch(scrutineeTree: Tree, patternTree: Tree)(using Context): Option[Tuple] = given Env = Map.empty - scrutineeTerm =?= patternTerm + scrutineeTree =?= patternTree /** Check that all trees match with `mtch` and concatenate the results with &&& */ private def matchLists[T](l1: List[T], l2: List[T])(mtch: (T, T) => Matching): Matching = (l1, l2) match { diff --git a/compiler/src/scala/quoted/runtime/impl/QuotesImpl.scala b/compiler/src/scala/quoted/runtime/impl/QuotesImpl.scala index 4d08e0582d1d..d1806947fa5d 100644 --- a/compiler/src/scala/quoted/runtime/impl/QuotesImpl.scala +++ b/compiler/src/scala/quoted/runtime/impl/QuotesImpl.scala @@ -3093,14 +3093,14 @@ class QuotesImpl private (using val ctx: Context) extends Quotes, QuoteUnpickler new TypeImpl(tree, SpliceScope.getCurrent).asInstanceOf[scala.quoted.Type[T]] object ExprMatch extends ExprMatchModule: - def unapply[TypeBindings <: Tuple, Tup <: Tuple](scrutinee: scala.quoted.Expr[Any])(using pattern: scala.quoted.Expr[Any]): Option[Tup] = + def unapply[TypeBindings, Tup <: Tuple](scrutinee: scala.quoted.Expr[Any])(using pattern: scala.quoted.Expr[Any]): Option[Tup] = val scrutineeTree = reflect.asTerm(scrutinee) val patternTree = reflect.asTerm(pattern) treeMatch(scrutineeTree, patternTree).asInstanceOf[Option[Tup]] end ExprMatch object TypeMatch extends TypeMatchModule: - def unapply[TypeBindings <: Tuple, Tup <: Tuple](scrutinee: scala.quoted.Type[?])(using pattern: scala.quoted.Type[?]): Option[Tup] = + def unapply[TypeBindings, Tup <: Tuple](scrutinee: scala.quoted.Type[?])(using pattern: scala.quoted.Type[?]): Option[Tup] = val scrutineeTree = reflect.TypeTree.of(using scrutinee) val patternTree = reflect.TypeTree.of(using pattern) treeMatch(scrutineeTree, patternTree).asInstanceOf[Option[Tup]] diff --git a/library/src/scala/quoted/runtime/QuoteMatching.scala b/library/src/scala/quoted/runtime/QuoteMatching.scala index 2a76143e9868..c95ffe87b5dc 100644 --- a/library/src/scala/quoted/runtime/QuoteMatching.scala +++ b/library/src/scala/quoted/runtime/QuoteMatching.scala @@ -17,7 +17,7 @@ trait QuoteMatching: * - `ExprMatch.unapply('{ f(0, myInt) })('{ f(patternHole[Int], patternHole[Int]) }, _)` * will return `Some(Tuple2('{0}, '{ myInt }))` * - `ExprMatch.unapply('{ f(0, "abc") })('{ f(0, patternHole[Int]) }, _)` - * will return `None` due to the missmatch of types in the hole + * will return `None` due to the mismatch of types in the hole * * Holes: * - scala.quoted.runtime.Patterns.patternHole[T]: hole that matches an expression `x` of type `Expr[U]` @@ -27,7 +27,7 @@ trait QuoteMatching: * @param pattern `Expr[Any]` containing the pattern tree * @return None if it did not match, `Some(tup)` if it matched where `tup` contains `Expr[Ti]`` */ - def unapply[TypeBindings <: Tuple, Tup <: Tuple](scrutinee: Expr[Any])(using pattern: Expr[Any]): Option[Tup] + def unapply[TypeBindings, Tup <: Tuple](scrutinee: Expr[Any])(using pattern: Expr[Any]): Option[Tup] } val TypeMatch: TypeMatchModule @@ -40,5 +40,10 @@ trait QuoteMatching: * @param pattern `Type[?]` containing the pattern tree * @return None if it did not match, `Some(tup)` if it matched where `tup` contains `Type[Ti]`` */ - def unapply[TypeBindings <: Tuple, Tup <: Tuple](scrutinee: Type[?])(using pattern: Type[?]): Option[Tup] + def unapply[TypeBindings, Tup <: Tuple](scrutinee: Type[?])(using pattern: Type[?]): Option[Tup] } + +object QuoteMatching: + type KList + type KCons[+H <: AnyKind, +T <: KList] <: KList + type KNil <: KList diff --git a/tests/pos-macros/hk-quoted-type-patterns/Macro_1.scala b/tests/pos-macros/hk-quoted-type-patterns/Macro_1.scala new file mode 100644 index 000000000000..0d2df1504918 --- /dev/null +++ b/tests/pos-macros/hk-quoted-type-patterns/Macro_1.scala @@ -0,0 +1,17 @@ +import scala.quoted._ + +private def impl(x: Expr[Any])(using Quotes): Expr[Unit] = { + x match + case '{ foo[x] } => + assert(Type.show[x] == "scala.Int", Type.show[x]) + case '{ type f[X]; foo[`f`] } => + assert(Type.show[f] == "[A >: scala.Nothing <: scala.Any] => scala.collection.immutable.List[A]", Type.show[f]) + case '{ type f <: AnyKind; foo[`f`] } => + assert(Type.show[f] == "[K >: scala.Nothing <: scala.Any, V >: scala.Nothing <: scala.Any] => scala.collection.immutable.Map[K, V]", Type.show[f]) + case x => throw MatchError(x.show) + '{} +} + +inline def test(inline x: Any): Unit = ${ impl('x) } + +def foo[T <: AnyKind]: Any = ??? diff --git a/tests/pos-macros/hk-quoted-type-patterns/Test_2.scala b/tests/pos-macros/hk-quoted-type-patterns/Test_2.scala new file mode 100644 index 000000000000..3cb9113f2452 --- /dev/null +++ b/tests/pos-macros/hk-quoted-type-patterns/Test_2.scala @@ -0,0 +1,5 @@ +@main +def Test = + test(foo[Int]) + test(foo[List]) + test(foo[Map]) From f0f6bafb87cae7245656059fe72b213958723cea Mon Sep 17 00:00:00 2001 From: Guillaume Martres Date: Tue, 21 Feb 2023 19:01:42 +0100 Subject: [PATCH 026/371] Fix caching issue caused by incorrect isProvisional check A static TypeRef can still be provisional if it's currently being completed (see the logic in `Namer#TypeDefCompleter#typeSig`). Fixes #16950. --- compiler/src/dotty/tools/dotc/core/Types.scala | 5 ++--- tests/pos/i16950.scala | 11 +++++++++++ 2 files changed, 13 insertions(+), 3 deletions(-) create mode 100644 tests/pos/i16950.scala diff --git a/compiler/src/dotty/tools/dotc/core/Types.scala b/compiler/src/dotty/tools/dotc/core/Types.scala index 15b0b00ed0f3..03fc7274beaa 100644 --- a/compiler/src/dotty/tools/dotc/core/Types.scala +++ b/compiler/src/dotty/tools/dotc/core/Types.scala @@ -118,10 +118,9 @@ object Types { if t.mightBeProvisional then t.mightBeProvisional = t match case t: TypeRef => - !t.currentSymbol.isStatic && { + t.currentSymbol.isProvisional || !t.currentSymbol.isStatic && { (t: Type).mightBeProvisional = false // break cycles - t.symbol.isProvisional - || test(t.prefix, theAcc) + test(t.prefix, theAcc) || t.denot.infoOrCompleter.match case info: LazyType => true case info: AliasingBounds => test(info.alias, theAcc) diff --git a/tests/pos/i16950.scala b/tests/pos/i16950.scala new file mode 100644 index 000000000000..ac95a477136e --- /dev/null +++ b/tests/pos/i16950.scala @@ -0,0 +1,11 @@ +object Foo: + def bar(x : Bar.YOf[Any]): Unit = ??? + +trait K: + type CType <: Bar.YOf[Any] + def foo : K = + val x : CType = ??? + x // was: error: Found: CType, Expected: K + +object Bar: + type YOf[T] = K { type M } From 1a77625064f5285ecf786e465f69476fed51b0d2 Mon Sep 17 00:00:00 2001 From: Szymon Rodziewicz Date: Mon, 20 Feb 2023 14:25:28 +0100 Subject: [PATCH 027/371] Fix race condition in new LazyVals --- library/src/scala/runtime/LazyVals.scala | 2 +- tests/run/i16806.check | 2 ++ tests/run/i16806.scala | 42 ++++++++++++++++++++++++ 3 files changed, 45 insertions(+), 1 deletion(-) create mode 100644 tests/run/i16806.check create mode 100644 tests/run/i16806.scala diff --git a/library/src/scala/runtime/LazyVals.scala b/library/src/scala/runtime/LazyVals.scala index 5d1e8e74b89d..416ffc91d34a 100644 --- a/library/src/scala/runtime/LazyVals.scala +++ b/library/src/scala/runtime/LazyVals.scala @@ -45,7 +45,7 @@ object LazyVals { /* ------------- Start of public API ------------- */ - sealed trait LazyValControlState + sealed trait LazyValControlState extends Serializable /** * Used to indicate the state of a lazy val that is being diff --git a/tests/run/i16806.check b/tests/run/i16806.check new file mode 100644 index 000000000000..af917347162a --- /dev/null +++ b/tests/run/i16806.check @@ -0,0 +1,2 @@ +Success +Success \ No newline at end of file diff --git a/tests/run/i16806.scala b/tests/run/i16806.scala new file mode 100644 index 000000000000..f45652080458 --- /dev/null +++ b/tests/run/i16806.scala @@ -0,0 +1,42 @@ +import java.util.concurrent.Semaphore +import scala.runtime.LazyVals.Evaluating + +object Repro { + + case object DFBit + final class DFError extends Exception("") + final class DFType[+T](val value: T | DFError) extends AnyVal + + def asIR(dfType: DFType[DFBit.type]): DFBit.type = dfType.value match + case dfTypeIR: DFBit.type => dfTypeIR + case err: DFError => throw new DFError + + object Holder { + val s = new Semaphore(1, false) + final lazy val Bit = { + s.release() + new DFType[DFBit.type](DFBit) + } + } + + @main + def Test = + val a = new Thread() { + override def run(): Unit = + Holder.s.acquire() + val x = Holder.Bit.value + assert(!x.isInstanceOf[Evaluating.type]) + println("Success") + } + val b = new Thread() { + override def run(): Unit = + Holder.s.acquire() + val x = Holder.Bit.value + assert(!x.isInstanceOf[Evaluating.type]) + println("Success") + } + a.start() + b.start() + a.join(300) + b.join(300) +} \ No newline at end of file From 4a7e92bdbfc2f4a0939667e2e408924b81f29af1 Mon Sep 17 00:00:00 2001 From: Szymon Rodziewicz Date: Mon, 20 Feb 2023 16:51:05 +0100 Subject: [PATCH 028/371] Do not depend on runtime lib in tests --- tests/run/i16806.scala | 5 ++--- 1 file changed, 2 insertions(+), 3 deletions(-) diff --git a/tests/run/i16806.scala b/tests/run/i16806.scala index f45652080458..0b0dfe1f6b35 100644 --- a/tests/run/i16806.scala +++ b/tests/run/i16806.scala @@ -1,5 +1,4 @@ import java.util.concurrent.Semaphore -import scala.runtime.LazyVals.Evaluating object Repro { @@ -25,14 +24,14 @@ object Repro { override def run(): Unit = Holder.s.acquire() val x = Holder.Bit.value - assert(!x.isInstanceOf[Evaluating.type]) + assert(x.isInstanceOf[DFBit.type]) println("Success") } val b = new Thread() { override def run(): Unit = Holder.s.acquire() val x = Holder.Bit.value - assert(!x.isInstanceOf[Evaluating.type]) + assert(x.isInstanceOf[DFBit.type]) println("Success") } a.start() From 580126233bb98eb2d211b6b0e9cb449de3b8751c Mon Sep 17 00:00:00 2001 From: Szymon Rodziewicz Date: Mon, 27 Feb 2023 12:11:24 +0100 Subject: [PATCH 029/371] Disable test for Scalajs --- tests/run/i16806.scala | 1 + 1 file changed, 1 insertion(+) diff --git a/tests/run/i16806.scala b/tests/run/i16806.scala index 0b0dfe1f6b35..16c0fb0d3ef5 100644 --- a/tests/run/i16806.scala +++ b/tests/run/i16806.scala @@ -1,3 +1,4 @@ +//scalajs: --skip import java.util.concurrent.Semaphore object Repro { From 81c6d6ebeb4f8ab12dcde2a23e45ce208a4dfa54 Mon Sep 17 00:00:00 2001 From: Szymon Rodziewicz Date: Mon, 27 Feb 2023 12:40:41 +0100 Subject: [PATCH 030/371] Add comment describing why LazyValControlState extends Serializable --- library/src/scala/runtime/LazyVals.scala | 1 + 1 file changed, 1 insertion(+) diff --git a/library/src/scala/runtime/LazyVals.scala b/library/src/scala/runtime/LazyVals.scala index 416ffc91d34a..d8c89c7abf28 100644 --- a/library/src/scala/runtime/LazyVals.scala +++ b/library/src/scala/runtime/LazyVals.scala @@ -45,6 +45,7 @@ object LazyVals { /* ------------- Start of public API ------------- */ + // This trait extends Serializable to fix #16806 that caused a race condition sealed trait LazyValControlState extends Serializable /** From aa601a1d55095ed533862c8461491a7ac719a011 Mon Sep 17 00:00:00 2001 From: Paul Coral Date: Wed, 15 Feb 2023 16:44:55 +0100 Subject: [PATCH 031/371] Fix #16822 - Ignore synthetic local private - Update test suit --- .../tools/dotc/transform/CheckUnused.scala | 19 +++++++-------- .../fatal-warnings/i15503i.scala | 23 ++++++++++++++----- 2 files changed, 27 insertions(+), 15 deletions(-) diff --git a/compiler/src/dotty/tools/dotc/transform/CheckUnused.scala b/compiler/src/dotty/tools/dotc/transform/CheckUnused.scala index 59878757c39b..6c47c12ac07c 100644 --- a/compiler/src/dotty/tools/dotc/transform/CheckUnused.scala +++ b/compiler/src/dotty/tools/dotc/transform/CheckUnused.scala @@ -81,7 +81,7 @@ class CheckUnused extends MiniPhase: ctx override def prepareForIdent(tree: tpd.Ident)(using Context): Context = - if tree.symbol.exists then + if tree.symbol.exists then _key.unusedDataApply(_.registerUsed(tree.symbol, Some(tree.name))) else if tree.hasType then _key.unusedDataApply(_.registerUsed(tree.tpe.classSymbol, Some(tree.name))) @@ -103,7 +103,7 @@ class CheckUnused extends MiniPhase: override def prepareForValDef(tree: tpd.ValDef)(using Context): Context = _key.unusedDataApply{ud => // do not register the ValDef generated for `object` - if !tree.symbol.is(Module) then + if !tree.symbol.is(Module) then ud.registerDef(tree) ud.addIgnoredUsage(tree.symbol) } @@ -335,7 +335,7 @@ object CheckUnused: * The optional name will be used to target the right import * as the same element can be imported with different renaming */ - def registerUsed(sym: Symbol, name: Option[Name])(using Context): Unit = + def registerUsed(sym: Symbol, name: Option[Name])(using Context): Unit = if !isConstructorOfSynth(sym) && !doNotRegister(sym) then if sym.isConstructor && sym.exists then registerUsed(sym.owner, None) // constructor are "implicitly" imported with the class @@ -371,7 +371,7 @@ object CheckUnused: implicitParamInScope += memDef else explicitParamInScope += memDef - else if currScopeType.top == ScopeType.Local then + else if currScopeType.top == ScopeType.Local then localDefInScope += memDef else if memDef.shouldReportPrivateDef then privateDefInScope += memDef @@ -578,10 +578,10 @@ object CheckUnused: else false - private def usedDefContains(using Context): Boolean = + private def usedDefContains(using Context): Boolean = sym.everySymbol.exists(usedDef.apply) - private def everySymbol(using Context): List[Symbol] = + private def everySymbol(using Context): List[Symbol] = List(sym, sym.companionClass, sym.companionModule, sym.moduleClass).filter(_.exists) end extension @@ -614,10 +614,11 @@ object CheckUnused: private def isValidParam(using Context): Boolean = val sym = memDef.symbol (sym.is(Param) || sym.isAllOf(PrivateParamAccessor | Local, butNot = CaseAccessor)) && - !isSyntheticMainParam(sym) && - !sym.shouldNotReportParamOwner + !isSyntheticMainParam(sym) && + !sym.shouldNotReportParamOwner && + (!sym.exists || !sym.owner.isAllOf(Synthetic | PrivateLocal)) - private def shouldReportPrivateDef(using Context): Boolean = + private def shouldReportPrivateDef(using Context): Boolean = currScopeType.top == ScopeType.Template && !memDef.symbol.isConstructor && memDef.symbol.is(Private, butNot = SelfName | Synthetic | CaseAccessor) extension (imp: tpd.Import) diff --git a/tests/neg-custom-args/fatal-warnings/i15503i.scala b/tests/neg-custom-args/fatal-warnings/i15503i.scala index 33e04f34daa8..7eae207d952d 100644 --- a/tests/neg-custom-args/fatal-warnings/i15503i.scala +++ b/tests/neg-custom-args/fatal-warnings/i15503i.scala @@ -78,13 +78,13 @@ package foo.test.companionprivate: package foo.test.i16678: def foo(func: Int => String, value: Int): String = func(value) // OK - def run = + def run = println(foo(number => number.toString, value = 5)) // OK println(foo(number => "", value = 5)) // error println(foo(func = number => "", value = 5)) // error println(foo(func = number => number.toString, value = 5)) // OK println(foo(func = _.toString, value = 5)) // OK - + package foo.test.possibleclasses: case class AllCaseClass( k: Int, // OK @@ -93,7 +93,7 @@ package foo.test.possibleclasses: s: Int, // error /* But not these */ val t: Int, // OK private val z: Int // error - ) + ) case class AllCaseUsed( k: Int, // OK @@ -113,7 +113,7 @@ package foo.test.possibleclasses: s: Int, // error val t: Int, // OK private val z: Int // error - ) + ) class AllUsed( k: Int, // OK @@ -124,10 +124,21 @@ package foo.test.possibleclasses: private val z: Int // OK ) { def a = k + y + s + t + z - } + } package foo.test.from.i16675: case class PositiveNumber private (i: Int) // OK object PositiveNumber: - def make(i: Int): Option[PositiveNumber] = //OK + def make(i: Int): Option[PositiveNumber] = //OK Option.when(i >= 0)(PositiveNumber(i)) // OK + +package foo.test.i16822: + enum ExampleEnum { + case Build(context: String) // OK + case List // OK + } + + def demo = { + val x = ExampleEnum.List // OK + println(x) // OK + } From 85fa542c390233350ddee0960133375fe9adff63 Mon Sep 17 00:00:00 2001 From: Kacper Korban Date: Thu, 16 Feb 2023 15:53:07 +0100 Subject: [PATCH 032/371] Register usage of symbols in non-inferred type trees in CheckUnused fixes lampepfl#16930 --- .../tools/dotc/transform/CheckUnused.scala | 59 ++++++++++--------- .../fatal-warnings/i16930.scala | 22 +++++++ 2 files changed, 54 insertions(+), 27 deletions(-) create mode 100644 tests/neg-custom-args/fatal-warnings/i16930.scala diff --git a/compiler/src/dotty/tools/dotc/transform/CheckUnused.scala b/compiler/src/dotty/tools/dotc/transform/CheckUnused.scala index 6c47c12ac07c..663f7f15b96f 100644 --- a/compiler/src/dotty/tools/dotc/transform/CheckUnused.scala +++ b/compiler/src/dotty/tools/dotc/transform/CheckUnused.scala @@ -46,12 +46,11 @@ class CheckUnused extends MiniPhase: */ private val _key = Property.Key[UnusedData] - extension (k: Property.Key[UnusedData]) - private def unusedDataApply[U](f: UnusedData => U)(using Context): Context = - ctx.property(_key).foreach(f) - ctx - private def getUnusedData(using Context): Option[UnusedData] = - ctx.property(_key) + private def unusedDataApply[U](f: UnusedData => U)(using Context): Context = + ctx.property(_key).foreach(f) + ctx + private def getUnusedData(using Context): Option[UnusedData] = + ctx.property(_key) override def phaseName: String = CheckUnused.phaseName @@ -71,7 +70,7 @@ class CheckUnused extends MiniPhase: // ========== END + REPORTING ========== override def transformUnit(tree: tpd.Tree)(using Context): tpd.Tree = - _key.unusedDataApply(ud => reportUnused(ud.getUnused)) + unusedDataApply(ud => reportUnused(ud.getUnused)) tree // ========== MiniPhase Prepare ========== @@ -81,15 +80,15 @@ class CheckUnused extends MiniPhase: ctx override def prepareForIdent(tree: tpd.Ident)(using Context): Context = - if tree.symbol.exists then - _key.unusedDataApply(_.registerUsed(tree.symbol, Some(tree.name))) + if tree.symbol.exists then + unusedDataApply(_.registerUsed(tree.symbol, Some(tree.name))) else if tree.hasType then - _key.unusedDataApply(_.registerUsed(tree.tpe.classSymbol, Some(tree.name))) + unusedDataApply(_.registerUsed(tree.tpe.classSymbol, Some(tree.name))) else ctx override def prepareForSelect(tree: tpd.Select)(using Context): Context = - _key.unusedDataApply(_.registerUsed(tree.symbol, Some(tree.name))) + unusedDataApply(_.registerUsed(tree.symbol, Some(tree.name))) override def prepareForBlock(tree: tpd.Block)(using Context): Context = pushInBlockTemplatePackageDef(tree) @@ -101,7 +100,7 @@ class CheckUnused extends MiniPhase: pushInBlockTemplatePackageDef(tree) override def prepareForValDef(tree: tpd.ValDef)(using Context): Context = - _key.unusedDataApply{ud => + unusedDataApply{ud => // do not register the ValDef generated for `object` if !tree.symbol.is(Module) then ud.registerDef(tree) @@ -109,7 +108,7 @@ class CheckUnused extends MiniPhase: } override def prepareForDefDef(tree: tpd.DefDef)(using Context): Context = - _key.unusedDataApply{ ud => + unusedDataApply{ ud => import ud.registerTrivial tree.registerTrivial ud.registerDef(tree) @@ -117,17 +116,17 @@ class CheckUnused extends MiniPhase: } override def prepareForTypeDef(tree: tpd.TypeDef)(using Context): Context = - _key.unusedDataApply{ ud => + unusedDataApply{ ud => if !tree.symbol.is(Param) then // Ignore type parameter (as Scala 2) ud.registerDef(tree) ud.addIgnoredUsage(tree.symbol) } override def prepareForBind(tree: tpd.Bind)(using Context): Context = - _key.unusedDataApply(_.registerPatVar(tree)) + unusedDataApply(_.registerPatVar(tree)) override def prepareForTypeTree(tree: tpd.TypeTree)(using Context): Context = - typeTraverser(_key.unusedDataApply).traverse(tree.tpe) + if !tree.isInstanceOf[tpd.InferredTypeTree] then typeTraverser(unusedDataApply).traverse(tree.tpe) ctx // ========== MiniPhase Transform ========== @@ -145,27 +144,27 @@ class CheckUnused extends MiniPhase: tree override def transformValDef(tree: tpd.ValDef)(using Context): tpd.Tree = - _key.unusedDataApply(_.removeIgnoredUsage(tree.symbol)) + unusedDataApply(_.removeIgnoredUsage(tree.symbol)) tree override def transformDefDef(tree: tpd.DefDef)(using Context): tpd.Tree = - _key.unusedDataApply(_.removeIgnoredUsage(tree.symbol)) + unusedDataApply(_.removeIgnoredUsage(tree.symbol)) tree override def transformTypeDef(tree: tpd.TypeDef)(using Context): tpd.Tree = - _key.unusedDataApply(_.removeIgnoredUsage(tree.symbol)) + unusedDataApply(_.removeIgnoredUsage(tree.symbol)) tree // ---------- MiniPhase HELPERS ----------- private def pushInBlockTemplatePackageDef(tree: tpd.Block | tpd.Template | tpd.PackageDef)(using Context): Context = - _key.unusedDataApply { ud => + unusedDataApply { ud => ud.pushScope(UnusedData.ScopeType.fromTree(tree)) } ctx private def popOutBlockTemplatePackageDef()(using Context): Context = - _key.unusedDataApply { ud => + unusedDataApply { ud => ud.popScope() } ctx @@ -188,7 +187,7 @@ class CheckUnused extends MiniPhase: val newCtx = if tree.symbol.exists then ctx.withOwner(tree.symbol) else ctx tree match case imp:tpd.Import => - _key.unusedDataApply(_.registerImport(imp)) + unusedDataApply(_.registerImport(imp)) traverseChildren(tree)(using newCtx) case ident: Ident => prepareForIdent(ident) @@ -198,7 +197,7 @@ class CheckUnused extends MiniPhase: traverseChildren(tree)(using newCtx) case _: (tpd.Block | tpd.Template | tpd.PackageDef) => //! DIFFERS FROM MINIPHASE - _key.unusedDataApply { ud => + unusedDataApply { ud => ud.inNewScope(ScopeType.fromTree(tree))(traverseChildren(tree)(using newCtx)) } case t:tpd.ValDef => @@ -216,9 +215,10 @@ class CheckUnused extends MiniPhase: case t: tpd.Bind => prepareForBind(t) traverseChildren(tree)(using newCtx) + case _: tpd.InferredTypeTree => case t@tpd.TypeTree() => //! DIFFERS FROM MINIPHASE - typeTraverser(_key.unusedDataApply).traverse(t.tpe) + typeTraverser(unusedDataApply).traverse(t.tpe) traverseChildren(tree)(using newCtx) case _ => //! DIFFERS FROM MINIPHASE @@ -228,9 +228,14 @@ class CheckUnused extends MiniPhase: /** This is a type traverser which catch some special Types not traversed by the term traverser above */ private def typeTraverser(dt: (UnusedData => Any) => Unit)(using Context) = new TypeTraverser: - override def traverse(tp: Type): Unit = tp match - case AnnotatedType(_, annot) => dt(_.registerUsed(annot.symbol, None)) - case _ => traverseChildren(tp) + override def traverse(tp: Type): Unit = + if tp.typeSymbol.exists then dt(_.registerUsed(tp.typeSymbol, Some(tp.typeSymbol.name))) + tp match + case AnnotatedType(_, annot) => + dt(_.registerUsed(annot.symbol, None)) + traverseChildren(tp) + case _ => + traverseChildren(tp) /** Do the actual reporting given the result of the anaylsis */ private def reportUnused(res: UnusedData.UnusedResult)(using Context): Unit = diff --git a/tests/neg-custom-args/fatal-warnings/i16930.scala b/tests/neg-custom-args/fatal-warnings/i16930.scala new file mode 100644 index 000000000000..1f6c5bf1a09f --- /dev/null +++ b/tests/neg-custom-args/fatal-warnings/i16930.scala @@ -0,0 +1,22 @@ +// scalac: -Wunused:imports + +trait Outer: + trait Used + trait Unused + +object Test { + val outer: Outer = ??? + import outer.{Used, Unused} // error + def foo(x: Any): Used = x.asInstanceOf[Used] +} + +trait Outer1: + trait UnusedToo1 + trait Unused1 + def unusedToo1: UnusedToo1 + +object Test1 { + val outer1: Outer1 = ??? + import outer1.{Unused1, UnusedToo1} // error // error + def foo() = outer1.unusedToo1 // in this case UnusedToo1 is not used explicitly, only inferred +} From ab28b090dc7e577bb7bf18313cd3c2920f2b70aa Mon Sep 17 00:00:00 2001 From: Paul Coral Date: Sat, 18 Feb 2023 16:41:59 +0100 Subject: [PATCH 033/371] Traverse annotations instead of just registering - Traverse the tree of annotations - Update test suits --- .../tools/dotc/transform/CheckUnused.scala | 20 +++++++++---------- .../fatal-warnings/i15503i.scala | 9 +++++++++ 2 files changed, 19 insertions(+), 10 deletions(-) diff --git a/compiler/src/dotty/tools/dotc/transform/CheckUnused.scala b/compiler/src/dotty/tools/dotc/transform/CheckUnused.scala index 663f7f15b96f..a07bf1e45247 100644 --- a/compiler/src/dotty/tools/dotc/transform/CheckUnused.scala +++ b/compiler/src/dotty/tools/dotc/transform/CheckUnused.scala @@ -28,6 +28,7 @@ import dotty.tools.dotc.core.Types.ConstantType import dotty.tools.dotc.core.NameKinds.WildcardParamName import dotty.tools.dotc.core.Types.TermRef import dotty.tools.dotc.core.Types.NameFilter +import dotty.tools.dotc.core.Symbols.Symbol @@ -80,7 +81,7 @@ class CheckUnused extends MiniPhase: ctx override def prepareForIdent(tree: tpd.Ident)(using Context): Context = - if tree.symbol.exists then + if tree.symbol.exists then unusedDataApply(_.registerUsed(tree.symbol, Some(tree.name))) else if tree.hasType then unusedDataApply(_.registerUsed(tree.tpe.classSymbol, Some(tree.name))) @@ -102,6 +103,7 @@ class CheckUnused extends MiniPhase: override def prepareForValDef(tree: tpd.ValDef)(using Context): Context = unusedDataApply{ud => // do not register the ValDef generated for `object` + traverseAnnotations(tree.symbol) if !tree.symbol.is(Module) then ud.registerDef(tree) ud.addIgnoredUsage(tree.symbol) @@ -111,6 +113,7 @@ class CheckUnused extends MiniPhase: unusedDataApply{ ud => import ud.registerTrivial tree.registerTrivial + traverseAnnotations(tree.symbol) ud.registerDef(tree) ud.addIgnoredUsage(tree.symbol) } @@ -118,11 +121,13 @@ class CheckUnused extends MiniPhase: override def prepareForTypeDef(tree: tpd.TypeDef)(using Context): Context = unusedDataApply{ ud => if !tree.symbol.is(Param) then // Ignore type parameter (as Scala 2) + traverseAnnotations(tree.symbol) ud.registerDef(tree) ud.addIgnoredUsage(tree.symbol) } override def prepareForBind(tree: tpd.Bind)(using Context): Context = + traverseAnnotations(tree.symbol) unusedDataApply(_.registerPatVar(tree)) override def prepareForTypeTree(tree: tpd.TypeTree)(using Context): Context = @@ -237,6 +242,10 @@ class CheckUnused extends MiniPhase: case _ => traverseChildren(tp) + /** This traverse the annotations of the symbol */ + private def traverseAnnotations(sym: Symbol)(using Context): Unit = + sym.denot.annotations.foreach(annot => traverser.traverse(annot.tree)) + /** Do the actual reporting given the result of the anaylsis */ private def reportUnused(res: UnusedData.UnusedResult)(using Context): Unit = import CheckUnused.WarnTypes @@ -279,7 +288,6 @@ object CheckUnused: private class UnusedData: import dotty.tools.dotc.transform.CheckUnused.UnusedData.UnusedResult import collection.mutable.{Set => MutSet, Map => MutMap, Stack => MutStack} - import dotty.tools.dotc.core.Symbols.Symbol import UnusedData.ScopeType /** The current scope during the tree traversal */ @@ -329,11 +337,6 @@ object CheckUnused: execInNewScope popScope() - /** Register all annotations of this symbol's denotation */ - def registerUsedAnnotation(sym: Symbol)(using Context): Unit = - val annotSym = sym.denot.annotations.map(_.symbol) - annotSym.foreach(s => registerUsed(s, None)) - /** * Register a found (used) symbol along with its name * @@ -368,8 +371,6 @@ object CheckUnused: /** Register (or not) some `val` or `def` according to the context, scope and flags */ def registerDef(memDef: tpd.MemberDef)(using Context): Unit = - // register the annotations for usage - registerUsedAnnotation(memDef.symbol) if memDef.isValidMemberDef then if memDef.isValidParam then if memDef.symbol.isOneOf(GivenOrImplicit) then @@ -383,7 +384,6 @@ object CheckUnused: /** Register pattern variable */ def registerPatVar(patvar: tpd.Bind)(using Context): Unit = - registerUsedAnnotation(patvar.symbol) if !patvar.symbol.isUnusedAnnot then patVarsInScope += patvar diff --git a/tests/neg-custom-args/fatal-warnings/i15503i.scala b/tests/neg-custom-args/fatal-warnings/i15503i.scala index 7eae207d952d..ccf9344319d2 100644 --- a/tests/neg-custom-args/fatal-warnings/i15503i.scala +++ b/tests/neg-custom-args/fatal-warnings/i15503i.scala @@ -142,3 +142,12 @@ package foo.test.i16822: val x = ExampleEnum.List // OK println(x) // OK } + +package foo.test.i16877: + import scala.collection.immutable.HashMap // OK + import scala.annotation.StaticAnnotation // OK + + class ExampleAnnotation(val a: Object) extends StaticAnnotation // OK + + @ExampleAnnotation(new HashMap()) // OK + class Test //OK From d4f8c740c432b0d1ef328235ee651ce91a09094e Mon Sep 17 00:00:00 2001 From: Paul Coral Date: Sat, 18 Feb 2023 17:16:38 +0100 Subject: [PATCH 034/371] Ignore parameter of accessors - Do not report parameter of accessors - Update test suit --- .../tools/dotc/transform/CheckUnused.scala | 2 +- .../fatal-warnings/i15503e.scala | 3 ++ .../fatal-warnings/i15503i.scala | 43 +++++++++++++++++++ 3 files changed, 47 insertions(+), 1 deletion(-) diff --git a/compiler/src/dotty/tools/dotc/transform/CheckUnused.scala b/compiler/src/dotty/tools/dotc/transform/CheckUnused.scala index a07bf1e45247..a1ccccdb12e2 100644 --- a/compiler/src/dotty/tools/dotc/transform/CheckUnused.scala +++ b/compiler/src/dotty/tools/dotc/transform/CheckUnused.scala @@ -621,7 +621,7 @@ object CheckUnused: (sym.is(Param) || sym.isAllOf(PrivateParamAccessor | Local, butNot = CaseAccessor)) && !isSyntheticMainParam(sym) && !sym.shouldNotReportParamOwner && - (!sym.exists || !sym.owner.isAllOf(Synthetic | PrivateLocal)) + (!sym.exists || !(sym.owner.isAllOf(Synthetic | PrivateLocal) || sym.owner.is(Accessor))) private def shouldReportPrivateDef(using Context): Boolean = currScopeType.top == ScopeType.Template && !memDef.symbol.isConstructor && memDef.symbol.is(Private, butNot = SelfName | Synthetic | CaseAccessor) diff --git a/tests/neg-custom-args/fatal-warnings/i15503e.scala b/tests/neg-custom-args/fatal-warnings/i15503e.scala index 79112942a205..cd56587327cd 100644 --- a/tests/neg-custom-args/fatal-warnings/i15503e.scala +++ b/tests/neg-custom-args/fatal-warnings/i15503e.scala @@ -52,3 +52,6 @@ package foo.test.trivial: def f77(x: Int) = foo // error } object Y + +package foo.test.i16955: + class S(var r: String) // OK \ No newline at end of file diff --git a/tests/neg-custom-args/fatal-warnings/i15503i.scala b/tests/neg-custom-args/fatal-warnings/i15503i.scala index ccf9344319d2..82fb9acf7ace 100644 --- a/tests/neg-custom-args/fatal-warnings/i15503i.scala +++ b/tests/neg-custom-args/fatal-warnings/i15503i.scala @@ -126,6 +126,49 @@ package foo.test.possibleclasses: def a = k + y + s + t + z } +package foo.test.possibleclasses.withvar: + case class AllCaseClass( + k: Int, // OK + private var y: Int // OK /* Kept as it can be taken from pattern */ + )( + s: Int, // error /* But not these */ + var t: Int, // OK + private var z: Int // error + ) + + case class AllCaseUsed( + k: Int, // OK + private var y: Int // OK + )( + s: Int, // OK + var t: Int, // OK + private var z: Int // OK + ) { + def a = k + y + s + t + z + } + + class AllClass( + k: Int, // error + private var y: Int // error + )( + s: Int, // error + var t: Int, // OK + private var z: Int // error + ) + + class AllUsed( + k: Int, // OK + private var y: Int // OK + )( + s: Int, // OK + var t: Int, // OK + private var z: Int // OK + ) { + def a = k + y + s + t + z + } + + + package foo.test.from.i16675: case class PositiveNumber private (i: Int) // OK object PositiveNumber: From 49686f87ba6ab3695a4192c155eff9715bc79dd9 Mon Sep 17 00:00:00 2001 From: Paul Coral Date: Sun, 19 Feb 2023 12:35:23 +0100 Subject: [PATCH 035/371] Improve override detection in CheckUnused - CheckUnused detects override from base type in addition of `override` flag - Update test suit --- .../tools/dotc/transform/CheckUnused.scala | 19 +++++++++++++------ .../fatal-warnings/i15503e.scala | 14 +++++++++++++- 2 files changed, 26 insertions(+), 7 deletions(-) diff --git a/compiler/src/dotty/tools/dotc/transform/CheckUnused.scala b/compiler/src/dotty/tools/dotc/transform/CheckUnused.scala index a1ccccdb12e2..49ce64b00b88 100644 --- a/compiler/src/dotty/tools/dotc/transform/CheckUnused.scala +++ b/compiler/src/dotty/tools/dotc/transform/CheckUnused.scala @@ -574,12 +574,14 @@ object CheckUnused: private def shouldNotReportParamOwner(using Context): Boolean = if sym.exists then val owner = sym.owner - trivialDefs(owner) || - owner.is(Flags.Override) || + trivialDefs(owner) || // is a trivial def owner.isPrimaryConstructor || - owner.annotations.exists ( + owner.annotations.exists ( // @depreacated _.symbol == ctx.definitions.DeprecatedAnnot - ) + ) || + owner.isAllOf(Synthetic | PrivateLocal) || + owner.is(Accessor) || + owner.isOverriden else false @@ -589,6 +591,11 @@ object CheckUnused: private def everySymbol(using Context): List[Symbol] = List(sym, sym.companionClass, sym.companionModule, sym.moduleClass).filter(_.exists) + /** A function is overriden. Either has `override flags` or parent has a matching member (type and name) */ + private def isOverriden(using Context): Boolean = + sym.is(Flags.Override) || + (if sym.exists then sym.owner.thisType.parents.exists(p => sym.matchingMember(p).exists) else false) + end extension extension (defdef: tpd.DefDef) @@ -620,8 +627,8 @@ object CheckUnused: val sym = memDef.symbol (sym.is(Param) || sym.isAllOf(PrivateParamAccessor | Local, butNot = CaseAccessor)) && !isSyntheticMainParam(sym) && - !sym.shouldNotReportParamOwner && - (!sym.exists || !(sym.owner.isAllOf(Synthetic | PrivateLocal) || sym.owner.is(Accessor))) + !sym.shouldNotReportParamOwner + private def shouldReportPrivateDef(using Context): Boolean = currScopeType.top == ScopeType.Template && !memDef.symbol.isConstructor && memDef.symbol.is(Private, butNot = SelfName | Synthetic | CaseAccessor) diff --git a/tests/neg-custom-args/fatal-warnings/i15503e.scala b/tests/neg-custom-args/fatal-warnings/i15503e.scala index cd56587327cd..56aec702a39e 100644 --- a/tests/neg-custom-args/fatal-warnings/i15503e.scala +++ b/tests/neg-custom-args/fatal-warnings/i15503e.scala @@ -54,4 +54,16 @@ package foo.test.trivial: object Y package foo.test.i16955: - class S(var r: String) // OK \ No newline at end of file + class S(var r: String) // OK + +package foo.test.i16865: + trait Foo: + def fn(a: Int, b: Int): Int // OK + trait Bar extends Foo + + object Ex extends Bar: + def fn(a: Int, b: Int): Int = b + 3 // OK + + object Ex2 extends Bar: + override def fn(a: Int, b: Int): Int = b + 3 // OK + From e03fa1b7fef5a6d5671208ac6b7bf1b57f935ec1 Mon Sep 17 00:00:00 2001 From: Szymon Rodziewicz Date: Mon, 27 Feb 2023 16:03:10 +0100 Subject: [PATCH 036/371] WUnused: Fix unused warnining in synthetic symbols --- compiler/src/dotty/tools/dotc/transform/CheckUnused.scala | 8 ++++++-- tests/neg-custom-args/fatal-warnings/i16925.scala | 8 ++++++++ tests/neg-custom-args/fatal-warnings/i16926.scala | 7 +++++++ 3 files changed, 21 insertions(+), 2 deletions(-) create mode 100644 tests/neg-custom-args/fatal-warnings/i16925.scala create mode 100644 tests/neg-custom-args/fatal-warnings/i16926.scala diff --git a/compiler/src/dotty/tools/dotc/transform/CheckUnused.scala b/compiler/src/dotty/tools/dotc/transform/CheckUnused.scala index 49ce64b00b88..9f3f5aded50c 100644 --- a/compiler/src/dotty/tools/dotc/transform/CheckUnused.scala +++ b/compiler/src/dotty/tools/dotc/transform/CheckUnused.scala @@ -7,13 +7,13 @@ import dotty.tools.dotc.ast.untpd.ImportSelector import dotty.tools.dotc.config.ScalaSettings import dotty.tools.dotc.core.Contexts.* import dotty.tools.dotc.core.Decorators.{em, i} -import dotty.tools.dotc.core.Flags._ +import dotty.tools.dotc.core.Flags.* import dotty.tools.dotc.core.Phases.Phase import dotty.tools.dotc.core.StdNames import dotty.tools.dotc.report import dotty.tools.dotc.reporting.Message import dotty.tools.dotc.typer.ImportInfo -import dotty.tools.dotc.util.Property +import dotty.tools.dotc.util.{Property, SourcePosition, SrcPos} import dotty.tools.dotc.core.Mode import dotty.tools.dotc.core.Types.TypeTraverser import dotty.tools.dotc.core.Types.Type @@ -302,6 +302,7 @@ object CheckUnused: * See the `isAccessibleAsIdent` extension method below in the file */ private val usedInScope = MutStack(MutSet[(Symbol,Boolean, Option[Name])]()) + private val usedInPosition = MutSet[(SrcPos, Name)]() /* unused import collected during traversal */ private val unusedImport = MutSet[ImportSelector]() @@ -351,6 +352,7 @@ object CheckUnused: usedInScope.top += ((sym, sym.isAccessibleAsIdent, name)) usedInScope.top += ((sym.companionModule, sym.isAccessibleAsIdent, name)) usedInScope.top += ((sym.companionClass, sym.isAccessibleAsIdent, name)) + name.map(n => usedInPosition += ((sym.sourcePos, n))) /** Register a symbol that should be ignored */ def addIgnoredUsage(sym: Symbol)(using Context): Unit = @@ -455,6 +457,7 @@ object CheckUnused: if ctx.settings.WunusedHas.locals then localDefInScope .filterNot(d => d.symbol.usedDefContains) + .filterNot(d => usedInPosition.exists { case (pos, name) => d.span.contains(pos.span) && name == d.symbol.name}) .map(d => d.namePos -> WarnTypes.LocalDefs).toList else Nil @@ -483,6 +486,7 @@ object CheckUnused: if ctx.settings.WunusedHas.patvars then patVarsInScope .filterNot(d => d.symbol.usedDefContains) + .filterNot(d => usedInPosition.exists { case (pos, name) => d.span.contains(pos.span) && name == d.symbol.name}) .map(d => d.namePos -> WarnTypes.PatVars).toList else Nil diff --git a/tests/neg-custom-args/fatal-warnings/i16925.scala b/tests/neg-custom-args/fatal-warnings/i16925.scala new file mode 100644 index 000000000000..5cc94f53cdd4 --- /dev/null +++ b/tests/neg-custom-args/fatal-warnings/i16925.scala @@ -0,0 +1,8 @@ +// scalac: -Wunused:all + +def hello = + for { + i <- 1 to 2 if true + _ = println(i) // OK + } yield () + diff --git a/tests/neg-custom-args/fatal-warnings/i16926.scala b/tests/neg-custom-args/fatal-warnings/i16926.scala new file mode 100644 index 000000000000..23f167f4ce30 --- /dev/null +++ b/tests/neg-custom-args/fatal-warnings/i16926.scala @@ -0,0 +1,7 @@ +// scalac: -Wunused:all + +def hello(): Unit = + for { + i <- (0 to 10).toList + (a, b) = "hello" -> "world" // OK + } yield println(s"$a $b") From 606608a7935630be69a0b9e4fa149ce0e3bc55fa Mon Sep 17 00:00:00 2001 From: Szymon Rodziewicz Date: Tue, 28 Feb 2023 12:55:33 +0100 Subject: [PATCH 037/371] Move tests --- tests/neg-custom-args/fatal-warnings/i15503i.scala | 14 ++++++++++++++ tests/neg-custom-args/fatal-warnings/i16925.scala | 8 -------- tests/neg-custom-args/fatal-warnings/i16926.scala | 7 ------- 3 files changed, 14 insertions(+), 15 deletions(-) delete mode 100644 tests/neg-custom-args/fatal-warnings/i16925.scala delete mode 100644 tests/neg-custom-args/fatal-warnings/i16926.scala diff --git a/tests/neg-custom-args/fatal-warnings/i15503i.scala b/tests/neg-custom-args/fatal-warnings/i15503i.scala index 82fb9acf7ace..ab83e1dafb3b 100644 --- a/tests/neg-custom-args/fatal-warnings/i15503i.scala +++ b/tests/neg-custom-args/fatal-warnings/i15503i.scala @@ -194,3 +194,17 @@ package foo.test.i16877: @ExampleAnnotation(new HashMap()) // OK class Test //OK + +package foo.test.i16926: + def hello(): Unit = + for { + i <- (0 to 10).toList + (a, b) = "hello" -> "world" // OK + } yield println(s"$a $b") + +package foo.test.i16925: + def hello = + for { + i <- 1 to 2 if true + _ = println(i) // OK + } yield () \ No newline at end of file diff --git a/tests/neg-custom-args/fatal-warnings/i16925.scala b/tests/neg-custom-args/fatal-warnings/i16925.scala deleted file mode 100644 index 5cc94f53cdd4..000000000000 --- a/tests/neg-custom-args/fatal-warnings/i16925.scala +++ /dev/null @@ -1,8 +0,0 @@ -// scalac: -Wunused:all - -def hello = - for { - i <- 1 to 2 if true - _ = println(i) // OK - } yield () - diff --git a/tests/neg-custom-args/fatal-warnings/i16926.scala b/tests/neg-custom-args/fatal-warnings/i16926.scala deleted file mode 100644 index 23f167f4ce30..000000000000 --- a/tests/neg-custom-args/fatal-warnings/i16926.scala +++ /dev/null @@ -1,7 +0,0 @@ -// scalac: -Wunused:all - -def hello(): Unit = - for { - i <- (0 to 10).toList - (a, b) = "hello" -> "world" // OK - } yield println(s"$a $b") From b050bdaf495f0e04b77fb60e3ed107176be75dcd Mon Sep 17 00:00:00 2001 From: Szymon Rodziewicz Date: Tue, 28 Feb 2023 17:54:21 +0100 Subject: [PATCH 038/371] Remove unused import --- compiler/src/dotty/tools/dotc/transform/CheckUnused.scala | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/compiler/src/dotty/tools/dotc/transform/CheckUnused.scala b/compiler/src/dotty/tools/dotc/transform/CheckUnused.scala index 9f3f5aded50c..e7e6e1c4952c 100644 --- a/compiler/src/dotty/tools/dotc/transform/CheckUnused.scala +++ b/compiler/src/dotty/tools/dotc/transform/CheckUnused.scala @@ -13,7 +13,7 @@ import dotty.tools.dotc.core.StdNames import dotty.tools.dotc.report import dotty.tools.dotc.reporting.Message import dotty.tools.dotc.typer.ImportInfo -import dotty.tools.dotc.util.{Property, SourcePosition, SrcPos} +import dotty.tools.dotc.util.{Property, SrcPos} import dotty.tools.dotc.core.Mode import dotty.tools.dotc.core.Types.TypeTraverser import dotty.tools.dotc.core.Types.Type From 2d41b4624a1a805137e55de1a7d7dd4acd921ffe Mon Sep 17 00:00:00 2001 From: Szymon Rodziewicz Date: Mon, 13 Mar 2023 15:11:53 +0100 Subject: [PATCH 039/371] Fix WUnused with indents in derived code --- .../tools/dotc/transform/CheckUnused.scala | 14 +++++++------- .../fatal-warnings/i15503i.scala | 19 ++++++++++++++++++- 2 files changed, 25 insertions(+), 8 deletions(-) diff --git a/compiler/src/dotty/tools/dotc/transform/CheckUnused.scala b/compiler/src/dotty/tools/dotc/transform/CheckUnused.scala index e7e6e1c4952c..c0ea483efea9 100644 --- a/compiler/src/dotty/tools/dotc/transform/CheckUnused.scala +++ b/compiler/src/dotty/tools/dotc/transform/CheckUnused.scala @@ -15,19 +15,14 @@ import dotty.tools.dotc.reporting.Message import dotty.tools.dotc.typer.ImportInfo import dotty.tools.dotc.util.{Property, SrcPos} import dotty.tools.dotc.core.Mode -import dotty.tools.dotc.core.Types.TypeTraverser -import dotty.tools.dotc.core.Types.Type -import dotty.tools.dotc.core.Types.AnnotatedType +import dotty.tools.dotc.core.Types.{AnnotatedType, ConstantType, NoType, TermRef, Type, TypeTraverser} import dotty.tools.dotc.core.Flags.flagsString import dotty.tools.dotc.core.Flags import dotty.tools.dotc.core.Names.Name import dotty.tools.dotc.transform.MegaPhase.MiniPhase import dotty.tools.dotc.core.Annotations import dotty.tools.dotc.core.Definitions -import dotty.tools.dotc.core.Types.ConstantType import dotty.tools.dotc.core.NameKinds.WildcardParamName -import dotty.tools.dotc.core.Types.TermRef -import dotty.tools.dotc.core.Types.NameFilter import dotty.tools.dotc.core.Symbols.Symbol @@ -82,6 +77,12 @@ class CheckUnused extends MiniPhase: override def prepareForIdent(tree: tpd.Ident)(using Context): Context = if tree.symbol.exists then + val prefixes = LazyList.iterate(tree.typeOpt.normalizedPrefix)(_.normalizedPrefix).takeWhile(_ != NoType) + for { + prefix <- prefixes + } { + unusedDataApply(_.registerUsed(prefix.classSymbol, None)) + } unusedDataApply(_.registerUsed(tree.symbol, Some(tree.name))) else if tree.hasType then unusedDataApply(_.registerUsed(tree.tpe.classSymbol, Some(tree.name))) @@ -409,7 +410,6 @@ object CheckUnused: val kept = used.filterNot { t => val (sym, isAccessible, optName) = t // keep the symbol for outer scope, if it matches **no** import - // This is the first matching wildcard selector var selWildCard: Option[ImportSelector] = None diff --git a/tests/neg-custom-args/fatal-warnings/i15503i.scala b/tests/neg-custom-args/fatal-warnings/i15503i.scala index ab83e1dafb3b..9f8416146af4 100644 --- a/tests/neg-custom-args/fatal-warnings/i15503i.scala +++ b/tests/neg-custom-args/fatal-warnings/i15503i.scala @@ -207,4 +207,21 @@ package foo.test.i16925: for { i <- 1 to 2 if true _ = println(i) // OK - } yield () \ No newline at end of file + } yield () + +package foo.test.i16679: + object myPackage: + trait CaseClassName[A]: + def name: String + object CaseClassName: + trait CaseClassByStringName[A] extends CaseClassName[A] + import scala.deriving.Mirror + object CaseClassByStringName: + inline final def derived[A](using inline A: Mirror.Of[A]): CaseClassByStringName[A] = + new CaseClassByStringName[A]: + def name: String = A.toString + + object secondPackage: + import myPackage.CaseClassName // OK + case class CoolClass(i: Int) derives CaseClassName.CaseClassByStringName + println(summon[CaseClassName[CoolClass]].name) From 87d9e947b95656c33890061a7ce2d3e7a5aa8758 Mon Sep 17 00:00:00 2001 From: Szymon Rodziewicz Date: Mon, 13 Mar 2023 15:38:12 +0100 Subject: [PATCH 040/371] Add failsafe for a case where prefixes in CheckUnused/prepareIndent formed an infinite cycle --- compiler/src/dotty/tools/dotc/transform/CheckUnused.scala | 1 + 1 file changed, 1 insertion(+) diff --git a/compiler/src/dotty/tools/dotc/transform/CheckUnused.scala b/compiler/src/dotty/tools/dotc/transform/CheckUnused.scala index c0ea483efea9..66b0876668be 100644 --- a/compiler/src/dotty/tools/dotc/transform/CheckUnused.scala +++ b/compiler/src/dotty/tools/dotc/transform/CheckUnused.scala @@ -78,6 +78,7 @@ class CheckUnused extends MiniPhase: override def prepareForIdent(tree: tpd.Ident)(using Context): Context = if tree.symbol.exists then val prefixes = LazyList.iterate(tree.typeOpt.normalizedPrefix)(_.normalizedPrefix).takeWhile(_ != NoType) + .take(10) // Failsafe for the odd case if there was an infinite cycle for { prefix <- prefixes } { From 8bdef2ff1d6e7189f5ee959ba0b4ac66f7713c47 Mon Sep 17 00:00:00 2001 From: Szymon Rodziewicz Date: Tue, 14 Mar 2023 15:53:10 +0100 Subject: [PATCH 041/371] Fix for formatting and traverse call of inlined tree in wunused --- .../src/dotty/tools/dotc/transform/CheckUnused.scala | 12 +++++++----- 1 file changed, 7 insertions(+), 5 deletions(-) diff --git a/compiler/src/dotty/tools/dotc/transform/CheckUnused.scala b/compiler/src/dotty/tools/dotc/transform/CheckUnused.scala index 66b0876668be..35153dbf66e9 100644 --- a/compiler/src/dotty/tools/dotc/transform/CheckUnused.scala +++ b/compiler/src/dotty/tools/dotc/transform/CheckUnused.scala @@ -1,7 +1,7 @@ package dotty.tools.dotc.transform import dotty.tools.dotc.ast.tpd -import dotty.tools.dotc.ast.tpd.TreeTraverser +import dotty.tools.dotc.ast.tpd.{Inlined, TreeTraverser} import dotty.tools.dotc.ast.untpd import dotty.tools.dotc.ast.untpd.ImportSelector import dotty.tools.dotc.config.ScalaSettings @@ -59,6 +59,7 @@ class CheckUnused extends MiniPhase: // ========== SETUP ============ override def prepareForUnit(tree: tpd.Tree)(using Context): Context = + println(tree) val data = UnusedData() val fresh = ctx.fresh.setProperty(_key, data) fresh @@ -75,15 +76,16 @@ class CheckUnused extends MiniPhase: traverser.traverse(tree) ctx + def prepareForInlined(tree: Inlined)(using Context): Context = + traverser.traverse(tree.call) + ctx + override def prepareForIdent(tree: tpd.Ident)(using Context): Context = if tree.symbol.exists then val prefixes = LazyList.iterate(tree.typeOpt.normalizedPrefix)(_.normalizedPrefix).takeWhile(_ != NoType) .take(10) // Failsafe for the odd case if there was an infinite cycle - for { - prefix <- prefixes - } { + for prefix <- prefixes do unusedDataApply(_.registerUsed(prefix.classSymbol, None)) - } unusedDataApply(_.registerUsed(tree.symbol, Some(tree.name))) else if tree.hasType then unusedDataApply(_.registerUsed(tree.tpe.classSymbol, Some(tree.name))) From 813a43b9ec34e59250d8d28b4d9fac777b1e44a9 Mon Sep 17 00:00:00 2001 From: Szymon Rodziewicz Date: Tue, 14 Mar 2023 16:11:07 +0100 Subject: [PATCH 042/371] Add test for wunused Inlined call --- .../tools/dotc/transform/CheckUnused.scala | 3 +-- .../fatal-warnings/i15503i.scala | 22 ++++++++++++++++++- 2 files changed, 22 insertions(+), 3 deletions(-) diff --git a/compiler/src/dotty/tools/dotc/transform/CheckUnused.scala b/compiler/src/dotty/tools/dotc/transform/CheckUnused.scala index 35153dbf66e9..5e4ed6f6e0df 100644 --- a/compiler/src/dotty/tools/dotc/transform/CheckUnused.scala +++ b/compiler/src/dotty/tools/dotc/transform/CheckUnused.scala @@ -59,7 +59,6 @@ class CheckUnused extends MiniPhase: // ========== SETUP ============ override def prepareForUnit(tree: tpd.Tree)(using Context): Context = - println(tree) val data = UnusedData() val fresh = ctx.fresh.setProperty(_key, data) fresh @@ -76,7 +75,7 @@ class CheckUnused extends MiniPhase: traverser.traverse(tree) ctx - def prepareForInlined(tree: Inlined)(using Context): Context = + override def prepareForInlined(tree: tpd.Inlined)(using Context): Context = traverser.traverse(tree.call) ctx diff --git a/tests/neg-custom-args/fatal-warnings/i15503i.scala b/tests/neg-custom-args/fatal-warnings/i15503i.scala index 9f8416146af4..3dd4d1fc61e7 100644 --- a/tests/neg-custom-args/fatal-warnings/i15503i.scala +++ b/tests/neg-custom-args/fatal-warnings/i15503i.scala @@ -209,7 +209,7 @@ package foo.test.i16925: _ = println(i) // OK } yield () -package foo.test.i16679: +package foo.test.i16679a: object myPackage: trait CaseClassName[A]: def name: String @@ -225,3 +225,23 @@ package foo.test.i16679: import myPackage.CaseClassName // OK case class CoolClass(i: Int) derives CaseClassName.CaseClassByStringName println(summon[CaseClassName[CoolClass]].name) + +package foo.test.i16679b: + object myPackage: + trait CaseClassName[A]: + def name: String + + object CaseClassName: + import scala.deriving.Mirror + inline final def derived[A](using inline A: Mirror.Of[A]): CaseClassName[A] = + new CaseClassName[A]: + def name: String = A.toString + + object Foo: + given x: myPackage.CaseClassName[secondPackage.CoolClass] = null + + object secondPackage: + import myPackage.CaseClassName // OK + import Foo.x + case class CoolClass(i: Int) + println(summon[myPackage.CaseClassName[CoolClass]]) From 06acf90c86e8285b510746556f2b87671a742ebf Mon Sep 17 00:00:00 2001 From: Szymon Rodziewicz Date: Tue, 7 Mar 2023 15:18:52 +0100 Subject: [PATCH 043/371] WUnused: Fix for symbols with synthetic names and unused transparent inlines --- .../tools/dotc/transform/CheckUnused.scala | 27 +++++++++++++++++- .../fatal-warnings/i15503i.scala | 28 +++++++++++++++++++ 2 files changed, 54 insertions(+), 1 deletion(-) diff --git a/compiler/src/dotty/tools/dotc/transform/CheckUnused.scala b/compiler/src/dotty/tools/dotc/transform/CheckUnused.scala index 5e4ed6f6e0df..cc51d9bfbe64 100644 --- a/compiler/src/dotty/tools/dotc/transform/CheckUnused.scala +++ b/compiler/src/dotty/tools/dotc/transform/CheckUnused.scala @@ -10,7 +10,7 @@ import dotty.tools.dotc.core.Decorators.{em, i} import dotty.tools.dotc.core.Flags.* import dotty.tools.dotc.core.Phases.Phase import dotty.tools.dotc.core.StdNames -import dotty.tools.dotc.report +import dotty.tools.dotc.{ast, report} import dotty.tools.dotc.reporting.Message import dotty.tools.dotc.typer.ImportInfo import dotty.tools.dotc.util.{Property, SrcPos} @@ -432,6 +432,20 @@ object CheckUnused: else exists } + + // not report unused transparent inline imports + for { + imp <- imports + sel <- imp.selectors + } { + if unusedImport.contains(sel) then + val tpd.Import(qual, _) = imp + val importedMembers = qual.tpe.member(sel.name).alternatives.map(_.symbol) + val isTransparentAndInline = importedMembers.exists(s => s.is(Transparent) && s.is(Inline)) + if isTransparentAndInline then + unusedImport -= sel + } + // if there's an outer scope if usedInScope.nonEmpty then // we keep the symbols not referencing an import in this scope @@ -450,6 +464,7 @@ object CheckUnused: */ def getUnused(using Context): UnusedResult = popScope() + val sortedImp = if ctx.settings.WunusedHas.imports || ctx.settings.WunusedHas.strictNoImplicitWarn then unusedImport.map(d => d.srcPos -> WarnTypes.Imports).toList @@ -460,6 +475,7 @@ object CheckUnused: localDefInScope .filterNot(d => d.symbol.usedDefContains) .filterNot(d => usedInPosition.exists { case (pos, name) => d.span.contains(pos.span) && name == d.symbol.name}) + .filterNot(d => containsSyntheticSuffix(d.symbol)) .map(d => d.namePos -> WarnTypes.LocalDefs).toList else Nil @@ -467,6 +483,7 @@ object CheckUnused: if ctx.settings.WunusedHas.explicits then explicitParamInScope .filterNot(d => d.symbol.usedDefContains) + .filterNot(d => containsSyntheticSuffix(d.symbol)) .map(d => d.namePos -> WarnTypes.ExplicitParams).toList else Nil @@ -474,6 +491,7 @@ object CheckUnused: if ctx.settings.WunusedHas.implicits then implicitParamInScope .filterNot(d => d.symbol.usedDefContains) + .filterNot(d => containsSyntheticSuffix(d.symbol)) .map(d => d.namePos -> WarnTypes.ImplicitParams).toList else Nil @@ -481,6 +499,7 @@ object CheckUnused: if ctx.settings.WunusedHas.privates then privateDefInScope .filterNot(d => d.symbol.usedDefContains) + .filterNot(d => containsSyntheticSuffix(d.symbol)) .map(d => d.namePos -> WarnTypes.PrivateMembers).toList else Nil @@ -488,6 +507,7 @@ object CheckUnused: if ctx.settings.WunusedHas.patvars then patVarsInScope .filterNot(d => d.symbol.usedDefContains) + .filterNot(d => containsSyntheticSuffix(d.symbol)) .filterNot(d => usedInPosition.exists { case (pos, name) => d.span.contains(pos.span) && name == d.symbol.name}) .map(d => d.namePos -> WarnTypes.PatVars).toList else @@ -500,6 +520,11 @@ object CheckUnused: end getUnused //============================ HELPERS ==================================== + /** + * Heuristic to detect synthetic suffixes in names of symbols + */ + private def containsSyntheticSuffix(symbol: Symbol)(using Context): Boolean = + symbol.name.mangledString.contains("$") /** * Is the the constructor of synthetic package object * Should be ignored as it is always imported/used in package diff --git a/tests/neg-custom-args/fatal-warnings/i15503i.scala b/tests/neg-custom-args/fatal-warnings/i15503i.scala index 3dd4d1fc61e7..9ac2ec5ef622 100644 --- a/tests/neg-custom-args/fatal-warnings/i15503i.scala +++ b/tests/neg-custom-args/fatal-warnings/i15503i.scala @@ -209,6 +209,34 @@ package foo.test.i16925: _ = println(i) // OK } yield () +package foo.test.i16863a: + import scala.quoted.* + def fn(using Quotes) = + val x = Expr(1) + '{ $x + 2 } // OK + +package foo.test.i16863b: + import scala.quoted.* + def fn[A](using Quotes, Type[A]) = // OK + val numeric = Expr.summon[Numeric[A]].getOrElse(???) + '{ $numeric.fromInt(3) } // OK + +package foo.test.i16863c: + import scala.quoted.* + def fn[A](expr: Expr[Any])(using Quotes) = + val imp = expr match + case '{ ${ _ }: a } => Expr.summon[Numeric[a]] // OK + println(imp) + +package foo.test.i16863d: + import scala.quoted.* + import scala.compiletime.asMatchable // OK + def fn[A](using Quotes, Type[A]) = + import quotes.reflect.* + val imp = TypeRepr.of[A].widen.asMatchable match + case Refinement(_,_,_) => () + println(imp) + package foo.test.i16679a: object myPackage: trait CaseClassName[A]: From 0f6c42e9630402a58d0bccfb2e06d9badfb479e3 Mon Sep 17 00:00:00 2001 From: Szymon Rodziewicz Date: Wed, 8 Mar 2023 13:24:54 +0100 Subject: [PATCH 044/371] Adjust assertions in test --- .../fatal-warnings/i15503-scala2/scala2-t11681.scala | 4 ++-- tests/neg-custom-args/fatal-warnings/i15503b.scala | 2 +- 2 files changed, 3 insertions(+), 3 deletions(-) diff --git a/tests/neg-custom-args/fatal-warnings/i15503-scala2/scala2-t11681.scala b/tests/neg-custom-args/fatal-warnings/i15503-scala2/scala2-t11681.scala index f04129a19e48..18aa6879eeba 100644 --- a/tests/neg-custom-args/fatal-warnings/i15503-scala2/scala2-t11681.scala +++ b/tests/neg-custom-args/fatal-warnings/i15503-scala2/scala2-t11681.scala @@ -100,9 +100,9 @@ trait Anonymous { trait Context[A] trait Implicits { def f[A](implicit ctx: Context[A]) = answer // error - def g[A: Context] = answer // error + def g[A: Context] = answer // OK } -class Bound[A: Context] // error +class Bound[A: Context] // OK object Answers { def answer: Int = 42 } diff --git a/tests/neg-custom-args/fatal-warnings/i15503b.scala b/tests/neg-custom-args/fatal-warnings/i15503b.scala index 19bcd01a8dde..8a4a055150f9 100644 --- a/tests/neg-custom-args/fatal-warnings/i15503b.scala +++ b/tests/neg-custom-args/fatal-warnings/i15503b.scala @@ -75,7 +75,7 @@ package foo.scala2.tests: object Types { def l1() = { - object HiObject { def f = this } // error + object HiObject { def f = this } // OK class Hi { // error def f1: Hi = new Hi def f2(x: Hi) = x From fbc65010bffac8fe6de36b958216fcfd430e1770 Mon Sep 17 00:00:00 2001 From: Szymon Rodziewicz Date: Fri, 10 Mar 2023 17:36:15 +0100 Subject: [PATCH 045/371] Check if import contains transparent inline in registerImport --- .../tools/dotc/transform/CheckUnused.scala | 29 +++++++++---------- .../fatal-warnings/i15503f.scala | 2 +- .../fatal-warnings/i15503g.scala | 4 +-- 3 files changed, 17 insertions(+), 18 deletions(-) diff --git a/compiler/src/dotty/tools/dotc/transform/CheckUnused.scala b/compiler/src/dotty/tools/dotc/transform/CheckUnused.scala index cc51d9bfbe64..9b2fd122f68a 100644 --- a/compiler/src/dotty/tools/dotc/transform/CheckUnused.scala +++ b/compiler/src/dotty/tools/dotc/transform/CheckUnused.scala @@ -10,7 +10,7 @@ import dotty.tools.dotc.core.Decorators.{em, i} import dotty.tools.dotc.core.Flags.* import dotty.tools.dotc.core.Phases.Phase import dotty.tools.dotc.core.StdNames -import dotty.tools.dotc.{ast, report} +import dotty.tools.dotc.report import dotty.tools.dotc.reporting.Message import dotty.tools.dotc.typer.ImportInfo import dotty.tools.dotc.util.{Property, SrcPos} @@ -368,7 +368,7 @@ object CheckUnused: /** Register an import */ def registerImport(imp: tpd.Import)(using Context): Unit = - if !tpd.languageImport(imp.expr).nonEmpty && !imp.isGeneratedByEnum then + if !tpd.languageImport(imp.expr).nonEmpty && !imp.isGeneratedByEnum && !isTransparentAndInline(imp) then impInScope.top += imp unusedImport ++= imp.selectors.filter { s => !shouldSelectorBeReported(imp, s) && !isImportExclusion(s) @@ -433,19 +433,6 @@ object CheckUnused: exists } - // not report unused transparent inline imports - for { - imp <- imports - sel <- imp.selectors - } { - if unusedImport.contains(sel) then - val tpd.Import(qual, _) = imp - val importedMembers = qual.tpe.member(sel.name).alternatives.map(_.symbol) - val isTransparentAndInline = importedMembers.exists(s => s.is(Transparent) && s.is(Inline)) - if isTransparentAndInline then - unusedImport -= sel - } - // if there's an outer scope if usedInScope.nonEmpty then // we keep the symbols not referencing an import in this scope @@ -520,6 +507,18 @@ object CheckUnused: end getUnused //============================ HELPERS ==================================== + + /** + * Checks if import selects a def that is transparent and inline + */ + private def isTransparentAndInline(imp: tpd.Import)(using Context): Boolean = + (for { + sel <- imp.selectors + } yield { + val qual = imp.expr + val importedMembers = qual.tpe.member(sel.name).alternatives.map(_.symbol) + importedMembers.exists(s => s.is(Transparent) && s.is(Inline)) + }).exists(identity) /** * Heuristic to detect synthetic suffixes in names of symbols */ diff --git a/tests/neg-custom-args/fatal-warnings/i15503f.scala b/tests/neg-custom-args/fatal-warnings/i15503f.scala index db695da3490b..d36cd01be74e 100644 --- a/tests/neg-custom-args/fatal-warnings/i15503f.scala +++ b/tests/neg-custom-args/fatal-warnings/i15503f.scala @@ -5,7 +5,7 @@ val default_int = 1 def f1(a: Int) = a // OK def f2(a: Int) = 1 // OK -def f3(a: Int)(using Int) = a // error +def f3(a: Int)(using Int) = a // OK def f4(a: Int)(using Int) = default_int // error def f6(a: Int)(using Int) = summon[Int] // OK def f7(a: Int)(using Int) = summon[Int] + a // OK diff --git a/tests/neg-custom-args/fatal-warnings/i15503g.scala b/tests/neg-custom-args/fatal-warnings/i15503g.scala index d4daea944184..a0822e7e1611 100644 --- a/tests/neg-custom-args/fatal-warnings/i15503g.scala +++ b/tests/neg-custom-args/fatal-warnings/i15503g.scala @@ -5,8 +5,8 @@ val default_int = 1 def f1(a: Int) = a // OK def f2(a: Int) = default_int // error -def f3(a: Int)(using Int) = a // error -def f4(a: Int)(using Int) = default_int // error // error +def f3(a: Int)(using Int) = a // OK +def f4(a: Int)(using Int) = default_int // error def f6(a: Int)(using Int) = summon[Int] // error def f7(a: Int)(using Int) = summon[Int] + a // OK From 4070dbda7b0aaccaff8a577b1a2708f3bba753a4 Mon Sep 17 00:00:00 2001 From: Szymon Rodziewicz Date: Tue, 28 Mar 2023 17:32:00 +0200 Subject: [PATCH 046/371] Warn for synthetic using/givens with wunused --- compiler/src/dotty/tools/dotc/transform/CheckUnused.scala | 2 +- tests/neg-custom-args/fatal-warnings/i15503f.scala | 2 +- 2 files changed, 2 insertions(+), 2 deletions(-) diff --git a/compiler/src/dotty/tools/dotc/transform/CheckUnused.scala b/compiler/src/dotty/tools/dotc/transform/CheckUnused.scala index 9b2fd122f68a..bf1ec37ebab4 100644 --- a/compiler/src/dotty/tools/dotc/transform/CheckUnused.scala +++ b/compiler/src/dotty/tools/dotc/transform/CheckUnused.scala @@ -478,7 +478,7 @@ object CheckUnused: if ctx.settings.WunusedHas.implicits then implicitParamInScope .filterNot(d => d.symbol.usedDefContains) - .filterNot(d => containsSyntheticSuffix(d.symbol)) + .filterNot(d => containsSyntheticSuffix(d.symbol) && !d.rawMods.is(Given)) .map(d => d.namePos -> WarnTypes.ImplicitParams).toList else Nil diff --git a/tests/neg-custom-args/fatal-warnings/i15503f.scala b/tests/neg-custom-args/fatal-warnings/i15503f.scala index d36cd01be74e..db695da3490b 100644 --- a/tests/neg-custom-args/fatal-warnings/i15503f.scala +++ b/tests/neg-custom-args/fatal-warnings/i15503f.scala @@ -5,7 +5,7 @@ val default_int = 1 def f1(a: Int) = a // OK def f2(a: Int) = 1 // OK -def f3(a: Int)(using Int) = a // OK +def f3(a: Int)(using Int) = a // error def f4(a: Int)(using Int) = default_int // error def f6(a: Int)(using Int) = summon[Int] // OK def f7(a: Int)(using Int) = summon[Int] + a // OK From 4644e5ed47ed52036869a9c424772c47500a4586 Mon Sep 17 00:00:00 2001 From: Szymon Rodziewicz Date: Tue, 28 Mar 2023 19:29:09 +0200 Subject: [PATCH 047/371] Wunused: only filter out non-zero span-length givens --- .../src/dotty/tools/dotc/transform/CheckUnused.scala | 9 ++++++++- 1 file changed, 8 insertions(+), 1 deletion(-) diff --git a/compiler/src/dotty/tools/dotc/transform/CheckUnused.scala b/compiler/src/dotty/tools/dotc/transform/CheckUnused.scala index bf1ec37ebab4..665c0b4284ca 100644 --- a/compiler/src/dotty/tools/dotc/transform/CheckUnused.scala +++ b/compiler/src/dotty/tools/dotc/transform/CheckUnused.scala @@ -478,7 +478,7 @@ object CheckUnused: if ctx.settings.WunusedHas.implicits then implicitParamInScope .filterNot(d => d.symbol.usedDefContains) - .filterNot(d => containsSyntheticSuffix(d.symbol) && !d.rawMods.is(Given)) + .filterNot(d => containsSyntheticSuffix(d.symbol) && (!d.rawMods.is(Given) || hasZeroLengthSpan(d.symbol))) .map(d => d.namePos -> WarnTypes.ImplicitParams).toList else Nil @@ -519,11 +519,18 @@ object CheckUnused: val importedMembers = qual.tpe.member(sel.name).alternatives.map(_.symbol) importedMembers.exists(s => s.is(Transparent) && s.is(Inline)) }).exists(identity) + /** * Heuristic to detect synthetic suffixes in names of symbols */ private def containsSyntheticSuffix(symbol: Symbol)(using Context): Boolean = symbol.name.mangledString.contains("$") + + /** + * Heuristic to detect generated symbols by checking if symbol has zero length span in source + */ + private def hasZeroLengthSpan(symbol: Symbol)(using Context): Boolean = + symbol.span.end - symbol.span.start == 0 /** * Is the the constructor of synthetic package object * Should be ignored as it is always imported/used in package From b72eade34401c963db8fdcf85799d4d9dc1fc9f8 Mon Sep 17 00:00:00 2001 From: Szymon Rodziewicz Date: Wed, 29 Mar 2023 14:52:46 +0200 Subject: [PATCH 048/371] Skip all symbols with $ in name in Wunused --- compiler/src/dotty/tools/dotc/transform/CheckUnused.scala | 7 +------ 1 file changed, 1 insertion(+), 6 deletions(-) diff --git a/compiler/src/dotty/tools/dotc/transform/CheckUnused.scala b/compiler/src/dotty/tools/dotc/transform/CheckUnused.scala index 665c0b4284ca..f960f7b9e60c 100644 --- a/compiler/src/dotty/tools/dotc/transform/CheckUnused.scala +++ b/compiler/src/dotty/tools/dotc/transform/CheckUnused.scala @@ -478,7 +478,7 @@ object CheckUnused: if ctx.settings.WunusedHas.implicits then implicitParamInScope .filterNot(d => d.symbol.usedDefContains) - .filterNot(d => containsSyntheticSuffix(d.symbol) && (!d.rawMods.is(Given) || hasZeroLengthSpan(d.symbol))) + .filterNot(d => containsSyntheticSuffix(d.symbol)) .map(d => d.namePos -> WarnTypes.ImplicitParams).toList else Nil @@ -526,11 +526,6 @@ object CheckUnused: private def containsSyntheticSuffix(symbol: Symbol)(using Context): Boolean = symbol.name.mangledString.contains("$") - /** - * Heuristic to detect generated symbols by checking if symbol has zero length span in source - */ - private def hasZeroLengthSpan(symbol: Symbol)(using Context): Boolean = - symbol.span.end - symbol.span.start == 0 /** * Is the the constructor of synthetic package object * Should be ignored as it is always imported/used in package From 432e829d3bdae27a2f62e18fb8878c74c0676ceb Mon Sep 17 00:00:00 2001 From: Szymon Rodziewicz Date: Wed, 29 Mar 2023 15:51:50 +0200 Subject: [PATCH 049/371] Add a failing case with named using to test Wunused:implicits --- tests/neg-custom-args/fatal-warnings/i15503f.scala | 5 +++-- 1 file changed, 3 insertions(+), 2 deletions(-) diff --git a/tests/neg-custom-args/fatal-warnings/i15503f.scala b/tests/neg-custom-args/fatal-warnings/i15503f.scala index db695da3490b..67c595d74f40 100644 --- a/tests/neg-custom-args/fatal-warnings/i15503f.scala +++ b/tests/neg-custom-args/fatal-warnings/i15503f.scala @@ -5,8 +5,9 @@ val default_int = 1 def f1(a: Int) = a // OK def f2(a: Int) = 1 // OK -def f3(a: Int)(using Int) = a // error -def f4(a: Int)(using Int) = default_int // error +def f3(a: Int)(using Int) = a // OK +def f4(a: Int)(using Int) = default_int // OK def f6(a: Int)(using Int) = summon[Int] // OK def f7(a: Int)(using Int) = summon[Int] + a // OK +def f8(a: Int)(using foo: Int) = a // error From 24080f11f1799f33acc94524c3e215234d02549a Mon Sep 17 00:00:00 2001 From: Szymon Rodziewicz Date: Wed, 29 Mar 2023 16:10:21 +0200 Subject: [PATCH 050/371] Replace for with exists in isTransparentInline in WUNused --- compiler/src/dotty/tools/dotc/transform/CheckUnused.scala | 6 ++---- 1 file changed, 2 insertions(+), 4 deletions(-) diff --git a/compiler/src/dotty/tools/dotc/transform/CheckUnused.scala b/compiler/src/dotty/tools/dotc/transform/CheckUnused.scala index f960f7b9e60c..d7c88a1fca40 100644 --- a/compiler/src/dotty/tools/dotc/transform/CheckUnused.scala +++ b/compiler/src/dotty/tools/dotc/transform/CheckUnused.scala @@ -512,13 +512,11 @@ object CheckUnused: * Checks if import selects a def that is transparent and inline */ private def isTransparentAndInline(imp: tpd.Import)(using Context): Boolean = - (for { - sel <- imp.selectors - } yield { + imp.selectors.exists { sel => val qual = imp.expr val importedMembers = qual.tpe.member(sel.name).alternatives.map(_.symbol) importedMembers.exists(s => s.is(Transparent) && s.is(Inline)) - }).exists(identity) + } /** * Heuristic to detect synthetic suffixes in names of symbols From 7cbdadf4858b75f76edfbadb4012eff6f663152f Mon Sep 17 00:00:00 2001 From: Szymon Rodziewicz Date: Wed, 29 Mar 2023 19:19:13 +0200 Subject: [PATCH 051/371] Skip extension method params in WUnused --- .../src/dotty/tools/dotc/transform/CheckUnused.scala | 2 +- tests/neg-custom-args/fatal-warnings/i15503g.scala | 9 ++++++++- 2 files changed, 9 insertions(+), 2 deletions(-) diff --git a/compiler/src/dotty/tools/dotc/transform/CheckUnused.scala b/compiler/src/dotty/tools/dotc/transform/CheckUnused.scala index d7c88a1fca40..65ab7f12189f 100644 --- a/compiler/src/dotty/tools/dotc/transform/CheckUnused.scala +++ b/compiler/src/dotty/tools/dotc/transform/CheckUnused.scala @@ -651,7 +651,7 @@ object CheckUnused: extension (memDef: tpd.MemberDef) private def isValidMemberDef(using Context): Boolean = - !memDef.symbol.isUnusedAnnot && !memDef.symbol.isAllOf(Flags.AccessorCreationFlags) && !memDef.name.isWildcard + !memDef.symbol.isUnusedAnnot && !memDef.symbol.isAllOf(Flags.AccessorCreationFlags) && !memDef.name.isWildcard && !memDef.symbol.owner.is(Extension) private def isValidParam(using Context): Boolean = val sym = memDef.symbol diff --git a/tests/neg-custom-args/fatal-warnings/i15503g.scala b/tests/neg-custom-args/fatal-warnings/i15503g.scala index a0822e7e1611..8b3fd7561a4b 100644 --- a/tests/neg-custom-args/fatal-warnings/i15503g.scala +++ b/tests/neg-custom-args/fatal-warnings/i15503g.scala @@ -12,4 +12,11 @@ def f7(a: Int)(using Int) = summon[Int] + a // OK /* --- Trivial method check --- */ def g1(x: Int) = 1 // OK -def g2(x: Int) = ??? // OK \ No newline at end of file +def g2(x: Int) = ??? // OK + +package foo.test.i17101: + type Test[A] = A + extension[A] (x: Test[A]) { // OK + def value: A = x + def causesIssue: Unit = println("oh no") + } From ac0603346e0e0297f85abd11de95e500f7c2cc5a Mon Sep 17 00:00:00 2001 From: Szymon Rodziewicz Date: Mon, 27 Mar 2023 16:49:09 +0200 Subject: [PATCH 052/371] Fix wunused false positive when deriving alias type --- .../src/dotty/tools/dotc/transform/CheckUnused.scala | 12 ++++++++++-- libste | 0 2 files changed, 10 insertions(+), 2 deletions(-) create mode 100644 libste diff --git a/compiler/src/dotty/tools/dotc/transform/CheckUnused.scala b/compiler/src/dotty/tools/dotc/transform/CheckUnused.scala index 65ab7f12189f..80349bf1f0c7 100644 --- a/compiler/src/dotty/tools/dotc/transform/CheckUnused.scala +++ b/compiler/src/dotty/tools/dotc/transform/CheckUnused.scala @@ -589,14 +589,22 @@ object CheckUnused: /** Given an import and accessibility, return an option of selector that match import<->symbol */ private def isInImport(imp: tpd.Import, isAccessible: Boolean, symName: Option[Name])(using Context): Option[ImportSelector] = val tpd.Import(qual, sels) = imp - val qualHasSymbol = qual.tpe.member(sym.name).alternatives.map(_.symbol).contains(sym) + val dealiasedSym = dealias(sym) + val qualHasSymbol = qual.tpe.member(sym.name).alternatives.map(_.symbol).map(dealias).contains(dealiasedSym) def selector = sels.find(sel => (sel.name.toTermName == sym.name || sel.name.toTypeName == sym.name) && symName.map(n => n.toTermName == sel.rename).getOrElse(true)) + def dealiasedSelector = sels.flatMap(sel => qual.tpe.member(sym.name).alternatives.map(m => (sel, m.symbol))).collect { + case (sel, sym) if dealias(sym) == dealiasedSym => sel + }.headOption def wildcard = sels.find(sel => sel.isWildcard && ((sym.is(Given) == sel.isGiven) || sym.is(Implicit))) if qualHasSymbol && !isAccessible && sym.exists then - selector.orElse(wildcard) // selector with name or wildcard (or given) + selector.orElse(dealiasedSelector).orElse(wildcard) // selector with name or wildcard (or given) else None + private def dealias(symbol: Symbol)(using Context): Symbol = + if(symbol.isType && symbol.asType.denot.isAliasType) then + symbol.asType.typeRef.dealias.typeSymbol + else symbol /** Annotated with @unused */ private def isUnusedAnnot(using Context): Boolean = sym.annotations.exists(a => a.symbol == ctx.definitions.UnusedAnnot) diff --git a/libste b/libste new file mode 100644 index 000000000000..e69de29bb2d1 From 41e74189e4ce0c2fb1fe8265546199d4e76e8ae2 Mon Sep 17 00:00:00 2001 From: Szymon Rodziewicz Date: Mon, 3 Apr 2023 15:53:05 +0200 Subject: [PATCH 053/371] Fix wunused for deriving alias type that has a different name --- .../src/dotty/tools/dotc/transform/CheckUnused.scala | 7 +++++-- libste | 0 tests/neg-custom-args/fatal-warnings/i15503i.scala | 11 +++++++++++ 3 files changed, 16 insertions(+), 2 deletions(-) delete mode 100644 libste diff --git a/compiler/src/dotty/tools/dotc/transform/CheckUnused.scala b/compiler/src/dotty/tools/dotc/transform/CheckUnused.scala index 80349bf1f0c7..4a6109e3ffa0 100644 --- a/compiler/src/dotty/tools/dotc/transform/CheckUnused.scala +++ b/compiler/src/dotty/tools/dotc/transform/CheckUnused.scala @@ -590,9 +590,12 @@ object CheckUnused: private def isInImport(imp: tpd.Import, isAccessible: Boolean, symName: Option[Name])(using Context): Option[ImportSelector] = val tpd.Import(qual, sels) = imp val dealiasedSym = dealias(sym) - val qualHasSymbol = qual.tpe.member(sym.name).alternatives.map(_.symbol).map(dealias).contains(dealiasedSym) + val typeSelections = sels.flatMap(n => qual.tpe.member(n.name.toTypeName).alternatives) + val termSelections = sels.flatMap(n => qual.tpe.member(n.name.toTermName).alternatives) + val allSelections = typeSelections ::: termSelections :::qual.tpe.member(sym.name).alternatives + val qualHasSymbol = allSelections.map(_.symbol).map(dealias).contains(dealiasedSym) def selector = sels.find(sel => (sel.name.toTermName == sym.name || sel.name.toTypeName == sym.name) && symName.map(n => n.toTermName == sel.rename).getOrElse(true)) - def dealiasedSelector = sels.flatMap(sel => qual.tpe.member(sym.name).alternatives.map(m => (sel, m.symbol))).collect { + def dealiasedSelector = sels.flatMap(sel => allSelections.map(m => (sel, m.symbol))).collect { case (sel, sym) if dealias(sym) == dealiasedSym => sel }.headOption def wildcard = sels.find(sel => sel.isWildcard && ((sym.is(Given) == sel.isGiven) || sym.is(Implicit))) diff --git a/libste b/libste deleted file mode 100644 index e69de29bb2d1..000000000000 diff --git a/tests/neg-custom-args/fatal-warnings/i15503i.scala b/tests/neg-custom-args/fatal-warnings/i15503i.scala index 9ac2ec5ef622..a76f96b3c89b 100644 --- a/tests/neg-custom-args/fatal-warnings/i15503i.scala +++ b/tests/neg-custom-args/fatal-warnings/i15503i.scala @@ -273,3 +273,14 @@ package foo.test.i16679b: import Foo.x case class CoolClass(i: Int) println(summon[myPackage.CaseClassName[CoolClass]]) + +package foo.test.i17156: + package a: + trait Foo[A] + + package b: + type Xd = Foo + + package c: + import b.Xd + trait Z derives Xd From 8262192141c486bfe13f75a72903b26645df6c83 Mon Sep 17 00:00:00 2001 From: Szymon Rodziewicz Date: Mon, 3 Apr 2023 19:45:58 +0200 Subject: [PATCH 054/371] Fix test for wunused alias deriving --- tests/neg-custom-args/fatal-warnings/i15503i.scala | 5 ++++- 1 file changed, 4 insertions(+), 1 deletion(-) diff --git a/tests/neg-custom-args/fatal-warnings/i15503i.scala b/tests/neg-custom-args/fatal-warnings/i15503i.scala index a76f96b3c89b..737e5f0739ca 100644 --- a/tests/neg-custom-args/fatal-warnings/i15503i.scala +++ b/tests/neg-custom-args/fatal-warnings/i15503i.scala @@ -277,9 +277,12 @@ package foo.test.i16679b: package foo.test.i17156: package a: trait Foo[A] + object Foo: + inline def derived[T]: Foo[T] = new Foo{} package b: - type Xd = Foo + import a.Foo + type Xd[A] = Foo[A] package c: import b.Xd From fd70247d37d44e2599e131a989913c08686571d0 Mon Sep 17 00:00:00 2001 From: Szymon Rodziewicz Date: Tue, 4 Apr 2023 14:43:12 +0200 Subject: [PATCH 055/371] Fix selecting unaliased selector in wunused --- compiler/src/dotty/tools/dotc/transform/CheckUnused.scala | 8 +++++--- 1 file changed, 5 insertions(+), 3 deletions(-) diff --git a/compiler/src/dotty/tools/dotc/transform/CheckUnused.scala b/compiler/src/dotty/tools/dotc/transform/CheckUnused.scala index 4a6109e3ffa0..7062478305fa 100644 --- a/compiler/src/dotty/tools/dotc/transform/CheckUnused.scala +++ b/compiler/src/dotty/tools/dotc/transform/CheckUnused.scala @@ -590,12 +590,13 @@ object CheckUnused: private def isInImport(imp: tpd.Import, isAccessible: Boolean, symName: Option[Name])(using Context): Option[ImportSelector] = val tpd.Import(qual, sels) = imp val dealiasedSym = dealias(sym) + val simpleSelections = qual.tpe.member(sym.name).alternatives val typeSelections = sels.flatMap(n => qual.tpe.member(n.name.toTypeName).alternatives) val termSelections = sels.flatMap(n => qual.tpe.member(n.name.toTermName).alternatives) - val allSelections = typeSelections ::: termSelections :::qual.tpe.member(sym.name).alternatives - val qualHasSymbol = allSelections.map(_.symbol).map(dealias).contains(dealiasedSym) + val selectionsToDealias = typeSelections ::: termSelections + val qualHasSymbol = simpleSelections.map(_.symbol).contains(sym) || (simpleSelections ::: selectionsToDealias).map(_.symbol).map(dealias).contains(dealiasedSym) def selector = sels.find(sel => (sel.name.toTermName == sym.name || sel.name.toTypeName == sym.name) && symName.map(n => n.toTermName == sel.rename).getOrElse(true)) - def dealiasedSelector = sels.flatMap(sel => allSelections.map(m => (sel, m.symbol))).collect { + def dealiasedSelector = sels.flatMap(sel => selectionsToDealias.map(m => (sel, m.symbol))).collect { case (sel, sym) if dealias(sym) == dealiasedSym => sel }.headOption def wildcard = sels.find(sel => sel.isWildcard && ((sym.is(Given) == sel.isGiven) || sym.is(Implicit))) @@ -604,6 +605,7 @@ object CheckUnused: else None + private def dealias(symbol: Symbol)(using Context): Symbol = if(symbol.isType && symbol.asType.denot.isAliasType) then symbol.asType.typeRef.dealias.typeSymbol From ec298fa4e37781aa6cf6eef860771884cd33fd46 Mon Sep 17 00:00:00 2001 From: Szymon Rodziewicz Date: Mon, 10 Apr 2023 19:41:15 +0200 Subject: [PATCH 056/371] Dealias only conditionally when symbol is derived val type in wunused --- .../tools/dotc/transform/CheckUnused.scala | 27 ++++++++++--------- 1 file changed, 15 insertions(+), 12 deletions(-) diff --git a/compiler/src/dotty/tools/dotc/transform/CheckUnused.scala b/compiler/src/dotty/tools/dotc/transform/CheckUnused.scala index 7062478305fa..5a178ff2ec1f 100644 --- a/compiler/src/dotty/tools/dotc/transform/CheckUnused.scala +++ b/compiler/src/dotty/tools/dotc/transform/CheckUnused.scala @@ -24,7 +24,7 @@ import dotty.tools.dotc.core.Annotations import dotty.tools.dotc.core.Definitions import dotty.tools.dotc.core.NameKinds.WildcardParamName import dotty.tools.dotc.core.Symbols.Symbol - +import dotty.tools.dotc.core.StdNames.nme /** @@ -109,6 +109,9 @@ class CheckUnused extends MiniPhase: traverseAnnotations(tree.symbol) if !tree.symbol.is(Module) then ud.registerDef(tree) + if tree.name.mangledString.startsWith(nme.derived.mangledString + "$") + && tree.typeOpt != NoType then + ud.registerUsed(tree.typeOpt.typeSymbol, None, true) ud.addIgnoredUsage(tree.symbol) } @@ -304,7 +307,7 @@ object CheckUnused: * * See the `isAccessibleAsIdent` extension method below in the file */ - private val usedInScope = MutStack(MutSet[(Symbol,Boolean, Option[Name])]()) + private val usedInScope = MutStack(MutSet[(Symbol,Boolean, Option[Name], Boolean)]()) private val usedInPosition = MutSet[(SrcPos, Name)]() /* unused import collected during traversal */ private val unusedImport = MutSet[ImportSelector]() @@ -347,14 +350,14 @@ object CheckUnused: * The optional name will be used to target the right import * as the same element can be imported with different renaming */ - def registerUsed(sym: Symbol, name: Option[Name])(using Context): Unit = + def registerUsed(sym: Symbol, name: Option[Name], isDerived: Boolean = false)(using Context): Unit = if !isConstructorOfSynth(sym) && !doNotRegister(sym) then if sym.isConstructor && sym.exists then registerUsed(sym.owner, None) // constructor are "implicitly" imported with the class else - usedInScope.top += ((sym, sym.isAccessibleAsIdent, name)) - usedInScope.top += ((sym.companionModule, sym.isAccessibleAsIdent, name)) - usedInScope.top += ((sym.companionClass, sym.isAccessibleAsIdent, name)) + usedInScope.top += ((sym, sym.isAccessibleAsIdent, name, isDerived)) + usedInScope.top += ((sym.companionModule, sym.isAccessibleAsIdent, name, isDerived)) + usedInScope.top += ((sym.companionClass, sym.isAccessibleAsIdent, name, isDerived)) name.map(n => usedInPosition += ((sym.sourcePos, n))) /** Register a symbol that should be ignored */ @@ -408,15 +411,15 @@ object CheckUnused: // used symbol in this scope val used = usedInScope.pop().toSet // used imports in this scope - val imports = impInScope.pop().toSet + val imports = impInScope.pop() val kept = used.filterNot { t => - val (sym, isAccessible, optName) = t + val (sym, isAccessible, optName, isDerived) = t // keep the symbol for outer scope, if it matches **no** import // This is the first matching wildcard selector var selWildCard: Option[ImportSelector] = None val exists = imports.exists { imp => - sym.isInImport(imp, isAccessible, optName) match + sym.isInImport(imp, isAccessible, optName, isDerived) match case None => false case optSel@Some(sel) if sel.isWildcard => if selWildCard.isEmpty then selWildCard = optSel @@ -587,7 +590,7 @@ object CheckUnused: } /** Given an import and accessibility, return an option of selector that match import<->symbol */ - private def isInImport(imp: tpd.Import, isAccessible: Boolean, symName: Option[Name])(using Context): Option[ImportSelector] = + private def isInImport(imp: tpd.Import, isAccessible: Boolean, symName: Option[Name], isDerived: Boolean)(using Context): Option[ImportSelector] = val tpd.Import(qual, sels) = imp val dealiasedSym = dealias(sym) val simpleSelections = qual.tpe.member(sym.name).alternatives @@ -596,9 +599,9 @@ object CheckUnused: val selectionsToDealias = typeSelections ::: termSelections val qualHasSymbol = simpleSelections.map(_.symbol).contains(sym) || (simpleSelections ::: selectionsToDealias).map(_.symbol).map(dealias).contains(dealiasedSym) def selector = sels.find(sel => (sel.name.toTermName == sym.name || sel.name.toTypeName == sym.name) && symName.map(n => n.toTermName == sel.rename).getOrElse(true)) - def dealiasedSelector = sels.flatMap(sel => selectionsToDealias.map(m => (sel, m.symbol))).collect { + def dealiasedSelector = if(isDerived) sels.flatMap(sel => selectionsToDealias.map(m => (sel, m.symbol))).collect { case (sel, sym) if dealias(sym) == dealiasedSym => sel - }.headOption + }.headOption else None def wildcard = sels.find(sel => sel.isWildcard && ((sym.is(Given) == sel.isGiven) || sym.is(Implicit))) if qualHasSymbol && !isAccessible && sym.exists then selector.orElse(dealiasedSelector).orElse(wildcard) // selector with name or wildcard (or given) From 87f8449af369ef933431bf2e4b5c6549847ed65f Mon Sep 17 00:00:00 2001 From: Szymon Rodziewicz Date: Wed, 29 Mar 2023 19:09:47 +0200 Subject: [PATCH 057/371] Fix WUnused for accessible symbols that are renamed --- .../dotty/tools/dotc/transform/CheckUnused.scala | 2 +- tests/neg-custom-args/fatal-warnings/i15503i.scala | 14 ++++++++++++++ 2 files changed, 15 insertions(+), 1 deletion(-) diff --git a/compiler/src/dotty/tools/dotc/transform/CheckUnused.scala b/compiler/src/dotty/tools/dotc/transform/CheckUnused.scala index 5a178ff2ec1f..fcbb61ab90e2 100644 --- a/compiler/src/dotty/tools/dotc/transform/CheckUnused.scala +++ b/compiler/src/dotty/tools/dotc/transform/CheckUnused.scala @@ -603,7 +603,7 @@ object CheckUnused: case (sel, sym) if dealias(sym) == dealiasedSym => sel }.headOption else None def wildcard = sels.find(sel => sel.isWildcard && ((sym.is(Given) == sel.isGiven) || sym.is(Implicit))) - if qualHasSymbol && !isAccessible && sym.exists then + if qualHasSymbol && (!isAccessible || symName.exists(_ != sym.name)) && sym.exists then selector.orElse(dealiasedSelector).orElse(wildcard) // selector with name or wildcard (or given) else None diff --git a/tests/neg-custom-args/fatal-warnings/i15503i.scala b/tests/neg-custom-args/fatal-warnings/i15503i.scala index 737e5f0739ca..fbdf47dae17a 100644 --- a/tests/neg-custom-args/fatal-warnings/i15503i.scala +++ b/tests/neg-custom-args/fatal-warnings/i15503i.scala @@ -287,3 +287,17 @@ package foo.test.i17156: package c: import b.Xd trait Z derives Xd + +package foo.test.i17117: + package example { + object test1 { + val test = "test" + } + + object test2 { + + import example.test1 as t1 + + val test = t1.test + } + } \ No newline at end of file From 0d2977e856b1ee6b7b5e5db0bac05ff05dd8fdc3 Mon Sep 17 00:00:00 2001 From: Szymon Rodziewicz Date: Wed, 12 Apr 2023 14:35:53 +0200 Subject: [PATCH 058/371] Compare simple name and handle NO_NAME case in WUnused --- compiler/src/dotty/tools/dotc/transform/CheckUnused.scala | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/compiler/src/dotty/tools/dotc/transform/CheckUnused.scala b/compiler/src/dotty/tools/dotc/transform/CheckUnused.scala index fcbb61ab90e2..5c5f382de1b2 100644 --- a/compiler/src/dotty/tools/dotc/transform/CheckUnused.scala +++ b/compiler/src/dotty/tools/dotc/transform/CheckUnused.scala @@ -603,7 +603,7 @@ object CheckUnused: case (sel, sym) if dealias(sym) == dealiasedSym => sel }.headOption else None def wildcard = sels.find(sel => sel.isWildcard && ((sym.is(Given) == sel.isGiven) || sym.is(Implicit))) - if qualHasSymbol && (!isAccessible || symName.exists(_ != sym.name)) && sym.exists then + if qualHasSymbol && (!isAccessible || (sym.name != nme.NO_NAME && symName.exists(_.toSimpleName != sym.name.toSimpleName))) && sym.exists then selector.orElse(dealiasedSelector).orElse(wildcard) // selector with name or wildcard (or given) else None From c6a6656c7002ef6b3da108d2d35fdeaee4a2adfd Mon Sep 17 00:00:00 2001 From: Szymon Rodziewicz Date: Wed, 12 Apr 2023 17:00:18 +0200 Subject: [PATCH 059/371] Extracted isRenamedSymbol def --- compiler/src/dotty/tools/dotc/transform/CheckUnused.scala | 4 +++- 1 file changed, 3 insertions(+), 1 deletion(-) diff --git a/compiler/src/dotty/tools/dotc/transform/CheckUnused.scala b/compiler/src/dotty/tools/dotc/transform/CheckUnused.scala index 5c5f382de1b2..cd1a21ece440 100644 --- a/compiler/src/dotty/tools/dotc/transform/CheckUnused.scala +++ b/compiler/src/dotty/tools/dotc/transform/CheckUnused.scala @@ -603,11 +603,13 @@ object CheckUnused: case (sel, sym) if dealias(sym) == dealiasedSym => sel }.headOption else None def wildcard = sels.find(sel => sel.isWildcard && ((sym.is(Given) == sel.isGiven) || sym.is(Implicit))) - if qualHasSymbol && (!isAccessible || (sym.name != nme.NO_NAME && symName.exists(_.toSimpleName != sym.name.toSimpleName))) && sym.exists then + if qualHasSymbol && (!isAccessible || isRenamedSymbol(sym, symName)) && sym.exists then selector.orElse(dealiasedSelector).orElse(wildcard) // selector with name or wildcard (or given) else None + private def isRenamedSymbol(sym: Symbol, symNameInScope: Option[Name]) = + sym.name != nme.NO_NAME && symName.exists(_.toSimpleName != sym.name.toSimpleName) private def dealias(symbol: Symbol)(using Context): Symbol = if(symbol.isType && symbol.asType.denot.isAliasType) then From 79b87a06f92e7f549ac3bab0d1a6cb5ca494e77f Mon Sep 17 00:00:00 2001 From: Szymon Rodziewicz Date: Wed, 12 Apr 2023 17:16:39 +0200 Subject: [PATCH 060/371] Fix isRenamedSymbol method in WUnused --- compiler/src/dotty/tools/dotc/transform/CheckUnused.scala | 6 +++--- 1 file changed, 3 insertions(+), 3 deletions(-) diff --git a/compiler/src/dotty/tools/dotc/transform/CheckUnused.scala b/compiler/src/dotty/tools/dotc/transform/CheckUnused.scala index cd1a21ece440..04f993e4c805 100644 --- a/compiler/src/dotty/tools/dotc/transform/CheckUnused.scala +++ b/compiler/src/dotty/tools/dotc/transform/CheckUnused.scala @@ -603,13 +603,13 @@ object CheckUnused: case (sel, sym) if dealias(sym) == dealiasedSym => sel }.headOption else None def wildcard = sels.find(sel => sel.isWildcard && ((sym.is(Given) == sel.isGiven) || sym.is(Implicit))) - if qualHasSymbol && (!isAccessible || isRenamedSymbol(sym, symName)) && sym.exists then + if qualHasSymbol && (!isAccessible || sym.isRenamedSymbol(symName)) && sym.exists then selector.orElse(dealiasedSelector).orElse(wildcard) // selector with name or wildcard (or given) else None - private def isRenamedSymbol(sym: Symbol, symNameInScope: Option[Name]) = - sym.name != nme.NO_NAME && symName.exists(_.toSimpleName != sym.name.toSimpleName) + private def isRenamedSymbol(symNameInScope: Option[Name])(using Context) = + sym.name != nme.NO_NAME && symNameInScope.exists(_.toSimpleName != sym.name.toSimpleName) private def dealias(symbol: Symbol)(using Context): Symbol = if(symbol.isType && symbol.asType.denot.isAliasType) then From 2a2a1117a28fbc937ee815d5f86eec7c2a0c9971 Mon Sep 17 00:00:00 2001 From: Szymon Rodziewicz Date: Wed, 29 Mar 2023 18:08:33 +0200 Subject: [PATCH 061/371] Fix WUnused false positive in for --- compiler/src/dotty/tools/dotc/transform/CheckUnused.scala | 1 + tests/neg-custom-args/fatal-warnings/i15503i.scala | 2 +- 2 files changed, 2 insertions(+), 1 deletion(-) diff --git a/compiler/src/dotty/tools/dotc/transform/CheckUnused.scala b/compiler/src/dotty/tools/dotc/transform/CheckUnused.scala index 04f993e4c805..c1d936cb1ca0 100644 --- a/compiler/src/dotty/tools/dotc/transform/CheckUnused.scala +++ b/compiler/src/dotty/tools/dotc/transform/CheckUnused.scala @@ -473,6 +473,7 @@ object CheckUnused: if ctx.settings.WunusedHas.explicits then explicitParamInScope .filterNot(d => d.symbol.usedDefContains) + .filterNot(d => usedInPosition.exists { case (pos, name) => d.span.contains(pos.span) && name == d.symbol.name}) .filterNot(d => containsSyntheticSuffix(d.symbol)) .map(d => d.namePos -> WarnTypes.ExplicitParams).toList else diff --git a/tests/neg-custom-args/fatal-warnings/i15503i.scala b/tests/neg-custom-args/fatal-warnings/i15503i.scala index fbdf47dae17a..daf487a2fad0 100644 --- a/tests/neg-custom-args/fatal-warnings/i15503i.scala +++ b/tests/neg-custom-args/fatal-warnings/i15503i.scala @@ -300,4 +300,4 @@ package foo.test.i17117: val test = t1.test } - } \ No newline at end of file + } From fd7b9627855a112833967a37c1146fdda0251685 Mon Sep 17 00:00:00 2001 From: Szymon Rodziewicz Date: Thu, 13 Apr 2023 13:32:30 +0200 Subject: [PATCH 062/371] Do not register used symbol when position doesnt exist in wunused --- compiler/src/dotty/tools/dotc/transform/CheckUnused.scala | 3 ++- .../fatal-warnings/i15503-scala2/scala2-t11681.scala | 2 +- tests/neg-custom-args/fatal-warnings/i15503i.scala | 4 ++-- 3 files changed, 5 insertions(+), 4 deletions(-) diff --git a/compiler/src/dotty/tools/dotc/transform/CheckUnused.scala b/compiler/src/dotty/tools/dotc/transform/CheckUnused.scala index c1d936cb1ca0..4ee50c03ab85 100644 --- a/compiler/src/dotty/tools/dotc/transform/CheckUnused.scala +++ b/compiler/src/dotty/tools/dotc/transform/CheckUnused.scala @@ -358,7 +358,8 @@ object CheckUnused: usedInScope.top += ((sym, sym.isAccessibleAsIdent, name, isDerived)) usedInScope.top += ((sym.companionModule, sym.isAccessibleAsIdent, name, isDerived)) usedInScope.top += ((sym.companionClass, sym.isAccessibleAsIdent, name, isDerived)) - name.map(n => usedInPosition += ((sym.sourcePos, n))) + if sym.sourcePos.exists then + name.map(n => usedInPosition += ((sym.sourcePos, n))) /** Register a symbol that should be ignored */ def addIgnoredUsage(sym: Symbol)(using Context): Unit = diff --git a/tests/neg-custom-args/fatal-warnings/i15503-scala2/scala2-t11681.scala b/tests/neg-custom-args/fatal-warnings/i15503-scala2/scala2-t11681.scala index 18aa6879eeba..912dbb456f3b 100644 --- a/tests/neg-custom-args/fatal-warnings/i15503-scala2/scala2-t11681.scala +++ b/tests/neg-custom-args/fatal-warnings/i15503-scala2/scala2-t11681.scala @@ -60,7 +60,7 @@ class Revaluing(u: Int) { def f = u } // OK case class CaseyKasem(k: Int) // OK -case class CaseyAtTheBat(k: Int)(s: String) // error +case class CaseyAtTheBat(k: Int)(s: String) // ok trait Ignorance { def f(readResolve: Int) = answer // error diff --git a/tests/neg-custom-args/fatal-warnings/i15503i.scala b/tests/neg-custom-args/fatal-warnings/i15503i.scala index daf487a2fad0..436ee7ca0c0c 100644 --- a/tests/neg-custom-args/fatal-warnings/i15503i.scala +++ b/tests/neg-custom-args/fatal-warnings/i15503i.scala @@ -90,7 +90,7 @@ package foo.test.possibleclasses: k: Int, // OK private val y: Int // OK /* Kept as it can be taken from pattern */ )( - s: Int, // error /* But not these */ + s: Int, val t: Int, // OK private val z: Int // error ) @@ -131,7 +131,7 @@ package foo.test.possibleclasses.withvar: k: Int, // OK private var y: Int // OK /* Kept as it can be taken from pattern */ )( - s: Int, // error /* But not these */ + s: Int, var t: Int, // OK private var z: Int // error ) From 6356a39d7e0075125dd4c876a0470caed3ff8223 Mon Sep 17 00:00:00 2001 From: Szymon Rodziewicz Date: Tue, 4 Apr 2023 17:20:02 +0200 Subject: [PATCH 063/371] Make CheckUnused run both after Typer and Inlining --- compiler/src/dotty/tools/dotc/Compiler.scala | 3 +- .../tools/dotc/transform/CheckUnused.scala | 92 ++++++++++++------- 2 files changed, 63 insertions(+), 32 deletions(-) diff --git a/compiler/src/dotty/tools/dotc/Compiler.scala b/compiler/src/dotty/tools/dotc/Compiler.scala index b03953afb37c..a488a2802ec2 100644 --- a/compiler/src/dotty/tools/dotc/Compiler.scala +++ b/compiler/src/dotty/tools/dotc/Compiler.scala @@ -35,7 +35,7 @@ class Compiler { protected def frontendPhases: List[List[Phase]] = List(new Parser) :: // Compiler frontend: scanner, parser List(new TyperPhase) :: // Compiler frontend: namer, typer - List(new CheckUnused) :: // Check for unused elements + List(CheckUnused.PostTyper) :: // Check for unused elements List(new YCheckPositions) :: // YCheck positions List(new sbt.ExtractDependencies) :: // Sends information on classes' dependencies to sbt via callbacks List(new semanticdb.ExtractSemanticDB) :: // Extract info into .semanticdb files @@ -50,6 +50,7 @@ class Compiler { List(new Pickler) :: // Generate TASTY info List(new Inlining) :: // Inline and execute macros List(new PostInlining) :: // Add mirror support for inlined code + List(CheckUnused.PostInlining) :: // Check for unused elements List(new Staging) :: // Check staging levels and heal staged types List(new Splicing) :: // Replace level 1 splices with holes List(new PickleQuotes) :: // Turn quoted trees into explicit run-time data structures diff --git a/compiler/src/dotty/tools/dotc/transform/CheckUnused.scala b/compiler/src/dotty/tools/dotc/transform/CheckUnused.scala index 4ee50c03ab85..8d6aea8c97d1 100644 --- a/compiler/src/dotty/tools/dotc/transform/CheckUnused.scala +++ b/compiler/src/dotty/tools/dotc/transform/CheckUnused.scala @@ -33,22 +33,15 @@ import dotty.tools.dotc.core.StdNames.nme * Basically, it gathers definition/imports and their usage. If a * definition/imports does not have any usage, then it is reported. */ -class CheckUnused extends MiniPhase: - import CheckUnused.UnusedData - - /** - * The key used to retrieve the "unused entity" analysis metadata, - * from the compilation `Context` - */ - private val _key = Property.Key[UnusedData] +class CheckUnused private (phaseMode: CheckUnused.PhaseMode, suffix: String, _key: Property.Key[CheckUnused.UnusedData]) extends MiniPhase: + import CheckUnused.* + import UnusedData.* private def unusedDataApply[U](f: UnusedData => U)(using Context): Context = ctx.property(_key).foreach(f) ctx - private def getUnusedData(using Context): Option[UnusedData] = - ctx.property(_key) - override def phaseName: String = CheckUnused.phaseName + override def phaseName: String = CheckUnused.phaseNamePrefix + suffix override def description: String = CheckUnused.description @@ -60,13 +53,21 @@ class CheckUnused extends MiniPhase: override def prepareForUnit(tree: tpd.Tree)(using Context): Context = val data = UnusedData() + tree.getAttachment(_key).foreach(oldData => + data.unusedAggregate = oldData.unusedAggregate + ) val fresh = ctx.fresh.setProperty(_key, data) + tree.putAttachment(_key, data) fresh // ========== END + REPORTING ========== override def transformUnit(tree: tpd.Tree)(using Context): tpd.Tree = - unusedDataApply(ud => reportUnused(ud.getUnused)) + unusedDataApply { ud => + aggregateUnused(ud, ud.getUnused) + if(phaseMode == PhaseMode.Report) then + ud.unusedAggregate.foreach(reportUnused) + } tree // ========== MiniPhase Prepare ========== @@ -252,31 +253,45 @@ class CheckUnused extends MiniPhase: private def traverseAnnotations(sym: Symbol)(using Context): Unit = sym.denot.annotations.foreach(annot => traverser.traverse(annot.tree)) + private def aggregateUnused(data: UnusedData, res: UnusedData.UnusedResult)(using Context): Unit = + data.unusedAggregate match { + case None => + data.unusedAggregate = Some(res) + case Some(prevUnused) => + val intersection = res.warnings.filter(sym => prevUnused.warnings.contains(sym)) + data.unusedAggregate = Some(UnusedResult(intersection)) + } + + + /** Do the actual reporting given the result of the anaylsis */ private def reportUnused(res: UnusedData.UnusedResult)(using Context): Unit = - import CheckUnused.WarnTypes res.warnings.foreach { s => s match - case (t, WarnTypes.Imports) => + case UnusedSymbol(t, _, WarnTypes.Imports) => report.warning(s"unused import", t) - case (t, WarnTypes.LocalDefs) => + case UnusedSymbol(t, _, WarnTypes.LocalDefs) => report.warning(s"unused local definition", t) - case (t, WarnTypes.ExplicitParams) => + case UnusedSymbol(t, _, WarnTypes.ExplicitParams) => report.warning(s"unused explicit parameter", t) - case (t, WarnTypes.ImplicitParams) => + case UnusedSymbol(t, _, WarnTypes.ImplicitParams) => report.warning(s"unused implicit parameter", t) - case (t, WarnTypes.PrivateMembers) => + case UnusedSymbol(t, _, WarnTypes.PrivateMembers) => report.warning(s"unused private member", t) - case (t, WarnTypes.PatVars) => + case UnusedSymbol(t, _, WarnTypes.PatVars) => report.warning(s"unused pattern variable", t) } end CheckUnused object CheckUnused: - val phaseName: String = "checkUnused" + val phaseNamePrefix: String = "checkUnused" val description: String = "check for unused elements" + enum PhaseMode: + case Aggregate + case Report + private enum WarnTypes: case Imports case LocalDefs @@ -285,6 +300,15 @@ object CheckUnused: case PrivateMembers case PatVars + /** + * The key used to retrieve the "unused entity" analysis metadata, + * from the compilation `Context` + */ + private val _key = Property.StickyKey[UnusedData] + + val PostTyper = new CheckUnused(PhaseMode.Aggregate, "PostTyper", _key) + val PostInlining = new CheckUnused(PhaseMode.Report, "PostInlining", _key) + /** * A stateful class gathering the infos on : * - imports @@ -292,13 +316,14 @@ object CheckUnused: * - usage */ private class UnusedData: - import dotty.tools.dotc.transform.CheckUnused.UnusedData.UnusedResult import collection.mutable.{Set => MutSet, Map => MutMap, Stack => MutStack} - import UnusedData.ScopeType + import UnusedData.* /** The current scope during the tree traversal */ var currScopeType: MutStack[ScopeType] = MutStack(ScopeType.Other) + var unusedAggregate: Option[UnusedResult] = None + /* IMPORTS */ private val impInScope = MutStack(MutSet[tpd.Import]()) /** @@ -453,12 +478,13 @@ object CheckUnused: * * The given `List` is sorted by line and then column of the position */ + def getUnused(using Context): UnusedResult = popScope() val sortedImp = if ctx.settings.WunusedHas.imports || ctx.settings.WunusedHas.strictNoImplicitWarn then - unusedImport.map(d => d.srcPos -> WarnTypes.Imports).toList + unusedImport.map(d => UnusedSymbol(d.srcPos, d.name, WarnTypes.Imports)).toList else Nil val sortedLocalDefs = @@ -467,7 +493,7 @@ object CheckUnused: .filterNot(d => d.symbol.usedDefContains) .filterNot(d => usedInPosition.exists { case (pos, name) => d.span.contains(pos.span) && name == d.symbol.name}) .filterNot(d => containsSyntheticSuffix(d.symbol)) - .map(d => d.namePos -> WarnTypes.LocalDefs).toList + .map(d => UnusedSymbol(d.namePos, d.name, WarnTypes.LocalDefs)).toList else Nil val sortedExplicitParams = @@ -476,7 +502,7 @@ object CheckUnused: .filterNot(d => d.symbol.usedDefContains) .filterNot(d => usedInPosition.exists { case (pos, name) => d.span.contains(pos.span) && name == d.symbol.name}) .filterNot(d => containsSyntheticSuffix(d.symbol)) - .map(d => d.namePos -> WarnTypes.ExplicitParams).toList + .map(d => UnusedSymbol(d.namePos, d.name, WarnTypes.ExplicitParams)).toList else Nil val sortedImplicitParams = @@ -484,7 +510,7 @@ object CheckUnused: implicitParamInScope .filterNot(d => d.symbol.usedDefContains) .filterNot(d => containsSyntheticSuffix(d.symbol)) - .map(d => d.namePos -> WarnTypes.ImplicitParams).toList + .map(d => UnusedSymbol(d.namePos, d.name, WarnTypes.ImplicitParams)).toList else Nil val sortedPrivateDefs = @@ -492,7 +518,7 @@ object CheckUnused: privateDefInScope .filterNot(d => d.symbol.usedDefContains) .filterNot(d => containsSyntheticSuffix(d.symbol)) - .map(d => d.namePos -> WarnTypes.PrivateMembers).toList + .map(d => UnusedSymbol(d.namePos, d.name, WarnTypes.PrivateMembers)).toList else Nil val sortedPatVars = @@ -501,14 +527,14 @@ object CheckUnused: .filterNot(d => d.symbol.usedDefContains) .filterNot(d => containsSyntheticSuffix(d.symbol)) .filterNot(d => usedInPosition.exists { case (pos, name) => d.span.contains(pos.span) && name == d.symbol.name}) - .map(d => d.namePos -> WarnTypes.PatVars).toList + .map(d => UnusedSymbol(d.namePos, d.name, WarnTypes.PatVars)).toList else Nil val warnings = List(sortedImp, sortedLocalDefs, sortedExplicitParams, sortedImplicitParams, sortedPrivateDefs, sortedPatVars).flatten.sortBy { s => - val pos = s._1.sourcePos + val pos = s.pos.sourcePos (pos.line, pos.column) } - UnusedResult(warnings, Nil) + UnusedResult(warnings) end getUnused //============================ HELPERS ==================================== @@ -707,7 +733,11 @@ object CheckUnused: case _:tpd.Block => Local case _ => Other + case class UnusedSymbol(pos: SrcPos, name: Name, warnType: WarnTypes) /** A container for the results of the used elements analysis */ - case class UnusedResult(warnings: List[(dotty.tools.dotc.util.SrcPos, WarnTypes)], usedImports: List[(tpd.Import, untpd.ImportSelector)]) + case class UnusedResult(warnings: List[UnusedSymbol]) + object UnusedResult: + val Empty = UnusedResult(Nil) + end CheckUnused From 774c4e9e8e4a5b74472ab07032c74a7b4880e1d5 Mon Sep 17 00:00:00 2001 From: Szymon Rodziewicz Date: Thu, 13 Apr 2023 14:18:44 +0200 Subject: [PATCH 064/371] Fix instantation of CheckUnused phase --- compiler/src/dotty/tools/dotc/Compiler.scala | 4 ++-- compiler/src/dotty/tools/dotc/transform/CheckUnused.scala | 6 ++++-- tests/neg-custom-args/fatal-warnings/i15503a.scala | 4 ++-- 3 files changed, 8 insertions(+), 6 deletions(-) diff --git a/compiler/src/dotty/tools/dotc/Compiler.scala b/compiler/src/dotty/tools/dotc/Compiler.scala index a488a2802ec2..15d4a39c511f 100644 --- a/compiler/src/dotty/tools/dotc/Compiler.scala +++ b/compiler/src/dotty/tools/dotc/Compiler.scala @@ -35,7 +35,7 @@ class Compiler { protected def frontendPhases: List[List[Phase]] = List(new Parser) :: // Compiler frontend: scanner, parser List(new TyperPhase) :: // Compiler frontend: namer, typer - List(CheckUnused.PostTyper) :: // Check for unused elements + List(new CheckUnused.PostTyper) :: // Check for unused elements List(new YCheckPositions) :: // YCheck positions List(new sbt.ExtractDependencies) :: // Sends information on classes' dependencies to sbt via callbacks List(new semanticdb.ExtractSemanticDB) :: // Extract info into .semanticdb files @@ -50,7 +50,7 @@ class Compiler { List(new Pickler) :: // Generate TASTY info List(new Inlining) :: // Inline and execute macros List(new PostInlining) :: // Add mirror support for inlined code - List(CheckUnused.PostInlining) :: // Check for unused elements + List(new CheckUnused.PostInlining) :: // Check for unused elements List(new Staging) :: // Check staging levels and heal staged types List(new Splicing) :: // Replace level 1 splices with holes List(new PickleQuotes) :: // Turn quoted trees into explicit run-time data structures diff --git a/compiler/src/dotty/tools/dotc/transform/CheckUnused.scala b/compiler/src/dotty/tools/dotc/transform/CheckUnused.scala index 8d6aea8c97d1..c1037ea59e75 100644 --- a/compiler/src/dotty/tools/dotc/transform/CheckUnused.scala +++ b/compiler/src/dotty/tools/dotc/transform/CheckUnused.scala @@ -52,6 +52,7 @@ class CheckUnused private (phaseMode: CheckUnused.PhaseMode, suffix: String, _ke // ========== SETUP ============ override def prepareForUnit(tree: tpd.Tree)(using Context): Context = + println(this) val data = UnusedData() tree.getAttachment(_key).foreach(oldData => data.unusedAggregate = oldData.unusedAggregate @@ -306,8 +307,9 @@ object CheckUnused: */ private val _key = Property.StickyKey[UnusedData] - val PostTyper = new CheckUnused(PhaseMode.Aggregate, "PostTyper", _key) - val PostInlining = new CheckUnused(PhaseMode.Report, "PostInlining", _key) + class PostTyper extends CheckUnused(PhaseMode.Aggregate, "PostTyper", _key) + + class PostInlining extends CheckUnused(PhaseMode.Report, "PostInlining", _key) /** * A stateful class gathering the infos on : diff --git a/tests/neg-custom-args/fatal-warnings/i15503a.scala b/tests/neg-custom-args/fatal-warnings/i15503a.scala index 868c488ddb84..cd7282490fc9 100644 --- a/tests/neg-custom-args/fatal-warnings/i15503a.scala +++ b/tests/neg-custom-args/fatal-warnings/i15503a.scala @@ -63,12 +63,12 @@ object FooTypeName: object InlineChecks: object InlineFoo: - import collection.mutable.Set // OK + import collection.mutable.Set // ok import collection.mutable.Map // error inline def getSet = Set(1) object InlinedBar: - import collection.mutable.Set // error + import collection.mutable.Set // ok import collection.mutable.Map // error val a = InlineFoo.getSet From a69b49f6d0b9c8fb76755056ef8c4bc297a7f78a Mon Sep 17 00:00:00 2001 From: Szymon Rodziewicz Date: Thu, 13 Apr 2023 14:37:21 +0200 Subject: [PATCH 065/371] Remove unnecessary logging in CheckUnused phase --- compiler/src/dotty/tools/dotc/transform/CheckUnused.scala | 1 - 1 file changed, 1 deletion(-) diff --git a/compiler/src/dotty/tools/dotc/transform/CheckUnused.scala b/compiler/src/dotty/tools/dotc/transform/CheckUnused.scala index c1037ea59e75..1b879321184d 100644 --- a/compiler/src/dotty/tools/dotc/transform/CheckUnused.scala +++ b/compiler/src/dotty/tools/dotc/transform/CheckUnused.scala @@ -52,7 +52,6 @@ class CheckUnused private (phaseMode: CheckUnused.PhaseMode, suffix: String, _ke // ========== SETUP ============ override def prepareForUnit(tree: tpd.Tree)(using Context): Context = - println(this) val data = UnusedData() tree.getAttachment(_key).foreach(oldData => data.unusedAggregate = oldData.unusedAggregate From 7966b5cd71b1d2d7eec4ad67880204062194f423 Mon Sep 17 00:00:00 2001 From: Szymon Rodziewicz Date: Thu, 13 Apr 2023 14:48:20 +0200 Subject: [PATCH 066/371] Add test cases for macro wunused --- .../fatal-warnings/i16876/Macro.scala | 23 +++++++++++++++++++ .../fatal-warnings/i16876/Test.scala | 11 +++++++++ 2 files changed, 34 insertions(+) create mode 100644 tests/neg-custom-args/fatal-warnings/i16876/Macro.scala create mode 100644 tests/neg-custom-args/fatal-warnings/i16876/Test.scala diff --git a/tests/neg-custom-args/fatal-warnings/i16876/Macro.scala b/tests/neg-custom-args/fatal-warnings/i16876/Macro.scala new file mode 100644 index 000000000000..2823de1f72c5 --- /dev/null +++ b/tests/neg-custom-args/fatal-warnings/i16876/Macro.scala @@ -0,0 +1,23 @@ +import scala.quoted.* + +def findMethodSymbol(using q: Quotes)(s: quotes.reflect.Symbol): quotes.reflect.Symbol = + if s.isDefDef then + s + else + findMethodSymbol(using q)(s.owner) +end findMethodSymbol + + +inline def adder: Int = ${ + adderImpl +} + +def adderImpl(using q: Quotes): Expr[Int] = + import quotes.reflect.* + + val inputs = findMethodSymbol(using q)(q.reflect.Symbol.spliceOwner).tree match + case DefDef(_, params, _, _) => + params.last match + case TermParamClause(valDefs) => + valDefs.map(vd => Ref(vd.symbol).asExprOf[Int]) + inputs.reduce((exp1, exp2) => '{ $exp1 + $exp2 }) \ No newline at end of file diff --git a/tests/neg-custom-args/fatal-warnings/i16876/Test.scala b/tests/neg-custom-args/fatal-warnings/i16876/Test.scala new file mode 100644 index 000000000000..d9229d31cd6d --- /dev/null +++ b/tests/neg-custom-args/fatal-warnings/i16876/Test.scala @@ -0,0 +1,11 @@ +// scalac: -Wunused:all + +object Foo { + private def myMethod(a: Int, b: Int, c: Int) = adder // ok + myMethod(1, 2, 3) + + private def myMethodFailing(a: Int, b: Int, c: Int) = a + 0 // error // error + myMethodFailing(1, 2, 3) +} + + From 644fee2feb1e9f3181dccab86faea7341fc09154 Mon Sep 17 00:00:00 2001 From: Szymon Rodziewicz Date: Thu, 13 Apr 2023 15:47:59 +0200 Subject: [PATCH 067/371] Apply review suggestions to WUnused PR --- .../tools/dotc/transform/CheckUnused.scala | 17 +++++++++-------- 1 file changed, 9 insertions(+), 8 deletions(-) diff --git a/compiler/src/dotty/tools/dotc/transform/CheckUnused.scala b/compiler/src/dotty/tools/dotc/transform/CheckUnused.scala index 1b879321184d..c7e68079c0e8 100644 --- a/compiler/src/dotty/tools/dotc/transform/CheckUnused.scala +++ b/compiler/src/dotty/tools/dotc/transform/CheckUnused.scala @@ -25,7 +25,7 @@ import dotty.tools.dotc.core.Definitions import dotty.tools.dotc.core.NameKinds.WildcardParamName import dotty.tools.dotc.core.Symbols.Symbol import dotty.tools.dotc.core.StdNames.nme - +import scala.math.Ordering /** * A compiler phase that checks for unused imports or definitions @@ -64,7 +64,7 @@ class CheckUnused private (phaseMode: CheckUnused.PhaseMode, suffix: String, _ke override def transformUnit(tree: tpd.Tree)(using Context): tpd.Tree = unusedDataApply { ud => - aggregateUnused(ud, ud.getUnused) + finishAggregation(ud) if(phaseMode == PhaseMode.Report) then ud.unusedAggregate.foreach(reportUnused) } @@ -253,12 +253,13 @@ class CheckUnused private (phaseMode: CheckUnused.PhaseMode, suffix: String, _ke private def traverseAnnotations(sym: Symbol)(using Context): Unit = sym.denot.annotations.foreach(annot => traverser.traverse(annot.tree)) - private def aggregateUnused(data: UnusedData, res: UnusedData.UnusedResult)(using Context): Unit = + private def finishAggregation(data: UnusedData)(using Context): Unit = + val unusedInThisStage = data.getUnused data.unusedAggregate match { case None => - data.unusedAggregate = Some(res) + data.unusedAggregate = Some(unusedInThisStage) case Some(prevUnused) => - val intersection = res.warnings.filter(sym => prevUnused.warnings.contains(sym)) + val intersection = unusedInThisStage.warnings.intersect(prevUnused.warnings) data.unusedAggregate = Some(UnusedResult(intersection)) } @@ -266,7 +267,7 @@ class CheckUnused private (phaseMode: CheckUnused.PhaseMode, suffix: String, _ke /** Do the actual reporting given the result of the anaylsis */ private def reportUnused(res: UnusedData.UnusedResult)(using Context): Unit = - res.warnings.foreach { s => + res.warnings.toList.sortBy(_.pos.line)(using Ordering[Int]).foreach { s => s match case UnusedSymbol(t, _, WarnTypes.Imports) => report.warning(s"unused import", t) @@ -321,7 +322,7 @@ object CheckUnused: import UnusedData.* /** The current scope during the tree traversal */ - var currScopeType: MutStack[ScopeType] = MutStack(ScopeType.Other) + val currScopeType: MutStack[ScopeType] = MutStack(ScopeType.Other) var unusedAggregate: Option[UnusedResult] = None @@ -736,7 +737,7 @@ object CheckUnused: case class UnusedSymbol(pos: SrcPos, name: Name, warnType: WarnTypes) /** A container for the results of the used elements analysis */ - case class UnusedResult(warnings: List[UnusedSymbol]) + case class UnusedResult(warnings: Set[UnusedSymbol]) object UnusedResult: val Empty = UnusedResult(Nil) From e369d90b300d0d468f8d9595ed4756938df4cd16 Mon Sep 17 00:00:00 2001 From: Szymon Rodziewicz Date: Thu, 13 Apr 2023 15:55:29 +0200 Subject: [PATCH 068/371] Move finishAggregation to UnusedData class in CheckUnused --- .../tools/dotc/transform/CheckUnused.scala | 28 +++++++++---------- 1 file changed, 14 insertions(+), 14 deletions(-) diff --git a/compiler/src/dotty/tools/dotc/transform/CheckUnused.scala b/compiler/src/dotty/tools/dotc/transform/CheckUnused.scala index c7e68079c0e8..468481e52441 100644 --- a/compiler/src/dotty/tools/dotc/transform/CheckUnused.scala +++ b/compiler/src/dotty/tools/dotc/transform/CheckUnused.scala @@ -64,7 +64,7 @@ class CheckUnused private (phaseMode: CheckUnused.PhaseMode, suffix: String, _ke override def transformUnit(tree: tpd.Tree)(using Context): tpd.Tree = unusedDataApply { ud => - finishAggregation(ud) + ud.finishAggregation() if(phaseMode == PhaseMode.Report) then ud.unusedAggregate.foreach(reportUnused) } @@ -253,17 +253,6 @@ class CheckUnused private (phaseMode: CheckUnused.PhaseMode, suffix: String, _ke private def traverseAnnotations(sym: Symbol)(using Context): Unit = sym.denot.annotations.foreach(annot => traverser.traverse(annot.tree)) - private def finishAggregation(data: UnusedData)(using Context): Unit = - val unusedInThisStage = data.getUnused - data.unusedAggregate match { - case None => - data.unusedAggregate = Some(unusedInThisStage) - case Some(prevUnused) => - val intersection = unusedInThisStage.warnings.intersect(prevUnused.warnings) - data.unusedAggregate = Some(UnusedResult(intersection)) - } - - /** Do the actual reporting given the result of the anaylsis */ private def reportUnused(res: UnusedData.UnusedResult)(using Context): Unit = @@ -371,6 +360,17 @@ object CheckUnused: execInNewScope popScope() + def finishAggregation(using Context)(): Unit = + val unusedInThisStage = this.getUnused + this.unusedAggregate match { + case None => + this.unusedAggregate = Some(unusedInThisStage) + case Some(prevUnused) => + val intersection = unusedInThisStage.warnings.intersect(prevUnused.warnings) + this.unusedAggregate = Some(UnusedResult(intersection)) + } + + /** * Register a found (used) symbol along with its name * @@ -536,7 +536,7 @@ object CheckUnused: val pos = s.pos.sourcePos (pos.line, pos.column) } - UnusedResult(warnings) + UnusedResult(warnings.toSet) end getUnused //============================ HELPERS ==================================== @@ -739,7 +739,7 @@ object CheckUnused: /** A container for the results of the used elements analysis */ case class UnusedResult(warnings: Set[UnusedSymbol]) object UnusedResult: - val Empty = UnusedResult(Nil) + val Empty = UnusedResult(Set.empty) end CheckUnused From 03dba67dd6d02aac7bcaec4a634241adca140646 Mon Sep 17 00:00:00 2001 From: Szymon Rodziewicz Date: Mon, 10 Apr 2023 20:19:48 +0200 Subject: [PATCH 069/371] WIP: Disable WUnused for params of non-private defs --- .../src/dotty/tools/dotc/transform/CheckUnused.scala | 12 +++++++++++- .../fatal-warnings/i15503-scala2/scala2-t11681.scala | 2 +- 2 files changed, 12 insertions(+), 2 deletions(-) diff --git a/compiler/src/dotty/tools/dotc/transform/CheckUnused.scala b/compiler/src/dotty/tools/dotc/transform/CheckUnused.scala index 468481e52441..e76b2abe95c6 100644 --- a/compiler/src/dotty/tools/dotc/transform/CheckUnused.scala +++ b/compiler/src/dotty/tools/dotc/transform/CheckUnused.scala @@ -118,6 +118,10 @@ class CheckUnused private (phaseMode: CheckUnused.PhaseMode, suffix: String, _ke override def prepareForDefDef(tree: tpd.DefDef)(using Context): Context = unusedDataApply{ ud => + if !tree.rawMods.is(Private) then + tree.termParamss.flatten.foreach { p => + ud.addIgnoredParam(p.symbol) + } import ud.registerTrivial tree.registerTrivial traverseAnnotations(tree.symbol) @@ -350,6 +354,8 @@ object CheckUnused: /** Trivial definitions, avoid registering params */ private val trivialDefs = MutSet[Symbol]() + private val paramsToSkip = MutSet[Symbol]() + /** * Push a new Scope of the given type, executes the given Unit and * pop it back to the original type. @@ -396,6 +402,10 @@ object CheckUnused: def removeIgnoredUsage(sym: Symbol)(using Context): Unit = doNotRegister --= sym.everySymbol + def addIgnoredParam(sym: Symbol)(using Context): Unit = + paramsToSkip += sym + + /** Register an import */ def registerImport(imp: tpd.Import)(using Context): Unit = @@ -411,7 +421,7 @@ object CheckUnused: if memDef.isValidParam then if memDef.symbol.isOneOf(GivenOrImplicit) then implicitParamInScope += memDef - else + else if !paramsToSkip.contains(memDef.symbol) then explicitParamInScope += memDef else if currScopeType.top == ScopeType.Local then localDefInScope += memDef diff --git a/tests/neg-custom-args/fatal-warnings/i15503-scala2/scala2-t11681.scala b/tests/neg-custom-args/fatal-warnings/i15503-scala2/scala2-t11681.scala index 912dbb456f3b..65354870f743 100644 --- a/tests/neg-custom-args/fatal-warnings/i15503-scala2/scala2-t11681.scala +++ b/tests/neg-custom-args/fatal-warnings/i15503-scala2/scala2-t11681.scala @@ -9,7 +9,7 @@ trait InterFace { } trait BadAPI extends InterFace { - def f(a: Int, + private def f(a: Int, b: String, // error c: Double): Int = { println(c) From 7017b8e4b1e90741468531dbc164695abb3686a5 Mon Sep 17 00:00:00 2001 From: Szymon Rodziewicz Date: Thu, 13 Apr 2023 15:32:56 +0200 Subject: [PATCH 070/371] Handle implicit params and adjust tests in WUnused --- .../tools/dotc/transform/CheckUnused.scala | 5 ++-- .../i15503-scala2/scala2-t11681.scala | 22 +++++++++--------- .../fatal-warnings/i15503e.scala | 18 ++++++++------- .../fatal-warnings/i15503f.scala | 17 +++++++------- .../fatal-warnings/i15503g.scala | 23 ++++++++++--------- .../fatal-warnings/i15503h.scala | 2 +- .../fatal-warnings/i15503i.scala | 10 +++++--- 7 files changed, 52 insertions(+), 45 deletions(-) diff --git a/compiler/src/dotty/tools/dotc/transform/CheckUnused.scala b/compiler/src/dotty/tools/dotc/transform/CheckUnused.scala index e76b2abe95c6..df916fc76a3b 100644 --- a/compiler/src/dotty/tools/dotc/transform/CheckUnused.scala +++ b/compiler/src/dotty/tools/dotc/transform/CheckUnused.scala @@ -405,8 +405,6 @@ object CheckUnused: def addIgnoredParam(sym: Symbol)(using Context): Unit = paramsToSkip += sym - - /** Register an import */ def registerImport(imp: tpd.Import)(using Context): Unit = if !tpd.languageImport(imp.expr).nonEmpty && !imp.isGeneratedByEnum && !isTransparentAndInline(imp) then @@ -420,7 +418,8 @@ object CheckUnused: if memDef.isValidMemberDef then if memDef.isValidParam then if memDef.symbol.isOneOf(GivenOrImplicit) then - implicitParamInScope += memDef + if !paramsToSkip.contains(memDef.symbol) then + implicitParamInScope += memDef else if !paramsToSkip.contains(memDef.symbol) then explicitParamInScope += memDef else if currScopeType.top == ScopeType.Local then diff --git a/tests/neg-custom-args/fatal-warnings/i15503-scala2/scala2-t11681.scala b/tests/neg-custom-args/fatal-warnings/i15503-scala2/scala2-t11681.scala index 65354870f743..13d540dc2a5d 100644 --- a/tests/neg-custom-args/fatal-warnings/i15503-scala2/scala2-t11681.scala +++ b/tests/neg-custom-args/fatal-warnings/i15503-scala2/scala2-t11681.scala @@ -33,7 +33,7 @@ trait BadAPI extends InterFace { override def equals(other: Any): Boolean = true // OK - def i(implicit s: String) = answer // error + def i(implicit s: String) = answer // ok /* def future(x: Int): Int = { @@ -63,7 +63,7 @@ case class CaseyKasem(k: Int) // OK case class CaseyAtTheBat(k: Int)(s: String) // ok trait Ignorance { - def f(readResolve: Int) = answer // error + def f(readResolve: Int) = answer // ok } class Reusing(u: Int) extends Unusing(u) // OK @@ -78,28 +78,28 @@ trait Unimplementation { } trait DumbStuff { - def f(implicit dummy: DummyImplicit) = answer // todo // error - def g(dummy: DummyImplicit) = answer // error + def f(implicit dummy: DummyImplicit) = answer // ok + def g(dummy: DummyImplicit) = answer // ok } trait Proofs { - def f[A, B](implicit ev: A =:= B) = answer // todo // error - def g[A, B](implicit ev: A <:< B) = answer // todo // error - def f2[A, B](ev: A =:= B) = answer // error - def g2[A, B](ev: A <:< B) = answer // error + def f[A, B](implicit ev: A =:= B) = answer // ok + def g[A, B](implicit ev: A <:< B) = answer // ok + def f2[A, B](ev: A =:= B) = answer // ok + def g2[A, B](ev: A <:< B) = answer // ok } trait Anonymous { - def f = (i: Int) => answer // error + def f = (i: Int) => answer // ok def f1 = (_: Int) => answer // OK def f2: Int => Int = _ + 1 // OK - def g = for (i <- List(1)) yield answer // error + def g = for (i <- List(1)) yield answer // ok } trait Context[A] trait Implicits { - def f[A](implicit ctx: Context[A]) = answer // error + def f[A](implicit ctx: Context[A]) = answer // ok def g[A: Context] = answer // OK } class Bound[A: Context] // OK diff --git a/tests/neg-custom-args/fatal-warnings/i15503e.scala b/tests/neg-custom-args/fatal-warnings/i15503e.scala index 56aec702a39e..6d166aff7347 100644 --- a/tests/neg-custom-args/fatal-warnings/i15503e.scala +++ b/tests/neg-custom-args/fatal-warnings/i15503e.scala @@ -1,14 +1,16 @@ // scalac: -Wunused:explicits -/* This goes around the "trivial method" detection */ -val default_val = 1 +object Foo { + /* This goes around the "trivial method" detection */ + val default_val = 1 -def f1(a: Int) = a // OK -def f2(a: Int) = default_val // error -def f3(a: Int)(using Int) = a // OK -def f4(a: Int)(using Int) = default_val // error -def f6(a: Int)(using Int) = summon[Int] // error -def f7(a: Int)(using Int) = summon[Int] + a // OK + private def f1(a: Int) = a // OK + private def f2(a: Int) = default_val // error + private def f3(a: Int)(using Int) = a // OK + private def f4(a: Int)(using Int) = default_val // error + private def f6(a: Int)(using Int) = summon[Int] // error + private def f7(a: Int)(using Int) = summon[Int] + a // OK +} package scala2main.unused.args: object happyBirthday { diff --git a/tests/neg-custom-args/fatal-warnings/i15503f.scala b/tests/neg-custom-args/fatal-warnings/i15503f.scala index 67c595d74f40..f909272af732 100644 --- a/tests/neg-custom-args/fatal-warnings/i15503f.scala +++ b/tests/neg-custom-args/fatal-warnings/i15503f.scala @@ -3,11 +3,12 @@ /* This goes around the "trivial method" detection */ val default_int = 1 -def f1(a: Int) = a // OK -def f2(a: Int) = 1 // OK -def f3(a: Int)(using Int) = a // OK -def f4(a: Int)(using Int) = default_int // OK -def f6(a: Int)(using Int) = summon[Int] // OK -def f7(a: Int)(using Int) = summon[Int] + a // OK -def f8(a: Int)(using foo: Int) = a // error - +object Xd { + private def f1(a: Int) = a // OK + private def f2(a: Int) = 1 // OK + private def f3(a: Int)(using Int) = a // OK + private def f4(a: Int)(using Int) = default_int // OK + private def f6(a: Int)(using Int) = summon[Int] // OK + private def f7(a: Int)(using Int) = summon[Int] + a // OK + private def f8(a: Int)(using foo: Int) = a // error +} diff --git a/tests/neg-custom-args/fatal-warnings/i15503g.scala b/tests/neg-custom-args/fatal-warnings/i15503g.scala index 8b3fd7561a4b..2185bfed711d 100644 --- a/tests/neg-custom-args/fatal-warnings/i15503g.scala +++ b/tests/neg-custom-args/fatal-warnings/i15503g.scala @@ -1,18 +1,19 @@ // scalac: -Wunused:params /* This goes around the "trivial method" detection */ -val default_int = 1 +object Foo { + val default_int = 1 -def f1(a: Int) = a // OK -def f2(a: Int) = default_int // error -def f3(a: Int)(using Int) = a // OK -def f4(a: Int)(using Int) = default_int // error -def f6(a: Int)(using Int) = summon[Int] // error -def f7(a: Int)(using Int) = summon[Int] + a // OK - -/* --- Trivial method check --- */ -def g1(x: Int) = 1 // OK -def g2(x: Int) = ??? // OK + private def f1(a: Int) = a // OK + private def f2(a: Int) = default_int // error + private def f3(a: Int)(using Int) = a // OK + private def f4(a: Int)(using Int) = default_int // error + private def f6(a: Int)(using Int) = summon[Int] // error + private def f7(a: Int)(using Int) = summon[Int] + a // OK + /* --- Trivial method check --- */ + private def g1(x: Int) = 1 // OK + private def g2(x: Int) = ??? // OK +} package foo.test.i17101: type Test[A] = A diff --git a/tests/neg-custom-args/fatal-warnings/i15503h.scala b/tests/neg-custom-args/fatal-warnings/i15503h.scala index f8d1d6f2202f..3bab6cdbd098 100644 --- a/tests/neg-custom-args/fatal-warnings/i15503h.scala +++ b/tests/neg-custom-args/fatal-warnings/i15503h.scala @@ -7,7 +7,7 @@ class A { val b = 2 // OK private def c = 2 // error - def d(using x:Int): Int = b // error + def d(using x:Int): Int = b // ok def e(x: Int) = 1 // OK def f = val x = 1 // error diff --git a/tests/neg-custom-args/fatal-warnings/i15503i.scala b/tests/neg-custom-args/fatal-warnings/i15503i.scala index 436ee7ca0c0c..e0f708e30f87 100644 --- a/tests/neg-custom-args/fatal-warnings/i15503i.scala +++ b/tests/neg-custom-args/fatal-warnings/i15503i.scala @@ -17,10 +17,10 @@ class A { private def c2 = 2 // OK def c3 = c2 - def d1(using x:Int): Int = default_int // error + def d1(using x:Int): Int = default_int // ok def d2(using x:Int): Int = x // OK - def e1(x: Int) = default_int // error + def e1(x: Int) = default_int // ok def e2(x: Int) = x // OK def f = val x = 1 // error @@ -44,7 +44,11 @@ package foo.test.scala.annotation: val default_int = 12 def a1(a: Int) = a // OK - def a2(a: Int) = default_int // error + def a2(a: Int) = default_int // ok + + private def a2_p(a: Int) = default_int // error + def a2_p_used = a2_p(3) + def a3(@unused a: Int) = default_int //OK def b1 = From b4e5cb7a7b862e386b9056c0c81cb275f4fd53fb Mon Sep 17 00:00:00 2001 From: Szymon Rodziewicz Date: Thu, 13 Apr 2023 17:28:33 +0200 Subject: [PATCH 071/371] Fix tests for WUnused/disable for public defs --- .../fatal-warnings/i15503e.scala | 30 +++++++++---------- 1 file changed, 15 insertions(+), 15 deletions(-) diff --git a/tests/neg-custom-args/fatal-warnings/i15503e.scala b/tests/neg-custom-args/fatal-warnings/i15503e.scala index 6d166aff7347..57664cd08dcd 100644 --- a/tests/neg-custom-args/fatal-warnings/i15503e.scala +++ b/tests/neg-custom-args/fatal-warnings/i15503e.scala @@ -14,7 +14,7 @@ object Foo { package scala2main.unused.args: object happyBirthday { - def main(args: Array[String]): Unit = println("Hello World") // error + def main(args: Array[String]): Unit = println("Hello World") // ok } package scala2main: @@ -31,7 +31,7 @@ package scala3main: package foo.test.lambda.param: val default_val = 1 val a = (i: Int) => i // OK - val b = (i: Int) => default_val // error + val b = (i: Int) => default_val // OK val c = (_: Int) => default_val // OK package foo.test.trivial: @@ -39,19 +39,19 @@ package foo.test.trivial: class C { def answer: 42 = 42 object X - def g0(x: Int) = ??? // OK - def f0(x: Int) = () // OK - def f1(x: Int) = throw new RuntimeException // OK - def f2(x: Int) = 42 // OK - def f3(x: Int): Option[Int] = None // OK - def f4(x: Int) = classOf[Int] // OK - def f5(x: Int) = answer + 27 // OK - def f6(x: Int) = X // OK - def f7(x: Int) = Y // OK - def f8(x: Int): List[C] = Nil // OK - def f9(x: Int): List[Int] = List(1,2,3,4) // error - def foo:Int = 32 // OK - def f77(x: Int) = foo // error + private def g0(x: Int) = ??? // OK + private def f0(x: Int) = () // OK + private def f1(x: Int) = throw new RuntimeException // OK + private def f2(x: Int) = 42 // OK + private def f3(x: Int): Option[Int] = None // OK + private def f4(x: Int) = classOf[Int] // OK + private def f5(x: Int) = answer + 27 // OK + private def f6(x: Int) = X // OK + private def f7(x: Int) = Y // OK + private def f8(x: Int): List[C] = Nil // OK + private def f9(x: Int): List[Int] = List(1,2,3,4) // error + private def foo:Int = 32 // OK + private def f77(x: Int) = foo // error } object Y From 2af117f75ad95b29407b9545fdd9b345e9a48fdd Mon Sep 17 00:00:00 2001 From: =?UTF-8?q?Pawe=C5=82=20Marks?= Date: Fri, 14 Apr 2023 18:27:58 +0200 Subject: [PATCH 072/371] Add missing test for Wunused --- tests/neg-custom-args/fatal-warnings/i15503i.scala | 11 +++++++++++ 1 file changed, 11 insertions(+) diff --git a/tests/neg-custom-args/fatal-warnings/i15503i.scala b/tests/neg-custom-args/fatal-warnings/i15503i.scala index e0f708e30f87..3b653bfadb0a 100644 --- a/tests/neg-custom-args/fatal-warnings/i15503i.scala +++ b/tests/neg-custom-args/fatal-warnings/i15503i.scala @@ -292,6 +292,17 @@ package foo.test.i17156: import b.Xd trait Z derives Xd + +package foo.test.i17175: + val continue = true + def foo = + for { + i <- 1.until(10) // OK + if continue + } { + println(i) + } + package foo.test.i17117: package example { object test1 { From 84458c712a5757224280d5c5bbdb55841cf1c370 Mon Sep 17 00:00:00 2001 From: Szymon Rodziewicz Date: Fri, 14 Apr 2023 12:58:37 +0200 Subject: [PATCH 073/371] Bring in #17263 to fix the tests. --- compiler/src/dotty/tools/dotc/transform/CheckUnused.scala | 2 +- tests/neg-custom-args/fatal-warnings/i15503i.scala | 3 --- 2 files changed, 1 insertion(+), 4 deletions(-) diff --git a/compiler/src/dotty/tools/dotc/transform/CheckUnused.scala b/compiler/src/dotty/tools/dotc/transform/CheckUnused.scala index df916fc76a3b..ce05e6c125de 100644 --- a/compiler/src/dotty/tools/dotc/transform/CheckUnused.scala +++ b/compiler/src/dotty/tools/dotc/transform/CheckUnused.scala @@ -118,7 +118,7 @@ class CheckUnused private (phaseMode: CheckUnused.PhaseMode, suffix: String, _ke override def prepareForDefDef(tree: tpd.DefDef)(using Context): Context = unusedDataApply{ ud => - if !tree.rawMods.is(Private) then + if !tree.symbol.is(Private) then tree.termParamss.flatten.foreach { p => ud.addIgnoredParam(p.symbol) } diff --git a/tests/neg-custom-args/fatal-warnings/i15503i.scala b/tests/neg-custom-args/fatal-warnings/i15503i.scala index 3b653bfadb0a..fefead7f01a3 100644 --- a/tests/neg-custom-args/fatal-warnings/i15503i.scala +++ b/tests/neg-custom-args/fatal-warnings/i15503i.scala @@ -46,9 +46,6 @@ package foo.test.scala.annotation: def a1(a: Int) = a // OK def a2(a: Int) = default_int // ok - private def a2_p(a: Int) = default_int // error - def a2_p_used = a2_p(3) - def a3(@unused a: Int) = default_int //OK def b1 = From 3156fe97c70cb931d60e8299117b4c9eff093d26 Mon Sep 17 00:00:00 2001 From: Anatolii Kmetiuk Date: Fri, 17 Mar 2023 14:59:39 +0100 Subject: [PATCH 074/371] Check the status of coursier download in CoursierScalaTests.scala This should provide more insight on why #17119 happens. --- .../tools/coursier/CoursierScalaTests.scala | 18 ++++++++++++------ 1 file changed, 12 insertions(+), 6 deletions(-) diff --git a/compiler/test-coursier/dotty/tools/coursier/CoursierScalaTests.scala b/compiler/test-coursier/dotty/tools/coursier/CoursierScalaTests.scala index 979fea0684b2..944bf1957d43 100644 --- a/compiler/test-coursier/dotty/tools/coursier/CoursierScalaTests.scala +++ b/compiler/test-coursier/dotty/tools/coursier/CoursierScalaTests.scala @@ -148,11 +148,11 @@ class CoursierScalaTests: object CoursierScalaTests: - def execCmd(command: String, options: String*): List[String] = + def execCmd(command: String, options: String*): (Int, List[String]) = val cmd = (command :: options.toList).toSeq.mkString(" ") val out = new ListBuffer[String] - cmd.!(ProcessLogger(out += _, out += _)) - out.toList + val code = cmd.!(ProcessLogger(out += _, out += _)) + (code, out.toList) def csScalaCmd(options: String*): List[String] = csCmd("dotty.tools.MainGenericRunner", options*) @@ -166,10 +166,16 @@ object CoursierScalaTests: case Nil => args case _ => "--" +: args val newJOpts = jOpts.map(s => s"--java-opt ${s.stripPrefix("-J")}").mkString(" ") - execCmd("./cs", (s"""launch "org.scala-lang:scala3-compiler_3:${sys.env("DOTTY_BOOTSTRAPPED_VERSION")}" $newJOpts --main-class "$entry" --property "scala.usejavacp=true"""" +: newOptions)*) + execCmd("./cs", (s"""launch "org.scala-lang:scala3-compiler_3:${sys.env("DOTTY_BOOTSTRAPPED_VERSION")}" $newJOpts --main-class "$entry" --property "scala.usejavacp=true"""" +: newOptions)*)._2 /** Get coursier script */ @BeforeClass def setup(): Unit = - val ver = execCmd("uname").head.replace('L', 'l').replace('D', 'd') - execCmd("curl", s"-fLo cs https://git.io/coursier-cli-$ver") #&& execCmd("chmod", "+x cs") + val ver = execCmd("uname")._2.head.replace('L', 'l').replace('D', 'd') + def runAndCheckCmd(cmd: String, options: String*): Unit = + val (code, out) = execCmd(cmd, options*) + if code != 0 then + fail(s"Failed to run $cmd ${options.mkString(" ")}, exit code: $code, output: ${out.mkString("\n")}") + + runAndCheckCmd("curl", s"-fLo cs https://git.io/coursier-cli-$ver") + runAndCheckCmd("chmod", "+x cs") From 22e6ffe7529bd888275218c02b377207e9fc2c0c Mon Sep 17 00:00:00 2001 From: =?UTF-8?q?Pawe=C5=82=20Marks?= Date: Mon, 17 Apr 2023 12:55:54 +0200 Subject: [PATCH 075/371] Add changelog for 3.3.0-RC4 --- changelogs/3.3.0-RC4.md | 35 +++++++++++++++++++++++++++++++++++ 1 file changed, 35 insertions(+) create mode 100644 changelogs/3.3.0-RC4.md diff --git a/changelogs/3.3.0-RC4.md b/changelogs/3.3.0-RC4.md new file mode 100644 index 000000000000..4c4a490237b6 --- /dev/null +++ b/changelogs/3.3.0-RC4.md @@ -0,0 +1,35 @@ +# Backported fixes + +- Fix HK quoted pattern type variables [#16907](https//github.com/lampepfl/dotty/pull/16907) +- Fix caching issue caused by incorrect isProvisional check [#16989](https://github.com/lampepfl/dotty/pull/16989) +- Fix race condition in new LazyVals [#16975](https://github.com/lampepfl/dotty/pull/16975) +- Fix "-Wunused: False positive on parameterless enum member" [#16927](https://github.com/lampepfl/dotty/pull/16927) +- Register usage of symbols in non-inferred type trees in CheckUnused [#16939](https://github.com/lampepfl/dotty/pull/16939) +- Traverse annotations instead of just registering in -W [#16956](https://github.com/lampepfl/dotty/pull/16956) +- Ignore parameter of accessors in -Wunused [#16957](https://github.com/lampepfl/dotty/pull/16957) +- Improve override detection in CheckUnused [#16965](https://github.com/lampepfl/dotty/pull/16965) +- WUnused: Fix unused warning in synthetic symbols [#17020](https://github.com/lampepfl/dotty/pull/17020) +- Fix WUnused with idents in derived code [#17095](https//github.com/lampepfl/dotty/pull/17095) +- WUnused: Fix for symbols with synthetic names and unused transparent inlines [#17061](https//github.com/lampepfl/dotty/pull/17061) +- Skip extension method params in WUnused [#17178](https//github.com/lampepfl/dotty/pull/17178) +- Fix wunused false positive when deriving alias type [#17157](https//github.com/lampepfl/dotty/pull/17157) +- Fix WUnused for accessible symbols that are renamed [#17177](https//github.com/lampepfl/dotty/pull/17177) +- Fix WUnused false positive in for [#17176](https//github.com/lampepfl/dotty/pull/17176) +- Make CheckUnused run both after Typer and Inlining [#17206](https//github.com/lampepfl/dotty/pull/17206) +- Disable WUnused for params of non-private defs [#17223](https//github.com/lampepfl/dotty/pull/17223) + +# Contributors + +Thank you to all the contributors who made this release possible 🎉 + +According to `git shortlog -sn --no-merges 3.3.0-RC3..3.3.0-RC4` these are: + +``` + 41 Szymon Rodziewicz + 4 Paul Coral + 3 Paweł Marks + 1 Guillaume Martres + 1 Kacper Korban + 1 Nicolas Stucki + +``` From 5990252b77128f9599d7b21c2444d3cbaa6fbb7a Mon Sep 17 00:00:00 2001 From: =?UTF-8?q?Pawe=C5=82=20Marks?= Date: Mon, 17 Apr 2023 14:46:55 +0200 Subject: [PATCH 076/371] Release 3.3.0-RC4 --- project/Build.scala | 4 ++-- 1 file changed, 2 insertions(+), 2 deletions(-) diff --git a/project/Build.scala b/project/Build.scala index 4360add9578a..dddddf20c1ce 100644 --- a/project/Build.scala +++ b/project/Build.scala @@ -82,7 +82,7 @@ object Build { val referenceVersion = "3.2.2" - val baseVersion = "3.3.0-RC3" + val baseVersion = "3.3.0-RC4" // Versions used by the vscode extension to create a new project // This should be the latest published releases. @@ -98,7 +98,7 @@ object Build { * set to 3.1.3. If it is going to be 3.1.0, it must be set to the latest * 3.0.x release. */ - val previousDottyVersion = "3.3.0-RC2" + val previousDottyVersion = "3.3.0-RC3" object CompatMode { final val BinaryCompatible = 0 From 40502e03042577d10283195ff4c3c3e72c4bcfaf Mon Sep 17 00:00:00 2001 From: Matt Bovel Date: Mon, 3 Apr 2023 10:55:26 +0200 Subject: [PATCH 077/371] Drop network tests in requests community-build --- community-build/community-projects/requests-scala | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/community-build/community-projects/requests-scala b/community-build/community-projects/requests-scala index 23b4895710f1..8e4a40588491 160000 --- a/community-build/community-projects/requests-scala +++ b/community-build/community-projects/requests-scala @@ -1 +1 @@ -Subproject commit 23b4895710f17bf892563b28755b225c8be7f7e3 +Subproject commit 8e4a40588491608aa40099f79c881d54a5094e75 From 72e5dd28c91ca1f4421aa8c244d239bfbb8969c0 Mon Sep 17 00:00:00 2001 From: Szymon Rodziewicz Date: Tue, 25 Apr 2023 14:04:49 +0200 Subject: [PATCH 078/371] Fix compiler crash in WUnused --- compiler/src/dotty/tools/dotc/transform/CheckUnused.scala | 3 ++- tests/neg-custom-args/fatal-warnings/i17335.scala | 4 ++++ 2 files changed, 6 insertions(+), 1 deletion(-) create mode 100644 tests/neg-custom-args/fatal-warnings/i17335.scala diff --git a/compiler/src/dotty/tools/dotc/transform/CheckUnused.scala b/compiler/src/dotty/tools/dotc/transform/CheckUnused.scala index ce05e6c125de..32afe9c8a1e7 100644 --- a/compiler/src/dotty/tools/dotc/transform/CheckUnused.scala +++ b/compiler/src/dotty/tools/dotc/transform/CheckUnused.scala @@ -47,7 +47,8 @@ class CheckUnused private (phaseMode: CheckUnused.PhaseMode, suffix: String, _ke override def isRunnable(using Context): Boolean = ctx.settings.Wunused.value.nonEmpty && - !ctx.isJava + !ctx.isJava && + super.isRunnable // ========== SETUP ============ diff --git a/tests/neg-custom-args/fatal-warnings/i17335.scala b/tests/neg-custom-args/fatal-warnings/i17335.scala new file mode 100644 index 000000000000..6629e2f151c9 --- /dev/null +++ b/tests/neg-custom-args/fatal-warnings/i17335.scala @@ -0,0 +1,4 @@ +// scalac: -Wunused:all + +def aMethod() = + doStuff { (x) => x } // error From 46d9c07c0b4aca30f87371043702c624730bdf08 Mon Sep 17 00:00:00 2001 From: Szymon Rodziewicz Date: Tue, 25 Apr 2023 14:12:56 +0200 Subject: [PATCH 079/371] Change the order of checks --- compiler/src/dotty/tools/dotc/transform/CheckUnused.scala | 4 ++-- 1 file changed, 2 insertions(+), 2 deletions(-) diff --git a/compiler/src/dotty/tools/dotc/transform/CheckUnused.scala b/compiler/src/dotty/tools/dotc/transform/CheckUnused.scala index 32afe9c8a1e7..371df57045b4 100644 --- a/compiler/src/dotty/tools/dotc/transform/CheckUnused.scala +++ b/compiler/src/dotty/tools/dotc/transform/CheckUnused.scala @@ -46,9 +46,9 @@ class CheckUnused private (phaseMode: CheckUnused.PhaseMode, suffix: String, _ke override def description: String = CheckUnused.description override def isRunnable(using Context): Boolean = + super.isRunnable && ctx.settings.Wunused.value.nonEmpty && - !ctx.isJava && - super.isRunnable + !ctx.isJava // ========== SETUP ============ From 716b8670210af194452c0009009d7b31ebd25158 Mon Sep 17 00:00:00 2001 From: Kacper Korban Date: Wed, 19 Apr 2023 15:50:29 +0200 Subject: [PATCH 080/371] Wunused: Check if symbol exists before isValidMemberDef check closes lampepfl#17309 --- compiler/src/dotty/tools/dotc/transform/CheckUnused.scala | 8 ++++++-- 1 file changed, 6 insertions(+), 2 deletions(-) diff --git a/compiler/src/dotty/tools/dotc/transform/CheckUnused.scala b/compiler/src/dotty/tools/dotc/transform/CheckUnused.scala index ce05e6c125de..b3083223c555 100644 --- a/compiler/src/dotty/tools/dotc/transform/CheckUnused.scala +++ b/compiler/src/dotty/tools/dotc/transform/CheckUnused.scala @@ -681,7 +681,7 @@ object CheckUnused: /** A function is overriden. Either has `override flags` or parent has a matching member (type and name) */ private def isOverriden(using Context): Boolean = sym.is(Flags.Override) || - (if sym.exists then sym.owner.thisType.parents.exists(p => sym.matchingMember(p).exists) else false) + (sym.exists && sym.owner.thisType.parents.exists(p => sym.matchingMember(p).exists)) end extension @@ -708,7 +708,11 @@ object CheckUnused: extension (memDef: tpd.MemberDef) private def isValidMemberDef(using Context): Boolean = - !memDef.symbol.isUnusedAnnot && !memDef.symbol.isAllOf(Flags.AccessorCreationFlags) && !memDef.name.isWildcard && !memDef.symbol.owner.is(Extension) + memDef.symbol.exists + && !memDef.symbol.isUnusedAnnot + && !memDef.symbol.isAllOf(Flags.AccessorCreationFlags) + && !memDef.name.isWildcard + && !memDef.symbol.owner.is(ExtensionMethod) private def isValidParam(using Context): Boolean = val sym = memDef.symbol From 12cd96e1de30c8f59183feabe25fe688a95a9441 Mon Sep 17 00:00:00 2001 From: Kacper Korban Date: Fri, 21 Apr 2023 01:35:46 +0200 Subject: [PATCH 081/371] Wunused: Include import selector bounds in unused checks closes lampepfl#17314 --- .../tools/dotc/transform/CheckUnused.scala | 34 ++++++++++++------- .../fatal-warnings/i17314b.scala | 14 ++++++++ tests/pos-special/fatal-warnings/i17314.scala | 33 ++++++++++++++++++ .../pos-special/fatal-warnings/i17314a.scala | 12 +++++++ 4 files changed, 81 insertions(+), 12 deletions(-) create mode 100644 tests/neg-custom-args/fatal-warnings/i17314b.scala create mode 100644 tests/pos-special/fatal-warnings/i17314.scala create mode 100644 tests/pos-special/fatal-warnings/i17314a.scala diff --git a/compiler/src/dotty/tools/dotc/transform/CheckUnused.scala b/compiler/src/dotty/tools/dotc/transform/CheckUnused.scala index ce05e6c125de..baef470c8e88 100644 --- a/compiler/src/dotty/tools/dotc/transform/CheckUnused.scala +++ b/compiler/src/dotty/tools/dotc/transform/CheckUnused.scala @@ -202,8 +202,11 @@ class CheckUnused private (phaseMode: CheckUnused.PhaseMode, suffix: String, _ke override def traverse(tree: tpd.Tree)(using Context): Unit = val newCtx = if tree.symbol.exists then ctx.withOwner(tree.symbol) else ctx tree match - case imp:tpd.Import => + case imp: tpd.Import => unusedDataApply(_.registerImport(imp)) + imp.selectors.filter(_.isGiven).map(_.bound).collect { + case untpd.TypedSplice(tree1) => tree1 + }.foreach(traverse(_)(using newCtx)) traverseChildren(tree)(using newCtx) case ident: Ident => prepareForIdent(ident) @@ -449,13 +452,12 @@ object CheckUnused: val used = usedInScope.pop().toSet // used imports in this scope val imports = impInScope.pop() - val kept = used.filterNot { t => - val (sym, isAccessible, optName, isDerived) = t + val kept = used.filterNot { (sym, isAccessible, optName, isDerived) => // keep the symbol for outer scope, if it matches **no** import // This is the first matching wildcard selector var selWildCard: Option[ImportSelector] = None - val exists = imports.exists { imp => + val matchedExplicitImport = imports.exists { imp => sym.isInImport(imp, isAccessible, optName, isDerived) match case None => false case optSel@Some(sel) if sel.isWildcard => @@ -466,11 +468,11 @@ object CheckUnused: unusedImport -= sel true } - if !exists && selWildCard.isDefined then + if !matchedExplicitImport && selWildCard.isDefined then unusedImport -= selWildCard.get true // a matching import exists so the symbol won't be kept for outer scope else - exists + matchedExplicitImport } // if there's an outer scope @@ -610,12 +612,17 @@ object CheckUnused: * return true */ private def shouldSelectorBeReported(imp: tpd.Import, sel: ImportSelector)(using Context): Boolean = - if ctx.settings.WunusedHas.strictNoImplicitWarn then + ctx.settings.WunusedHas.strictNoImplicitWarn && ( sel.isWildcard || imp.expr.tpe.member(sel.name.toTermName).alternatives.exists(_.symbol.isOneOf(GivenOrImplicit)) || imp.expr.tpe.member(sel.name.toTypeName).alternatives.exists(_.symbol.isOneOf(GivenOrImplicit)) - else - false + ) + + extension (tree: ImportSelector) + def boundTpe: Type = tree.bound match { + case untpd.TypedSplice(tree1) => tree1.tpe + case _ => NoType + } extension (sym: Symbol) /** is accessible without import in current context */ @@ -628,7 +635,7 @@ object CheckUnused: && c.owner.thisType.member(sym.name).alternatives.contains(sym) } - /** Given an import and accessibility, return an option of selector that match import<->symbol */ + /** Given an import and accessibility, return selector that matches import<->symbol */ private def isInImport(imp: tpd.Import, isAccessible: Boolean, symName: Option[Name], isDerived: Boolean)(using Context): Option[ImportSelector] = val tpd.Import(qual, sels) = imp val dealiasedSym = dealias(sym) @@ -641,9 +648,12 @@ object CheckUnused: def dealiasedSelector = if(isDerived) sels.flatMap(sel => selectionsToDealias.map(m => (sel, m.symbol))).collect { case (sel, sym) if dealias(sym) == dealiasedSym => sel }.headOption else None - def wildcard = sels.find(sel => sel.isWildcard && ((sym.is(Given) == sel.isGiven) || sym.is(Implicit))) + def givenSelector = if sym.is(Given) || sym.is(Implicit) + then sels.filter(sel => sel.isGiven && !sel.bound.isEmpty).find(sel => sel.boundTpe =:= sym.info) + else None + def wildcard = sels.find(sel => sel.isWildcard && ((sym.is(Given) == sel.isGiven && sel.bound.isEmpty) || sym.is(Implicit))) if qualHasSymbol && (!isAccessible || sym.isRenamedSymbol(symName)) && sym.exists then - selector.orElse(dealiasedSelector).orElse(wildcard) // selector with name or wildcard (or given) + selector.orElse(dealiasedSelector).orElse(givenSelector).orElse(wildcard) // selector with name or wildcard (or given) else None diff --git a/tests/neg-custom-args/fatal-warnings/i17314b.scala b/tests/neg-custom-args/fatal-warnings/i17314b.scala new file mode 100644 index 000000000000..384767765cf4 --- /dev/null +++ b/tests/neg-custom-args/fatal-warnings/i17314b.scala @@ -0,0 +1,14 @@ +// scalac: -Wunused:all + +package foo: + class Foo[T] + given Foo[Int] = new Foo[Int] + + +package bar: + import foo.{given foo.Foo[Int]} // error + import foo.Foo + + given Foo[Int] = ??? + + val repro: Foo[Int] = summon[Foo[Int]] diff --git a/tests/pos-special/fatal-warnings/i17314.scala b/tests/pos-special/fatal-warnings/i17314.scala new file mode 100644 index 000000000000..23f988741bed --- /dev/null +++ b/tests/pos-special/fatal-warnings/i17314.scala @@ -0,0 +1,33 @@ +// scalac: "-Wunused:all" + +import java.net.URI + +object circelike { + import scala.compiletime.summonInline + import scala.deriving.Mirror + + type Codec[T] + type Configuration + trait ConfiguredCodec[T] + object ConfiguredCodec: + inline final def derived[A](using conf: Configuration)(using + inline mirror: Mirror.Of[A] + ): ConfiguredCodec[A] = + new ConfiguredCodec[A]: + val codec = summonInline[Codec[URI]] // simplification +} + +object foo { + import circelike.{Codec, Configuration} + + given Configuration = ??? + given Codec[URI] = ??? +} + +object bar { + import circelike.Codec + import circelike.{Configuration, ConfiguredCodec} + import foo.{given Configuration, given Codec[URI]} + + case class Operator(url: URI) derives ConfiguredCodec +} diff --git a/tests/pos-special/fatal-warnings/i17314a.scala b/tests/pos-special/fatal-warnings/i17314a.scala new file mode 100644 index 000000000000..468b956fb04c --- /dev/null +++ b/tests/pos-special/fatal-warnings/i17314a.scala @@ -0,0 +1,12 @@ +// scalac: -Wunused:all + +package foo: + class Foo[T] + given Foo[Int] = new Foo[Int] + + +package bar: + import foo.{given foo.Foo[Int]} + import foo.Foo + + val repro: Foo[Int] = summon[Foo[Int]] From 74b0aa46c7e5e4fc1b2a3828cc6aa5ac0acd1f47 Mon Sep 17 00:00:00 2001 From: Michael Pilquist Date: Fri, 3 Feb 2023 20:43:05 -0500 Subject: [PATCH 082/371] Remove experimental from Mirror#fromProductTyped --- library/src/scala/deriving/Mirror.scala | 1 - 1 file changed, 1 deletion(-) diff --git a/library/src/scala/deriving/Mirror.scala b/library/src/scala/deriving/Mirror.scala index 5de219dfe5c4..57453a516567 100644 --- a/library/src/scala/deriving/Mirror.scala +++ b/library/src/scala/deriving/Mirror.scala @@ -52,7 +52,6 @@ object Mirror { extension [T](p: ProductOf[T]) /** Create a new instance of type `T` with elements taken from product `a`. */ - @annotation.experimental def fromProductTyped[A <: scala.Product, Elems <: p.MirroredElemTypes](a: A)(using m: ProductOf[A] { type MirroredElemTypes = Elems }): T = p.fromProduct(a) From a55322d74c9fb18cb6e0562018dacee4623a9731 Mon Sep 17 00:00:00 2001 From: Michael Pilquist Date: Sat, 4 Feb 2023 08:42:54 -0500 Subject: [PATCH 083/371] Update experimental definitions list --- .../tasty-inspector/stdlibExperimentalDefinitions.scala | 4 ---- 1 file changed, 4 deletions(-) diff --git a/tests/run-custom-args/tasty-inspector/stdlibExperimentalDefinitions.scala b/tests/run-custom-args/tasty-inspector/stdlibExperimentalDefinitions.scala index 30e7c5af6c2a..062fa25e0ca5 100644 --- a/tests/run-custom-args/tasty-inspector/stdlibExperimentalDefinitions.scala +++ b/tests/run-custom-args/tasty-inspector/stdlibExperimentalDefinitions.scala @@ -58,10 +58,6 @@ val experimentalDefinitionInLibrary = Set( //// New feature: into "scala.annotation.allowConversions", - //// New APIs: Mirror - // Can be stabilized in 3.3.0 or later. - "scala.deriving.Mirror$.fromProductTyped", // This API is a bit convoluted. We may need some more feedback before we can stabilize it. - //// New feature: Macro annotations "scala.annotation.MacroAnnotation", From 909b56c0d4b40901053af1dad3858a882286c855 Mon Sep 17 00:00:00 2001 From: =?UTF-8?q?Pawe=C5=82=20Marks?= Date: Thu, 27 Apr 2023 10:16:47 +0200 Subject: [PATCH 084/371] Add changelog for 3.3.0-RC5 --- changelogs/3.3.0-RC5.md | 22 ++++++++++++++++++++++ 1 file changed, 22 insertions(+) create mode 100644 changelogs/3.3.0-RC5.md diff --git a/changelogs/3.3.0-RC5.md b/changelogs/3.3.0-RC5.md new file mode 100644 index 000000000000..a9cc120ae39a --- /dev/null +++ b/changelogs/3.3.0-RC5.md @@ -0,0 +1,22 @@ +# Backported fixes + +- Remove experimental from `Mirror#fromProductTyped` [#16829](https//github.com/lampepfl/dotty/pull/16829) +- Wunused: Check if symbol exists before `isValidMemberDef` check [#17316](https://github.com/lampepfl/dotty/pull/17316) +- Wunused: Include import selector bounds in unused checks [#17323](https://github.com/lampepfl/dotty/pull/17323) +- Fix compiler crash in WUnused [#17340](https://github.com/lampepfl/dotty/pull/17340) + +# Contributors + +Thank you to all the contributors who made this release possible 🎉 + +According to `git shortlog -sn --no-merges 3.3.0-RC4..3.3.0-RC5` these are: + +``` + 2 Kacper Korban + 2 Michael Pilquist + 2 Paweł Marks + 2 Szymon Rodziewicz + 1 Matt Bovel + + +``` From 597144e8805fc21e2f0a938f6701326df7aafe30 Mon Sep 17 00:00:00 2001 From: =?UTF-8?q?Pawe=C5=82=20Marks?= Date: Thu, 27 Apr 2023 10:17:23 +0200 Subject: [PATCH 085/371] Release 3.3.0-RC5 --- project/Build.scala | 4 ++-- 1 file changed, 2 insertions(+), 2 deletions(-) diff --git a/project/Build.scala b/project/Build.scala index dddddf20c1ce..03d482a83407 100644 --- a/project/Build.scala +++ b/project/Build.scala @@ -82,7 +82,7 @@ object Build { val referenceVersion = "3.2.2" - val baseVersion = "3.3.0-RC4" + val baseVersion = "3.3.0-RC5" // Versions used by the vscode extension to create a new project // This should be the latest published releases. @@ -98,7 +98,7 @@ object Build { * set to 3.1.3. If it is going to be 3.1.0, it must be set to the latest * 3.0.x release. */ - val previousDottyVersion = "3.3.0-RC3" + val previousDottyVersion = "3.3.0-RC4" object CompatMode { final val BinaryCompatible = 0 From 3b9b83d9c2f0ba31bda219756729c45b7052f45b Mon Sep 17 00:00:00 2001 From: Dale Wijnand Date: Fri, 3 Feb 2023 13:55:56 +0000 Subject: [PATCH 086/371] Patmat: Use less type variables in prefix inference In code like: class Outer: sealed trait Foo case class Bar() extends Foo def mat(foo: Foo) = foo match case Bar() => When in the course of decomposing the scrutinee's type, which is `Outer.this.Foo`, we're trying to instantiate subclass `Outer.this.Bar`, the `Outer.this` is fixed - it needn't be inferred, via type variables and type bounds. Cutting down on type variables, particularly when GADT symbols are also present, can really speed up the operation, including making code that used to hang forever compile speedily. --- .../src/dotty/tools/dotc/core/TypeOps.scala | 53 +++++++++++++------ tests/patmat/i12408.check | 2 +- tests/pos/i16785.scala | 11 ++++ 3 files changed, 50 insertions(+), 16 deletions(-) create mode 100644 tests/pos/i16785.scala diff --git a/compiler/src/dotty/tools/dotc/core/TypeOps.scala b/compiler/src/dotty/tools/dotc/core/TypeOps.scala index d9da11c561e8..c91412988e82 100644 --- a/compiler/src/dotty/tools/dotc/core/TypeOps.scala +++ b/compiler/src/dotty/tools/dotc/core/TypeOps.scala @@ -2,7 +2,7 @@ package dotty.tools package dotc package core -import Contexts._, Types._, Symbols._, Names._, Flags._ +import Contexts._, Types._, Symbols._, Names._, NameKinds.*, Flags._ import SymDenotations._ import util.Spans._ import util.Stats @@ -839,24 +839,51 @@ object TypeOps: } } - // Prefix inference, replace `p.C.this.Child` with `X.Child` where `X <: p.C` - // Note: we need to strip ThisType in `p` recursively. + /** Gather GADT symbols and `ThisType`s found in `tp2`, ie. the scrutinee. */ + object TraverseTp2 extends TypeTraverser: + val thisTypes = util.HashSet[ThisType]() + val gadtSyms = new mutable.ListBuffer[Symbol] + + def traverse(tp: Type) = { + val tpd = tp.dealias + if tpd ne tp then traverse(tpd) + else tp match + case tp: ThisType if !tp.tref.symbol.isStaticOwner && !thisTypes.contains(tp) => + thisTypes += tp + traverseChildren(tp.tref) + case tp: TypeRef if tp.symbol.isAbstractOrParamType => + gadtSyms += tp.symbol + traverseChildren(tp) + case _ => + traverseChildren(tp) + } + TraverseTp2.traverse(tp2) + val thisTypes = TraverseTp2.thisTypes + val gadtSyms = TraverseTp2.gadtSyms.toList + + // Prefix inference, given `p.C.this.Child`: + // 1. return it as is, if `C.this` is found in `tp`, i.e. the scrutinee; or + // 2. replace it with `X.Child` where `X <: p.C`, stripping ThisType in `p` recursively. // - // See tests/patmat/i3938.scala + // See tests/patmat/i3938.scala, tests/pos/i15029.more.scala, tests/pos/i16785.scala class InferPrefixMap extends TypeMap { var prefixTVar: Type | Null = null def apply(tp: Type): Type = tp match { - case ThisType(tref: TypeRef) if !tref.symbol.isStaticOwner => + case tp @ ThisType(tref) if !tref.symbol.isStaticOwner => val symbol = tref.symbol - if (symbol.is(Module)) + if thisTypes.contains(tp) then + prefixTVar = tp // e.g. tests/pos/i16785.scala, keep Outer.this + prefixTVar.uncheckedNN + else if symbol.is(Module) then TermRef(this(tref.prefix), symbol.sourceModule) else if (prefixTVar != null) this(tref) else { prefixTVar = WildcardType // prevent recursive call from assigning it - val tvars = tref.typeParams.map { tparam => newTypeVar(tparam.paramInfo.bounds) } + // e.g. tests/pos/i15029.more.scala, create a TypeVar for `Instances`' B, so we can disregard `Ints` + val tvars = tref.typeParams.map { tparam => newTypeVar(tparam.paramInfo.bounds, DepParamName.fresh(tparam.paramName)) } val tref2 = this(tref.applyIfParameterized(tvars)) - prefixTVar = newTypeVar(TypeBounds.upper(tref2)) + prefixTVar = newTypeVar(TypeBounds.upper(tref2), DepParamName.fresh(tref.name)) prefixTVar.uncheckedNN } case tp => mapOver(tp) @@ -864,15 +891,11 @@ object TypeOps: } val inferThisMap = new InferPrefixMap - val tvars = tp1.typeParams.map { tparam => newTypeVar(tparam.paramInfo.bounds) } + val tvars = tp1.typeParams.map { tparam => newTypeVar(tparam.paramInfo.bounds, DepParamName.fresh(tparam.paramName)) } val protoTp1 = inferThisMap.apply(tp1).appliedTo(tvars) - val getAbstractSymbols = new TypeAccumulator[List[Symbol]]: - def apply(xs: List[Symbol], tp: Type) = tp.dealias match - case tp: TypeRef if tp.symbol.exists && !tp.symbol.isClass => foldOver(tp.symbol :: xs, tp) - case tp => foldOver(xs, tp) - val syms2 = getAbstractSymbols(Nil, tp2).reverse - if syms2.nonEmpty then ctx.gadtState.addToConstraint(syms2) + if gadtSyms.nonEmpty then + ctx.gadtState.addToConstraint(gadtSyms) // If parent contains a reference to an abstract type, then we should // refine subtype checking to eliminate abstract types according to diff --git a/tests/patmat/i12408.check b/tests/patmat/i12408.check index ada7b8c21fa8..60acc2cba84e 100644 --- a/tests/patmat/i12408.check +++ b/tests/patmat/i12408.check @@ -1,2 +1,2 @@ -13: Pattern Match Exhaustivity: X[] & (X.this : X[T]).A(_), X[] & (X.this : X[T]).C(_) +13: Pattern Match Exhaustivity: A(_), C(_) 21: Pattern Match diff --git a/tests/pos/i16785.scala b/tests/pos/i16785.scala new file mode 100644 index 000000000000..1cfabf5a4312 --- /dev/null +++ b/tests/pos/i16785.scala @@ -0,0 +1,11 @@ +class VarImpl[Lbl, A] + +class Outer[|*|[_, _], Lbl1]: + type Var[A1] = VarImpl[Lbl1, A1] + + sealed trait Foo[G] + case class Bar[T, U]() + extends Foo[Var[T] |*| Var[U]] + + def go[X](scr: Foo[Var[X]]): Unit = scr match // was: compile hang + case Bar() => () From 9c1cdc882bbba6b302663852fc36252e395449b5 Mon Sep 17 00:00:00 2001 From: Adrien Piquerez Date: Fri, 28 Apr 2023 14:40:34 +0200 Subject: [PATCH 087/371] Fix #17187: allow patches with same span --- .../dotty/tools/dotc/parsing/Parsers.scala | 9 ++-- .../dotty/tools/dotc/rewrites/Rewrites.scala | 5 +-- .../dotty/tools/dotc/CompilationTests.scala | 1 + tests/rewrites/i12340.check | 9 ++++ tests/rewrites/i12340.scala | 7 +++ tests/rewrites/i17187.check | 44 +++++++++++++++++++ tests/rewrites/i17187.scala | 33 ++++++++++++++ 7 files changed, 100 insertions(+), 8 deletions(-) create mode 100644 tests/rewrites/i17187.check create mode 100644 tests/rewrites/i17187.scala diff --git a/compiler/src/dotty/tools/dotc/parsing/Parsers.scala b/compiler/src/dotty/tools/dotc/parsing/Parsers.scala index ff5e95f3aa03..fbf31cc8cbbd 100644 --- a/compiler/src/dotty/tools/dotc/parsing/Parsers.scala +++ b/compiler/src/dotty/tools/dotc/parsing/Parsers.scala @@ -704,7 +704,11 @@ object Parsers { val t = enclosed(INDENT, body) if needsBraces(t) then patch(source, Span(startOpening, endOpening), " {") - patch(source, Span(closingOffset(source.nextLine(in.lastOffset))), indentWidth.toPrefix ++ "}\n") + val next = in.next + def closedByEndMarker = + next.token == END && (next.offset - next.lineOffset) == indentWidth.toPrefix.size + if closedByEndMarker then patch(source, Span(next.offset), "} // ") + else patch(source, Span(closingOffset(source.nextLine(in.lastOffset))), indentWidth.toPrefix ++ "}\n") t end indentedToBraces @@ -1411,9 +1415,6 @@ object Parsers { val start = in.skipToken() if stats.isEmpty || !matchesAndSetEnd(stats.last) then syntaxError(em"misaligned end marker", Span(start, in.lastCharOffset)) - else if overlapsPatch(source, Span(start, start)) then - patch(source, Span(start, start), "") - patch(source, Span(start, in.lastCharOffset), s"} // end $endName") in.token = IDENTIFIER // Leaving it as the original token can confuse newline insertion in.nextToken() end checkEndMarker diff --git a/compiler/src/dotty/tools/dotc/rewrites/Rewrites.scala b/compiler/src/dotty/tools/dotc/rewrites/Rewrites.scala index 96e88e5c68ae..f2dfac88d464 100644 --- a/compiler/src/dotty/tools/dotc/rewrites/Rewrites.scala +++ b/compiler/src/dotty/tools/dotc/rewrites/Rewrites.scala @@ -23,10 +23,7 @@ object Rewrites { private[Rewrites] val pbuf = new mutable.ListBuffer[Patch]() def addPatch(span: Span, replacement: String): Unit = - pbuf.indexWhere(p => p.span.start == span.start && p.span.end == span.end) match { - case i if i >= 0 => pbuf.update(i, Patch(span, replacement)) - case _ => pbuf += Patch(span, replacement) - } + pbuf += Patch(span, replacement) def apply(cs: Array[Char]): Array[Char] = { val delta = pbuf.map(_.delta).sum diff --git a/compiler/test/dotty/tools/dotc/CompilationTests.scala b/compiler/test/dotty/tools/dotc/CompilationTests.scala index b8b38cce92e4..191bfa0a1e93 100644 --- a/compiler/test/dotty/tools/dotc/CompilationTests.scala +++ b/compiler/test/dotty/tools/dotc/CompilationTests.scala @@ -83,6 +83,7 @@ class CompilationTests { compileFile("tests/rewrites/i9632.scala", defaultOptions.and("-indent", "-rewrite")), compileFile("tests/rewrites/i11895.scala", defaultOptions.and("-indent", "-rewrite")), compileFile("tests/rewrites/i12340.scala", unindentOptions.and("-rewrite")), + compileFile("tests/rewrites/i17187.scala", unindentOptions.and("-rewrite")), ).checkRewrites() } diff --git a/tests/rewrites/i12340.check b/tests/rewrites/i12340.check index c6cb9af8bb57..3ee2867f3e62 100644 --- a/tests/rewrites/i12340.check +++ b/tests/rewrites/i12340.check @@ -3,6 +3,15 @@ class C { def f = 42 } // end C +class A { + class B { + class C { + def foo = 42 + } + + } // end B +} + def f(i: Int) = { if i < 42 then println(i) diff --git a/tests/rewrites/i12340.scala b/tests/rewrites/i12340.scala index bf907ef9f276..10fcdd256f5b 100644 --- a/tests/rewrites/i12340.scala +++ b/tests/rewrites/i12340.scala @@ -3,6 +3,13 @@ class C: def f = 42 end C +class A: + class B: + class C: + def foo = 42 + + end B + def f(i: Int) = if i < 42 then println(i) diff --git a/tests/rewrites/i17187.check b/tests/rewrites/i17187.check new file mode 100644 index 000000000000..1e6a07c79738 --- /dev/null +++ b/tests/rewrites/i17187.check @@ -0,0 +1,44 @@ + +object A { + object B { + def a = 2 + } +} + +def m1 = { + def b = { + def c = 2 + } +} + +def m2 = + if true then { + val x = 3 + if (false) + x + else { + val y = 4 + y + } + } + +def m3 = + try { + val n2 = 21 + val n1 = 4 + n2 / n1 + } + catch { + case _ => 4 + } + +def m4 = { + val n2 = 21 + try { + val n1 = 4 + n2 / n1 + } + catch { + case _ => 4 + } +} diff --git a/tests/rewrites/i17187.scala b/tests/rewrites/i17187.scala new file mode 100644 index 000000000000..e6a55e00b39a --- /dev/null +++ b/tests/rewrites/i17187.scala @@ -0,0 +1,33 @@ + +object A: + object B: + def a = 2 + +def m1 = + def b = + def c = 2 + +def m2 = + if true then + val x = 3 + if (false) + x + else + val y = 4 + y + +def m3 = + try + val n2 = 21 + val n1 = 4 + n2 / n1 + catch + case _ => 4 + +def m4 = + val n2 = 21 + try + val n1 = 4 + n2 / n1 + catch + case _ => 4 From c1028a225f6a1e40c8eb6c94b8fa530fee6b8ae5 Mon Sep 17 00:00:00 2001 From: Adrien Piquerez Date: Mon, 1 May 2023 09:43:44 +0200 Subject: [PATCH 088/371] Revert exact match in overlaps As suggested by @som-snytt in https://github.com/lampepfl/dotty/pull/17366#pullrequestreview-1406509043 --- compiler/src/dotty/tools/dotc/util/Spans.scala | 1 - 1 file changed, 1 deletion(-) diff --git a/compiler/src/dotty/tools/dotc/util/Spans.scala b/compiler/src/dotty/tools/dotc/util/Spans.scala index ba537e9aec01..e1487408f36b 100644 --- a/compiler/src/dotty/tools/dotc/util/Spans.scala +++ b/compiler/src/dotty/tools/dotc/util/Spans.scala @@ -86,7 +86,6 @@ object Spans { || containsInner(this, that.end) || containsInner(that, this.start) || containsInner(that, this.end) - || this.start == that.start && this.end == that.end // exact match in one point ) } From 9a1e7cb5b23c807dfe5918a818cd46ac277ae8dd Mon Sep 17 00:00:00 2001 From: =?UTF-8?q?Micha=C5=82=20Pa=C5=82ka?= Date: Wed, 10 May 2023 13:41:35 +0200 Subject: [PATCH 089/371] Raise a warning instead of an error for a type ascription on a pattern other than a variable or a number literal. This partially reverts the changes from https://github.com/lampepfl/dotty/pull/16150. This change is motivated by not breaking source compatibility for a number of projects in the Open Community Build. --- .../dotty/tools/dotc/parsing/Parsers.scala | 10 +-- docs/_docs/internals/syntax.md | 5 +- docs/_docs/reference/syntax.md | 5 +- .../fatal-warnings}/i10994.scala | 0 .../fatal-warnings}/i15893.scala | 6 +- tests/neg/t5702-neg-bad-and-wild.check | 10 ++- tests/pending/run/i15893.scala | 6 +- tests/pos/i10994.scala | 2 + tests/pos/i15893.scala | 61 +++++++++++++++++++ 9 files changed, 83 insertions(+), 22 deletions(-) rename tests/{neg => neg-custom-args/fatal-warnings}/i10994.scala (100%) rename tests/{neg => neg-custom-args/fatal-warnings}/i15893.scala (89%) create mode 100644 tests/pos/i10994.scala create mode 100644 tests/pos/i15893.scala diff --git a/compiler/src/dotty/tools/dotc/parsing/Parsers.scala b/compiler/src/dotty/tools/dotc/parsing/Parsers.scala index ff5e95f3aa03..c42b302912a9 100644 --- a/compiler/src/dotty/tools/dotc/parsing/Parsers.scala +++ b/compiler/src/dotty/tools/dotc/parsing/Parsers.scala @@ -2821,14 +2821,14 @@ object Parsers { if (isIdent(nme.raw.BAR)) { in.nextToken(); pattern1(location) :: patternAlts(location) } else Nil - /** Pattern1 ::= PatVar Ascription - * | [‘-’] integerLiteral Ascription - * | [‘-’] floatingPointLiteral Ascription - * | Pattern2 + /** Pattern1 ::= Pattern2 [Ascription] */ def pattern1(location: Location = Location.InPattern): Tree = val p = pattern2() - if (isVarPattern(p) || p.isInstanceOf[Number]) && in.isColon then + if in.isColon then + val isVariableOrNumber = isVarPattern(p) || p.isInstanceOf[Number] + if !isVariableOrNumber then + warning(em"Only variable and number literal patterns can have type ascriptions") in.nextToken() ascription(p, location) else p diff --git a/docs/_docs/internals/syntax.md b/docs/_docs/internals/syntax.md index 76664569bb17..cf6a6d053566 100644 --- a/docs/_docs/internals/syntax.md +++ b/docs/_docs/internals/syntax.md @@ -319,10 +319,7 @@ TypeCaseClauses ::= TypeCaseClause { TypeCaseClause } TypeCaseClause ::= ‘case’ (InfixType | ‘_’) ‘=>’ Type [semi] Pattern ::= Pattern1 { ‘|’ Pattern1 } Alternative(pats) -Pattern1 ::= PatVar ‘:’ RefinedType Bind(name, Typed(Ident(wildcard), tpe)) - | [‘-’] integerLiteral ‘:’ RefinedType Typed(pat, tpe) - | [‘-’] floatingPointLiteral ‘:’ RefinedType Typed(pat, tpe) - | Pattern2 +Pattern1 ::= Pattern2 [‘:’ RefinedType] Bind(name, Typed(Ident(wildcard), tpe)) Pattern2 ::= [id ‘@’] InfixPattern [‘*’] Bind(name, pat) InfixPattern ::= SimplePattern { id [nl] SimplePattern } InfixOp(pat, op, pat) SimplePattern ::= PatVar Ident(wildcard) diff --git a/docs/_docs/reference/syntax.md b/docs/_docs/reference/syntax.md index 7e4b81b1ef5a..8d110f685f9d 100644 --- a/docs/_docs/reference/syntax.md +++ b/docs/_docs/reference/syntax.md @@ -312,10 +312,7 @@ TypeCaseClauses ::= TypeCaseClause { TypeCaseClause } TypeCaseClause ::= ‘case’ (InfixType | ‘_’) ‘=>’ Type [semi] Pattern ::= Pattern1 { ‘|’ Pattern1 } -Pattern1 ::= PatVar ‘:’ RefinedType - | [‘-’] integerLiteral ‘:’ RefinedType - | [‘-’] floatingPointLiteral ‘:’ RefinedType - | Pattern2 +Pattern1 ::= Pattern2 [‘:’ RefinedType] Pattern2 ::= [id ‘@’] InfixPattern [‘*’] InfixPattern ::= SimplePattern { id [nl] SimplePattern } SimplePattern ::= PatVar diff --git a/tests/neg/i10994.scala b/tests/neg-custom-args/fatal-warnings/i10994.scala similarity index 100% rename from tests/neg/i10994.scala rename to tests/neg-custom-args/fatal-warnings/i10994.scala diff --git a/tests/neg/i15893.scala b/tests/neg-custom-args/fatal-warnings/i15893.scala similarity index 89% rename from tests/neg/i15893.scala rename to tests/neg-custom-args/fatal-warnings/i15893.scala index 997c51179099..f23e6150106a 100644 --- a/tests/neg/i15893.scala +++ b/tests/neg-custom-args/fatal-warnings/i15893.scala @@ -22,7 +22,7 @@ transparent inline def transparentInlineMod2(inline n: NatT): NatT = inline n m case Succ(Zero()) => Succ(Zero()) case Succ(Succ(predPredN)) => transparentInlineMod2(predPredN) -def dependentlyTypedMod2[N <: NatT](n: N): Mod2[N] = n match // exhaustivity warning; unexpected +def dependentlyTypedMod2[N <: NatT](n: N): Mod2[N] = n match case Zero(): Zero => Zero() // error case Succ(Zero()): Succ[Zero] => Succ(Zero()) // error case Succ(Succ(predPredN)): Succ[Succ[_]] => dependentlyTypedMod2(predPredN) // error @@ -57,5 +57,5 @@ inline def transparentInlineFoo(inline n: NatT): NatT = inline transparentInline println(transparentInlineMod2(Succ(Succ(Succ(Zero()))))) // prints Succ(Zero()), as expected println(transparentInlineFoo(Succ(Succ(Succ(Zero()))))) // prints Zero(), as expected println(dependentlyTypedMod2(Succ(Succ(Succ(Zero()))))) // runtime error; unexpected -// println(inlineDependentlyTypedMod2(Succ(Succ(Succ(Zero()))))) // doesn't compile; unexpected -// println(transparentInlineDependentlyTypedMod2(Succ(Succ(Succ(Zero()))))) // doesn't compile; unexpected + println(inlineDependentlyTypedMod2(Succ(Succ(Succ(Zero()))))) // prints Succ(Zero()), as expected + println(transparentInlineDependentlyTypedMod2(Succ(Succ(Succ(Zero()))))) // prints Succ(Zero()), as expected diff --git a/tests/neg/t5702-neg-bad-and-wild.check b/tests/neg/t5702-neg-bad-and-wild.check index 731195411069..36ac71b2e1e7 100644 --- a/tests/neg/t5702-neg-bad-and-wild.check +++ b/tests/neg/t5702-neg-bad-and-wild.check @@ -10,10 +10,10 @@ | pattern expected | | longer explanation available when compiling with `-explain` --- [E040] Syntax Error: tests/neg/t5702-neg-bad-and-wild.scala:13:22 --------------------------------------------------- +-- [E040] Syntax Error: tests/neg/t5702-neg-bad-and-wild.scala:13:23 --------------------------------------------------- 13 | case List(1, _*3:) => // error // error - | ^ - | ')' expected, but ':' found + | ^ + | an identifier expected, but ')' found -- [E032] Syntax Error: tests/neg/t5702-neg-bad-and-wild.scala:15:18 --------------------------------------------------- 15 | case List(x*, 1) => // error: pattern expected | ^ @@ -56,6 +56,10 @@ | Recursive value $1$ needs type | | longer explanation available when compiling with `-explain` +-- Warning: tests/neg/t5702-neg-bad-and-wild.scala:13:22 --------------------------------------------------------------- +13 | case List(1, _*3:) => // error // error + | ^ + | Only variable and number literal patterns can have type ascriptions -- Warning: tests/neg/t5702-neg-bad-and-wild.scala:22:20 --------------------------------------------------------------- 22 | val K(x @ _*) = k | ^ diff --git a/tests/pending/run/i15893.scala b/tests/pending/run/i15893.scala index dedec2138f2a..d9cd2822e971 100644 --- a/tests/pending/run/i15893.scala +++ b/tests/pending/run/i15893.scala @@ -24,7 +24,7 @@ transparent inline def transparentInlineMod2(inline n: NatT): NatT = inline n m case Succ(Zero()) => Succ(Zero()) case Succ(Succ(predPredN)) => transparentInlineMod2(predPredN) */ -def dependentlyTypedMod2[N <: NatT](n: N): Mod2[N] = n match // exhaustivity warning; unexpected +def dependentlyTypedMod2[N <: NatT](n: N): Mod2[N] = n match case Zero(): Zero => Zero() case Succ(Zero()): Succ[Zero] => Succ(Zero()) case Succ(Succ(predPredN)): Succ[Succ[_]] => dependentlyTypedMod2(predPredN) @@ -61,5 +61,5 @@ inline def transparentInlineFoo(inline n: NatT): NatT = inline transparentInline println(transparentInlineFoo(Succ(Succ(Succ(Zero()))))) // prints Zero(), as expected */ println(dependentlyTypedMod2(Succ(Succ(Succ(Zero()))))) // runtime error; unexpected -// println(inlineDependentlyTypedMod2(Succ(Succ(Succ(Zero()))))) // doesn't compile; unexpected -// println(transparentInlineDependentlyTypedMod2(Succ(Succ(Succ(Zero()))))) // doesn't compile; unexpected +// println(inlineDependentlyTypedMod2(Succ(Succ(Succ(Zero()))))) // prints Succ(Zero()), as expected +// println(transparentInlineDependentlyTypedMod2(Succ(Succ(Succ(Zero()))))) // prints Succ(Zero()), as expected diff --git a/tests/pos/i10994.scala b/tests/pos/i10994.scala new file mode 100644 index 000000000000..99ae647466b1 --- /dev/null +++ b/tests/pos/i10994.scala @@ -0,0 +1,2 @@ +def foo = true match + case (b: Boolean): Boolean => () diff --git a/tests/pos/i15893.scala b/tests/pos/i15893.scala new file mode 100644 index 000000000000..af6e7ae38ad2 --- /dev/null +++ b/tests/pos/i15893.scala @@ -0,0 +1,61 @@ +sealed trait NatT +case class Zero() extends NatT +case class Succ[+N <: NatT](n: N) extends NatT + +type Mod2[N <: NatT] <: NatT = N match + case Zero => Zero + case Succ[Zero] => Succ[Zero] + case Succ[Succ[predPredN]] => Mod2[predPredN] + +def mod2(n: NatT): NatT = n match + case Zero() => Zero() + case Succ(Zero()) => Succ(Zero()) + case Succ(Succ(predPredN)) => mod2(predPredN) + +inline def inlineMod2(inline n: NatT): NatT = inline n match + case Zero() => Zero() + case Succ(Zero()) => Succ(Zero()) + case Succ(Succ(predPredN)) => inlineMod2(predPredN) + +transparent inline def transparentInlineMod2(inline n: NatT): NatT = inline n match + case Zero() => Zero() + case Succ(Zero()) => Succ(Zero()) + case Succ(Succ(predPredN)) => transparentInlineMod2(predPredN) + +def dependentlyTypedMod2[N <: NatT](n: N): Mod2[N] = n match + case Zero(): Zero => Zero() // warning + case Succ(Zero()): Succ[Zero] => Succ(Zero()) // warning + case Succ(Succ(predPredN)): Succ[Succ[_]] => dependentlyTypedMod2(predPredN) // warning + +inline def inlineDependentlyTypedMod2[N <: NatT](inline n: N): Mod2[N] = inline n match + case Zero(): Zero => Zero() // warning + case Succ(Zero()): Succ[Zero] => Succ(Zero()) // warning + case Succ(Succ(predPredN)): Succ[Succ[_]] => inlineDependentlyTypedMod2(predPredN) // warning + +transparent inline def transparentInlineDependentlyTypedMod2[N <: NatT](inline n: N): Mod2[N] = inline n match + case Zero(): Zero => Zero() // warning + case Succ(Zero()): Succ[Zero] => Succ(Zero()) // warning + case Succ(Succ(predPredN)): Succ[Succ[_]] => transparentInlineDependentlyTypedMod2(predPredN) // warning + +def foo(n: NatT): NatT = mod2(n) match + case Succ(Zero()) => Zero() + case _ => n + +inline def inlineFoo(inline n: NatT): NatT = inline inlineMod2(n) match + case Succ(Zero()) => Zero() + case _ => n + +inline def transparentInlineFoo(inline n: NatT): NatT = inline transparentInlineMod2(n) match + case Succ(Zero()) => Zero() + case _ => n + +@main def main(): Unit = + println(mod2(Succ(Succ(Succ(Zero()))))) // prints Succ(Zero()), as expected + println(foo(Succ(Succ(Succ(Zero()))))) // prints Zero(), as expected + println(inlineMod2(Succ(Succ(Succ(Zero()))))) // prints Succ(Zero()), as expected + println(inlineFoo(Succ(Succ(Succ(Zero()))))) // prints Succ(Succ(Succ(Zero()))); unexpected + println(transparentInlineMod2(Succ(Succ(Succ(Zero()))))) // prints Succ(Zero()), as expected + println(transparentInlineFoo(Succ(Succ(Succ(Zero()))))) // prints Zero(), as expected + println(dependentlyTypedMod2(Succ(Succ(Succ(Zero()))))) // runtime error; unexpected + println(inlineDependentlyTypedMod2(Succ(Succ(Succ(Zero()))))) // prints Succ(Zero()), as expected + println(transparentInlineDependentlyTypedMod2(Succ(Succ(Succ(Zero()))))) // prints Succ(Zero()), as expected From 16d68f6238ef668b53c5e762622ae87f0d1cd002 Mon Sep 17 00:00:00 2001 From: =?UTF-8?q?Micha=C5=82=20Pa=C5=82ka?= Date: Thu, 11 May 2023 13:26:45 +0200 Subject: [PATCH 090/371] * Preserve the more restrictive syntax for typed patterns in the language specification * Make the parser's warning a migration warning --- .../src/dotty/tools/dotc/parsing/Parsers.scala | 15 +++++++++++++-- .../test/dotty/tools/dotc/CompilationTests.scala | 1 + docs/_docs/internals/syntax.md | 5 ++++- docs/_docs/reference/syntax.md | 5 ++++- tests/neg-custom-args/i10994.check | 7 +++++++ tests/neg-custom-args/i10994.scala | 2 ++ tests/neg/t5702-neg-bad-and-wild.check | 5 ++++- tests/pos/i10994.scala | 2 +- 8 files changed, 36 insertions(+), 6 deletions(-) create mode 100644 tests/neg-custom-args/i10994.check create mode 100644 tests/neg-custom-args/i10994.scala diff --git a/compiler/src/dotty/tools/dotc/parsing/Parsers.scala b/compiler/src/dotty/tools/dotc/parsing/Parsers.scala index c42b302912a9..605afc07cc72 100644 --- a/compiler/src/dotty/tools/dotc/parsing/Parsers.scala +++ b/compiler/src/dotty/tools/dotc/parsing/Parsers.scala @@ -2821,14 +2821,25 @@ object Parsers { if (isIdent(nme.raw.BAR)) { in.nextToken(); pattern1(location) :: patternAlts(location) } else Nil - /** Pattern1 ::= Pattern2 [Ascription] + /** Pattern1 ::= PatVar Ascription + * | [‘-’] integerLiteral Ascription + * | [‘-’] floatingPointLiteral Ascription + * | Pattern2 */ def pattern1(location: Location = Location.InPattern): Tree = val p = pattern2() if in.isColon then val isVariableOrNumber = isVarPattern(p) || p.isInstanceOf[Number] if !isVariableOrNumber then - warning(em"Only variable and number literal patterns can have type ascriptions") + report.gradualErrorOrMigrationWarning( + em"""Type ascriptions after patterns other than: + | * variable pattern, e.g. `case x: String =>` + | * number literal pattern, e.g. `case 10.5: Double =>` + |are no longer supported. Remove the type ascription or move it to a separate variable pattern.""", + in.sourcePos(), + warnFrom = `3.3`, + errorFrom = future + ) in.nextToken() ascription(p, location) else p diff --git a/compiler/test/dotty/tools/dotc/CompilationTests.scala b/compiler/test/dotty/tools/dotc/CompilationTests.scala index b8b38cce92e4..fdbd9216f1b7 100644 --- a/compiler/test/dotty/tools/dotc/CompilationTests.scala +++ b/compiler/test/dotty/tools/dotc/CompilationTests.scala @@ -188,6 +188,7 @@ class CompilationTests { compileFile("tests/neg-custom-args/i13026.scala", defaultOptions.and("-print-lines")), compileFile("tests/neg-custom-args/i13838.scala", defaultOptions.and("-Ximplicit-search-limit", "1000")), compileFile("tests/neg-custom-args/jdk-9-app.scala", defaultOptions.and("-release:8")), + compileFile("tests/neg-custom-args/i10994.scala", defaultOptions.and("-source", "future")), ).checkExpectedErrors() } diff --git a/docs/_docs/internals/syntax.md b/docs/_docs/internals/syntax.md index cf6a6d053566..8e7de0efe19e 100644 --- a/docs/_docs/internals/syntax.md +++ b/docs/_docs/internals/syntax.md @@ -319,7 +319,10 @@ TypeCaseClauses ::= TypeCaseClause { TypeCaseClause } TypeCaseClause ::= ‘case’ (InfixType | ‘_’) ‘=>’ Type [semi] Pattern ::= Pattern1 { ‘|’ Pattern1 } Alternative(pats) -Pattern1 ::= Pattern2 [‘:’ RefinedType] Bind(name, Typed(Ident(wildcard), tpe)) +Pattern1 ::= PatVar ‘:’ RefinedType Bind(name, Typed(Ident(wildcard), tpe)) + | [‘-’] integerLiteral ‘:’ RefinedType Typed(pat, tpe) + | [‘-’] floatingPointLiteral ‘:’ RefinedType Typed(pat, tpe) + | Pattern2 Pattern2 ::= [id ‘@’] InfixPattern [‘*’] Bind(name, pat) InfixPattern ::= SimplePattern { id [nl] SimplePattern } InfixOp(pat, op, pat) SimplePattern ::= PatVar Ident(wildcard) diff --git a/docs/_docs/reference/syntax.md b/docs/_docs/reference/syntax.md index 8d110f685f9d..bc709fb1f870 100644 --- a/docs/_docs/reference/syntax.md +++ b/docs/_docs/reference/syntax.md @@ -312,7 +312,10 @@ TypeCaseClauses ::= TypeCaseClause { TypeCaseClause } TypeCaseClause ::= ‘case’ (InfixType | ‘_’) ‘=>’ Type [semi] Pattern ::= Pattern1 { ‘|’ Pattern1 } -Pattern1 ::= Pattern2 [‘:’ RefinedType] +Pattern1 ::= PatVar ‘:’ RefinedType + | [‘-’] integerLiteral ‘:’ RefinedType + | [‘-’] floatingPointLiteral ‘:’ RefinedType + | Pattern2 Pattern2 ::= [id ‘@’] InfixPattern [‘*’] InfixPattern ::= SimplePattern { id [nl] SimplePattern } SimplePattern ::= PatVar diff --git a/tests/neg-custom-args/i10994.check b/tests/neg-custom-args/i10994.check new file mode 100644 index 000000000000..c540a04657c3 --- /dev/null +++ b/tests/neg-custom-args/i10994.check @@ -0,0 +1,7 @@ +-- Error: tests/neg-custom-args/i10994.scala:2:19 ---------------------------------------------------------------------- +2 | case (b: Boolean): Boolean => () // error + | ^ + | Type ascriptions after patterns other than: + | * variable pattern, e.g. `case x: String =>` + | * number literal pattern, e.g. `case 10.5: Double =>` + | are no longer supported. Remove the type ascription or move it to a separate variable pattern. diff --git a/tests/neg-custom-args/i10994.scala b/tests/neg-custom-args/i10994.scala new file mode 100644 index 000000000000..65695ccf4352 --- /dev/null +++ b/tests/neg-custom-args/i10994.scala @@ -0,0 +1,2 @@ +def foo = true match + case (b: Boolean): Boolean => () // error diff --git a/tests/neg/t5702-neg-bad-and-wild.check b/tests/neg/t5702-neg-bad-and-wild.check index 36ac71b2e1e7..c461b76ea70b 100644 --- a/tests/neg/t5702-neg-bad-and-wild.check +++ b/tests/neg/t5702-neg-bad-and-wild.check @@ -59,7 +59,10 @@ -- Warning: tests/neg/t5702-neg-bad-and-wild.scala:13:22 --------------------------------------------------------------- 13 | case List(1, _*3:) => // error // error | ^ - | Only variable and number literal patterns can have type ascriptions + | Type ascriptions after patterns other than: + | * variable pattern, e.g. `case x: String =>` + | * number literal pattern, e.g. `case 10.5: Double =>` + | are no longer supported. Remove the type ascription or move it to a separate variable pattern. -- Warning: tests/neg/t5702-neg-bad-and-wild.scala:22:20 --------------------------------------------------------------- 22 | val K(x @ _*) = k | ^ diff --git a/tests/pos/i10994.scala b/tests/pos/i10994.scala index 99ae647466b1..b7b6a3661649 100644 --- a/tests/pos/i10994.scala +++ b/tests/pos/i10994.scala @@ -1,2 +1,2 @@ def foo = true match - case (b: Boolean): Boolean => () + case (b: Boolean): Boolean => () // warning From 58256dd0fd1aa78ced4b1587275f7150d670d71e Mon Sep 17 00:00:00 2001 From: =?UTF-8?q?Pawe=C5=82=20Marks?= Date: Fri, 12 May 2023 13:29:49 +0200 Subject: [PATCH 091/371] Add changelog for 3.3.0-RC6 --- changelogs/3.3.0-RC6.md | 18 ++++++++++++++++++ 1 file changed, 18 insertions(+) create mode 100644 changelogs/3.3.0-RC6.md diff --git a/changelogs/3.3.0-RC6.md b/changelogs/3.3.0-RC6.md new file mode 100644 index 000000000000..ab98f0055974 --- /dev/null +++ b/changelogs/3.3.0-RC6.md @@ -0,0 +1,18 @@ +# Backported fixes + +- Patmat: Use less type variables in prefix inference [#16827](https//github.com/lampepfl/dotty/pull/16827) +- Just warn on type ascription on a pattern [#17454](https://github.com/lampepfl/dotty/pull/17454) +- Fix #17187: allow patches with same span [#17366](https://github.com/lampepfl/dotty/pull/17366) + +# Contributors + +Thank you to all the contributors who made this release possible 🎉 + +According to `git shortlog -sn --no-merges 3.3.0-RC5..3.3.0-RC6` these are: + +``` + 2 Adrien Piquerez + 2 Michał Pałka + 2 Paweł Marks + 1 Dale Wijnand +``` From 9bae88a8eac9944e73edc9e7f0155e0bd5b381ee Mon Sep 17 00:00:00 2001 From: =?UTF-8?q?Pawe=C5=82=20Marks?= Date: Fri, 12 May 2023 13:26:29 +0200 Subject: [PATCH 092/371] Release 3.3.0-RC6 --- project/Build.scala | 4 ++-- 1 file changed, 2 insertions(+), 2 deletions(-) diff --git a/project/Build.scala b/project/Build.scala index 03d482a83407..9109a925a450 100644 --- a/project/Build.scala +++ b/project/Build.scala @@ -82,7 +82,7 @@ object Build { val referenceVersion = "3.2.2" - val baseVersion = "3.3.0-RC5" + val baseVersion = "3.3.0-RC6" // Versions used by the vscode extension to create a new project // This should be the latest published releases. @@ -98,7 +98,7 @@ object Build { * set to 3.1.3. If it is going to be 3.1.0, it must be set to the latest * 3.0.x release. */ - val previousDottyVersion = "3.3.0-RC4" + val previousDottyVersion = "3.3.0-RC5" object CompatMode { final val BinaryCompatible = 0 From 410e5df4ec58444ee5c63ac1ea4f5ddd8eb8d15f Mon Sep 17 00:00:00 2001 From: =?UTF-8?q?Pawe=C5=82=20Marks?= Date: Tue, 23 May 2023 12:43:54 +0200 Subject: [PATCH 093/371] Set TASTy Version to 28.3.0 --- tasty/src/dotty/tools/tasty/TastyFormat.scala | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/tasty/src/dotty/tools/tasty/TastyFormat.scala b/tasty/src/dotty/tools/tasty/TastyFormat.scala index 2d18923e1b0c..ac0357068c55 100644 --- a/tasty/src/dotty/tools/tasty/TastyFormat.scala +++ b/tasty/src/dotty/tools/tasty/TastyFormat.scala @@ -305,7 +305,7 @@ object TastyFormat { * is able to read final TASTy documents if the file's * `MinorVersion` is strictly less than the current value. */ - final val ExperimentalVersion: Int = 1 + final val ExperimentalVersion: Int = 0 /**This method implements a binary relation (`<:<`) between two TASTy versions. * From 92152f4225890c93cb6c2660dadfff9519d1f1e8 Mon Sep 17 00:00:00 2001 From: =?UTF-8?q?Pawe=C5=82=20Marks?= Date: Tue, 23 May 2023 13:02:20 +0200 Subject: [PATCH 094/371] Add changelog for 3.3.0 --- changelogs/3.3.0.md | 268 ++++++++++++++++++++++++++++++++++++++++++++ 1 file changed, 268 insertions(+) create mode 100644 changelogs/3.3.0.md diff --git a/changelogs/3.3.0.md b/changelogs/3.3.0.md new file mode 100644 index 000000000000..e3cc3703fadd --- /dev/null +++ b/changelogs/3.3.0.md @@ -0,0 +1,268 @@ +# Highlights of the release + +- Stabilize new lazy vals [#16614](https://github.com/lampepfl/dotty/pull/16614) +- Experimental Macro annotations [#16392](https://github.com/lampepfl/dotty/pull/16392) [#16454](https://github.com/lampepfl/dotty/pull/16454) [#16534](https://github.com/lampepfl/dotty/pull/16534) +- Fix stability check for inline parameters [#15511](https://github.com/lampepfl/dotty/pull/15511) +- Make `fewerBraces` a standard feature [#16297](https://github.com/lampepfl/dotty/pull/16297) +- Add new front-end phase for unused entities and add support for unused imports [#16157](https://github.com/lampepfl/dotty/pull/16157) +- Implement -Wvalue-discard warning [#15975](https://github.com/lampepfl/dotty/pull/15975) +- Introduce boundary/break control abstraction. [#16612](https://github.com/lampepfl/dotty/pull/16612) + +# Other changes and fixes + +## Annotations + +- Support use-site meta-annotations [#16445](https://github.com/lampepfl/dotty/pull/16445) + +## Desugaring + +- Reuse typed prefix for `applyDynamic` and `applyDynamicNamed` [#16552](https://github.com/lampepfl/dotty/pull/16552) +- Fix object selftype match error [#16441](https://github.com/lampepfl/dotty/pull/16441) + +## Erasure + +- Dealias before checking for outer references in types [#16525](https://github.com/lampepfl/dotty/pull/16525) +- Fix generic signature for type params bounded by primitive [#16442](https://github.com/lampepfl/dotty/pull/16442) +- Avoid EmptyScope.cloneScope crashing, eg on missing references [#16314](https://github.com/lampepfl/dotty/pull/16314) + +## GADTs + +- Inline GADT state restoring in TypeComparer [#16564](https://github.com/lampepfl/dotty/pull/16564) +- Add extension/conversion to GADT selection healing [#16638](https://github.com/lampepfl/dotty/pull/16638) +- Split out immutable GadtConstraint [#16602](https://github.com/lampepfl/dotty/pull/16602) +- Avoid bidirectional GADT typebounds from fullBounds [#15683](https://github.com/lampepfl/dotty/pull/15683) + +## Incremental compilation + +- Unpickle arguments of parent constructors in Templates lazily [#16688](https://github.com/lampepfl/dotty/pull/16688) + +## Initialization + +- Fix #16438: Supply dummy args for erroneous parent call in init check [#16448](https://github.com/lampepfl/dotty/pull/16448) + +## Inline + +- Dealias in ConstantValue, for inline if cond [#16652](https://github.com/lampepfl/dotty/pull/16652) +- Set Span for top level annotations generated in PostTyper [#16378](https://github.com/lampepfl/dotty/pull/16378) +- Interpolate any type vars from comparing against SelectionProto [#16348](https://github.com/lampepfl/dotty/pull/16348) +- Handle binding of beta reduced inlined lambdas [#16377](https://github.com/lampepfl/dotty/pull/16377) +- Do not add dummy RHS to abstract inline methods [#16510](https://github.com/lampepfl/dotty/pull/16510) +- Warn on inline given aliases with functions as RHS [#16499](https://github.com/lampepfl/dotty/pull/16499) +- Support inline overrides in value classes [#16523](https://github.com/lampepfl/dotty/pull/16523) + +## Java interop + +- Represent Java annotations as interfaces so they can be extended, and disallow various misuses of them [#16260](https://github.com/lampepfl/dotty/pull/16260) + +## Linting + +- Fix -Wunused:import registering constructor `` instead of its owner (also fix false positive for enum) [#16661](https://github.com/lampepfl/dotty/pull/16661) +- Fix #16675 : -Wunused false positive on case class generated method, due to flags used to distinguish case accessors. [#16683](https://github.com/lampepfl/dotty/pull/16683) +- Fix #16682: CheckUnused missed some used symbols [#16690](https://github.com/lampepfl/dotty/pull/16690) +- Fix "-Wunused: False positive on parameterless enum member" [#16927](https://github.com/lampepfl/dotty/pull/16927) +- Register usage of symbols in non-inferred type trees in CheckUnused [#16939](https://github.com/lampepfl/dotty/pull/16939) +- Traverse annotations instead of just registering in -Wunused [#16956](https://github.com/lampepfl/dotty/pull/16956) +- Ignore parameter of accessors in -Wunused [#16957](https://github.com/lampepfl/dotty/pull/16957) +- Ignore parameter of accessors in -Wunused [#16957](https://github.com/lampepfl/dotty/pull/16957) +- Improve override detection in CheckUnused [#16965](https://github.com/lampepfl/dotty/pull/16965) +- WUnused: Fix unused warning in synthetic symbols [#17020](https://github.com/lampepfl/dotty/pull/17020) +- Fix WUnused with idents in derived code [#17095](https//github.com/lampepfl/dotty/pull/17095) +- WUnused: Fix for symbols with synthetic names and unused transparent inlines [#17061](https//github.com/lampepfl/dotty/pull/17061) +- Skip extension method params in WUnused [#17178](https//github.com/lampepfl/dotty/pull/17178) +- Fix wunused false positive when deriving alias type [#17157](https//github.com/lampepfl/dotty/pull/17157) +- Fix WUnused for accessible symbols that are renamed [#17177](https//github.com/lampepfl/dotty/pull/17177) +- Fix WUnused false positive in for [#17176](https//github.com/lampepfl/dotty/pull/17176) +- Make CheckUnused run both after Typer and Inlining [#17206](https//github.com/lampepfl/dotty/pull/17206) +- Disable WUnused for params of non-private defs [#17223](https//github.com/lampepfl/dotty/pull/17223) +- Wunused: Check if symbol exists before `isValidMemberDef` check [#17316](https://github.com/lampepfl/dotty/pull/17316) +- Wunused: Include import selector bounds in unused checks [#17323](https://github.com/lampepfl/dotty/pull/17323) +- Fix compiler crash in WUnused [#17340](https://github.com/lampepfl/dotty/pull/17340) + +## Opaque Types + +- Delay opaque alias checking until PostTyper [#16644](https://github.com/lampepfl/dotty/pull/16644) + +## Overloading + +- Handle context function arguments in overloading resolution [#16511](https://github.com/lampepfl/dotty/pull/16511) + +## Parser + +- Improve support for Unicode supplementary characters in identifiers and string interpolation (as in Scala 2) [#16278](https://github.com/lampepfl/dotty/pull/16278) +- Require indent after colon at EOL [#16466](https://github.com/lampepfl/dotty/pull/16466) +- Help givens return refined types [#16293](https://github.com/lampepfl/dotty/pull/16293) + +## Pattern Matching + +- Tweak AvoidMap's derivedSelect [#16563](https://github.com/lampepfl/dotty/pull/16563) +- Space: Use RHS of & when refining subtypes [#16573](https://github.com/lampepfl/dotty/pull/16573) +- Freeze constraints in a condition check of maximiseType [#16526](https://github.com/lampepfl/dotty/pull/16526) +- Restrict syntax of typed patterns [#16150](https://github.com/lampepfl/dotty/pull/16150) +- Test case to show that #16252 works with transparent [#16262](https://github.com/lampepfl/dotty/pull/16262) +- Support inline unapplySeq and with leading given parameters [#16358](https://github.com/lampepfl/dotty/pull/16358) +- Handle sealed prefixes in exh checking [#16621](https://github.com/lampepfl/dotty/pull/16621) +- Detect irrefutable quoted patterns [#16674](https://github.com/lampepfl/dotty/pull/16674) +- Patmat: Use less type variables in prefix inference [#16827](https//github.com/lampepfl/dotty/pull/16827) + +## Pickling + +- Allow case classes with up to 254 parameters [#16501](https://github.com/lampepfl/dotty/pull/16501) +- Correctly unpickle Scala 2 private case classes in traits [#16519](https://github.com/lampepfl/dotty/pull/16519) + +## Polyfunctions + +- Fix #9996: Crash with function accepting polymorphic function type with singleton result [#16327](https://github.com/lampepfl/dotty/pull/16327) + +## Quotes + +- Remove contents of inline methods [#16345](https://github.com/lampepfl/dotty/pull/16345) +- Fix errors in explicit type annotations in inline match cases [#16257](https://github.com/lampepfl/dotty/pull/16257) +- Handle macro annotation suspends and crashes [#16509](https://github.com/lampepfl/dotty/pull/16509) +- Fix macro annotations `spliceOwner` [#16513](https://github.com/lampepfl/dotty/pull/16513) +- Fix HK quoted pattern type variables [#16907](https//github.com/lampepfl/dotty/pull/16907) + +## REPL + +- REPL: Fix crash when printing instances of value classes [#16393](https://github.com/lampepfl/dotty/pull/16393) +- Attempt to fix completion crash [#16267](https://github.com/lampepfl/dotty/pull/16267) +- Fix REPL shadowing bug [#16389](https://github.com/lampepfl/dotty/pull/16389) +- Open up for extensibility [#16276](https://github.com/lampepfl/dotty/pull/16276) +- Don't crash if completions throw [#16687](https://github.com/lampepfl/dotty/pull/16687) + +## Reflection + +- Fix reflect typeMembers to return all members [#15033](https://github.com/lampepfl/dotty/pull/15033) +- Deprecate reflect Flags.Static [#16568](https://github.com/lampepfl/dotty/pull/16568) + +## Reporting + +- Suppress follow-on errors for erroneous import qualifiers [#16658](https://github.com/lampepfl/dotty/pull/16658) +- Fix order in which errors are reported for assignment to val [#16660](https://github.com/lampepfl/dotty/pull/16660) +- Fix class name in error message [#16635](https://github.com/lampepfl/dotty/pull/16635) +- Make refined type printing more source compatible [#16303](https://github.com/lampepfl/dotty/pull/16303) +- Add error hint on local inline def used in quotes [#16572](https://github.com/lampepfl/dotty/pull/16572) +- Fix Text wrapping [#16277](https://github.com/lampepfl/dotty/pull/16277) +- Fix #16680 by registering Ident not containing a symbol [#16689](https://github.com/lampepfl/dotty/pull/16689) +- Fix the non-miniphase tree traverser [#16684](https://github.com/lampepfl/dotty/pull/16684) +- Just warn on type ascription on a pattern [#17454](https://github.com/lampepfl/dotty/pull/17454) + +## Scala-JS + +- Fix #14289: Accept Ident refs to `js.native` in native member rhs. [#16185](https://github.com/lampepfl/dotty/pull/16185) + +## Scaladoc + +- Added jpath check to `ClassLikeSupport` getParentsAsTreeSymbolTuples [#16759](https://github.com/lampepfl/dotty/pull/16759) + +## Standard Library + +- Add `CanEqual` instance for `Map` [#15886](https://github.com/lampepfl/dotty/pull/15886) +- Refine `Tuple.Append` return type [#16140](https://github.com/lampepfl/dotty/pull/16140) +- Remove experimental from `Mirror#fromProductTyped` [#16829](https//github.com/lampepfl/dotty/pull/16829) + +## TASTy format + +- Make it a fatal error if erasure cannot resolve a type [#16373](https://github.com/lampepfl/dotty/pull/16373) + +## Tooling + +- Add -Yimports compiler flag [#16218](https://github.com/lampepfl/dotty/pull/16218) +- Allow BooleanSettings to be set with a colon [#16425](https://github.com/lampepfl/dotty/pull/16425) +- Add support for disabling redirected output in the REPL driver for usage in worksheets in the Scala Plugin for IntelliJ IDEA [#16810](https://github.com/lampepfl/dotty/pull/16810) +- Fix #17187: allow patches with same span [#17366](https://github.com/lampepfl/dotty/pull/17366) + +## Transform + +- Avoid stackoverflow in ExplicitOuter [#16381](https://github.com/lampepfl/dotty/pull/16381) +- Make lazy vals run on non-fallback graal image - remove dynamic reflection [#16346](https://github.com/lampepfl/dotty/pull/16346) +- Patch to avoid crash in #16351 [#16354](https://github.com/lampepfl/dotty/pull/16354) +- Don't treat package object's `` methods as package members [#16667](https://github.com/lampepfl/dotty/pull/16667) +- Space: Refine isSubspace property & an example [#16574](https://github.com/lampepfl/dotty/pull/16574) +- Fix static lazy field holder for GraalVM [#16800](https://github.com/lampepfl/dotty/pull/16800) +- Fix race condition in new LazyVals [#16975](https://github.com/lampepfl/dotty/pull/16975) + +## Typer + +- Drop requirement that self types are closed [#16648](https://github.com/lampepfl/dotty/pull/16648) +- Disallow constructor params from appearing in parent types for soundness [#16664](https://github.com/lampepfl/dotty/pull/16664) +- Don't search implicit arguments in singleton type prefix [#16490](https://github.com/lampepfl/dotty/pull/16490) +- Don't rely on isProvisional to determine whether atoms computed [#16489](https://github.com/lampepfl/dotty/pull/16489) +- Support signature polymorphic methods (`MethodHandle` and `VarHandle`) [#16225](https://github.com/lampepfl/dotty/pull/16225) +- Prefer parameterless alternatives during ambiguous overload resolution [#16315](https://github.com/lampepfl/dotty/pull/16315) +- Fix calculation to drop transparent classes [#16344](https://github.com/lampepfl/dotty/pull/16344) +- Test case for issue 16311 [#16317](https://github.com/lampepfl/dotty/pull/16317) +- Skip caching provisional OrType atoms [#16295](https://github.com/lampepfl/dotty/pull/16295) +- Avoid cyclic references due to experimental check when inlining [#16195](https://github.com/lampepfl/dotty/pull/16195) +- Track type variable dependencies to guide instantiation decisions [#16042](https://github.com/lampepfl/dotty/pull/16042) +- Two fixes to constraint solving [#16353](https://github.com/lampepfl/dotty/pull/16353) +- Fix regression in cyclic constraint handling [#16514](https://github.com/lampepfl/dotty/pull/16514) +- Sharpen range approximation for applied types with capture set ranges [#16261](https://github.com/lampepfl/dotty/pull/16261) +- Cut the Gordian Knot: Don't widen unions to transparent [#15642](https://github.com/lampepfl/dotty/pull/15642) +- Fix widening logic to keep instantiation within bounds [#16417](https://github.com/lampepfl/dotty/pull/16417) +- Skip ambiguous reference error when symbols are aliases [#16401](https://github.com/lampepfl/dotty/pull/16401) +- Avoid incorrect simplifications when updating bounds in the constraint [#16410](https://github.com/lampepfl/dotty/pull/16410) +- Take `@targetName` into account when resolving extension methods [#16487](https://github.com/lampepfl/dotty/pull/16487) +- Improve ClassTag handling to avoid invalid ClassTag generation and inference failure [#16492](https://github.com/lampepfl/dotty/pull/16492) +- Fix extracting the elemType of a union of arrays [#16569](https://github.com/lampepfl/dotty/pull/16569) +- Make sure annotations are typed in expression contexts [#16699](https://github.com/lampepfl/dotty/pull/16699) +- Throw a type error when using hk-types in unions or intersections [#16712](https://github.com/lampepfl/dotty/pull/16712) +- Add missing criterion to subtype check [#16889](https://github.com/lampepfl/dotty/pull/16889) +- Fix caching issue caused by incorrect isProvisional check [#16989](https://github.com/lampepfl/dotty/pull/16989) + +# Contributors + +Thank you to all the contributors who made this release possible 🎉 + +According to `git shortlog -sn --no-merges 3.2.2..3.3.0` these are: + +``` + 226 Martin Odersky + 106 Szymon Rodziewicz + 81 Dale Wijnand + 56 Nicolas Stucki + 52 Paul Coral + 48 Kamil Szewczyk + 45 Paweł Marks + 28 Florian3k + 28 Yichen Xu + 15 Guillaume Martres + 10 Michał Pałka + 9 Kacper Korban + 8 Fengyun Liu + 7 Chris Birchall + 7 rochala + 6 Sébastien Doeraene + 6 jdudrak + 5 Seth Tisue + 5 Som Snytt + 5 nizhikov + 4 Filip Zybała + 4 Jan Chyb + 4 Michael Pollmeier + 4 Natsu Kagami + 3 Anatolii Kmetiuk + 3 Jamie Thompson + 2 Adrien Piquerez + 2 Alex + 2 Dmitrii Naumenko + 2 Lukas Rytz + 2 Michael Pilquist + 2 Vasil Vasilev + 2 adampauls + 2 yoshinorin + 1 Alexander Slesarenko + 1 Chris Kipp + 1 Guillaume Raffin + 1 Jakub Kozłowski + 1 Jan-Pieter van den Heuvel + 1 Julien Richard-Foy + 1 Kenji Yoshida + 1 Matt Bovel + 1 Mohammad Yousuf Minhaj Zia + 1 Philippus + 1 Szymon R + 1 Tim Spence + 1 s.bazarsadaev + + +``` \ No newline at end of file From 5879ff1caa82b4b5c32f67e88c85370c7fdbc5a3 Mon Sep 17 00:00:00 2001 From: =?UTF-8?q?Pawe=C5=82=20Marks?= Date: Tue, 23 May 2023 13:35:45 +0200 Subject: [PATCH 095/371] Release 3.3.0 --- project/Build.scala | 4 ++-- 1 file changed, 2 insertions(+), 2 deletions(-) diff --git a/project/Build.scala b/project/Build.scala index 9109a925a450..5aca4ace8d6a 100644 --- a/project/Build.scala +++ b/project/Build.scala @@ -82,7 +82,7 @@ object Build { val referenceVersion = "3.2.2" - val baseVersion = "3.3.0-RC6" + val baseVersion = "3.3.0" // Versions used by the vscode extension to create a new project // This should be the latest published releases. @@ -98,7 +98,7 @@ object Build { * set to 3.1.3. If it is going to be 3.1.0, it must be set to the latest * 3.0.x release. */ - val previousDottyVersion = "3.3.0-RC5" + val previousDottyVersion = "3.2.2" object CompatMode { final val BinaryCompatible = 0 From 390f836c5ac0d8aaa30b331745e361b75139f68a Mon Sep 17 00:00:00 2001 From: =?UTF-8?q?Pawe=C5=82=20Marks?= Date: Wed, 24 May 2023 13:29:23 +0200 Subject: [PATCH 096/371] Add changelog for 3.3.1-RC1 --- changelogs/3.3.1-RC1.md | 288 ++++++++++++++++++++++++++++++++++++++++ 1 file changed, 288 insertions(+) create mode 100644 changelogs/3.3.1-RC1.md diff --git a/changelogs/3.3.1-RC1.md b/changelogs/3.3.1-RC1.md new file mode 100644 index 000000000000..4e52eb874891 --- /dev/null +++ b/changelogs/3.3.1-RC1.md @@ -0,0 +1,288 @@ +# Highlights of the release + +- Support records in JavaParsers [#16762](https://github.com/lampepfl/dotty/pull/16762) +- Port JVM backend refactor from Scala 2 [#15322](https://github.com/lampepfl/dotty/pull/15322) + +# Other changes and fixes + +## Backend + +- Disallow mixins where super calls bind to vals [#16908](https://github.com/lampepfl/dotty/pull/16908) +- Fix #15107: Avoid re-emitting a LineNumber after only LabelNodes. [#16813](https://github.com/lampepfl/dotty/pull/16813) + +## Coverage + +- Fix #17042: Preserve the shape of secondary ctors in instrumentCoverage. [#17111](https://github.com/lampepfl/dotty/pull/17111) + +## Default parameters + +- Dupe fix when finding default arg getters [#17058](https://github.com/lampepfl/dotty/pull/17058) + +## Documentation + +- Fix: ensure syntax blocks for ebnf are marked as such [#16837](https://github.com/lampepfl/dotty/pull/16837) + +## Erasure + +- Handle `@companionClass` and `@companionMethod` meta-annotations [#17091](https://github.com/lampepfl/dotty/pull/17091) + +## Extension Methods + +- Support extension methods imported from different objects [#17050](https://github.com/lampepfl/dotty/pull/17050) + +## GADTs + +- Fix tuple member selection so it works with GADT healing [#16766](https://github.com/lampepfl/dotty/pull/16766) +- Fix upper bound constraints, that are higher-kinded [#16744](https://github.com/lampepfl/dotty/pull/16744) +- Split out immutable GadtConstraint [#16602](https://github.com/lampepfl/dotty/pull/16602) + +## Implicits + +- Improve subtyping check for not yet eta-expanded higher kinded types [#17139](https://github.com/lampepfl/dotty/pull/17139) +- Harden tpd.Apply/TypeApply in case of errors [#16887](https://github.com/lampepfl/dotty/pull/16887) +- Try to be more subtle when inferring type parameters of class parents [#16896](https://github.com/lampepfl/dotty/pull/16896) +- Include `P` in the implicit scope of `P.this.type` [#17088](https://github.com/lampepfl/dotty/pull/17088) + +## Incremental Compilation + +- Fix under-compilation when the method type in a SAM changes [#16996](https://github.com/lampepfl/dotty/pull/16996) + +## Infrastructure + +- Set reference version to 3.3.0-RC6 [#17504](https://github.com/lampepfl/dotty/pull/17504) +- Fix #17119: Download Coursier from GitHub directly [#17141](https://github.com/lampepfl/dotty/pull/17141) + +## Inline + +- Remove NamedArg from inlined arguments [#17228](https://github.com/lampepfl/dotty/pull/17228) +- Don't generate a Select for a TermRef with NoPrefix [#16754](https://github.com/lampepfl/dotty/pull/16754) +- Prepare bodies of inline forwarders eagerly [#16757](https://github.com/lampepfl/dotty/pull/16757) +- Do not remove inline method implementations until PruneErasedDefs [#17408](https://github.com/lampepfl/dotty/pull/17408) + +## Java Interop + +- ClassfileParser: allow missing param names (for JDK 21) [#17536](https://github.com/lampepfl/dotty/pull/17536) + +## Linting + +- Improve -Wunused: locals, privates with unset vars warning #16639 [#17160](https://github.com/lampepfl/dotty/pull/17160) +- Fix wunused false positive when deriving alias type [#17157](https://github.com/lampepfl/dotty/pull/17157) +- Port `-Wnonunit-statement` setting for dotty [#16936](https://github.com/lampepfl/dotty/pull/16936) + +## Match Types + +- Normalize match type usage during implicit lookup [#17457](https://github.com/lampepfl/dotty/pull/17457) +- Fix #13757: Explicitly disallow higher-kinded scrutinees of match types. [#17322](https://github.com/lampepfl/dotty/pull/17322) +- Fix match type reduction with wildcard type arguments [#17065](https://github.com/lampepfl/dotty/pull/17065) +- Fix check whether classtag can be generated for match types [#16708](https://github.com/lampepfl/dotty/pull/16708) + +## Parser + +- Allow lines starting with `.` to fall outside previous indentation widths [#17056](https://github.com/lampepfl/dotty/pull/17056) + +## Pattern Matching + +- Fix #11541: Specialize ClassTag[T] in exhaustivity check [#17385](https://github.com/lampepfl/dotty/pull/17385) +- Check outer class prefixes in type projections when pattern matching [#17136](https://github.com/lampepfl/dotty/pull/17136) +- Make unchecked cases non-`@unchecked` and non-unreachable [#16958](https://github.com/lampepfl/dotty/pull/16958) +- Fix #16899: Better handle X instanceOf P where X is T1 | T2 [#17382](https://github.com/lampepfl/dotty/pull/17382) + +## Pickling + +- ClassfileParser: Avoid cycle when accessing companion in inner class lookup [#16882](https://github.com/lampepfl/dotty/pull/16882) + +## Polyfunctions + +- Fix type aliases in beta-reduction of polyfunctions [#17054](https://github.com/lampepfl/dotty/pull/17054) + +## Quotes + +- Register `paramProxy` and `thisProxy` in `Quote` type [#17541](https://github.com/lampepfl/dotty/pull/17541) +- Only check newVal/newMethod privateWithin on -Xcheck-macros [#17437](https://github.com/lampepfl/dotty/pull/17437) +- Unencode quote and splice trees [#17342](https://github.com/lampepfl/dotty/pull/17342) +- Correctly type Expr.ofTupleFromSeq for arity > 22 [#17261](https://github.com/lampepfl/dotty/pull/17261) +- Use TermRef to distinguish distinct Type[T] instances [#17205](https://github.com/lampepfl/dotty/pull/17205) +- Check level consistency of SingletonTypeTree as a type [#17209](https://github.com/lampepfl/dotty/pull/17209) +- Fix splice type variable pattern detection [#17048](https://github.com/lampepfl/dotty/pull/17048) +- Avoid creation of `@SplicedType` quote local refrences [#17051](https://github.com/lampepfl/dotty/pull/17051) +- Dealias type references when healing types in quotes [#17049](https://github.com/lampepfl/dotty/pull/17049) +- Replace quoted type variables in signature of HOAS pattern result [#16951](https://github.com/lampepfl/dotty/pull/16951) +- Beta-reduce directly applied PolymorphicFunction [#16623](https://github.com/lampepfl/dotty/pull/16623) +- Use `Object.toString` for `quoted.{Expr, Type}` [#16663](https://github.com/lampepfl/dotty/pull/16663) +- Fix Splicer.isEscapedVariable [#16838](https://github.com/lampepfl/dotty/pull/16838) +- Fix references to class members defined in quotes [#17107](https://github.com/lampepfl/dotty/pull/17107) +- Handle pickled forward references in pickled expressions [#16855](https://github.com/lampepfl/dotty/pull/16855) +- Fix #16615 - crashes of path dependent types in spliced Type.of [#16773](https://github.com/lampepfl/dotty/pull/16773) +- Disallow local term references in staged types [#16362](https://github.com/lampepfl/dotty/pull/16362) +- Refactor level checking / type healing logic [#17082](https://github.com/lampepfl/dotty/pull/17082) +- Dealias quoted types when staging [#17059](https://github.com/lampepfl/dotty/pull/17059) +- Fix quotes with references to path dependent types [#17081](https://github.com/lampepfl/dotty/pull/17081) +- Make arguments order in quote hole deterministic [#17405](https://github.com/lampepfl/dotty/pull/17405) +- Only transform the body of the quote with QuoteTransformer [#17451](https://github.com/lampepfl/dotty/pull/17451) +- Place staged type captures in Quote AST [#17424](https://github.com/lampepfl/dotty/pull/17424) +- Add SplicePattern AST to parse and type quote pattern splices [#17396](https://github.com/lampepfl/dotty/pull/17396) + +## Reflection + +- -Xcheck-macros: add hint when a symbol in created twice [#16733](https://github.com/lampepfl/dotty/pull/16733) +- Assert that symbols created using reflect API have correct privateWithin symbols [#17352](https://github.com/lampepfl/dotty/pull/17352) +- Fix reflect.LambdaType type test [#16972](https://github.com/lampepfl/dotty/pull/16972) +- Improve `New`/`Select` -Ycheck message [#16746](https://github.com/lampepfl/dotty/pull/16746) +- Improve error message for CyclicReference in macros [#16749](https://github.com/lampepfl/dotty/pull/16749) +- Add reflect `defn.FunctionClass` overloads [#16849](https://github.com/lampepfl/dotty/pull/16849) + +## REPL + +- Always load REPL classes in macros including the output directory [#16866](https://github.com/lampepfl/dotty/pull/16866) + +## Reporting + +- Improve missing argument list error [#17126](https://github.com/lampepfl/dotty/pull/17126) +- Improve implicit parameter error message with aliases [#17125](https://github.com/lampepfl/dotty/pull/17125) +- Improve "constructor proxy shadows outer" handling [#17154](https://github.com/lampepfl/dotty/pull/17154) +- Clarify ambiguous reference error message [#16137](https://github.com/lampepfl/dotty/pull/16137) +- Hint about forbidden combination of implicit values and conversions [#16735](https://github.com/lampepfl/dotty/pull/16735) +- Attach explanation message to diagnostic message [#16787](https://github.com/lampepfl/dotty/pull/16787) +- Propagate implicit search errors from implicit macros [#16840](https://github.com/lampepfl/dotty/pull/16840) +- Detail UnapplyInvalidReturnType error message [#17167](https://github.com/lampepfl/dotty/pull/17167) +- Add way to debug -Xcheck-macros tree checking [#16973](https://github.com/lampepfl/dotty/pull/16973) +- Enrich and finesse compiler crash reporting [#17031](https://github.com/lampepfl/dotty/pull/17031) +- Allow @implicitNotFound messages as explanations [#16893](https://github.com/lampepfl/dotty/pull/16893) +- Include top-level symbols from same file in outer ambiguity error [#17033](https://github.com/lampepfl/dotty/pull/17033) +- Do not issue deprecation warnings when declaring deprecated case classes [#17165](https://github.com/lampepfl/dotty/pull/17165) + +## Scala-JS + +- Fix #17344: Make implicit references to this above dynamic imports explicit. [#17357](https://github.com/lampepfl/dotty/pull/17357) +- Fix #12621: Better error message for JS trait ctor param. [#16811](https://github.com/lampepfl/dotty/pull/16811) +- Fix #16801: Handle Closure's of s.r.FunctionXXL. [#16809](https://github.com/lampepfl/dotty/pull/16809) +- Fix #17549: Unify how Memoize and Constructors decide what fields need storing. [#17560](https://github.com/lampepfl/dotty/pull/17560) + +## Scaladoc + +- Feat: Add a blog configuration with yaml [#17214](https://github.com/lampepfl/dotty/pull/17214) +- Don't render the "$" for module [#17302](https://github.com/lampepfl/dotty/pull/17302) +- Fix: Add scrollbar to the sidebar [#17203](https://github.com/lampepfl/dotty/pull/17203) +- Scaladoc: fix crash when processing extends call [#17260](https://github.com/lampepfl/dotty/pull/17260) +- Fix: Modify the CSS so that the logo of the generated documentation is adaptive [#17172](https://github.com/lampepfl/dotty/pull/17172) +- Fix: Remove the duplicate parameter when generating the scaladoc. [#17097](https://github.com/lampepfl/dotty/pull/17097) +- Fix: padding top in mobile version [#17019](https://github.com/lampepfl/dotty/pull/17019) +- Fix: tap target of the menu in Mobile version [#17018](https://github.com/lampepfl/dotty/pull/17018) +- Scaladoc: Fix expand icon not changing on anchor link [#17053](https://github.com/lampepfl/dotty/pull/17053) +- Scaladoc: fix inkuire generation for PolyTypes [#17129](https://github.com/lampepfl/dotty/pull/17129) +- Re port scroll bar [#17463](https://github.com/lampepfl/dotty/pull/17463) +- Handle empty files and truncated YAML front matter [#17527](https://github.com/lampepfl/dotty/pull/17527) + +## SemanticDB + +- Make sure symbol exists before calling owner [#16860](https://github.com/lampepfl/dotty/pull/16860) +- Support LambdaType (convert from HKTypeLambda) [#16056](https://github.com/lampepfl/dotty/pull/16056) + +## Specification + +- Apply `class-shadowing.md` to the Spec [#16839](https://github.com/lampepfl/dotty/pull/16839) +- Adding base for future Spec into the compiler repo [#16825](https://github.com/lampepfl/dotty/pull/16825) + +## Standard Library + +- Optimization: avoid NotGiven allocations [#17090](https://github.com/lampepfl/dotty/pull/17090) + +## Tooling + +- Disable `ExtractSemanticDB` phase when writing to output directory defined as JAR. [#16790](https://github.com/lampepfl/dotty/pull/16790) +- Print owner of bind symbol with -Yprint-debug-owners [#16854](https://github.com/lampepfl/dotty/pull/16854) +- Small fixes to allow using Metals with scaladoc with sbt [#16816](https://github.com/lampepfl/dotty/pull/16816) + +## Transform + +- Move CrossVersionChecks before FirstTransform [#17301](https://github.com/lampepfl/dotty/pull/17301) +- Fix needsOuterIfReferenced [#17159](https://github.com/lampepfl/dotty/pull/17159) +- Drop incorrect super accessor in trait subclass [#17062](https://github.com/lampepfl/dotty/pull/17062) +- Generate toString only for synthetic companions of case classes [#16890](https://github.com/lampepfl/dotty/pull/16890) +- Check trait constructor for accessibility even if not called at Typer [#17094](https://github.com/lampepfl/dotty/pull/17094) +- Fix #17435: A simpler fix [#17436](https://github.com/lampepfl/dotty/pull/17436) + +## Typer + +- Preserve type bounds for inlined definitions in posttyper [#17190](https://github.com/lampepfl/dotty/pull/17190) +- Change logic to find members of recursive types [#17386](https://github.com/lampepfl/dotty/pull/17386) +- Recognize named arguments in isFunctionWithUnknownParamType [#17161](https://github.com/lampepfl/dotty/pull/17161) +- Better comparisons for type projections [#17092](https://github.com/lampepfl/dotty/pull/17092) +- Allow selectDynamic and applyDynamic to be extension methods [#17106](https://github.com/lampepfl/dotty/pull/17106) +- Fix use of accessibleFrom when finding default arg getters [#16977](https://github.com/lampepfl/dotty/pull/16977) +- Map class literal constant types [#16988](https://github.com/lampepfl/dotty/pull/16988) +- Always use adapted type in withDenotation [#16901](https://github.com/lampepfl/dotty/pull/16901) +- Restrict captureWildcards to only be used if needed [#16799](https://github.com/lampepfl/dotty/pull/16799) +- Don't capture wildcards if in closure or by-name [#16732](https://github.com/lampepfl/dotty/pull/16732) +- Infer: Don't minimise to Nothing if there's an upper bound [#16786](https://github.com/lampepfl/dotty/pull/16786) +- Perform Matchable check only if type test is needed [#16824](https://github.com/lampepfl/dotty/pull/16824) +- Don't eta expand unary varargs methods [#16892](https://github.com/lampepfl/dotty/pull/16892) +- Fix beta-reduction with `Nothing` and `null` args [#16938](https://github.com/lampepfl/dotty/pull/16938) +- Generate kind-correct wildcards when selecting from a wildcard [#17025](https://github.com/lampepfl/dotty/pull/17025) +- Fix #16405 ctd - wildcards prematurely resolving to Nothing [#16764](https://github.com/lampepfl/dotty/pull/16764) +- Test: add regression test for #7790 [#17473](https://github.com/lampepfl/dotty/pull/17473) +- Properly handle `AnyVal`s as refinement members of `Selectable`s [#16286](https://github.com/lampepfl/dotty/pull/16286) + +# Contributors + +Thank you to all the contributors who made this release possible 🎉 + +According to `git shortlog -sn --no-merges 3.3.0..3.3.1-RC1` these are: + +``` + 148 Nicolas Stucki + 65 Martin Odersky + 51 Szymon Rodziewicz + 49 Dale Wijnand + 49 Quentin Bernet + 38 Chris Kipp + 19 David Hua + 18 Lucas + 18 ysthakur + 15 Fengyun Liu + 15 Paweł Marks + 14 Guillaume Martres + 14 Jamie Thompson + 11 Sébastien Doeraene + 9 Timothée Andres + 8 Kacper Korban + 7 Matt Bovel + 7 Som Snytt + 6 Julien Richard-Foy + 6 Lucas Leblanc + 5 Michał Pałka + 4 Anatolii Kmetiuk + 4 Guillaume Raffin + 4 Paul Coral + 4 Wojciech Mazur + 4 Yichen Xu + 3 Decel + 3 Jan Chyb + 2 Adrien Piquerez + 2 Arman Bilge + 2 Carl + 2 Florian3k + 2 Kenji Yoshida + 2 Michael Pilquist + 2 Natsu Kagami + 2 Seth Tisue + 2 Tomasz Godzik + 2 Vasil Vasilev + 2 Yadu Krishnan + 1 Bersier + 1 Flavio Brasil + 1 Jan-Pieter van den Heuvel + 1 Lukas Rytz + 1 Miles Yucht + 1 Mohammad Yousuf Minhaj Zia + 1 Ondra Pelech + 1 Philippus + 1 Rikito Taniguchi + 1 Simon R + 1 brandonspark + 1 github-actions[bot] + 1 liang3zy22 + 1 s.bazarsadaev + 1 Łukasz Wroński + +``` From dfb23f95afa8bec461674140c99b19ea3a9ab010 Mon Sep 17 00:00:00 2001 From: =?UTF-8?q?Pawe=C5=82=20Marks?= Date: Wed, 24 May 2023 13:30:49 +0200 Subject: [PATCH 097/371] Release 3.3.1-RC1 --- tasty/src/dotty/tools/tasty/TastyFormat.scala | 4 ++-- 1 file changed, 2 insertions(+), 2 deletions(-) diff --git a/tasty/src/dotty/tools/tasty/TastyFormat.scala b/tasty/src/dotty/tools/tasty/TastyFormat.scala index 226fc14acb39..39d559234868 100644 --- a/tasty/src/dotty/tools/tasty/TastyFormat.scala +++ b/tasty/src/dotty/tools/tasty/TastyFormat.scala @@ -290,7 +290,7 @@ object TastyFormat { * compatibility, but remains backwards compatible, with all * preceeding `MinorVersion`. */ - final val MinorVersion: Int = 4 + final val MinorVersion: Int = 3 /** Natural Number. The `ExperimentalVersion` allows for * experimentation with changes to TASTy without committing @@ -306,7 +306,7 @@ object TastyFormat { * is able to read final TASTy documents if the file's * `MinorVersion` is strictly less than the current value. */ - final val ExperimentalVersion: Int = 1 + final val ExperimentalVersion: Int = 0 /**This method implements a binary relation (`<:<`) between two TASTy versions. * From 0fa1c91eb181e695e425bb8a022daf61c49ab214 Mon Sep 17 00:00:00 2001 From: =?UTF-8?q?Pawe=C5=82=20Marks?= Date: Mon, 29 May 2023 16:25:59 +0200 Subject: [PATCH 098/371] Add info about 3.3 to source compat doc --- docs/_docs/reference/language-versions/source-compatibility.md | 3 +++ 1 file changed, 3 insertions(+) diff --git a/docs/_docs/reference/language-versions/source-compatibility.md b/docs/_docs/reference/language-versions/source-compatibility.md index 077f06b2b4db..3e9954a6d55a 100644 --- a/docs/_docs/reference/language-versions/source-compatibility.md +++ b/docs/_docs/reference/language-versions/source-compatibility.md @@ -23,6 +23,9 @@ The default Scala language syntax version currently supported by the Dotty compi - [`3.2-migration`](https://scala-lang.org/api/3.x/scala/runtime/stdLibPatches/language$$3/2-migration$.html): the same as `3.2`, but in conjunction with `-rewrite`, offer code rewrites from Scala `3.0/3.1` to `3.2`. - [`future`](https://scala-lang.org/api/3.x/scala/runtime/stdLibPatches/language$$future$.html): A preview of changes that will be introduced in `3.x` versions after `3.2`. Some Scala 2 specific idioms are dropped in this version. The feature set supported by this version may grow over time as features become stabilised for preview. +- [`3.3`](https://scala-lang.org/api/3.x/scala/runtime/stdLibPatches/language$$3/3$.html): the same as `3.2`, but in addition: + -[Fewer braces syntax](https://docs.scala-lang.org/scala3/reference/other-new-features/indentation.html#optional-braces-for-method-arguments-1) is enabled by default. +- [`3.3-migration`](https://scala-lang.org/api/3.x/scala/runtime/stdLibPatches/language$$3/3-migration$.html): the same as `3.3` - [`future-migration`](https://scala-lang.org/api/3.x/scala/runtime/stdLibPatches/language$$future-migration$.html): Same as `future` but with additional helpers to migrate from `3.2`. Similarly to the helpers available under `3.0-migration`, these include migration warnings and optional rewrites. From 724340e3bfd5c970655fe6e7f49d2d91697c96fe Mon Sep 17 00:00:00 2001 From: =?UTF-8?q?Pawe=C5=82=20Marks?= Date: Tue, 30 May 2023 00:13:10 -0700 Subject: [PATCH 099/371] Update docs/_docs/reference/language-versions/source-compatibility.md Co-authored-by: Jamie Thompson --- docs/_docs/reference/language-versions/source-compatibility.md | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/docs/_docs/reference/language-versions/source-compatibility.md b/docs/_docs/reference/language-versions/source-compatibility.md index 3e9954a6d55a..131bb100a91b 100644 --- a/docs/_docs/reference/language-versions/source-compatibility.md +++ b/docs/_docs/reference/language-versions/source-compatibility.md @@ -27,7 +27,7 @@ Some Scala 2 specific idioms are dropped in this version. The feature set suppor -[Fewer braces syntax](https://docs.scala-lang.org/scala3/reference/other-new-features/indentation.html#optional-braces-for-method-arguments-1) is enabled by default. - [`3.3-migration`](https://scala-lang.org/api/3.x/scala/runtime/stdLibPatches/language$$3/3-migration$.html): the same as `3.3` -- [`future-migration`](https://scala-lang.org/api/3.x/scala/runtime/stdLibPatches/language$$future-migration$.html): Same as `future` but with additional helpers to migrate from `3.2`. Similarly to the helpers available under `3.0-migration`, these include migration warnings and optional rewrites. +- [`future-migration`](https://scala-lang.org/api/3.x/scala/runtime/stdLibPatches/language$$future-migration$.html): Same as `future` but with additional helpers to migrate from `3.3`. Similarly to the helpers available under `3.0-migration`, these include migration warnings and optional rewrites. There are two ways to specify a language version : From 232180f07a4415863b19f7e87ead852694effaf3 Mon Sep 17 00:00:00 2001 From: Jamie Thompson Date: Tue, 30 May 2023 16:12:44 +0200 Subject: [PATCH 100/371] Update source-compatibility.md reorder the source versions --- .../reference/language-versions/source-compatibility.md | 5 ++--- 1 file changed, 2 insertions(+), 3 deletions(-) diff --git a/docs/_docs/reference/language-versions/source-compatibility.md b/docs/_docs/reference/language-versions/source-compatibility.md index 131bb100a91b..5cb705a16b82 100644 --- a/docs/_docs/reference/language-versions/source-compatibility.md +++ b/docs/_docs/reference/language-versions/source-compatibility.md @@ -21,12 +21,11 @@ The default Scala language syntax version currently supported by the Dotty compi - [stricter pattern bindings](https://docs.scala-lang.org/scala3/reference/changed-features/pattern-bindings.html) are now enabled (part of `future` in earlier `3.x` releases), producing warnings for refutable patterns. These warnings can be silenced to achieve the same runtime behavior, but in `future` they become errors and refutable patterns will not compile. - [Nonlocal returns](https://docs.scala-lang.org/scala3/reference/dropped-features/nonlocal-returns.html) now produce a warning upon usage (they are still an error under `future`). - [`3.2-migration`](https://scala-lang.org/api/3.x/scala/runtime/stdLibPatches/language$$3/2-migration$.html): the same as `3.2`, but in conjunction with `-rewrite`, offer code rewrites from Scala `3.0/3.1` to `3.2`. -- [`future`](https://scala-lang.org/api/3.x/scala/runtime/stdLibPatches/language$$future$.html): A preview of changes that will be introduced in `3.x` versions after `3.2`. -Some Scala 2 specific idioms are dropped in this version. The feature set supported by this version may grow over time as features become stabilised for preview. - [`3.3`](https://scala-lang.org/api/3.x/scala/runtime/stdLibPatches/language$$3/3$.html): the same as `3.2`, but in addition: -[Fewer braces syntax](https://docs.scala-lang.org/scala3/reference/other-new-features/indentation.html#optional-braces-for-method-arguments-1) is enabled by default. - [`3.3-migration`](https://scala-lang.org/api/3.x/scala/runtime/stdLibPatches/language$$3/3-migration$.html): the same as `3.3` - +- [`future`](https://scala-lang.org/api/3.x/scala/runtime/stdLibPatches/language$$future$.html): A preview of changes that will be introduced in `3.x` versions after `3.3`. +Some Scala 2 specific idioms are dropped in this version. The feature set supported by this version may grow over time as features become stabilised for preview. - [`future-migration`](https://scala-lang.org/api/3.x/scala/runtime/stdLibPatches/language$$future-migration$.html): Same as `future` but with additional helpers to migrate from `3.3`. Similarly to the helpers available under `3.0-migration`, these include migration warnings and optional rewrites. There are two ways to specify a language version : From 8f019275f8dd54fc9554d00292e02fd331ce3427 Mon Sep 17 00:00:00 2001 From: Nicolas Stucki Date: Tue, 30 May 2023 10:00:08 +0200 Subject: [PATCH 101/371] Dealias types in New before matching quotes Fixes #17606 --- .../scala/quoted/runtime/impl/QuoteMatcher.scala | 2 +- tests/pos-macros/i17606/Macros_1.scala | 14 ++++++++++++++ tests/pos-macros/i17606/Test_2.scala | 8 ++++++++ 3 files changed, 23 insertions(+), 1 deletion(-) create mode 100644 tests/pos-macros/i17606/Macros_1.scala create mode 100644 tests/pos-macros/i17606/Test_2.scala diff --git a/compiler/src/scala/quoted/runtime/impl/QuoteMatcher.scala b/compiler/src/scala/quoted/runtime/impl/QuoteMatcher.scala index 5477628a30a3..bfa4c1c6d1f2 100644 --- a/compiler/src/scala/quoted/runtime/impl/QuoteMatcher.scala +++ b/compiler/src/scala/quoted/runtime/impl/QuoteMatcher.scala @@ -301,7 +301,7 @@ object QuoteMatcher { /* Match new */ case New(tpt1) => pattern match - case New(tpt2) if tpt1.tpe.typeSymbol == tpt2.tpe.typeSymbol => matched + case New(tpt2) if tpt1.tpe.dealias.typeSymbol == tpt2.tpe.dealias.typeSymbol => matched case _ => notMatched /* Match this */ diff --git a/tests/pos-macros/i17606/Macros_1.scala b/tests/pos-macros/i17606/Macros_1.scala new file mode 100644 index 000000000000..245f2df66e7b --- /dev/null +++ b/tests/pos-macros/i17606/Macros_1.scala @@ -0,0 +1,14 @@ +package example + +import scala.quoted.* + +object A { + inline def f(inline a: Any): Boolean = ${ impl('a) } + + def impl(a: Expr[Any])(using Quotes): Expr[Boolean] = { + a match { + case '{ new String($x: Array[Byte]) } => Expr(true) + case _ => quotes.reflect.report.errorAndAbort("Expected match", a) + } + } +} diff --git a/tests/pos-macros/i17606/Test_2.scala b/tests/pos-macros/i17606/Test_2.scala new file mode 100644 index 000000000000..ebf535bc2ae9 --- /dev/null +++ b/tests/pos-macros/i17606/Test_2.scala @@ -0,0 +1,8 @@ +package example + +object Main { + def main(args: Array[String]): Unit = { + val x = A.f(new String(Array.empty[Byte])) + println(x) + } +} From 38265fca08ff605edc9df99f9628b2959c22a134 Mon Sep 17 00:00:00 2001 From: Jamie Thompson Date: Wed, 21 Jun 2023 11:10:59 +0200 Subject: [PATCH 102/371] sort language versions to match natural ordering also fixes the bullet-point under 3.3 to actually be a bullet point --- .../reference/language-versions/source-compatibility.md | 8 ++++---- 1 file changed, 4 insertions(+), 4 deletions(-) diff --git a/docs/_docs/reference/language-versions/source-compatibility.md b/docs/_docs/reference/language-versions/source-compatibility.md index 5cb705a16b82..145c4a84d11b 100644 --- a/docs/_docs/reference/language-versions/source-compatibility.md +++ b/docs/_docs/reference/language-versions/source-compatibility.md @@ -17,16 +17,16 @@ The default Scala language syntax version currently supported by the Dotty compi - in conjunction with `-rewrite`, offer code rewrites from Scala 2.13 to 3.0. - [`3.0`](https://scala-lang.org/api/3.x/scala/runtime/stdLibPatches/language$$3/0$.html), [`3.1`](https://scala-lang.org/api/3.x/scala/runtime/stdLibPatches/language$$3/1$.html): the default set of features included in scala versions `3.0.0` to `3.1.3`. +- [`3.2-migration`](https://scala-lang.org/api/3.x/scala/runtime/stdLibPatches/language$$3/2-migration$.html): the same as `3.2`, but in conjunction with `-rewrite`, offer code rewrites from Scala `3.0/3.1` to `3.2`. - [`3.2`](https://scala-lang.org/api/3.x/scala/runtime/stdLibPatches/language$$3/2$.html): the same as `3.0` and `3.1`, but in addition: - [stricter pattern bindings](https://docs.scala-lang.org/scala3/reference/changed-features/pattern-bindings.html) are now enabled (part of `future` in earlier `3.x` releases), producing warnings for refutable patterns. These warnings can be silenced to achieve the same runtime behavior, but in `future` they become errors and refutable patterns will not compile. - [Nonlocal returns](https://docs.scala-lang.org/scala3/reference/dropped-features/nonlocal-returns.html) now produce a warning upon usage (they are still an error under `future`). -- [`3.2-migration`](https://scala-lang.org/api/3.x/scala/runtime/stdLibPatches/language$$3/2-migration$.html): the same as `3.2`, but in conjunction with `-rewrite`, offer code rewrites from Scala `3.0/3.1` to `3.2`. -- [`3.3`](https://scala-lang.org/api/3.x/scala/runtime/stdLibPatches/language$$3/3$.html): the same as `3.2`, but in addition: - -[Fewer braces syntax](https://docs.scala-lang.org/scala3/reference/other-new-features/indentation.html#optional-braces-for-method-arguments-1) is enabled by default. - [`3.3-migration`](https://scala-lang.org/api/3.x/scala/runtime/stdLibPatches/language$$3/3-migration$.html): the same as `3.3` +- [`3.3`](https://scala-lang.org/api/3.x/scala/runtime/stdLibPatches/language$$3/3$.html): the same as `3.2`, but in addition: + - [Fewer braces syntax](https://docs.scala-lang.org/scala3/reference/other-new-features/indentation.html#optional-braces-for-method-arguments-1) is enabled by default. +- [`future-migration`](https://scala-lang.org/api/3.x/scala/runtime/stdLibPatches/language$$future-migration$.html): Same as `future` but with additional helpers to migrate from `3.3`. Similarly to the helpers available under `3.0-migration`, these include migration warnings and optional rewrites. - [`future`](https://scala-lang.org/api/3.x/scala/runtime/stdLibPatches/language$$future$.html): A preview of changes that will be introduced in `3.x` versions after `3.3`. Some Scala 2 specific idioms are dropped in this version. The feature set supported by this version may grow over time as features become stabilised for preview. -- [`future-migration`](https://scala-lang.org/api/3.x/scala/runtime/stdLibPatches/language$$future-migration$.html): Same as `future` but with additional helpers to migrate from `3.3`. Similarly to the helpers available under `3.0-migration`, these include migration warnings and optional rewrites. There are two ways to specify a language version : From 28d207dd9da5112204269d97d869b64e8cce9828 Mon Sep 17 00:00:00 2001 From: Quentin Bernet Date: Mon, 26 Jun 2023 16:19:22 +0200 Subject: [PATCH 103/371] Update indentation.md --- .../_docs/reference/other-new-features/indentation.md | 11 ++++++----- 1 file changed, 6 insertions(+), 5 deletions(-) diff --git a/docs/_docs/reference/other-new-features/indentation.md b/docs/_docs/reference/other-new-features/indentation.md index 40e2fc6fb38c..75306ec6f90d 100644 --- a/docs/_docs/reference/other-new-features/indentation.md +++ b/docs/_docs/reference/other-new-features/indentation.md @@ -261,7 +261,8 @@ Indentation can be mixed freely with braces `{...}`, as well as brackets `[...]` For instance, consider: ```scala { - val x = f(x: Int, y => + val x = 4 + f(x: Int, y => x * ( y + 1 ) + @@ -270,13 +271,13 @@ For instance, consider: ) } ``` - - Here, the indentation width of the region enclosed by the braces is 3 (i.e. the indentation width of the + - Here, the indentation width of the region enclosed by the braces is 2 (i.e. the indentation width of the statement starting with `val`). - - The indentation width of the region in parentheses that follows `f` is also 3, since the opening + - The indentation width of the region in parentheses that follows `f` is also 2, since the opening parenthesis is not at the end of a line. - - The indentation width of the region in parentheses around `y + 1` is 9 + - The indentation width of the region in parentheses around `y + 1` is 6 (i.e. the indentation width of `y + 1`). - - Finally, the indentation width of the last region in parentheses starting with `(x` is 6 (i.e. the indentation width of the indented region following the `=>`. + - Finally, the indentation width of the last region in parentheses starting with `(x` is 4 (i.e. the indentation width of the indented region following the `=>`. ## Special Treatment of Case Clauses From 49680df3ecead55d5a53b065dbdf3f06b80b3812 Mon Sep 17 00:00:00 2001 From: odersky Date: Sun, 25 Jun 2023 20:11:55 +0200 Subject: [PATCH 104/371] Fix accessibleType for package object prefixes Making a package object explicit re-computes the denotations of an overloaded method. So it should not be done after we have pruned down those denotations by an accessibility test. We now do it before checking accessibility. Fixes #15821 --- .../dotty/tools/dotc/core/Denotations.scala | 4 +-- .../dotty/tools/dotc/typer/TypeAssigner.scala | 30 +++++++++++-------- tests/pos/i15821.scala | 9 ++++++ 3 files changed, 28 insertions(+), 15 deletions(-) create mode 100644 tests/pos/i15821.scala diff --git a/compiler/src/dotty/tools/dotc/core/Denotations.scala b/compiler/src/dotty/tools/dotc/core/Denotations.scala index 82368fd4dbf5..e56cc453d34d 100644 --- a/compiler/src/dotty/tools/dotc/core/Denotations.scala +++ b/compiler/src/dotty/tools/dotc/core/Denotations.scala @@ -1269,8 +1269,8 @@ object Denotations { def hasAltWith(p: SingleDenotation => Boolean): Boolean = denot1.hasAltWith(p) || denot2.hasAltWith(p) def accessibleFrom(pre: Type, superAccess: Boolean)(using Context): Denotation = { - val d1 = denot1 accessibleFrom (pre, superAccess) - val d2 = denot2 accessibleFrom (pre, superAccess) + val d1 = denot1.accessibleFrom(pre, superAccess) + val d2 = denot2.accessibleFrom(pre, superAccess) if (!d1.exists) d2 else if (!d2.exists) d1 else derivedUnionDenotation(d1, d2) diff --git a/compiler/src/dotty/tools/dotc/typer/TypeAssigner.scala b/compiler/src/dotty/tools/dotc/typer/TypeAssigner.scala index 6ac45cbcf04d..be6121e13209 100644 --- a/compiler/src/dotty/tools/dotc/typer/TypeAssigner.scala +++ b/compiler/src/dotty/tools/dotc/typer/TypeAssigner.scala @@ -77,21 +77,25 @@ trait TypeAssigner { * (2) in Java compilation units, `Object` is replaced by `defn.FromJavaObjectType` */ def accessibleType(tpe: Type, superAccess: Boolean)(using Context): Type = - tpe match + if ctx.isJava && tpe.isAnyRef then + defn.FromJavaObjectType + else tpe match case tpe: NamedType => - val pre = tpe.prefix - val name = tpe.name - def postProcess(d: Denotation) = - if ctx.isJava && tpe.isAnyRef then defn.FromJavaObjectType - else TypeOps.makePackageObjPrefixExplicit(tpe withDenot d) - val d = tpe.denot.accessibleFrom(pre, superAccess) - if d.exists then postProcess(d) + val tpe1 = TypeOps.makePackageObjPrefixExplicit(tpe) + if tpe1 ne tpe then + accessibleType(tpe1, superAccess) else - // it could be that we found an inaccessible private member, but there is - // an inherited non-private member with the same name and signature. - val d2 = pre.nonPrivateMember(name).accessibleFrom(pre, superAccess) - if reallyExists(d2) then postProcess(d2) - else NoType + val pre = tpe.prefix + val name = tpe.name + val d = tpe.denot.accessibleFrom(pre, superAccess) + if d eq tpe.denot then tpe + else if d.exists then tpe.withDenot(d) + else + // it could be that we found an inaccessible private member, but there is + // an inherited non-private member with the same name and signature. + val d2 = pre.nonPrivateMember(name).accessibleFrom(pre, superAccess) + if reallyExists(d2) then tpe.withDenot(d2) + else NoType case tpe => tpe /** Try to make `tpe` accessible, emit error if not possible */ diff --git a/tests/pos/i15821.scala b/tests/pos/i15821.scala new file mode 100644 index 000000000000..a72d13e07bc7 --- /dev/null +++ b/tests/pos/i15821.scala @@ -0,0 +1,9 @@ +def main = + foo.bar(42) + foo.bar + +package object foo { + def bar[F[_]]: Unit = ??? + def bar[F[_]](x: Int): Unit = ??? + private[foo] def bar[F[_]](x: Int)(implicit dummy: DummyImplicit): Unit = ??? +} From 186e4be054c0df229e4a97152635df788432a876 Mon Sep 17 00:00:00 2001 From: odersky Date: Mon, 26 Jun 2023 14:53:36 +0200 Subject: [PATCH 105/371] Disable specs2 for now. --- .../scala/dotty/communitybuild/CommunityBuildTest.scala | 7 ++++++- 1 file changed, 6 insertions(+), 1 deletion(-) diff --git a/community-build/test/scala/dotty/communitybuild/CommunityBuildTest.scala b/community-build/test/scala/dotty/communitybuild/CommunityBuildTest.scala index 146ad6f4f951..8837f7319117 100644 --- a/community-build/test/scala/dotty/communitybuild/CommunityBuildTest.scala +++ b/community-build/test/scala/dotty/communitybuild/CommunityBuildTest.scala @@ -93,7 +93,12 @@ class CommunityBuildTestC: @Test def sconfig = projects.sconfig.run() @Test def shapeless = projects.shapeless.run() @Test def sourcecode = projects.sourcecode.run() - @Test def specs2 = projects.specs2.run() + + // Disabled. Currently fails in FutureMatchers.scala. The call to + // `checkResultFailure` goes to a protected method which is not accessible. + // I tried to fix it, but get test failures. + // @Test def specs2 = projects.specs2.run() + @Test def stdLib213 = projects.stdLib213.run() @Test def ujson = projects.ujson.run() @Test def upickle = projects.upickle.run() From 1451dc50ccc7bdf52501af02052aa1059487e393 Mon Sep 17 00:00:00 2001 From: =?UTF-8?q?Pawe=C5=82=20Marks?= Date: Wed, 28 Jun 2023 15:36:01 +0200 Subject: [PATCH 106/371] Add changelog for 3.3.1-RC2 --- changelogs/3.3.1-RC2.md | 16 ++++++++++++++++ 1 file changed, 16 insertions(+) create mode 100644 changelogs/3.3.1-RC2.md diff --git a/changelogs/3.3.1-RC2.md b/changelogs/3.3.1-RC2.md new file mode 100644 index 000000000000..f21bfa074b66 --- /dev/null +++ b/changelogs/3.3.1-RC2.md @@ -0,0 +1,16 @@ +# Backported fixes + +- Dealias types in `New`` before matching quotes [#17615](https://github.com/lampepfl/dotty/pull/17615) +- Fix `accessibleType` for package object prefixes [#18057](https://github.com/lampepfl/dotty/pull/18057) + +# Contributors + +Thank you to all the contributors who made this release possible 🎉 + +According to `git shortlog -sn --no-merges 3.3.1-RC1..3.3.1-RC2` these are: + +``` + 2 Martin Odersky + 2 Paweł Marks + 1 Nicolas Stucki +``` From c9bbcb0f0297b6097eff0dd28f9d5a5cae290e8c Mon Sep 17 00:00:00 2001 From: =?UTF-8?q?Pawe=C5=82=20Marks?= Date: Wed, 28 Jun 2023 15:52:38 +0200 Subject: [PATCH 107/371] Release 3.3.1-RC2 --- project/Build.scala | 4 ++-- 1 file changed, 2 insertions(+), 2 deletions(-) diff --git a/project/Build.scala b/project/Build.scala index 910ee7ef4f58..a2ea5ce1a596 100644 --- a/project/Build.scala +++ b/project/Build.scala @@ -82,7 +82,7 @@ object Build { val referenceVersion = "3.3.0" - val baseVersion = "3.3.1-RC1" + val baseVersion = "3.3.1-RC2" // Versions used by the vscode extension to create a new project // This should be the latest published releases. @@ -98,7 +98,7 @@ object Build { * set to 3.1.3. If it is going to be 3.1.0, it must be set to the latest * 3.0.x release. */ - val previousDottyVersion = "3.3.0" + val previousDottyVersion = "3.3.1-RC1" object CompatMode { final val BinaryCompatible = 0 From aed47fd78aec49aebf7d2d97b167e58c9c165cc2 Mon Sep 17 00:00:00 2001 From: odersky Date: Wed, 5 Jul 2023 00:14:35 +0200 Subject: [PATCH 108/371] Add clause for protected visibility from package objects We usually have an access rule that the access to a protected member `foo` in class `C` must be from somewhere nested in a subclass of `C`. But that fails if the member is accessed from a package object `p.package`. In that case, the access does not need to be in the same object, it just has to be in package `p`. This clause was previously missing and is now added. Why was this only recently discovered? #18057 fixed an issue where toplevel protected members were always accessible because explicit package object prefixes were added after the accessibility check was done, and would re-establish the previous members without doing an accessibility check. The fix was done by adding package objects first, then doing he rest of the checks. But that also means that protected toplevel objects now get checked as members of their synthetic package object instead of as members of their package. The change here also makes specs2 compile again. --- .../dotty/communitybuild/CommunityBuildTest.scala | 6 +----- .../dotty/tools/dotc/core/SymDenotations.scala | 11 +++++++---- tests/pos/i18124/definition.scala | 15 +++++++++++++++ tests/pos/i18124/usage.scala | 8 ++++++++ 4 files changed, 31 insertions(+), 9 deletions(-) create mode 100644 tests/pos/i18124/definition.scala create mode 100644 tests/pos/i18124/usage.scala diff --git a/community-build/test/scala/dotty/communitybuild/CommunityBuildTest.scala b/community-build/test/scala/dotty/communitybuild/CommunityBuildTest.scala index 8837f7319117..bf6b6d431509 100644 --- a/community-build/test/scala/dotty/communitybuild/CommunityBuildTest.scala +++ b/community-build/test/scala/dotty/communitybuild/CommunityBuildTest.scala @@ -93,11 +93,7 @@ class CommunityBuildTestC: @Test def sconfig = projects.sconfig.run() @Test def shapeless = projects.shapeless.run() @Test def sourcecode = projects.sourcecode.run() - - // Disabled. Currently fails in FutureMatchers.scala. The call to - // `checkResultFailure` goes to a protected method which is not accessible. - // I tried to fix it, but get test failures. - // @Test def specs2 = projects.specs2.run() + @Test def specs2 = projects.specs2.run() @Test def stdLib213 = projects.stdLib213.run() @Test def ujson = projects.ujson.run() diff --git a/compiler/src/dotty/tools/dotc/core/SymDenotations.scala b/compiler/src/dotty/tools/dotc/core/SymDenotations.scala index aa97435d64bb..988a37be4388 100644 --- a/compiler/src/dotty/tools/dotc/core/SymDenotations.scala +++ b/compiler/src/dotty/tools/dotc/core/SymDenotations.scala @@ -907,10 +907,13 @@ object SymDenotations { false val cls = owner.enclosingSubClass if !cls.exists then - val encl = if ctx.owner.isConstructor then ctx.owner.enclosingClass.owner.enclosingClass else ctx.owner.enclosingClass - fail(i""" - | Access to protected $this not permitted because enclosing ${encl.showLocated} - | is not a subclass of ${owner.showLocated} where target is defined""") + if pre.termSymbol.isPackageObject && accessWithin(pre.termSymbol.owner) then + true + else + val encl = if ctx.owner.isConstructor then ctx.owner.enclosingClass.owner.enclosingClass else ctx.owner.enclosingClass + fail(i""" + | Access to protected $this not permitted because enclosing ${encl.showLocated} + | is not a subclass of ${owner.showLocated} where target is defined""") else if isType || pre.derivesFrom(cls) || isConstructor || owner.is(ModuleClass) then // allow accesses to types from arbitrary subclasses fixes #4737 // don't perform this check for static members diff --git a/tests/pos/i18124/definition.scala b/tests/pos/i18124/definition.scala new file mode 100644 index 000000000000..1377c94fe7cd --- /dev/null +++ b/tests/pos/i18124/definition.scala @@ -0,0 +1,15 @@ +// definition.scala +package oolong.bson: + + trait BsonValue + protected def merge( + base: BsonValue, + patch: BsonValue, + arraySubvalues: Boolean = false + ): BsonValue = ??? + + private def foo: Int = 1 + + package inner: + protected[bson] def bar = 2 + diff --git a/tests/pos/i18124/usage.scala b/tests/pos/i18124/usage.scala new file mode 100644 index 000000000000..0bc0417c01ad --- /dev/null +++ b/tests/pos/i18124/usage.scala @@ -0,0 +1,8 @@ +// usage.scala +package oolong.bson + +extension (bv: BsonValue) + def :+(other: BsonValue): BsonValue = merge(other, bv, false) + +val x = foo +val y = inner.bar From 9cae4e8af5a54916b9fa128e620bd53cf54d3e0b Mon Sep 17 00:00:00 2001 From: =?UTF-8?q?Pawe=C5=82=20Marks?= Date: Wed, 5 Jul 2023 17:58:03 +0200 Subject: [PATCH 109/371] Add changelog for 3.3.1-RC3 --- changelogs/3.3.1-RC3.md | 15 +++++++++++++++ 1 file changed, 15 insertions(+) create mode 100644 changelogs/3.3.1-RC3.md diff --git a/changelogs/3.3.1-RC3.md b/changelogs/3.3.1-RC3.md new file mode 100644 index 000000000000..006d887c4f49 --- /dev/null +++ b/changelogs/3.3.1-RC3.md @@ -0,0 +1,15 @@ +# Backported fixes + +- Add clause for protected visibility from package objects [#18134](https://github.com/lampepfl/dotty/pull/18134) + +# Contributors + +Thank you to all the contributors who made this release possible 🎉 + +According to `git shortlog -sn --no-merges 3.3.1-RC2..3.3.1-RC3` these are: + +``` + 2 Paweł Marks + 1 Martin Odersky + +``` From 161de6e8e7b0e8f4fd59406bc9c1b9c79c6a634b Mon Sep 17 00:00:00 2001 From: =?UTF-8?q?Pawe=C5=82=20Marks?= Date: Wed, 5 Jul 2023 17:59:29 +0200 Subject: [PATCH 110/371] Release 3.3.1-RC3 --- project/Build.scala | 4 ++-- 1 file changed, 2 insertions(+), 2 deletions(-) diff --git a/project/Build.scala b/project/Build.scala index a2ea5ce1a596..1d4f2c7350a5 100644 --- a/project/Build.scala +++ b/project/Build.scala @@ -82,7 +82,7 @@ object Build { val referenceVersion = "3.3.0" - val baseVersion = "3.3.1-RC2" + val baseVersion = "3.3.1-RC3" // Versions used by the vscode extension to create a new project // This should be the latest published releases. @@ -98,7 +98,7 @@ object Build { * set to 3.1.3. If it is going to be 3.1.0, it must be set to the latest * 3.0.x release. */ - val previousDottyVersion = "3.3.1-RC1" + val previousDottyVersion = "3.3.1-RC2" object CompatMode { final val BinaryCompatible = 0 From 011e6674f0b79a84c4b68a749c73b0c91d7c036e Mon Sep 17 00:00:00 2001 From: Nicolas Stucki Date: Tue, 9 May 2023 08:54:12 +0200 Subject: [PATCH 111/371] Revert "Include top-level symbols from same file in outer ambiguity error" This reverts commit 7d4e103a941a30306ddde28a11f8bc3a8841acf8. Closes #17433 --- .../src/dotty/tools/dotc/typer/Typer.scala | 19 ++++------------ tests/neg/ambiref.check | 16 -------------- tests/neg/ambiref.scala | 22 +------------------ tests/pos-special/fatal-warnings/i9260.scala | 2 +- tests/run/protectedacc.scala | 2 +- 5 files changed, 7 insertions(+), 54 deletions(-) diff --git a/compiler/src/dotty/tools/dotc/typer/Typer.scala b/compiler/src/dotty/tools/dotc/typer/Typer.scala index 2e7444af8e96..cb23262d1410 100644 --- a/compiler/src/dotty/tools/dotc/typer/Typer.scala +++ b/compiler/src/dotty/tools/dotc/typer/Typer.scala @@ -408,16 +408,11 @@ class Typer(@constructorOnly nestingLevel: Int = 0) extends Namer // Does reference `tp` refer only to inherited symbols? def isInherited(denot: Denotation) = def isCurrent(mbr: SingleDenotation): Boolean = - !mbr.symbol.exists || mbr.symbol.owner == ctx.owner || ctx.owner.is(Package) + !mbr.symbol.exists || mbr.symbol.owner == ctx.owner denot match case denot: SingleDenotation => !isCurrent(denot) case denot => !denot.hasAltWith(isCurrent) - /* It is an error if an identifier x is available as an inherited member in an inner scope - * and the same name x is defined in an outer scope in the same source file, unless - * the inherited member (has an overloaded alternative that) coincides with - * (an overloaded alternative of) the definition x. - */ def checkNoOuterDefs(denot: Denotation, last: Context, prevCtx: Context): Unit = def sameTermOrType(d1: SingleDenotation, d2: Denotation) = d2.containsSym(d1.symbol) || d2.hasUniqueSym && { @@ -434,15 +429,9 @@ class Typer(@constructorOnly nestingLevel: Int = 0) extends Namer val owner = outer.owner if (owner eq last.owner) && (outer.scope eq last.scope) then checkNoOuterDefs(denot, outer, prevCtx) - else if !owner.isRoot then - val found = - if owner.is(Package) then - owner.denot.asClass.membersNamed(name) - .filterWithPredicate(d => !d.symbol.is(Package) && d.symbol.source == denot.symbol.source) - else - val scope = if owner.isClass then owner.info.decls else outer.scope - scope.denotsNamed(name) - val competing = found.filterWithFlags(required, excluded | Synthetic) + else if !owner.is(Package) then + val scope = if owner.isClass then owner.info.decls else outer.scope + val competing = scope.denotsNamed(name).filterWithFlags(required, excluded) if competing.exists then val symsMatch = competing .filterWithPredicate(sd => sameTermOrType(sd, denot)) diff --git a/tests/neg/ambiref.check b/tests/neg/ambiref.check index 32b4078f1346..5d701b3b3b71 100644 --- a/tests/neg/ambiref.check +++ b/tests/neg/ambiref.check @@ -30,19 +30,3 @@ | and inherited subsequently in class E | | longer explanation available when compiling with `-explain` --- [E049] Reference Error: tests/neg/ambiref.scala:43:10 --------------------------------------------------------------- -43 | println(global) // error - | ^^^^^^ - | Reference to global is ambiguous. - | It is both defined in package - | and inherited subsequently in object D - | - | longer explanation available when compiling with `-explain` --- [E049] Reference Error: tests/neg/ambiref.scala:49:16 --------------------------------------------------------------- -49 | def t = new T { } // error - | ^ - | Reference to T is ambiguous. - | It is both defined in package p - | and inherited subsequently in class C - | - | longer explanation available when compiling with `-explain` diff --git a/tests/neg/ambiref.scala b/tests/neg/ambiref.scala index bb48997cd465..e7a5d5efbd7e 100644 --- a/tests/neg/ambiref.scala +++ b/tests/neg/ambiref.scala @@ -40,24 +40,4 @@ val global = 0 class C: val global = 1 object D extends C: - println(global) // error - -package p: - class T - trait P { trait T } - class C extends P: - def t = new T { } // error - -package scala: - trait P { trait Option[+A] } - class C extends P: - def t = new Option[String] { } // OK, competing scala.Option is not defined in the same compilation unit - -object test5: - class Mu // generates a synthetic companion object with an apply method - trait A { - val Mu = 1 - } - trait B extends A { - def t = Mu // don't warn about synthetic companion - } + println(global) // OK, since global is defined in package \ No newline at end of file diff --git a/tests/pos-special/fatal-warnings/i9260.scala b/tests/pos-special/fatal-warnings/i9260.scala index 0392c1c96fa8..df548f393eea 100644 --- a/tests/pos-special/fatal-warnings/i9260.scala +++ b/tests/pos-special/fatal-warnings/i9260.scala @@ -10,7 +10,7 @@ end AstImpl object untpd extends AstImpl[Null]: - def DefDef(ast: this.Ast): DefDef = ast match + def DefDef(ast: Ast): DefDef = ast match case ast: DefDef => ast end untpd diff --git a/tests/run/protectedacc.scala b/tests/run/protectedacc.scala index 85aa3438faa3..a08e7201fd15 100644 --- a/tests/run/protectedacc.scala +++ b/tests/run/protectedacc.scala @@ -134,7 +134,7 @@ package p { abstract class X[T] extends PolyA[T] { - trait Inner extends this.B { + trait Inner extends B { def self: T; def self2: Node; def getB: Inner; From bf10893d6bc034c5aafccecd47d38b69c0d8f274 Mon Sep 17 00:00:00 2001 From: =?UTF-8?q?Pawe=C5=82=20Marks?= Date: Tue, 11 Jul 2023 12:58:07 +0200 Subject: [PATCH 112/371] Add changelog for 3.3.1-RC4 --- changelogs/3.3.1-RC3.md | 2 +- changelogs/3.3.1-RC4.md | 15 +++++++++++++++ 2 files changed, 16 insertions(+), 1 deletion(-) create mode 100644 changelogs/3.3.1-RC4.md diff --git a/changelogs/3.3.1-RC3.md b/changelogs/3.3.1-RC3.md index 006d887c4f49..eb19f40b10dc 100644 --- a/changelogs/3.3.1-RC3.md +++ b/changelogs/3.3.1-RC3.md @@ -10,6 +10,6 @@ According to `git shortlog -sn --no-merges 3.3.1-RC2..3.3.1-RC3` these are: ``` 2 Paweł Marks - 1 Martin Odersky + 1 Nicolas Stucki ``` diff --git a/changelogs/3.3.1-RC4.md b/changelogs/3.3.1-RC4.md new file mode 100644 index 000000000000..7d95e0258fad --- /dev/null +++ b/changelogs/3.3.1-RC4.md @@ -0,0 +1,15 @@ +# Backported fixes + +- Revert "Include top-level symbols from same file in outer ambiguity error" [#17438](https://github.com/lampepfl/dotty/pull/17438) + +# Contributors + +Thank you to all the contributors who made this release possible 🎉 + +According to `git shortlog -sn --no-merges 3.3.1-RC3..3.3.1-RC4` these are: + +``` + 2 Paweł Marks + 1 Nicolas Stucki + +``` From 555df5304aa882e67e08c917ff4924ab5947d295 Mon Sep 17 00:00:00 2001 From: =?UTF-8?q?Pawe=C5=82=20Marks?= Date: Tue, 11 Jul 2023 13:00:08 +0200 Subject: [PATCH 113/371] Release 3.3.1-RC4 --- project/Build.scala | 4 ++-- 1 file changed, 2 insertions(+), 2 deletions(-) diff --git a/project/Build.scala b/project/Build.scala index 1d4f2c7350a5..a60932eb9e30 100644 --- a/project/Build.scala +++ b/project/Build.scala @@ -82,7 +82,7 @@ object Build { val referenceVersion = "3.3.0" - val baseVersion = "3.3.1-RC3" + val baseVersion = "3.3.1-RC4" // Versions used by the vscode extension to create a new project // This should be the latest published releases. @@ -98,7 +98,7 @@ object Build { * set to 3.1.3. If it is going to be 3.1.0, it must be set to the latest * 3.0.x release. */ - val previousDottyVersion = "3.3.1-RC2" + val previousDottyVersion = "3.3.1-RC3" object CompatMode { final val BinaryCompatible = 0 From c54bf671b0293890a26a21d0b6325ad1a117615d Mon Sep 17 00:00:00 2001 From: Quentin Bernet Date: Fri, 14 Jul 2023 11:42:23 +0200 Subject: [PATCH 114/371] Update link to point to correct section In the reference, in Erased Definitions, link pointed to the Inline page, even though the content is in Compile Time Operations --- docs/_docs/reference/experimental/erased-defs.md | 4 ++-- 1 file changed, 2 insertions(+), 2 deletions(-) diff --git a/docs/_docs/reference/experimental/erased-defs.md b/docs/_docs/reference/experimental/erased-defs.md index 28455f26cdc0..548b9c11bc0b 100644 --- a/docs/_docs/reference/experimental/erased-defs.md +++ b/docs/_docs/reference/experimental/erased-defs.md @@ -161,8 +161,8 @@ object Machine: // State must be Off ``` -Note that in [Inline](../metaprogramming/inline.md) we discussed `erasedValue` and inline -matches. `erasedValue` is implemented with `erased`, so the state machine above +Note that in [Compile-time operations](../metaprogramming/compiletime-ops.md#erasedvalue) we discussed `erasedValue` and inline +matches. `erasedValue` is internally implemented with `erased` (and is not experimental), so the state machine above can be encoded as follows: ```scala From e1233d80c5a870ebb51c72e4b26a8fc8004b3774 Mon Sep 17 00:00:00 2001 From: Nicolas Stucki Date: Tue, 18 Jul 2023 14:32:36 +0200 Subject: [PATCH 115/371] Heal stage inconsistent prefixes of type projections Fixes #17293 --- compiler/src/dotty/tools/dotc/staging/HealType.scala | 2 +- tests/pos-macros/i17293.scala | 12 ++++++++++++ tests/pos-macros/i17293b.scala | 12 ++++++++++++ 3 files changed, 25 insertions(+), 1 deletion(-) create mode 100644 tests/pos-macros/i17293.scala create mode 100644 tests/pos-macros/i17293b.scala diff --git a/compiler/src/dotty/tools/dotc/staging/HealType.scala b/compiler/src/dotty/tools/dotc/staging/HealType.scala index 023271960b40..7d3ca0ad2f63 100644 --- a/compiler/src/dotty/tools/dotc/staging/HealType.scala +++ b/compiler/src/dotty/tools/dotc/staging/HealType.scala @@ -46,7 +46,7 @@ class HealType(pos: SrcPos)(using Context) extends TypeMap { case prefix: TermRef if tp.symbol.isTypeSplice => checkNotWildcardSplice(tp) if level == 0 then tp else getTagRef(prefix) - case _: NamedType | _: ThisType | NoPrefix => + case _: TermRef | _: ThisType | NoPrefix => if levelInconsistentRootOfPath(tp).exists then tryHeal(tp) else diff --git a/tests/pos-macros/i17293.scala b/tests/pos-macros/i17293.scala new file mode 100644 index 000000000000..57eba1181903 --- /dev/null +++ b/tests/pos-macros/i17293.scala @@ -0,0 +1,12 @@ +import scala.quoted.* + +trait OuterTrait { + trait X +} + +def exampleMacro[T <: OuterTrait: Type](expr: Expr[T])(using Quotes): Expr[OuterTrait#X] = { + '{ + val prefix: T = ${ expr } + new prefix.X {} + } +} diff --git a/tests/pos-macros/i17293b.scala b/tests/pos-macros/i17293b.scala new file mode 100644 index 000000000000..a8b73ba6176b --- /dev/null +++ b/tests/pos-macros/i17293b.scala @@ -0,0 +1,12 @@ +import scala.quoted.* + +trait OuterTrait { self => + trait X + + def exampleMacro[T <: self.type: Type](expr: Expr[T])(using Quotes): Expr[self.X] = { + '{ + val prefix: T = ${ expr } + new prefix.X {} + } + } +} \ No newline at end of file From 5f2450aa8ae3c00f6f52eb6a2dbe2427fe0ae6a8 Mon Sep 17 00:00:00 2001 From: Jan Chyb Date: Tue, 25 Jul 2023 12:56:09 +0200 Subject: [PATCH 116/371] Fix regression with Overloaded methods returning Functions Before the regression, FunctionOf unapply would not try dealiasing, meaning that an aliased function type would be handled by a general case. To fix that, instead of handling Function types separately when filtering overloaded methods in `resolveOverloaded1`, we allow to fallback to the general case if the previous one returns nothing. Along with fixing the regression, this also improves other cases, one of which was added to the test. Readd a separate FunctionOf case, but with a fallback --- .../dotty/tools/dotc/typer/Applications.scala | 52 ++++++++++--------- tests/pos/i17245.scala | 20 +++++++ 2 files changed, 48 insertions(+), 24 deletions(-) create mode 100644 tests/pos/i17245.scala diff --git a/compiler/src/dotty/tools/dotc/typer/Applications.scala b/compiler/src/dotty/tools/dotc/typer/Applications.scala index fbed4b77d3fe..cb6aec26406a 100644 --- a/compiler/src/dotty/tools/dotc/typer/Applications.scala +++ b/compiler/src/dotty/tools/dotc/typer/Applications.scala @@ -2062,31 +2062,35 @@ trait Applications extends Compatibility { if isDetermined(alts2) then alts2 else resolveMapped(alts1, _.widen.appliedTo(targs1.tpes), pt1) - case defn.FunctionOf(args, resultType, _) => - narrowByTypes(alts, args, resultType) - case pt => - val compat = alts.filterConserve(normalizedCompatible(_, pt, keepConstraint = false)) - if (compat.isEmpty) - /* - * the case should not be moved to the enclosing match - * since SAM type must be considered only if there are no candidates - * For example, the second f should be chosen for the following code: - * def f(x: String): Unit = ??? - * def f: java.io.OutputStream = ??? - * new java.io.ObjectOutputStream(f) - */ - pt match { - case SAMType(mtp) => - narrowByTypes(alts, mtp.paramInfos, mtp.resultType) - case _ => - // pick any alternatives that are not methods since these might be convertible - // to the expected type, or be used as extension method arguments. - val convertible = alts.filterNot(alt => - normalize(alt, IgnoredProto(pt)).widenSingleton.isInstanceOf[MethodType]) - if convertible.length == 1 then convertible else compat - } - else compat + val compat0 = pt match + case defn.FunctionOf(args, resType, _) => + narrowByTypes(alts, args, resType) + case _ => + Nil + if (compat0.isEmpty) then + val compat = alts.filterConserve(normalizedCompatible(_, pt, keepConstraint = false)) + if (compat.isEmpty) + /* + * the case should not be moved to the enclosing match + * since SAM type must be considered only if there are no candidates + * For example, the second f should be chosen for the following code: + * def f(x: String): Unit = ??? + * def f: java.io.OutputStream = ??? + * new java.io.ObjectOutputStream(f) + */ + pt match { + case SAMType(mtp) => + narrowByTypes(alts, mtp.paramInfos, mtp.resultType) + case _ => + // pick any alternatives that are not methods since these might be convertible + // to the expected type, or be used as extension method arguments. + val convertible = alts.filterNot(alt => + normalize(alt, IgnoredProto(pt)).widenSingleton.isInstanceOf[MethodType]) + if convertible.length == 1 then convertible else compat + } + else compat + else compat0 } /** The type of alternative `alt` after instantiating its first parameter diff --git a/tests/pos/i17245.scala b/tests/pos/i17245.scala new file mode 100644 index 000000000000..3b5b3a74108d --- /dev/null +++ b/tests/pos/i17245.scala @@ -0,0 +1,20 @@ +import scala.reflect.ClassTag + +trait MockSettings + +object Mockito { + def mock[T : ClassTag]: T = ??? + def mock[T : ClassTag](settings: MockSettings): T = ??? +} + +trait Channel +type OnChannel = Channel => Any + +@main def Test = + val case1: OnChannel = Mockito.mock[OnChannel] + val case2: OnChannel = Mockito.mock + val case3 = Mockito.mock[OnChannel] + val case4: OnChannel = Mockito.mock[OnChannel](summon[ClassTag[OnChannel]]) + + // not a regressive case, but an added improvement with the fix for the above + val case5: Channel => Any = Mockito.mock[Channel => Any] From b85cbb5a3a64cd7b21bf6b2cbd3f75c0f11db8fd Mon Sep 17 00:00:00 2001 From: odersky Date: Mon, 17 Jul 2023 18:34:16 +0200 Subject: [PATCH 117/371] Disallow taking singleton types of packages again Fixes #18109 --- compiler/src/dotty/tools/dotc/typer/Checking.scala | 13 ++++++++----- tests/neg/i18109.scala | 11 +++++++++++ 2 files changed, 19 insertions(+), 5 deletions(-) create mode 100644 tests/neg/i18109.scala diff --git a/compiler/src/dotty/tools/dotc/typer/Checking.scala b/compiler/src/dotty/tools/dotc/typer/Checking.scala index b2ab5332c3b2..df5639b50302 100644 --- a/compiler/src/dotty/tools/dotc/typer/Checking.scala +++ b/compiler/src/dotty/tools/dotc/typer/Checking.scala @@ -748,13 +748,16 @@ object Checking { if sym.isNoValue && !ctx.isJava then report.error(JavaSymbolIsNotAValue(sym), tree.srcPos) + /** Check that `tree` refers to a value, unless `tree` is selected or applied + * (singleton types x.type don't count as selections). + */ def checkValue(tree: Tree, proto: Type)(using Context): tree.type = tree match - case tree: RefTree - if tree.name.isTermName - && !proto.isInstanceOf[SelectionProto] - && !proto.isInstanceOf[FunOrPolyProto] => - checkValue(tree) + case tree: RefTree if tree.name.isTermName => + proto match + case _: SelectionProto if proto ne SingletonTypeProto => // no value check + case _: FunOrPolyProto => // no value check + case _ => checkValue(tree) case _ => tree diff --git a/tests/neg/i18109.scala b/tests/neg/i18109.scala new file mode 100644 index 000000000000..7df13b0c36ff --- /dev/null +++ b/tests/neg/i18109.scala @@ -0,0 +1,11 @@ +package foo {} + +package bar { + object Test { + def qux[A] = 123 + def main(args: Array[String]): Unit = { + val y = qux[foo.type] // error + val x = valueOf[foo.type] // error + } + } +} \ No newline at end of file From 110c91f5831b24c6751b30ed45f76c436be9db04 Mon Sep 17 00:00:00 2001 From: odersky Date: Mon, 7 Aug 2023 18:58:54 +0200 Subject: [PATCH 118/371] A slightly more conservative version of #14128 Two changes - Fix `hasUpperBound` to work correctly for higher-kinded types - A more conservative fix in `IsFullyDefinedAccumulator`. We now maintain the symmetry that - if variance < 0, we maximize - if variance > 0 (and Nothing is admissible) we minimize - only if variance = 0, we use the upper bound as a tie breaker Previously, we maximized even if variance > 0 if there was an upper but no lower bound. But that was asymmetric since there is no corresponding case where we minimize at variance < 0 if there is a lower but no upper bound. --- compiler/src/dotty/tools/dotc/core/Types.scala | 7 ++++++- compiler/src/dotty/tools/dotc/typer/Inferencing.scala | 6 +++++- 2 files changed, 11 insertions(+), 2 deletions(-) diff --git a/compiler/src/dotty/tools/dotc/core/Types.scala b/compiler/src/dotty/tools/dotc/core/Types.scala index 73a64f2c1b8f..bb4fd02816a6 100644 --- a/compiler/src/dotty/tools/dotc/core/Types.scala +++ b/compiler/src/dotty/tools/dotc/core/Types.scala @@ -246,6 +246,11 @@ object Types { case _ => false } + /** Is this type exactly `Any`, or a type lambda ending in `Any`? */ + def isTopOfSomeKind(using Context): Boolean = dealias match + case tp: TypeLambda => tp.resType.isTopOfSomeKind + case _ => isExactlyAny + def isBottomType(using Context): Boolean = if ctx.mode.is(Mode.SafeNulls) && !ctx.phase.erasedTypes then hasClassSymbol(defn.NothingClass) else isBottomTypeAfterErasure @@ -4813,7 +4818,7 @@ object Types { def hasLowerBound(using Context): Boolean = !currentEntry.loBound.isExactlyNothing /** For uninstantiated type variables: Is the upper bound different from Any? */ - def hasUpperBound(using Context): Boolean = !currentEntry.hiBound.finalResultType.isExactlyAny + def hasUpperBound(using Context): Boolean = !currentEntry.hiBound.isTopOfSomeKind /** Unwrap to instance (if instantiated) or origin (if not), until result * is no longer a TypeVar diff --git a/compiler/src/dotty/tools/dotc/typer/Inferencing.scala b/compiler/src/dotty/tools/dotc/typer/Inferencing.scala index 0e1c41ceef74..4d027b8750e0 100644 --- a/compiler/src/dotty/tools/dotc/typer/Inferencing.scala +++ b/compiler/src/dotty/tools/dotc/typer/Inferencing.scala @@ -187,7 +187,11 @@ object Inferencing { // else hold off instantiating unbounded unconstrained variable else if direction != 0 then instantiate(tvar, fromBelow = direction < 0) - else if variance >= 0 && (force.ifBottom == IfBottom.ok && !tvar.hasUpperBound || tvar.hasLowerBound) then + else if variance >= 0 && tvar.hasLowerBound then + instantiate(tvar, fromBelow = true) + else if (variance > 0 || variance == 0 && !tvar.hasUpperBound) + && force.ifBottom == IfBottom.ok + then // if variance == 0, prefer upper bound if one is given instantiate(tvar, fromBelow = true) else if variance >= 0 && force.ifBottom == IfBottom.fail then fail = true From 232c5f448f49406d6bf68ab4f5b230e4cf6aaf39 Mon Sep 17 00:00:00 2001 From: Dale Wijnand Date: Mon, 7 Aug 2023 11:05:44 +0100 Subject: [PATCH 119/371] Show Implicit Candidate & RefAndLevel --- compiler/src/dotty/tools/dotc/printing/PlainPrinter.scala | 7 +++++++ compiler/src/dotty/tools/dotc/printing/Printer.scala | 5 ++++- compiler/src/dotty/tools/dotc/typer/Implicits.scala | 6 ++++-- 3 files changed, 15 insertions(+), 3 deletions(-) diff --git a/compiler/src/dotty/tools/dotc/printing/PlainPrinter.scala b/compiler/src/dotty/tools/dotc/printing/PlainPrinter.scala index f3540502597c..700b3fbf525f 100644 --- a/compiler/src/dotty/tools/dotc/printing/PlainPrinter.scala +++ b/compiler/src/dotty/tools/dotc/printing/PlainPrinter.scala @@ -640,6 +640,13 @@ class PlainPrinter(_ctx: Context) extends Printer { else if (pos.source.exists) s"${pos.source.file.name}:${pos.line + 1}" else s"(no source file, offset = ${pos.span.point})" + def toText(cand: Candidate): Text = + "Cand(" + ~ toTextRef(cand.ref) + ~ (if cand.isConversion then " conv" else "") + ~ (if cand.isExtension then " ext" else "") + ~ Str(" L" + cand.level) ~ ")" + def toText(result: SearchResult): Text = result match { case result: SearchSuccess => "SearchSuccess: " ~ toText(result.ref) ~ " via " ~ toText(result.tree) diff --git a/compiler/src/dotty/tools/dotc/printing/Printer.scala b/compiler/src/dotty/tools/dotc/printing/Printer.scala index 697ab063a646..ab0c867ec31f 100644 --- a/compiler/src/dotty/tools/dotc/printing/Printer.scala +++ b/compiler/src/dotty/tools/dotc/printing/Printer.scala @@ -7,7 +7,7 @@ import Texts._, ast.Trees._ import Types.{Type, SingletonType, LambdaParam}, Symbols.Symbol, Scopes.Scope, Constants.Constant, Names.Name, Denotations._, Annotations.Annotation, Contexts.Context -import typer.Implicits.SearchResult +import typer.Implicits.* import util.SourcePosition import typer.ImportInfo @@ -153,6 +153,9 @@ abstract class Printer { /** Textual representation of source position */ def toText(pos: SourcePosition): Text + /** Textual representation of implicit candidates. */ + def toText(cand: Candidate): Text + /** Textual representation of implicit search result */ def toText(result: SearchResult): Text diff --git a/compiler/src/dotty/tools/dotc/typer/Implicits.scala b/compiler/src/dotty/tools/dotc/typer/Implicits.scala index c6795ed25a0e..66f400d7eae0 100644 --- a/compiler/src/dotty/tools/dotc/typer/Implicits.scala +++ b/compiler/src/dotty/tools/dotc/typer/Implicits.scala @@ -49,17 +49,19 @@ object Implicits: } /** Both search candidates and successes are references with a specific nesting level. */ - sealed trait RefAndLevel { + sealed trait RefAndLevel extends Showable { def ref: TermRef def level: Int } /** An eligible implicit candidate, consisting of an implicit reference and a nesting level */ - case class Candidate(implicitRef: ImplicitRef, kind: Candidate.Kind, level: Int) extends RefAndLevel { + case class Candidate(implicitRef: ImplicitRef, kind: Candidate.Kind, level: Int) extends RefAndLevel with Showable { def ref: TermRef = implicitRef.underlyingRef def isExtension = (kind & Candidate.Extension) != 0 def isConversion = (kind & Candidate.Conversion) != 0 + + def toText(printer: Printer): Text = printer.toText(this) } object Candidate { type Kind = Int From 48c994c7e82c2fe4be4e7bfa294ce5afc3148270 Mon Sep 17 00:00:00 2001 From: Dale Wijnand Date: Mon, 24 Jul 2023 15:34:17 +0100 Subject: [PATCH 120/371] Record failures to adapt application arguments --- .../tools/dotc/core/OrderingConstraint.scala | 4 +- .../dotty/tools/dotc/typer/Applications.scala | 2 +- tests/neg-macros/i6762.scala | 2 +- tests/neg/enum-values.check | 8 ++-- tests/neg/enumsAccess.scala | 2 +- tests/neg/i6779.check | 2 +- tests/neg/recursive-lower-constraint.scala | 2 +- tests/neg/syntax-error-recovery.check | 6 --- tests/neg/syntax-error-recovery.scala | 2 +- tests/pos/i18163.orig.scala | 40 +++++++++++++++++++ tests/pos/i18163.scala | 21 ++++++++++ 11 files changed, 74 insertions(+), 17 deletions(-) create mode 100644 tests/pos/i18163.orig.scala create mode 100644 tests/pos/i18163.scala diff --git a/compiler/src/dotty/tools/dotc/core/OrderingConstraint.scala b/compiler/src/dotty/tools/dotc/core/OrderingConstraint.scala index faea30390d2b..0328cea9b3ca 100644 --- a/compiler/src/dotty/tools/dotc/core/OrderingConstraint.scala +++ b/compiler/src/dotty/tools/dotc/core/OrderingConstraint.scala @@ -344,7 +344,8 @@ class OrderingConstraint(private val boundsMap: ParamBounds, if newSet.isEmpty then deps.remove(referenced) else deps.updated(referenced, newSet) - def traverse(t: Type) = t match + def traverse(t: Type) = try + t match case param: TypeParamRef => if hasBounds(param) then if variance >= 0 then coDeps = update(coDeps, param) @@ -356,6 +357,7 @@ class OrderingConstraint(private val boundsMap: ParamBounds, seen += tp traverse(tp.ref) case _ => traverseChildren(t) + catch case ex: Throwable => handleRecursive("adjust", t.show, ex) end Adjuster /** Adjust dependencies to account for the delta of previous entry `prevEntry` diff --git a/compiler/src/dotty/tools/dotc/typer/Applications.scala b/compiler/src/dotty/tools/dotc/typer/Applications.scala index fbed4b77d3fe..a9376444a911 100644 --- a/compiler/src/dotty/tools/dotc/typer/Applications.scala +++ b/compiler/src/dotty/tools/dotc/typer/Applications.scala @@ -844,7 +844,7 @@ trait Applications extends Compatibility { var typedArgs = typedArgBuf.toList def app0 = cpy.Apply(app)(normalizedFun, typedArgs) // needs to be a `def` because typedArgs can change later val app1 = - if (!success) app0.withType(UnspecifiedErrorType) + if (!success || typedArgs.exists(_.tpe.isError)) app0.withType(UnspecifiedErrorType) else { if !sameSeq(args, orderedArgs) && !isJavaAnnotConstr(methRef.symbol) diff --git a/tests/neg-macros/i6762.scala b/tests/neg-macros/i6762.scala index a8df289b26c2..054945e213d6 100644 --- a/tests/neg-macros/i6762.scala +++ b/tests/neg-macros/i6762.scala @@ -2,4 +2,4 @@ import scala.quoted.* type G[X] case class Foo[T](x: T) -def f(word: String)(using Quotes): Expr[Foo[G[String]]] = '{Foo(${Expr(word)})} // error // error +def f(word: String)(using Quotes): Expr[Foo[G[String]]] = '{Foo(${Expr(word)})} // error diff --git a/tests/neg/enum-values.check b/tests/neg/enum-values.check index 37990e8f312e..23337de1b2c4 100644 --- a/tests/neg/enum-values.check +++ b/tests/neg/enum-values.check @@ -24,8 +24,8 @@ | | failed with: | - | Found: Array[example.Tag[?]] - | Required: Array[example.ListLike[?]] + | Found: example.ListLike.type + | Required: Nothing -- [E008] Not Found Error: tests/neg/enum-values.scala:34:52 ----------------------------------------------------------- 34 | val typeCtorsK: Array[TypeCtorsK[?]] = TypeCtorsK.values // error | ^^^^^^^^^^^^^^^^^ @@ -38,8 +38,8 @@ | | failed with: | - | Found: Array[example.Tag[?]] - | Required: Array[example.TypeCtorsK[?[_$1]]] + | Found: example.TypeCtorsK.type + | Required: Nothing -- [E008] Not Found Error: tests/neg/enum-values.scala:36:6 ------------------------------------------------------------ 36 | Tag.valueOf("Int") // error | ^^^^^^^^^^^ diff --git a/tests/neg/enumsAccess.scala b/tests/neg/enumsAccess.scala index 18b91b346b6a..8a8e9af8910f 100644 --- a/tests/neg/enumsAccess.scala +++ b/tests/neg/enumsAccess.scala @@ -63,7 +63,7 @@ object test5 { enum E5[T](x: T) { case C3() extends E5[INT](defaultX)// error: illegal reference // error: illegal reference case C4 extends E5[INT](defaultX) // error: illegal reference // error: illegal reference - case C5 extends E5[E5[_]](E5.this) // error: type mismatch + case C5 extends E5[E5[_]](E5.this) // error: cannot be instantiated // error: conflicting base types // error: type mismatch } object E5 { diff --git a/tests/neg/i6779.check b/tests/neg/i6779.check index 8e05c22eb640..f1e1b9d5557b 100644 --- a/tests/neg/i6779.check +++ b/tests/neg/i6779.check @@ -11,7 +11,7 @@ | value f is not a member of T. | An extension method was tried, but could not be fully constructed: | - | Test.f[G[T]](x)(given_Stuff) + | Test.f[G[T]](x) | | failed with: | diff --git a/tests/neg/recursive-lower-constraint.scala b/tests/neg/recursive-lower-constraint.scala index 8009ab5fce6e..cf45d8b95171 100644 --- a/tests/neg/recursive-lower-constraint.scala +++ b/tests/neg/recursive-lower-constraint.scala @@ -3,5 +3,5 @@ class Bar extends Foo[Bar] class A { def foo[T <: Foo[T], U >: Foo[T] <: T](x: T): T = x - foo(new Bar) // error + foo(new Bar) // error // error } diff --git a/tests/neg/syntax-error-recovery.check b/tests/neg/syntax-error-recovery.check index 0bf626210fed..18d877833d79 100644 --- a/tests/neg/syntax-error-recovery.check +++ b/tests/neg/syntax-error-recovery.check @@ -94,12 +94,6 @@ | Not found: bam | | longer explanation available when compiling with `-explain` --- [E006] Not Found Error: tests/neg/syntax-error-recovery.scala:61:10 ------------------------------------------------- -61 | println(bam) // error - | ^^^ - | Not found: bam - | - | longer explanation available when compiling with `-explain` -- [E129] Potential Issue Warning: tests/neg/syntax-error-recovery.scala:7:2 ------------------------------------------- 6 | 2 7 | } diff --git a/tests/neg/syntax-error-recovery.scala b/tests/neg/syntax-error-recovery.scala index 775abeb97bdb..b6663cc9c70a 100644 --- a/tests/neg/syntax-error-recovery.scala +++ b/tests/neg/syntax-error-recovery.scala @@ -58,5 +58,5 @@ object Test2: def foo5(x: Int) = foo2(foo2(,) // error // error - println(bam) // error + println(bam) // error \ No newline at end of file diff --git a/tests/pos/i18163.orig.scala b/tests/pos/i18163.orig.scala new file mode 100644 index 000000000000..eb0627254156 --- /dev/null +++ b/tests/pos/i18163.orig.scala @@ -0,0 +1,40 @@ +import scala.language.implicitConversions + +// We do have 2 `contramap` functions, one provided via `LoggerSyntax` other via `Contravariant.Ops` +// `ContravariantMonoidal` given instances are not used, and they do not match our type. Code fails when we have at least 2 instances of them +// Removal of `import catsSyntax._` allow to compile code +// Removal of `import odinSyntax.LoggerSyntax` and remaining `catsSyntax` would fail to compile the `def fails` + +trait Foo[A] +trait Bar[A] + +trait WriterT[F[_]: Contravariant, L, V]: + def contramap[Z](fn: Z => V): WriterT[F, L, Z] = ??? +trait Logger[F[_]] +class WriterTLogger[F[_]] extends Logger[[G] =>> WriterT[F, List[String], G]] + +trait ContravariantMonoidal[F[_]] extends Invariant[F] with Contravariant[F] +trait Invariant[F[_]] +object Invariant: + given ContravariantMonoidal[Foo] = ??? + given ContravariantMonoidal[Bar] = ??? + +trait Contravariant[F[_]] extends Invariant[F] +object Contravariant: + trait Ops[F[_], A]: + def contramap[B](f: B => A): F[B] = ??? + +object catsSyntax: + implicit def toContravariantOps[F[_]: Contravariant, A](target: F[A]): Contravariant.Ops[F, A] = ??? + +object odinSyntax: + implicit class LoggerSyntax[F[_]](logger: Logger[F]): + def contramap(f: String => String): Logger[F] = ??? + +import catsSyntax._ +import odinSyntax.LoggerSyntax + +class Test: + def fails = new WriterTLogger[Option].contramap(identity) + def works = LoggerSyntax(new WriterTLogger[Option]).contramap(identity) + diff --git a/tests/pos/i18163.scala b/tests/pos/i18163.scala new file mode 100644 index 000000000000..5c364a50dd57 --- /dev/null +++ b/tests/pos/i18163.scala @@ -0,0 +1,21 @@ +import scala.language.implicitConversions + +trait Foo[A] +trait Bar[B] +trait Qux[C] +class Log[K[_]] + +trait Inv[F[_]] +object Inv: + given monFoo: Inv[Foo] = ??? + given monBar: Inv[Bar] = ??? + +trait InvOps[H[_], D] { def desc(s: String): H[D] = ??? } +trait LogOps[L[_]] { def desc(s: String): Log[L] = ??? } + +class Test: + implicit def LogOps[Q[_]](l: Log[Q]): LogOps[Q] = ??? + implicit def InvOps[J[_], E](j11: J[E])(implicit z: Inv[J]): InvOps[J, E] = ??? + + def fails = new Log[Qux].desc("fails") + def works = LogOps[Qux](new Log[Qux]).desc("works") From c569a4f4c691c8eaf5536cd90e3935553932b8fd Mon Sep 17 00:00:00 2001 From: Dale Wijnand Date: Tue, 1 Aug 2023 14:27:06 +0100 Subject: [PATCH 121/371] Space: Fix intersectUnrelatedAtomicTypes tracing --- compiler/src/dotty/tools/dotc/transform/patmat/Space.scala | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/compiler/src/dotty/tools/dotc/transform/patmat/Space.scala b/compiler/src/dotty/tools/dotc/transform/patmat/Space.scala index 7238756454b3..002e3646c663 100644 --- a/compiler/src/dotty/tools/dotc/transform/patmat/Space.scala +++ b/compiler/src/dotty/tools/dotc/transform/patmat/Space.scala @@ -330,7 +330,7 @@ object SpaceEngine { * The types should be atomic (non-decomposable) and unrelated (neither * should be a subtype of the other). */ - def intersectUnrelatedAtomicTypes(tp1: Type, tp2: Type)(sp: Space)(using Context): Space = trace(i"atomic intersection: ${AndType(tp1, tp2)}", debug) { + def intersectUnrelatedAtomicTypes(tp1: Type, tp2: Type)(sp: Space)(using Context): Space = trace(i"atomic intersection: ${AndType(tp1, tp2)}", debug, show) { // Precondition: !isSubType(tp1, tp2) && !isSubType(tp2, tp1). if !ctx.mode.is(Mode.SafeNulls) && (tp1.isNullType || tp2.isNullType) then // Since projections of types don't include null, intersection with null is empty. From 518c02055f3addd2b4ea08ebaa6ac9c3ae65392e Mon Sep 17 00:00:00 2001 From: Dale Wijnand Date: Tue, 1 Aug 2023 14:47:31 +0100 Subject: [PATCH 122/371] Space: Make isDecomposableToChildren ignore type constructors --- .../src/dotty/tools/dotc/transform/patmat/Space.scala | 9 +++++---- 1 file changed, 5 insertions(+), 4 deletions(-) diff --git a/compiler/src/dotty/tools/dotc/transform/patmat/Space.scala b/compiler/src/dotty/tools/dotc/transform/patmat/Space.scala index 002e3646c663..467b36df3805 100644 --- a/compiler/src/dotty/tools/dotc/transform/patmat/Space.scala +++ b/compiler/src/dotty/tools/dotc/transform/patmat/Space.scala @@ -642,7 +642,7 @@ object SpaceEngine { // For instance, from i15029, `decompose((X | Y).Field[T]) = [X.Field[T], Y.Field[T]]`. parts.map(tp.derivedAppliedType(_, targs)) - case tp if tp.classSymbol.isDecomposableToChildren => + case tp if tp.isDecomposableToChildren => def getChildren(sym: Symbol): List[Symbol] = sym.children.flatMap { child => if child eq sym then List(sym) // i3145: sealed trait Baz, val x = new Baz {}, Baz.children returns Baz... @@ -678,8 +678,8 @@ object SpaceEngine { rec(tp, Nil) } - extension (cls: Symbol) - /** A type is decomposable to children if it's sealed, + extension (tp: Type) + /** A type is decomposable to children if it has a simple kind, it's sealed, * abstract (or a trait) - so its not a sealed concrete class that can be instantiated on its own, * has no anonymous children, which we wouldn't be able to name as counter-examples, * but does have children. @@ -688,7 +688,8 @@ object SpaceEngine { * A sealed trait with subclasses that then get removed after `refineUsingParent`, decomposes to the empty list. * So that's why we consider whether a type has children. */ def isDecomposableToChildren(using Context): Boolean = - cls.is(Sealed) && cls.isOneOf(AbstractOrTrait) && !cls.hasAnonymousChild && cls.children.nonEmpty + val cls = tp.classSymbol + tp.hasSimpleKind && cls.is(Sealed) && cls.isOneOf(AbstractOrTrait) && !cls.hasAnonymousChild && cls.children.nonEmpty val ListOfNoType = List(NoType) val ListOfTypNoType = ListOfNoType.map(Typ(_, decomposed = true)) From 86782076c45e88fc16c3abb0ca8646ce4a2dd417 Mon Sep 17 00:00:00 2001 From: Dale Wijnand Date: Tue, 1 Aug 2023 16:00:31 +0100 Subject: [PATCH 123/371] Space: Revert how invariant targs are erased to fix regression The motivating case (i16451) is complicated, because it involves unchecked type arguments. To fix the regression, I'm reverting the fix. --- .../tools/dotc/transform/patmat/Space.scala | 13 ++---------- .../suppressed-type-test-warnings.scala | 2 ++ .../isInstanceOf/enum-approx2.scala | 2 ++ .../neg-custom-args/isInstanceOf/i11178.scala | 1 + .../neg-custom-args/isInstanceOf/i8932.scala | 1 + tests/{ => pending}/neg/i16451.check | 0 tests/{ => pending}/neg/i16451.scala | 4 ---- tests/pos/i17230.bootstrap.scala | 16 +++++++++++++++ tests/pos/i17230.min1.scala | 15 ++++++++++++++ tests/pos/i17230.orig.scala | 20 +++++++++++++++++++ 10 files changed, 59 insertions(+), 15 deletions(-) rename tests/{ => pending}/neg/i16451.check (100%) rename tests/{ => pending}/neg/i16451.scala (93%) create mode 100644 tests/pos/i17230.bootstrap.scala create mode 100644 tests/pos/i17230.min1.scala create mode 100644 tests/pos/i17230.orig.scala diff --git a/compiler/src/dotty/tools/dotc/transform/patmat/Space.scala b/compiler/src/dotty/tools/dotc/transform/patmat/Space.scala index 467b36df3805..eab65890c227 100644 --- a/compiler/src/dotty/tools/dotc/transform/patmat/Space.scala +++ b/compiler/src/dotty/tools/dotc/transform/patmat/Space.scala @@ -468,17 +468,8 @@ object SpaceEngine { WildcardType case tp @ AppliedType(tycon, args) => - val args2 = - if tycon.isRef(defn.ArrayClass) then - args.map(arg => erase(arg, inArray = true, isValue = false)) - else tycon.typeParams.lazyZip(args).map { (tparam, arg) => - if isValue && tparam.paramVarianceSign == 0 then - // when matching against a value, - // any type argument for an invariant type parameter will be unchecked, - // meaning it won't fail to match against anything; thus the wildcard replacement - WildcardType - else erase(arg, inArray = false, isValue = false) - } + val inArray = tycon.isRef(defn.ArrayClass) + val args2 = args.map(arg => erase(arg, inArray = inArray, isValue = false)) tp.derivedAppliedType(erase(tycon, inArray, isValue = false), args2) case tp @ OrType(tp1, tp2) => diff --git a/tests/neg-custom-args/fatal-warnings/suppressed-type-test-warnings.scala b/tests/neg-custom-args/fatal-warnings/suppressed-type-test-warnings.scala index 175096fc6b21..92d86b3307e5 100644 --- a/tests/neg-custom-args/fatal-warnings/suppressed-type-test-warnings.scala +++ b/tests/neg-custom-args/fatal-warnings/suppressed-type-test-warnings.scala @@ -18,10 +18,12 @@ object Test { def err2[A, B](value: Foo[A, B], a: A => Int): B = value match { case b: Bar[B] => // spurious // error b.x + case _ => ??? // avoid fatal inexhaustivity warnings suppressing the uncheckable warning } def fail[A, B](value: Foo[A, B], a: A => Int): B = value match { case b: Bar[Int] => // error b.x + case _ => ??? // avoid fatal inexhaustivity warnings suppressing the uncheckable warning } } diff --git a/tests/neg-custom-args/isInstanceOf/enum-approx2.scala b/tests/neg-custom-args/isInstanceOf/enum-approx2.scala index 5e3bdef7553d..c7c8a6c4e1fb 100644 --- a/tests/neg-custom-args/isInstanceOf/enum-approx2.scala +++ b/tests/neg-custom-args/isInstanceOf/enum-approx2.scala @@ -4,5 +4,7 @@ case class Fun[A, B](f: Exp[A => B]) extends Exp[A => B] class Test { def eval(e: Fun[Int, Int]) = e match { case Fun(x: Fun[Int, Double]) => ??? // error + case Fun(x: Exp[Int => String]) => ??? // error + case _ => } } diff --git a/tests/neg-custom-args/isInstanceOf/i11178.scala b/tests/neg-custom-args/isInstanceOf/i11178.scala index 71bc346e5743..47e8b4c3acab 100644 --- a/tests/neg-custom-args/isInstanceOf/i11178.scala +++ b/tests/neg-custom-args/isInstanceOf/i11178.scala @@ -12,6 +12,7 @@ object Test1 { def test[A](bar: Bar[A]) = bar match { case _: Bar[Boolean] => ??? // error + case _ => ??? } } diff --git a/tests/neg-custom-args/isInstanceOf/i8932.scala b/tests/neg-custom-args/isInstanceOf/i8932.scala index e070fdae518c..84d2f7d4990a 100644 --- a/tests/neg-custom-args/isInstanceOf/i8932.scala +++ b/tests/neg-custom-args/isInstanceOf/i8932.scala @@ -6,6 +6,7 @@ class Dummy extends Bar[Nothing] with Foo[String] def bugReport[A](foo: Foo[A]): Foo[A] = foo match { case bar: Bar[A] => bar // error + case dummy: Dummy => ??? } def test = bugReport(new Dummy: Foo[String]) diff --git a/tests/neg/i16451.check b/tests/pending/neg/i16451.check similarity index 100% rename from tests/neg/i16451.check rename to tests/pending/neg/i16451.check diff --git a/tests/neg/i16451.scala b/tests/pending/neg/i16451.scala similarity index 93% rename from tests/neg/i16451.scala rename to tests/pending/neg/i16451.scala index 685b79477bbe..49997d2bcf92 100644 --- a/tests/neg/i16451.scala +++ b/tests/pending/neg/i16451.scala @@ -1,10 +1,6 @@ // scalac: -Werror enum Color: case Red, Green -//sealed trait Color -//object Color: -// case object Red extends Color -// case object Green extends Color case class Wrapper[A](value: A) diff --git a/tests/pos/i17230.bootstrap.scala b/tests/pos/i17230.bootstrap.scala new file mode 100644 index 000000000000..ef2d98d8f55b --- /dev/null +++ b/tests/pos/i17230.bootstrap.scala @@ -0,0 +1,16 @@ +type Untyped = Type | Null + +class Type +abstract class SearchFailureType extends Type + +abstract class Tree[+T <: Untyped]: + def tpe: T = null.asInstanceOf[T] + +class SearchFailureIdent[+T <: Untyped] extends Tree[T] + +class Test_i17230_bootstrap: + def t1(arg: Tree[Type]) = arg match + case arg: SearchFailureIdent[?] => arg.tpe match + case x: SearchFailureType => + case _ => + case _ => diff --git a/tests/pos/i17230.min1.scala b/tests/pos/i17230.min1.scala new file mode 100644 index 000000000000..e2df63e168c1 --- /dev/null +++ b/tests/pos/i17230.min1.scala @@ -0,0 +1,15 @@ +// scalac: -Werror +trait Foo: + type Bar[_] + +object Foo: + type Aux[B[_]] = Foo { type Bar[A] = B[A] } + +class Test: + def t1[B[_]](self: Option[Foo.Aux[B]]) = self match + case Some(_) => 1 + case None => 2 + + def t2[B[_]](self: Option[Foo.Aux[B]]) = self match + case Some(f) => 1 + case None => 2 diff --git a/tests/pos/i17230.orig.scala b/tests/pos/i17230.orig.scala new file mode 100644 index 000000000000..d72a0082a116 --- /dev/null +++ b/tests/pos/i17230.orig.scala @@ -0,0 +1,20 @@ +// scalac: -Werror +import scala.util.* + +trait Transaction { + type State[_] +} +object Transaction { + type of[S[_]] = Transaction { type State[A] = S[A] } +} +trait DynamicScope[State[_]] + +case class ScopeSearch[State[_]](self: Either[Transaction.of[State], DynamicScope[State]]) { + + def embedTransaction[T](f: Transaction.of[State] => T): T = + self match { + case Left(integrated) => ??? + case Right(ds) => ??? + } +} + From 5d6891fe3de921a825d53b369cc9b3e805275753 Mon Sep 17 00:00:00 2001 From: =?UTF-8?q?Pawe=C5=82=20Marks?= Date: Thu, 10 Aug 2023 09:17:12 +0200 Subject: [PATCH 124/371] Add changelog for 3.3.1-RC5 --- changelogs/3.3.1-RC5.md | 22 ++++++++++++++++++++++ 1 file changed, 22 insertions(+) create mode 100644 changelogs/3.3.1-RC5.md diff --git a/changelogs/3.3.1-RC5.md b/changelogs/3.3.1-RC5.md new file mode 100644 index 000000000000..e0bfc2a7fea8 --- /dev/null +++ b/changelogs/3.3.1-RC5.md @@ -0,0 +1,22 @@ +# Backported fixes + +- Heal stage inconsistent prefixes of type projections [#18239](https://github.com/lampepfl/dotty/pull/18239) +- Fix regression #17245: Overloaded methods with ClassTags [#18286](http://github.com/lampepfl/dotty/pull/18286) +- Disallow taking singleton types of packages again [#18232](http://github.com/lampepfl/dotty/pull/18232) +- A slightly more conservative version of #14218 [#18352](http://github.com/lampepfl/dotty/pull/18352) +- Record failures to adapt application arguments [#18269](http://github.com/lampepfl/dotty/pull/18269) +- Fix regression in exhaustivity of HK types [#18303](http://github.com/lampepfl/dotty/pull/18303) + +# Contributors + +Thank you to all the contributors who made this release possible 🎉 + +According to `git shortlog -sn --no-merges 3.3.1-RC4..3.3.1-RC5` these are: + +``` + 5 Dale Wijnand + 2 Martin Odersky + 2 Paweł Marks + 1 Jan Chyb + 1 Nicolas Stucki +``` From 059748245f9e0816a8f9d837b0b2625956853aa5 Mon Sep 17 00:00:00 2001 From: =?UTF-8?q?Pawe=C5=82=20Marks?= Date: Thu, 10 Aug 2023 09:19:25 +0200 Subject: [PATCH 125/371] Release 3.3.1-RC5 --- project/Build.scala | 4 ++-- 1 file changed, 2 insertions(+), 2 deletions(-) diff --git a/project/Build.scala b/project/Build.scala index a60932eb9e30..b1c6e63cef9d 100644 --- a/project/Build.scala +++ b/project/Build.scala @@ -82,7 +82,7 @@ object Build { val referenceVersion = "3.3.0" - val baseVersion = "3.3.1-RC4" + val baseVersion = "3.3.1-RC5" // Versions used by the vscode extension to create a new project // This should be the latest published releases. @@ -98,7 +98,7 @@ object Build { * set to 3.1.3. If it is going to be 3.1.0, it must be set to the latest * 3.0.x release. */ - val previousDottyVersion = "3.3.1-RC3" + val previousDottyVersion = "3.3.1-RC4" object CompatMode { final val BinaryCompatible = 0 From 8e9b7182f41d03f412b2f2db0c8414f1411aa70c Mon Sep 17 00:00:00 2001 From: odersky Date: Thu, 13 Jul 2023 20:42:43 +0200 Subject: [PATCH 126/371] Refine infoDependsOnPrefix infoDependsOnPrefix now also considers non-final term members. Before 8d65f19 it only considered abstract types. Constructors were classified as non-final, which caused regression. We now exclude constructors specifically. Maybe we should instead classify them as effectively final. Fixes #18160 --- .../src/dotty/tools/dotc/core/Types.scala | 1 + .../tools/dotc/transform/TreeChecker.scala | 2 +- tests/pos/i18160/Test_2.scala | 11 ++++++++ tests/pos/i18160/repro_1.scala | 25 +++++++++++++++++++ 4 files changed, 38 insertions(+), 1 deletion(-) create mode 100644 tests/pos/i18160/Test_2.scala create mode 100644 tests/pos/i18160/repro_1.scala diff --git a/compiler/src/dotty/tools/dotc/core/Types.scala b/compiler/src/dotty/tools/dotc/core/Types.scala index bb4fd02816a6..81fc28a32fec 100644 --- a/compiler/src/dotty/tools/dotc/core/Types.scala +++ b/compiler/src/dotty/tools/dotc/core/Types.scala @@ -2537,6 +2537,7 @@ object Types { (symd.isAbstractType || symd.isTerm && !symd.flagsUNSAFE.isOneOf(Module | Final | Param) + && !symd.isConstructor && !symd.maybeOwner.isEffectivelyFinal) && prefix.sameThis(symd.maybeOwner.thisType) && refines(givenSelfTypeOrCompleter(prefix.cls), symd.name) diff --git a/compiler/src/dotty/tools/dotc/transform/TreeChecker.scala b/compiler/src/dotty/tools/dotc/transform/TreeChecker.scala index e50fb9d8b09c..34b3183a6b15 100644 --- a/compiler/src/dotty/tools/dotc/transform/TreeChecker.scala +++ b/compiler/src/dotty/tools/dotc/transform/TreeChecker.scala @@ -544,7 +544,7 @@ object TreeChecker { val TypeDef(_, impl @ Template(constr, _, _, _)) = cdef: @unchecked assert(cdef.symbol == cls) assert(impl.symbol.owner == cls) - assert(constr.symbol.owner == cls) + assert(constr.symbol.owner == cls, i"constr ${constr.symbol} in $cdef has wrong owner; should be $cls but is ${constr.symbol.owner}") assert(cls.primaryConstructor == constr.symbol, i"mismatch, primary constructor ${cls.primaryConstructor}, in tree = ${constr.symbol}") checkOwner(impl) checkOwner(impl.constr) diff --git a/tests/pos/i18160/Test_2.scala b/tests/pos/i18160/Test_2.scala new file mode 100644 index 000000000000..9ee40c3d37f9 --- /dev/null +++ b/tests/pos/i18160/Test_2.scala @@ -0,0 +1,11 @@ +class SynchronizedReevaluation +class SynchronizedReevaluationApi[Api <: RescalaInterface](val api: Api){ + import api._ + + def SynchronizedReevaluation[A](evt: Event[A])(implicit + turnSource: CreationTicket + ): (SynchronizedReevaluation, Event[A]) = { + val sync = new SynchronizedReevaluation + (sync, evt.map(identity)(turnSource)) + } +} diff --git a/tests/pos/i18160/repro_1.scala b/tests/pos/i18160/repro_1.scala new file mode 100644 index 000000000000..060f2d325d2d --- /dev/null +++ b/tests/pos/i18160/repro_1.scala @@ -0,0 +1,25 @@ +object core { + final class CreationTicket[State[_]] +} + +trait ReadAs[S[_], +A] { type State[V] = S[V] } + +trait EventCompatBundle { + bundle: Operators => + + trait EventCompat[+T] extends ReadAs[State, Option[T]] { + selfType: Event[T] => + final inline def map[B](inline expression: T => B)(implicit ticket: CreationTicket): Event[B] = ??? + } +} + +trait EventBundle extends EventCompatBundle { self: Operators => + trait Event[+T] extends EventCompat[T]: + final override type State[V] = self.State[V] +} +trait Operators extends EventBundle { + type State[_] + type CreationTicket = core.CreationTicket[State] +} +trait RescalaInterface extends Operators + From d2a0b3cd298ec45ef34942b23829f305cfca6d96 Mon Sep 17 00:00:00 2001 From: odersky Date: Thu, 13 Jul 2023 20:53:37 +0200 Subject: [PATCH 127/371] Make constructors effectively final This is mainly a cleanup. --- compiler/src/dotty/tools/dotc/core/SymDenotations.scala | 1 + compiler/src/dotty/tools/dotc/transform/init/Util.scala | 6 ++---- compiler/src/dotty/tools/dotc/typer/RefChecks.scala | 2 +- 3 files changed, 4 insertions(+), 5 deletions(-) diff --git a/compiler/src/dotty/tools/dotc/core/SymDenotations.scala b/compiler/src/dotty/tools/dotc/core/SymDenotations.scala index 988a37be4388..b8c17ff61e9e 100644 --- a/compiler/src/dotty/tools/dotc/core/SymDenotations.scala +++ b/compiler/src/dotty/tools/dotc/core/SymDenotations.scala @@ -1196,6 +1196,7 @@ object SymDenotations { isOneOf(EffectivelyFinalFlags) || is(Inline, butNot = Deferred) || is(JavaDefinedVal, butNot = Method) + || isConstructor || !owner.isExtensibleClass /** A class is effectively sealed if has the `final` or `sealed` modifier, or it diff --git a/compiler/src/dotty/tools/dotc/transform/init/Util.scala b/compiler/src/dotty/tools/dotc/transform/init/Util.scala index 4e60c1325b09..ba2216504aef 100644 --- a/compiler/src/dotty/tools/dotc/transform/init/Util.scala +++ b/compiler/src/dotty/tools/dotc/transform/init/Util.scala @@ -75,11 +75,9 @@ object Util: case _ => None - def resolve(cls: ClassSymbol, sym: Symbol)(using Context): Symbol = log("resove " + cls + ", " + sym, printer, (_: Symbol).show) { - if (sym.isEffectivelyFinal || sym.isConstructor) sym + def resolve(cls: ClassSymbol, sym: Symbol)(using Context): Symbol = log("resove " + cls + ", " + sym, printer, (_: Symbol).show): + if sym.isEffectivelyFinal then sym else sym.matchingMember(cls.appliedRef) - } - extension (sym: Symbol) def hasSource(using Context): Boolean = !sym.defTree.isEmpty diff --git a/compiler/src/dotty/tools/dotc/typer/RefChecks.scala b/compiler/src/dotty/tools/dotc/typer/RefChecks.scala index fe28a8b18833..025eae3606af 100644 --- a/compiler/src/dotty/tools/dotc/typer/RefChecks.scala +++ b/compiler/src/dotty/tools/dotc/typer/RefChecks.scala @@ -1689,7 +1689,7 @@ class RefChecks extends MiniPhase { thisPhase => // if (settings.warnNullaryUnit) // checkNullaryMethodReturnType(sym) // if (settings.warnInaccessible) { - // if (!sym.isConstructor && !sym.isEffectivelyFinal && !sym.isSynthetic) + // if (!sym.isEffectivelyFinal && !sym.isSynthetic) // checkAccessibilityOfReferencedTypes(tree) // } // tree match { From 0305d8878a7dbbbc0ff398cdec0fc079f067280f Mon Sep 17 00:00:00 2001 From: Nicolas Stucki Date: Tue, 25 Jul 2023 15:29:29 +0200 Subject: [PATCH 128/371] Do not compute `protoFormal` if `param.tpt` is empty This was accidentally moved before of the `if (!param.tpt.isEmpty)` guard in https://github.com/lampepfl/dotty/commit/0f7c3abc3706b2054c48f3b16991741edb3a4610#diff-8c9ece1772bd78160fc1c31e988664586c9df566a1d22ff99ef99dd6d5627a90R1534 Fixes #18276 --- .../src/dotty/tools/dotc/typer/Typer.scala | 49 +++++++++---------- tests/pos/i18276a.scala | 15 ++++++ tests/pos/i18276b.scala | 9 ++++ 3 files changed, 48 insertions(+), 25 deletions(-) create mode 100644 tests/pos/i18276a.scala create mode 100644 tests/pos/i18276b.scala diff --git a/compiler/src/dotty/tools/dotc/typer/Typer.scala b/compiler/src/dotty/tools/dotc/typer/Typer.scala index cb23262d1410..74be1dee9a9b 100644 --- a/compiler/src/dotty/tools/dotc/typer/Typer.scala +++ b/compiler/src/dotty/tools/dotc/typer/Typer.scala @@ -1596,32 +1596,31 @@ class Typer(@constructorOnly nestingLevel: Int = 0) extends Namer if desugared.isEmpty then val inferredParams: List[untpd.ValDef] = for ((param, i) <- params.zipWithIndex) yield - val (formalBounds, isErased) = protoFormal(i) - val param0 = - if (!param.tpt.isEmpty) param - else - val formal = formalBounds.loBound - val isBottomFromWildcard = (formalBounds ne formal) && formal.isExactlyNothing - val knownFormal = isFullyDefined(formal, ForceDegree.failBottom) - // If the expected formal is a TypeBounds wildcard argument with Nothing as lower bound, - // try to prioritize inferring from target. See issue 16405 (tests/run/16405.scala) - val paramType = - // Strip inferred erased annotation, to avoid accidentally inferring erasedness - val formal0 = if !isErased then formal.stripAnnots(_.symbol != defn.ErasedParamAnnot) else formal - if knownFormal && !isBottomFromWildcard then - formal0 - else - inferredFromTarget(param, formal, calleeType, isErased, paramIndex).orElse( - if knownFormal then formal0 - else errorType(AnonymousFunctionMissingParamType(param, tree, formal), param.srcPos) - ) - val paramTpt = untpd.TypedSplice( - (if knownFormal then InferredTypeTree() else untpd.TypeTree()) - .withType(paramType.translateFromRepeated(toArray = false)) - .withSpan(param.span.endPos) + if (!param.tpt.isEmpty) param + else + val (formalBounds, isErased) = protoFormal(i) + val formal = formalBounds.loBound + val isBottomFromWildcard = (formalBounds ne formal) && formal.isExactlyNothing + val knownFormal = isFullyDefined(formal, ForceDegree.failBottom) + // If the expected formal is a TypeBounds wildcard argument with Nothing as lower bound, + // try to prioritize inferring from target. See issue 16405 (tests/run/16405.scala) + val paramType = + // Strip inferred erased annotation, to avoid accidentally inferring erasedness + val formal0 = if !isErased then formal.stripAnnots(_.symbol != defn.ErasedParamAnnot) else formal + if knownFormal && !isBottomFromWildcard then + formal0 + else + inferredFromTarget(param, formal, calleeType, isErased, paramIndex).orElse( + if knownFormal then formal0 + else errorType(AnonymousFunctionMissingParamType(param, tree, formal), param.srcPos) ) - cpy.ValDef(param)(tpt = paramTpt) - if isErased then param0.withAddedFlags(Flags.Erased) else param0 + val paramTpt = untpd.TypedSplice( + (if knownFormal then InferredTypeTree() else untpd.TypeTree()) + .withType(paramType.translateFromRepeated(toArray = false)) + .withSpan(param.span.endPos) + ) + val param0 = cpy.ValDef(param)(tpt = paramTpt) + if isErased then param0.withAddedFlags(Flags.Erased) else param0 desugared = desugar.makeClosure(inferredParams, fnBody, resultTpt, isContextual, tree.span) typed(desugared, pt) diff --git a/tests/pos/i18276a.scala b/tests/pos/i18276a.scala new file mode 100644 index 000000000000..46c2722fd8be --- /dev/null +++ b/tests/pos/i18276a.scala @@ -0,0 +1,15 @@ +import scala.language.implicitConversions + +case class Assign(left: String, right: String) +class SyntaxAnalyser extends ParsersBase { + val x: Parser[String ~ String] = ??? + val y: Parser[Assign] = x.map(Assign.apply) +} + +class ParsersBase { + trait ~[+T, +U] + abstract class Parser[+T]: + def map[U](f: T => U): Parser[U] = ??? + + given [A, B, X]: Conversion[(A, B) => X, (A ~ B) => X] = ??? +} diff --git a/tests/pos/i18276b.scala b/tests/pos/i18276b.scala new file mode 100644 index 000000000000..a4d905293472 --- /dev/null +++ b/tests/pos/i18276b.scala @@ -0,0 +1,9 @@ +import scala.language.implicitConversions + +def foo(a: Int): Int = ??? +def bar(f: () => Int): Int = ??? + +given f: Conversion[Int => Int, () => Int] = ??? + +def test1: Int = bar(foo) // implicit conversion applied to foo +def test2: Int = bar(f(foo)) From 6bf8ac95d677bda402cb5ef44f15400b82481e69 Mon Sep 17 00:00:00 2001 From: =?UTF-8?q?Pawe=C5=82=20Marks?= Date: Mon, 21 Aug 2023 17:20:00 +0200 Subject: [PATCH 129/371] Revert "Normalize match type usage during implicit lookup" This reverts commit 5bafff7cc96f1f31f6e77620ca509dfa55d816b4. --- .../dotty/tools/dotc/core/TypeComparer.scala | 2 +- .../dotty/tools/dotc/core/TypeErrors.scala | 3 --- .../dotty/tools/dotc/typer/Implicits.scala | 7 ------ tests/pos/i17395.scala | 25 ------------------- 4 files changed, 1 insertion(+), 36 deletions(-) delete mode 100644 tests/pos/i17395.scala diff --git a/compiler/src/dotty/tools/dotc/core/TypeComparer.scala b/compiler/src/dotty/tools/dotc/core/TypeComparer.scala index b84af998ffb6..6857e3da38ed 100644 --- a/compiler/src/dotty/tools/dotc/core/TypeComparer.scala +++ b/compiler/src/dotty/tools/dotc/core/TypeComparer.scala @@ -3180,7 +3180,7 @@ class TrackingTypeComparer(initctx: Context) extends TypeComparer(initctx) { tp case Nil => val casesText = MatchTypeTrace.noMatchesText(scrut, cases) - throw MatchTypeReductionError(em"Match type reduction $casesText") + throw TypeError(em"Match type reduction $casesText") inFrozenConstraint { // Empty types break the basic assumption that if a scrutinee and a diff --git a/compiler/src/dotty/tools/dotc/core/TypeErrors.scala b/compiler/src/dotty/tools/dotc/core/TypeErrors.scala index f59bd08da779..24a207da6836 100644 --- a/compiler/src/dotty/tools/dotc/core/TypeErrors.scala +++ b/compiler/src/dotty/tools/dotc/core/TypeErrors.scala @@ -46,9 +46,6 @@ object TypeError: def toMessage(using Context) = msg end TypeError -class MatchTypeReductionError(msg: Message)(using Context) extends TypeError: - def toMessage(using Context) = msg - class MalformedType(pre: Type, denot: Denotation, absMembers: Set[Name])(using Context) extends TypeError: def toMessage(using Context) = em"malformed type: $pre is not a legal prefix for $denot because it contains abstract type member${if (absMembers.size == 1) "" else "s"} ${absMembers.mkString(", ")}" diff --git a/compiler/src/dotty/tools/dotc/typer/Implicits.scala b/compiler/src/dotty/tools/dotc/typer/Implicits.scala index 66f400d7eae0..4bbd6ee080b6 100644 --- a/compiler/src/dotty/tools/dotc/typer/Implicits.scala +++ b/compiler/src/dotty/tools/dotc/typer/Implicits.scala @@ -636,13 +636,6 @@ trait ImplicitRunInfo: case t: TypeLambda => for p <- t.paramRefs do partSeen += p traverseChildren(t) - case t: MatchType => - traverseChildren(t) - traverse(try t.normalized catch case _: MatchTypeReductionError => t) - case MatchType.InDisguise(mt) - if !t.isInstanceOf[LazyRef] // skip recursive applications (eg. Tuple.Map) - => - traverse(mt) case t => traverseChildren(t) diff --git a/tests/pos/i17395.scala b/tests/pos/i17395.scala deleted file mode 100644 index 87c0a45a9ff5..000000000000 --- a/tests/pos/i17395.scala +++ /dev/null @@ -1,25 +0,0 @@ -trait TC[T] - -object TC { - def optionTCForPart[T](implicit tc: TC[ExtractPart[T]]): TC[Option[ExtractPart[T]]] = new TC[Option[ExtractPart[T]]] {} -} - -type ExtractPart[T] = T match { - case PartField[t] => t -} -type PartField[T] = Any { type Part = T } - -class ValuePartHolder { - type Part = Value -} - -class Value -object Value { - implicit val tcValue: TC[Value] = new {} -} - -@main def main(): Unit = { -// import Value.tcValue // explicit import works around the issue, but shouldn't be necessary - val tc = TC.optionTCForPart[ValuePartHolder] - println(tc) -} From 340303fb983405cd6b123458f3573b3cf1505c23 Mon Sep 17 00:00:00 2001 From: =?UTF-8?q?Pawe=C5=82=20Marks?= Date: Tue, 22 Aug 2023 13:18:45 +0200 Subject: [PATCH 130/371] Add changelog for 3.3.1-RC6 --- changelogs/3.3.1-RC6.md | 17 +++++++++++++++++ 1 file changed, 17 insertions(+) create mode 100644 changelogs/3.3.1-RC6.md diff --git a/changelogs/3.3.1-RC6.md b/changelogs/3.3.1-RC6.md new file mode 100644 index 000000000000..f74ab7fe7e18 --- /dev/null +++ b/changelogs/3.3.1-RC6.md @@ -0,0 +1,17 @@ +# Backported fixes + +- Refine `infoDependsOnPrefix` [#18204](https://github.com/lampepfl/dotty/pull/18204) +- FDo not compute `protoFormal` if `param.tpt` is empty [#18288](http://github.com/lampepfl/dotty/pull/18288) +- Revert "Normalize match type usage during implicit lookup" [#18440](http://github.com/lampepfl/dotty/pull/18440) + +# Contributors + +Thank you to all the contributors who made this release possible 🎉 + +According to `git shortlog -sn --no-merges 3.3.1-RC4..3.3.1-RC5` these are: + +``` + 3 Paweł Marks + 2 Martin Odersky + 1 Nicolas Stucki +``` From 5f8485e13b66b6d64b8e52161b5de10699900543 Mon Sep 17 00:00:00 2001 From: =?UTF-8?q?Pawe=C5=82=20Marks?= Date: Tue, 22 Aug 2023 13:14:01 +0200 Subject: [PATCH 131/371] Release 3.3.1-RC6 --- project/Build.scala | 4 ++-- 1 file changed, 2 insertions(+), 2 deletions(-) diff --git a/project/Build.scala b/project/Build.scala index b1c6e63cef9d..047310df0a6b 100644 --- a/project/Build.scala +++ b/project/Build.scala @@ -82,7 +82,7 @@ object Build { val referenceVersion = "3.3.0" - val baseVersion = "3.3.1-RC5" + val baseVersion = "3.3.1-RC6" // Versions used by the vscode extension to create a new project // This should be the latest published releases. @@ -98,7 +98,7 @@ object Build { * set to 3.1.3. If it is going to be 3.1.0, it must be set to the latest * 3.0.x release. */ - val previousDottyVersion = "3.3.1-RC4" + val previousDottyVersion = "3.3.1-RC5" object CompatMode { final val BinaryCompatible = 0 From 88e6725df9fc73d9894d8ae8205a54ca8d47851d Mon Sep 17 00:00:00 2001 From: odersky Date: Mon, 28 Aug 2023 16:55:10 +0200 Subject: [PATCH 132/371] Tweak selection from self types Previously, we rejected the case where a symbol of a self type selection was private if was not of the enclosing class. But that symbol could shadow a non-private symbol in a base class, so have to treat that case as well. Fixes #18631 --- compiler/src/dotty/tools/dotc/core/Types.scala | 7 ++++++- tests/pos/i18361.scala | 15 +++++++++++++++ 2 files changed, 21 insertions(+), 1 deletion(-) create mode 100644 tests/pos/i18361.scala diff --git a/compiler/src/dotty/tools/dotc/core/Types.scala b/compiler/src/dotty/tools/dotc/core/Types.scala index 81fc28a32fec..fb66d133c0ba 100644 --- a/compiler/src/dotty/tools/dotc/core/Types.scala +++ b/compiler/src/dotty/tools/dotc/core/Types.scala @@ -808,9 +808,14 @@ object Types { // is made to save execution time in the common case. See i9844.scala for test cases. def qualifies(sd: SingleDenotation) = !sd.symbol.is(Private) || sd.symbol.owner == tp.cls - d match + d.match case d: SingleDenotation => if qualifies(d) then d else NoDenotation case d => d.filterWithPredicate(qualifies) + .orElse: + // Only inaccessible private symbols were found. But there could still be + // shadowed non-private symbols, so as a fallback search for those. + // Test case is i18361.scala. + findMember(name, pre, required, excluded | Private) else d else // There is a special case to handle: diff --git a/tests/pos/i18361.scala b/tests/pos/i18361.scala new file mode 100644 index 000000000000..a84d5f0a09db --- /dev/null +++ b/tests/pos/i18361.scala @@ -0,0 +1,15 @@ +package test1: + class Service(val name: String) + class CrudService(name: String) extends Service(name) + + trait Foo { self: CrudService => + val x = self.name + } + +package test2: + abstract class Service[F[_]](val name: String) + abstract class CrudService[F[_]](name: String) extends Service[F](name) + + trait Foo[F[_]] { self: CrudService[?] => + val x = self.name + } From fb6545872968e90798c5a411e69b79670df8e0fe Mon Sep 17 00:00:00 2001 From: =?UTF-8?q?Pawe=C5=82=20Marks?= Date: Tue, 29 Aug 2023 11:34:58 +0200 Subject: [PATCH 133/371] Revert "Add reflect `defn.FunctionClass` overloads" This reverts commit 9571b42a2ff63e897cb566c09259f5f8d1e7a021. --- .../quoted/runtime/impl/QuotesImpl.scala | 4 --- library/src/scala/quoted/Quotes.scala | 19 -------------- .../stdlibExperimentalDefinitions.scala | 2 -- tests/run-macros/tasty-definitions-1.check | 26 ------------------- .../tasty-definitions-1/quoted_1.scala | 1 - 5 files changed, 52 deletions(-) diff --git a/compiler/src/scala/quoted/runtime/impl/QuotesImpl.scala b/compiler/src/scala/quoted/runtime/impl/QuotesImpl.scala index e178dc81b1a3..db4e3e6c6a05 100644 --- a/compiler/src/scala/quoted/runtime/impl/QuotesImpl.scala +++ b/compiler/src/scala/quoted/runtime/impl/QuotesImpl.scala @@ -2811,10 +2811,6 @@ class QuotesImpl private (using val ctx: Context) extends Quotes, QuoteUnpickler if isErased then throw new Exception("Erased function classes are not supported. Use a refined `scala.runtime.ErasedFunction`") else dotc.core.Symbols.defn.FunctionSymbol(arity, isImplicit) - def FunctionClass(arity: Int): Symbol = - FunctionClass(arity, false, false) - def FunctionClass(arity: Int, isContextual: Boolean): Symbol = - FunctionClass(arity, isContextual, false) def ErasedFunctionClass = dotc.core.Symbols.defn.ErasedFunctionClass def TupleClass(arity: Int): Symbol = dotc.core.Symbols.defn.TupleType(arity).nn.classSymbol.asClass diff --git a/library/src/scala/quoted/Quotes.scala b/library/src/scala/quoted/Quotes.scala index c7d5719b0e1f..b6e5a12da2d8 100644 --- a/library/src/scala/quoted/Quotes.scala +++ b/library/src/scala/quoted/Quotes.scala @@ -4295,27 +4295,8 @@ trait Quotes { self: runtime.QuoteUnpickler & runtime.QuoteMatching => * - ... * - Nth element is `FunctionN` */ - // TODO: deprecate in 3.4 and stabilize FunctionClass(Int)/FunctionClass(Int,Boolean) - // @deprecated("Use overload of `FunctionClass` with 1 or 2 arguments","3.4") def FunctionClass(arity: Int, isImplicit: Boolean = false, isErased: Boolean = false): Symbol - /** Class symbol of a function class `scala.FunctionN`. - * - * @param arity the arity of the function where `0 <= arity` - * @return class symbol of `scala.FunctionN` where `N == arity` - */ - @experimental - def FunctionClass(arity: Int): Symbol - - /** Class symbol of a context function class `scala.FunctionN` or `scala.ContextFunctionN`. - * - * @param arity the arity of the function where `0 <= arity` - * @param isContextual if it is a `scala.ContextFunctionN` - * @return class symbol of `scala.FunctionN` or `scala.ContextFunctionN` where `N == arity` - */ - @experimental - def FunctionClass(arity: Int, isContextual: Boolean): Symbol - /** The `scala.runtime.ErasedFunction` built-in trait. */ @experimental def ErasedFunctionClass: Symbol diff --git a/tests/run-custom-args/tasty-inspector/stdlibExperimentalDefinitions.scala b/tests/run-custom-args/tasty-inspector/stdlibExperimentalDefinitions.scala index 644efb54c32e..5ccdb753e9b3 100644 --- a/tests/run-custom-args/tasty-inspector/stdlibExperimentalDefinitions.scala +++ b/tests/run-custom-args/tasty-inspector/stdlibExperimentalDefinitions.scala @@ -63,8 +63,6 @@ val experimentalDefinitionInLibrary = Set( "scala.annotation.MacroAnnotation", //// New APIs: Quotes - // Should be stabilized in 3.4.0 - "scala.quoted.Quotes.reflectModule.defnModule.FunctionClass", "scala.quoted.Quotes.reflectModule.FlagsModule.AbsOverride", // Can be stabilized in 3.4.0 (unsure) or later "scala.quoted.Quotes.reflectModule.CompilationInfoModule.XmacroSettings", diff --git a/tests/run-macros/tasty-definitions-1.check b/tests/run-macros/tasty-definitions-1.check index ce7251d7d3ee..4ac0e6267028 100644 --- a/tests/run-macros/tasty-definitions-1.check +++ b/tests/run-macros/tasty-definitions-1.check @@ -57,57 +57,31 @@ Function23 Function24 Function25 ContextFunction0 -ContextFunction0 -ContextFunction1 ContextFunction1 ContextFunction2 -ContextFunction2 -ContextFunction3 ContextFunction3 ContextFunction4 -ContextFunction4 -ContextFunction5 ContextFunction5 ContextFunction6 -ContextFunction6 ContextFunction7 -ContextFunction7 -ContextFunction8 ContextFunction8 ContextFunction9 -ContextFunction9 ContextFunction10 -ContextFunction10 -ContextFunction11 ContextFunction11 ContextFunction12 -ContextFunction12 ContextFunction13 -ContextFunction13 -ContextFunction14 ContextFunction14 ContextFunction15 -ContextFunction15 -ContextFunction16 ContextFunction16 ContextFunction17 -ContextFunction17 -ContextFunction18 ContextFunction18 ContextFunction19 -ContextFunction19 ContextFunction20 -ContextFunction20 -ContextFunction21 ContextFunction21 ContextFunction22 -ContextFunction22 ContextFunction23 -ContextFunction23 -ContextFunction24 ContextFunction24 ContextFunction25 -ContextFunction25 class java.lang.Exception: Erased function classes are not supported. Use a refined `scala.runtime.ErasedFunction` ErasedFunction Tuple2 diff --git a/tests/run-macros/tasty-definitions-1/quoted_1.scala b/tests/run-macros/tasty-definitions-1/quoted_1.scala index bf9e28288486..ed210706f567 100644 --- a/tests/run-macros/tasty-definitions-1/quoted_1.scala +++ b/tests/run-macros/tasty-definitions-1/quoted_1.scala @@ -60,7 +60,6 @@ object Macros { printout(defn.FunctionClass(i).name) for (i <- 0 to 25) - printout(defn.FunctionClass(i, isContextual = true).name) printout(defn.FunctionClass(i, isImplicit = true).name) // should fail From 24cd50d2d66caeb0930032d3e23bb1a60fe02528 Mon Sep 17 00:00:00 2001 From: =?UTF-8?q?Pawe=C5=82=20Marks?= Date: Tue, 29 Aug 2023 17:56:55 +0200 Subject: [PATCH 134/371] Add changelog for 3.3.1-RC7 --- changelogs/3.3.1-RC7.md | 16 ++++++++++++++++ 1 file changed, 16 insertions(+) create mode 100644 changelogs/3.3.1-RC7.md diff --git a/changelogs/3.3.1-RC7.md b/changelogs/3.3.1-RC7.md new file mode 100644 index 000000000000..f8f093a18d11 --- /dev/null +++ b/changelogs/3.3.1-RC7.md @@ -0,0 +1,16 @@ +# Backported fixes + +- Tweak selection from self types [#18467](https://github.com/lampepfl/dotty/pull/18467) +- Revert "Add reflect `defn.FunctionClass` overloads" [#18473](http://github.com/lampepfl/dotty/pull/18473) + +# Contributors + +Thank you to all the contributors who made this release possible 🎉 + +According to `git shortlog -sn --no-merges 3.3.1-RC6..3.3.1-RC7` these are: + +``` + 3 Paweł Marks + 1 Martin Odersky + +``` From ca005766359c27ea06adc18409b6e53f79732bd6 Mon Sep 17 00:00:00 2001 From: =?UTF-8?q?Pawe=C5=82=20Marks?= Date: Tue, 29 Aug 2023 17:58:04 +0200 Subject: [PATCH 135/371] Release 3.3.1-RC7 --- project/Build.scala | 4 ++-- 1 file changed, 2 insertions(+), 2 deletions(-) diff --git a/project/Build.scala b/project/Build.scala index 047310df0a6b..f748ce44d4ca 100644 --- a/project/Build.scala +++ b/project/Build.scala @@ -82,7 +82,7 @@ object Build { val referenceVersion = "3.3.0" - val baseVersion = "3.3.1-RC6" + val baseVersion = "3.3.1-RC7" // Versions used by the vscode extension to create a new project // This should be the latest published releases. @@ -98,7 +98,7 @@ object Build { * set to 3.1.3. If it is going to be 3.1.0, it must be set to the latest * 3.0.x release. */ - val previousDottyVersion = "3.3.1-RC5" + val previousDottyVersion = "3.3.1-RC6" object CompatMode { final val BinaryCompatible = 0 From 9b4ea8eb22dd96c94eb5d5c62cd0c0d277342a53 Mon Sep 17 00:00:00 2001 From: =?UTF-8?q?Pawe=C5=82=20Marks?= Date: Tue, 5 Sep 2023 14:33:24 +0200 Subject: [PATCH 136/371] Add changelog for 3.3.1 Also fix typos in older changelogs --- changelogs/3.3.1-RC1.md | 15 ++- changelogs/3.3.1-RC6.md | 4 +- changelogs/3.3.1.md | 287 ++++++++++++++++++++++++++++++++++++++++ 3 files changed, 302 insertions(+), 4 deletions(-) create mode 100644 changelogs/3.3.1.md diff --git a/changelogs/3.3.1-RC1.md b/changelogs/3.3.1-RC1.md index 4e52eb874891..e7d9f8f87ea9 100644 --- a/changelogs/3.3.1-RC1.md +++ b/changelogs/3.3.1-RC1.md @@ -42,6 +42,7 @@ - Harden tpd.Apply/TypeApply in case of errors [#16887](https://github.com/lampepfl/dotty/pull/16887) - Try to be more subtle when inferring type parameters of class parents [#16896](https://github.com/lampepfl/dotty/pull/16896) - Include `P` in the implicit scope of `P.this.type` [#17088](https://github.com/lampepfl/dotty/pull/17088) +- Do not compute `protoFormal` if `param.tpt` is empty [#18288](http://github.com/lampepfl/dotty/pull/18288) ## Incremental Compilation @@ -71,7 +72,6 @@ ## Match Types -- Normalize match type usage during implicit lookup [#17457](https://github.com/lampepfl/dotty/pull/17457) - Fix #13757: Explicitly disallow higher-kinded scrutinees of match types. [#17322](https://github.com/lampepfl/dotty/pull/17322) - Fix match type reduction with wildcard type arguments [#17065](https://github.com/lampepfl/dotty/pull/17065) - Fix check whether classtag can be generated for match types [#16708](https://github.com/lampepfl/dotty/pull/16708) @@ -86,6 +86,7 @@ - Check outer class prefixes in type projections when pattern matching [#17136](https://github.com/lampepfl/dotty/pull/17136) - Make unchecked cases non-`@unchecked` and non-unreachable [#16958](https://github.com/lampepfl/dotty/pull/16958) - Fix #16899: Better handle X instanceOf P where X is T1 | T2 [#17382](https://github.com/lampepfl/dotty/pull/17382) +- Fix regression in exhaustivity of HK types [#18303](http://github.com/lampepfl/dotty/pull/18303) ## Pickling @@ -121,6 +122,7 @@ - Only transform the body of the quote with QuoteTransformer [#17451](https://github.com/lampepfl/dotty/pull/17451) - Place staged type captures in Quote AST [#17424](https://github.com/lampepfl/dotty/pull/17424) - Add SplicePattern AST to parse and type quote pattern splices [#17396](https://github.com/lampepfl/dotty/pull/17396) +- Dealias types in `New`` before matching quotes [#17615](https://github.com/lampepfl/dotty/pull/17615) ## Reflection @@ -129,7 +131,6 @@ - Fix reflect.LambdaType type test [#16972](https://github.com/lampepfl/dotty/pull/16972) - Improve `New`/`Select` -Ycheck message [#16746](https://github.com/lampepfl/dotty/pull/16746) - Improve error message for CyclicReference in macros [#16749](https://github.com/lampepfl/dotty/pull/16749) -- Add reflect `defn.FunctionClass` overloads [#16849](https://github.com/lampepfl/dotty/pull/16849) ## REPL @@ -222,6 +223,16 @@ - Fix #16405 ctd - wildcards prematurely resolving to Nothing [#16764](https://github.com/lampepfl/dotty/pull/16764) - Test: add regression test for #7790 [#17473](https://github.com/lampepfl/dotty/pull/17473) - Properly handle `AnyVal`s as refinement members of `Selectable`s [#16286](https://github.com/lampepfl/dotty/pull/16286) +- Fix `accessibleType` for package object prefixes [#18057](https://github.com/lampepfl/dotty/pull/18057) +- Add clause for protected visibility from package objects [#18134](https://github.com/lampepfl/dotty/pull/18134) +- Revert "Include top-level symbols from same file in outer ambiguity error" [#17438](https://github.com/lampepfl/dotty/pull/17438) +- Heal stage inconsistent prefixes of type projections [#18239](https://github.com/lampepfl/dotty/pull/18239) +- Fix regression #17245: Overloaded methods with ClassTags [#18286](http://github.com/lampepfl/dotty/pull/18286) +- Disallow taking singleton types of packages again [#18232](http://github.com/lampepfl/dotty/pull/18232) +- A slightly more conservative version of #14218 [#18352](http://github.com/lampepfl/dotty/pull/18352) +- Record failures to adapt application arguments [#18269](http://github.com/lampepfl/dotty/pull/18269) +- Refine `infoDependsOnPrefix` [#18204](httpsF://github.com/lampepfl/dotty/pull/18204) +- Tweak selection from self types [#18467](https://github.com/lampepfl/dotty/pull/18467) # Contributors diff --git a/changelogs/3.3.1-RC6.md b/changelogs/3.3.1-RC6.md index f74ab7fe7e18..96181855f1a0 100644 --- a/changelogs/3.3.1-RC6.md +++ b/changelogs/3.3.1-RC6.md @@ -1,14 +1,14 @@ # Backported fixes - Refine `infoDependsOnPrefix` [#18204](https://github.com/lampepfl/dotty/pull/18204) -- FDo not compute `protoFormal` if `param.tpt` is empty [#18288](http://github.com/lampepfl/dotty/pull/18288) +- Do not compute `protoFormal` if `param.tpt` is empty [#18288](http://github.com/lampepfl/dotty/pull/18288) - Revert "Normalize match type usage during implicit lookup" [#18440](http://github.com/lampepfl/dotty/pull/18440) # Contributors Thank you to all the contributors who made this release possible 🎉 -According to `git shortlog -sn --no-merges 3.3.1-RC4..3.3.1-RC5` these are: +According to `git shortlog -sn --no-merges 3.3.1-RC5..3.3.1-RC6` these are: ``` 3 Paweł Marks diff --git a/changelogs/3.3.1.md b/changelogs/3.3.1.md new file mode 100644 index 000000000000..5bbd6eb2861c --- /dev/null +++ b/changelogs/3.3.1.md @@ -0,0 +1,287 @@ +# Highlights of the release + +- Support records in JavaParsers [#16762](https://github.com/lampepfl/dotty/pull/16762) +- Port JVM backend refactor from Scala 2 [#15322](https://github.com/lampepfl/dotty/pull/15322) + +# Other changes and fixes + +## Backend + +- Disallow mixins where super calls bind to vals [#16908](https://github.com/lampepfl/dotty/pull/16908) +- Fix #15107: Avoid re-emitting a LineNumber after only LabelNodes. [#16813](https://github.com/lampepfl/dotty/pull/16813) + +## Coverage + +- Fix #17042: Preserve the shape of secondary ctors in instrumentCoverage. [#17111](https://github.com/lampepfl/dotty/pull/17111) + +## Default parameters + +- Dupe fix when finding default arg getters [#17058](https://github.com/lampepfl/dotty/pull/17058) + +## Documentation + +- Fix: ensure syntax blocks for ebnf are marked as such [#16837](https://github.com/lampepfl/dotty/pull/16837) + +## Erasure + +- Handle `@companionClass` and `@companionMethod` meta-annotations [#17091](https://github.com/lampepfl/dotty/pull/17091) + +## Extension Methods + +- Support extension methods imported from different objects [#17050](https://github.com/lampepfl/dotty/pull/17050) + +## GADTs + +- Fix tuple member selection so it works with GADT healing [#16766](https://github.com/lampepfl/dotty/pull/16766) +- Fix upper bound constraints, that are higher-kinded [#16744](https://github.com/lampepfl/dotty/pull/16744) +- Split out immutable GadtConstraint [#16602](https://github.com/lampepfl/dotty/pull/16602) + +## Implicits + +- Improve subtyping check for not yet eta-expanded higher kinded types [#17139](https://github.com/lampepfl/dotty/pull/17139) +- Harden tpd.Apply/TypeApply in case of errors [#16887](https://github.com/lampepfl/dotty/pull/16887) +- Try to be more subtle when inferring type parameters of class parents [#16896](https://github.com/lampepfl/dotty/pull/16896) +- Include `P` in the implicit scope of `P.this.type` [#17088](https://github.com/lampepfl/dotty/pull/17088) + +## Incremental Compilation + +- Fix under-compilation when the method type in a SAM changes [#16996](https://github.com/lampepfl/dotty/pull/16996) + +## Infrastructure + +- Set reference version to 3.3.0-RC6 [#17504](https://github.com/lampepfl/dotty/pull/17504) +- Fix #17119: Download Coursier from GitHub directly [#17141](https://github.com/lampepfl/dotty/pull/17141) + +## Inline + +- Remove NamedArg from inlined arguments [#17228](https://github.com/lampepfl/dotty/pull/17228) +- Don't generate a Select for a TermRef with NoPrefix [#16754](https://github.com/lampepfl/dotty/pull/16754) +- Prepare bodies of inline forwarders eagerly [#16757](https://github.com/lampepfl/dotty/pull/16757) +- Do not remove inline method implementations until PruneErasedDefs [#17408](https://github.com/lampepfl/dotty/pull/17408) + +## Java Interop + +- ClassfileParser: allow missing param names (for JDK 21) [#17536](https://github.com/lampepfl/dotty/pull/17536) + +## Linting + +- Improve -Wunused: locals, privates with unset vars warning #16639 [#17160](https://github.com/lampepfl/dotty/pull/17160) +- Fix wunused false positive when deriving alias type [#17157](https://github.com/lampepfl/dotty/pull/17157) +- Port `-Wnonunit-statement` setting for dotty [#16936](https://github.com/lampepfl/dotty/pull/16936) + +## Match Types + +- Normalize match type usage during implicit lookup [#17457](https://github.com/lampepfl/dotty/pull/17457) +- Fix #13757: Explicitly disallow higher-kinded scrutinees of match types. [#17322](https://github.com/lampepfl/dotty/pull/17322) +- Fix match type reduction with wildcard type arguments [#17065](https://github.com/lampepfl/dotty/pull/17065) +- Fix check whether classtag can be generated for match types [#16708](https://github.com/lampepfl/dotty/pull/16708) + +## Parser + +- Allow lines starting with `.` to fall outside previous indentation widths [#17056](https://github.com/lampepfl/dotty/pull/17056) + +## Pattern Matching + +- Fix #11541: Specialize ClassTag[T] in exhaustivity check [#17385](https://github.com/lampepfl/dotty/pull/17385) +- Check outer class prefixes in type projections when pattern matching [#17136](https://github.com/lampepfl/dotty/pull/17136) +- Make unchecked cases non-`@unchecked` and non-unreachable [#16958](https://github.com/lampepfl/dotty/pull/16958) +- Fix #16899: Better handle X instanceOf P where X is T1 | T2 [#17382](https://github.com/lampepfl/dotty/pull/17382) + +## Pickling + +- ClassfileParser: Avoid cycle when accessing companion in inner class lookup [#16882](https://github.com/lampepfl/dotty/pull/16882) + +## Polyfunctions + +- Fix type aliases in beta-reduction of polyfunctions [#17054](https://github.com/lampepfl/dotty/pull/17054) + +## Quotes + +- Register `paramProxy` and `thisProxy` in `Quote` type [#17541](https://github.com/lampepfl/dotty/pull/17541) +- Only check newVal/newMethod privateWithin on -Xcheck-macros [#17437](https://github.com/lampepfl/dotty/pull/17437) +- Unencode quote and splice trees [#17342](https://github.com/lampepfl/dotty/pull/17342) +- Correctly type Expr.ofTupleFromSeq for arity > 22 [#17261](https://github.com/lampepfl/dotty/pull/17261) +- Use TermRef to distinguish distinct Type[T] instances [#17205](https://github.com/lampepfl/dotty/pull/17205) +- Check level consistency of SingletonTypeTree as a type [#17209](https://github.com/lampepfl/dotty/pull/17209) +- Fix splice type variable pattern detection [#17048](https://github.com/lampepfl/dotty/pull/17048) +- Avoid creation of `@SplicedType` quote local refrences [#17051](https://github.com/lampepfl/dotty/pull/17051) +- Dealias type references when healing types in quotes [#17049](https://github.com/lampepfl/dotty/pull/17049) +- Replace quoted type variables in signature of HOAS pattern result [#16951](https://github.com/lampepfl/dotty/pull/16951) +- Beta-reduce directly applied PolymorphicFunction [#16623](https://github.com/lampepfl/dotty/pull/16623) +- Use `Object.toString` for `quoted.{Expr, Type}` [#16663](https://github.com/lampepfl/dotty/pull/16663) +- Fix Splicer.isEscapedVariable [#16838](https://github.com/lampepfl/dotty/pull/16838) +- Fix references to class members defined in quotes [#17107](https://github.com/lampepfl/dotty/pull/17107) +- Handle pickled forward references in pickled expressions [#16855](https://github.com/lampepfl/dotty/pull/16855) +- Fix #16615 - crashes of path dependent types in spliced Type.of [#16773](https://github.com/lampepfl/dotty/pull/16773) +- Disallow local term references in staged types [#16362](https://github.com/lampepfl/dotty/pull/16362) +- Refactor level checking / type healing logic [#17082](https://github.com/lampepfl/dotty/pull/17082) +- Dealias quoted types when staging [#17059](https://github.com/lampepfl/dotty/pull/17059) +- Fix quotes with references to path dependent types [#17081](https://github.com/lampepfl/dotty/pull/17081) +- Make arguments order in quote hole deterministic [#17405](https://github.com/lampepfl/dotty/pull/17405) +- Only transform the body of the quote with QuoteTransformer [#17451](https://github.com/lampepfl/dotty/pull/17451) +- Place staged type captures in Quote AST [#17424](https://github.com/lampepfl/dotty/pull/17424) +- Add SplicePattern AST to parse and type quote pattern splices [#17396](https://github.com/lampepfl/dotty/pull/17396) + +## Reflection + +- -Xcheck-macros: add hint when a symbol in created twice [#16733](https://github.com/lampepfl/dotty/pull/16733) +- Assert that symbols created using reflect API have correct privateWithin symbols [#17352](https://github.com/lampepfl/dotty/pull/17352) +- Fix reflect.LambdaType type test [#16972](https://github.com/lampepfl/dotty/pull/16972) +- Improve `New`/`Select` -Ycheck message [#16746](https://github.com/lampepfl/dotty/pull/16746) +- Improve error message for CyclicReference in macros [#16749](https://github.com/lampepfl/dotty/pull/16749) +- Add reflect `defn.FunctionClass` overloads [#16849](https://github.com/lampepfl/dotty/pull/16849) + +## REPL + +- Always load REPL classes in macros including the output directory [#16866](https://github.com/lampepfl/dotty/pull/16866) + +## Reporting + +- Improve missing argument list error [#17126](https://github.com/lampepfl/dotty/pull/17126) +- Improve implicit parameter error message with aliases [#17125](https://github.com/lampepfl/dotty/pull/17125) +- Improve "constructor proxy shadows outer" handling [#17154](https://github.com/lampepfl/dotty/pull/17154) +- Clarify ambiguous reference error message [#16137](https://github.com/lampepfl/dotty/pull/16137) +- Hint about forbidden combination of implicit values and conversions [#16735](https://github.com/lampepfl/dotty/pull/16735) +- Attach explanation message to diagnostic message [#16787](https://github.com/lampepfl/dotty/pull/16787) +- Propagate implicit search errors from implicit macros [#16840](https://github.com/lampepfl/dotty/pull/16840) +- Detail UnapplyInvalidReturnType error message [#17167](https://github.com/lampepfl/dotty/pull/17167) +- Add way to debug -Xcheck-macros tree checking [#16973](https://github.com/lampepfl/dotty/pull/16973) +- Enrich and finesse compiler crash reporting [#17031](https://github.com/lampepfl/dotty/pull/17031) +- Allow @implicitNotFound messages as explanations [#16893](https://github.com/lampepfl/dotty/pull/16893) +- Include top-level symbols from same file in outer ambiguity error [#17033](https://github.com/lampepfl/dotty/pull/17033) +- Do not issue deprecation warnings when declaring deprecated case classes [#17165](https://github.com/lampepfl/dotty/pull/17165) + +## Scala-JS + +- Fix #17344: Make implicit references to this above dynamic imports explicit. [#17357](https://github.com/lampepfl/dotty/pull/17357) +- Fix #12621: Better error message for JS trait ctor param. [#16811](https://github.com/lampepfl/dotty/pull/16811) +- Fix #16801: Handle Closure's of s.r.FunctionXXL. [#16809](https://github.com/lampepfl/dotty/pull/16809) +- Fix #17549: Unify how Memoize and Constructors decide what fields need storing. [#17560](https://github.com/lampepfl/dotty/pull/17560) + +## Scaladoc + +- Feat: Add a blog configuration with yaml [#17214](https://github.com/lampepfl/dotty/pull/17214) +- Don't render the "$" for module [#17302](https://github.com/lampepfl/dotty/pull/17302) +- Fix: Add scrollbar to the sidebar [#17203](https://github.com/lampepfl/dotty/pull/17203) +- Scaladoc: fix crash when processing extends call [#17260](https://github.com/lampepfl/dotty/pull/17260) +- Fix: Modify the CSS so that the logo of the generated documentation is adaptive [#17172](https://github.com/lampepfl/dotty/pull/17172) +- Fix: Remove the duplicate parameter when generating the scaladoc. [#17097](https://github.com/lampepfl/dotty/pull/17097) +- Fix: padding top in mobile version [#17019](https://github.com/lampepfl/dotty/pull/17019) +- Fix: tap target of the menu in Mobile version [#17018](https://github.com/lampepfl/dotty/pull/17018) +- Scaladoc: Fix expand icon not changing on anchor link [#17053](https://github.com/lampepfl/dotty/pull/17053) +- Scaladoc: fix inkuire generation for PolyTypes [#17129](https://github.com/lampepfl/dotty/pull/17129) +- Re port scroll bar [#17463](https://github.com/lampepfl/dotty/pull/17463) +- Handle empty files and truncated YAML front matter [#17527](https://github.com/lampepfl/dotty/pull/17527) + +## SemanticDB + +- Make sure symbol exists before calling owner [#16860](https://github.com/lampepfl/dotty/pull/16860) +- Support LambdaType (convert from HKTypeLambda) [#16056](https://github.com/lampepfl/dotty/pull/16056) + +## Specification + +- Apply `class-shadowing.md` to the Spec [#16839](https://github.com/lampepfl/dotty/pull/16839) +- Adding base for future Spec into the compiler repo [#16825](https://github.com/lampepfl/dotty/pull/16825) + +## Standard Library + +- Optimization: avoid NotGiven allocations [#17090](https://github.com/lampepfl/dotty/pull/17090) + +## Tooling + +- Disable `ExtractSemanticDB` phase when writing to output directory defined as JAR. [#16790](https://github.com/lampepfl/dotty/pull/16790) +- Print owner of bind symbol with -Yprint-debug-owners [#16854](https://github.com/lampepfl/dotty/pull/16854) +- Small fixes to allow using Metals with scaladoc with sbt [#16816](https://github.com/lampepfl/dotty/pull/16816) + +## Transform + +- Move CrossVersionChecks before FirstTransform [#17301](https://github.com/lampepfl/dotty/pull/17301) +- Fix needsOuterIfReferenced [#17159](https://github.com/lampepfl/dotty/pull/17159) +- Drop incorrect super accessor in trait subclass [#17062](https://github.com/lampepfl/dotty/pull/17062) +- Generate toString only for synthetic companions of case classes [#16890](https://github.com/lampepfl/dotty/pull/16890) +- Check trait constructor for accessibility even if not called at Typer [#17094](https://github.com/lampepfl/dotty/pull/17094) +- Fix #17435: A simpler fix [#17436](https://github.com/lampepfl/dotty/pull/17436) + +## Typer + +- Preserve type bounds for inlined definitions in posttyper [#17190](https://github.com/lampepfl/dotty/pull/17190) +- Change logic to find members of recursive types [#17386](https://github.com/lampepfl/dotty/pull/17386) +- Recognize named arguments in isFunctionWithUnknownParamType [#17161](https://github.com/lampepfl/dotty/pull/17161) +- Better comparisons for type projections [#17092](https://github.com/lampepfl/dotty/pull/17092) +- Allow selectDynamic and applyDynamic to be extension methods [#17106](https://github.com/lampepfl/dotty/pull/17106) +- Fix use of accessibleFrom when finding default arg getters [#16977](https://github.com/lampepfl/dotty/pull/16977) +- Map class literal constant types [#16988](https://github.com/lampepfl/dotty/pull/16988) +- Always use adapted type in withDenotation [#16901](https://github.com/lampepfl/dotty/pull/16901) +- Restrict captureWildcards to only be used if needed [#16799](https://github.com/lampepfl/dotty/pull/16799) +- Don't capture wildcards if in closure or by-name [#16732](https://github.com/lampepfl/dotty/pull/16732) +- Infer: Don't minimise to Nothing if there's an upper bound [#16786](https://github.com/lampepfl/dotty/pull/16786) +- Perform Matchable check only if type test is needed [#16824](https://github.com/lampepfl/dotty/pull/16824) +- Don't eta expand unary varargs methods [#16892](https://github.com/lampepfl/dotty/pull/16892) +- Fix beta-reduction with `Nothing` and `null` args [#16938](https://github.com/lampepfl/dotty/pull/16938) +- Generate kind-correct wildcards when selecting from a wildcard [#17025](https://github.com/lampepfl/dotty/pull/17025) +- Fix #16405 ctd - wildcards prematurely resolving to Nothing [#16764](https://github.com/lampepfl/dotty/pull/16764) +- Test: add regression test for #7790 [#17473](https://github.com/lampepfl/dotty/pull/17473) +- Properly handle `AnyVal`s as refinement members of `Selectable`s [#16286](https://github.com/lampepfl/dotty/pull/16286) + +# Contributors + +Thank you to all the contributors who made this release possible 🎉 + +According to `git shortlog -sn --no-merges 3.3.0..3.3.1` these are: + +``` + 152 Nicolas Stucki + 73 Martin Odersky + 54 Dale Wijnand + 51 Szymon Rodziewicz + 49 Quentin Bernet + 38 Chris Kipp + 31 Paweł Marks + 19 David Hua + 18 Lucas + 18 ysthakur + 15 Fengyun Liu + 14 Guillaume Martres + 14 Jamie Thompson + 11 Sébastien Doeraene + 9 Timothée Andres + 8 Kacper Korban + 7 Matt Bovel + 7 Som Snytt + 6 Julien Richard-Foy + 6 Lucas Leblanc + 5 Michał Pałka + 4 Anatolii Kmetiuk + 4 Guillaume Raffin + 4 Jan Chyb + 4 Paul Coral + 4 Wojciech Mazur + 4 Yichen Xu + 3 Decel + 2 Adrien Piquerez + 2 Arman Bilge + 2 Carl + 2 Florian3k + 2 Kenji Yoshida + 2 Michael Pilquist + 2 Natsu Kagami + 2 Seth Tisue + 2 Tomasz Godzik + 2 Vasil Vasilev + 2 Yadu Krishnan + 1 Bersier + 1 Flavio Brasil + 1 Jan-Pieter van den Heuvel + 1 Lukas Rytz + 1 Miles Yucht + 1 Mohammad Yousuf Minhaj Zia + 1 Ondra Pelech + 1 Philippus + 1 Rikito Taniguchi + 1 Simon R + 1 brandonspark + 1 github-actions[bot] + 1 liang3zy22 + 1 s.bazarsadaev + 1 Łukasz Wroński +``` From 721e7c87ee95b811984b7b992728729d7094c4c4 Mon Sep 17 00:00:00 2001 From: =?UTF-8?q?Pawe=C5=82=20Marks?= Date: Tue, 5 Sep 2023 14:35:21 +0200 Subject: [PATCH 137/371] Release 3.3.1 --- project/Build.scala | 4 ++-- 1 file changed, 2 insertions(+), 2 deletions(-) diff --git a/project/Build.scala b/project/Build.scala index f748ce44d4ca..f3ec6bb54548 100644 --- a/project/Build.scala +++ b/project/Build.scala @@ -82,7 +82,7 @@ object Build { val referenceVersion = "3.3.0" - val baseVersion = "3.3.1-RC7" + val baseVersion = "3.3.1" // Versions used by the vscode extension to create a new project // This should be the latest published releases. @@ -98,7 +98,7 @@ object Build { * set to 3.1.3. If it is going to be 3.1.0, it must be set to the latest * 3.0.x release. */ - val previousDottyVersion = "3.3.1-RC6" + val previousDottyVersion = "3.3.0" object CompatMode { final val BinaryCompatible = 0 From 3eb354bc1e9a47c2e92ba709ef83cb898d60fe83 Mon Sep 17 00:00:00 2001 From: =?UTF-8?q?Adam=20D=C4=85browski?= Date: Sun, 1 Oct 2023 21:56:51 +0200 Subject: [PATCH 138/371] Fix open-classes.md --- docs/_docs/reference/other-new-features/open-classes.md | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/docs/_docs/reference/other-new-features/open-classes.md b/docs/_docs/reference/other-new-features/open-classes.md index 764c234df599..10af6ead669e 100644 --- a/docs/_docs/reference/other-new-features/open-classes.md +++ b/docs/_docs/reference/other-new-features/open-classes.md @@ -77,4 +77,4 @@ A class that is neither `abstract` nor `open` is similar to a `sealed` class: it ## Migration -`open` is a new modifier in Scala 3. To allow cross compilation between Scala 2.13 and Scala 3.0 without warnings, the feature warning for ad-hoc extensions is produced only under `-source future`. It will be produced by default from Scala 3.1 on. +`open` is a new modifier in Scala 3. To allow cross compilation between Scala 2.13 and Scala 3.0 without warnings, the feature warning for ad-hoc extensions is produced only under `-source future`. It will be produced by default [from Scala 3.4 on](https://github.com/lampepfl/dotty/issues/16334). From bb5be18e11b412bc7fe39d76f0b4785297f49b86 Mon Sep 17 00:00:00 2001 From: Bersier Date: Wed, 6 Dec 2023 08:38:22 -0500 Subject: [PATCH 139/371] Update package-objects.md Package objects are not deprecated in Scala 3.3.1 yet. --- docs/_docs/reference/dropped-features/package-objects.md | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/docs/_docs/reference/dropped-features/package-objects.md b/docs/_docs/reference/dropped-features/package-objects.md index d8149e460bf5..9fe5bbd2de41 100644 --- a/docs/_docs/reference/dropped-features/package-objects.md +++ b/docs/_docs/reference/dropped-features/package-objects.md @@ -11,7 +11,7 @@ package object p { def b = ... } ``` -will be dropped. They are still available in Scala 3.0 and 3.1, but will be deprecated and removed afterwards. +will be dropped. They are still available, but will be deprecated and removed at some point in the future. Package objects are no longer needed since all kinds of definitions can now be written at the top-level. Example: ```scala From 41f5990b4b1857d5a10fbf8d5409a77015eeacc1 Mon Sep 17 00:00:00 2001 From: =?UTF-8?q?Pawe=C5=82=20Marks?= Date: Wed, 20 Dec 2023 17:29:56 +0100 Subject: [PATCH 140/371] Add changelog for 3.4.0-RC1 --- changelogs/3.4.0-RC1.md | 466 ++++++++++++++++++++++++++++++++++++++++ 1 file changed, 466 insertions(+) create mode 100644 changelogs/3.4.0-RC1.md diff --git a/changelogs/3.4.0-RC1.md b/changelogs/3.4.0-RC1.md new file mode 100644 index 000000000000..79695cad83f9 --- /dev/null +++ b/changelogs/3.4.0-RC1.md @@ -0,0 +1,466 @@ +# Highlights of the release + +- Make polymorphic functions more efficient and expressive [#17548](https://github.com/lampepfl/dotty/pull/17548) +- SIP-56: Better foundations for match types [#18262](https://github.com/lampepfl/dotty/pull/18262) +- Make SIP 54 (Multi-Source Extension Overloads) a standard feature [#17441](https://github.com/lampepfl/dotty/pull/17441) +- Value parameter inference for polymorphic lambdas [#18041](https://github.com/lampepfl/dotty/pull/18041) +- Add `@publicInBinary` annotation and `-WunstableInlineAccessors` linting flag [#18402](https://github.com/lampepfl/dotty/pull/18402) +- Stabilize Quotes `defn.PolyFunction` [#18480](https://github.com/lampepfl/dotty/pull/18480) +- Stabilize Quotes `Flags.AbsOverride` [#18482](https://github.com/lampepfl/dotty/pull/18482) +- Add `-experimental` compiler flags [#18571](https://github.com/lampepfl/dotty/pull/18571) +- Stabilize SIP-53 (quote pattern explicit type variable syntax) [#18574](https://github.com/lampepfl/dotty/pull/18574) +- Add reflect TypeRepr.dealiasKeepOpaques [#18583](https://github.com/lampepfl/dotty/pull/18583) +- Add attributes section to TASTy and use it for Stdlib TASTy [#18599](https://github.com/lampepfl/dotty/pull/18599) +- Error when reading class file with unknown newer jdk version [#18618](https://github.com/lampepfl/dotty/pull/18618) +- Add support for xsbti.compile.CompileProgress [#18739](https://github.com/lampepfl/dotty/pull/18739) +- Improve type inference for functions like fold [#18780](https://github.com/lampepfl/dotty/pull/18780) +- Improve error message for mismatched tasty versions, allow configuration of header unpickler [#18828](https://github.com/lampepfl/dotty/pull/18828) +- In 3.4 make refutable patterns in a for comprehension an error [#18842](https://github.com/lampepfl/dotty/pull/18842) +- Disallow use of PolyFunction in user code [#18920](https://github.com/lampepfl/dotty/pull/18920) +- Store source file in TASTY attributes [#18948](https://github.com/lampepfl/dotty/pull/18948) +- First step to pipelining support - enable reading Java symbols from TASTy [#19074](https://github.com/lampepfl/dotty/pull/19074) +- Activate constrainResult fix in 3.4 [#19253](https://github.com/lampepfl/dotty/pull/19253) +- Parallelise JVM backend - Scala 2 port [#15392](https://github.com/lampepfl/dotty/pull/15392) + +## Deprecation warnings for old syntax + +- `_` type wildcards [#18813](https://github.com/lampepfl/dotty/pull/18813) +- `private[this]` [#18819](https://github.com/lampepfl/dotty/pull/18819) +- `var x = _` [#18821](https://github.com/lampepfl/dotty/pull/18821) +- `with` as a type operator [#18837](https://github.com/lampepfl/dotty/pull/18837) +- `xs: _*` varargs [#18872](https://github.com/lampepfl/dotty/pull/18872) +- trailing `_` to force eta expansion [#18926](https://github.com/lampepfl/dotty/pull/18926) + +# Other changes and fixes + +## Backend + +- Count size of parameters for platform limit check [#18464](https://github.com/lampepfl/dotty/pull/18464) +- Don't emit line number for synthetic unit value [#18717](https://github.com/lampepfl/dotty/pull/18717) +- Avoid too eager transform of $outer for lhs & accessor rhs [#18949](https://github.com/lampepfl/dotty/pull/18949) +- Make more anonymous functions static [#19251](https://github.com/lampepfl/dotty/pull/19251) +- Fix deadlock in initialization of CoreBTypes using Lazy container [#19298](https://github.com/lampepfl/dotty/pull/19298) +- Fix #18769: Allow HK type args in Java signatures. [#18883](https://github.com/lampepfl/dotty/pull/18883) +- Loading symbols from TASTy files directly [#17594](https://github.com/lampepfl/dotty/pull/17594) +- Use dedicated equals method for univerval equality of chars [#18770](https://github.com/lampepfl/dotty/pull/18770) + +## Erasure + +- Get generic signature of fields entered after erasure from their accessor [#19207](https://github.com/lampepfl/dotty/pull/19207) +- Detect case where two alternatives are the same after widening ExprTypes [#18787](https://github.com/lampepfl/dotty/pull/18787) +- Improve erased params logic [#18433](https://github.com/lampepfl/dotty/pull/18433) + +## Experimental: Capture Checking + +- Fix capture set variable installation in Setup [#18885](https://github.com/lampepfl/dotty/pull/18885) +- Don't follow opaque aliases when transforming sym info for cc [#18929](https://github.com/lampepfl/dotty/pull/18929) +- Reset `comparersInUse` to zero in `ContextState.reset` [#18915](https://github.com/lampepfl/dotty/pull/18915) +- Special handling of experimental.captureChecking import [#17427](https://github.com/lampepfl/dotty/pull/17427) +- Change handling of curried function types in capture checking [#18131](https://github.com/lampepfl/dotty/pull/18131) +- Fix #18246: correctly compute capture sets in `TypeComparer.glb` [#18254](https://github.com/lampepfl/dotty/pull/18254) +- New capture escape checking based on levels [#18463](https://github.com/lampepfl/dotty/pull/18463) +- A more robust scheme for resetting denotations after Recheck [#18534](https://github.com/lampepfl/dotty/pull/18534) +- A more flexible scheme for handling the universal capability [#18699](https://github.com/lampepfl/dotty/pull/18699) +- Fix potential soundness hole when adding references to a mapped capture set [#18758](https://github.com/lampepfl/dotty/pull/18758) +- Alternative scheme for cc encapsulation [#18899](https://github.com/lampepfl/dotty/pull/18899) +- Make reach refinement shallow [#19171](https://github.com/lampepfl/dotty/pull/19171) + +## F-bounds + +- Don't check bounds of Java applications in Java units [#18054](https://github.com/lampepfl/dotty/pull/18054) + +## GADTs + +- Avoid embedding SelectionProtos in Conversions [#17755](https://github.com/lampepfl/dotty/pull/17755) +- Freeze constraints while calculating GADT full bounds [#18222](https://github.com/lampepfl/dotty/pull/18222) + +## Implicits + +- Followup fix to transparent inline conversion [#18130](https://github.com/lampepfl/dotty/pull/18130) +- Select local implicits over name-imported over wildcard imported [#18203](https://github.com/lampepfl/dotty/pull/18203) +- Fix how implicit candidates are combined [#18321](https://github.com/lampepfl/dotty/pull/18321) +- Improve error message about missing type of context function parameter [#18788](https://github.com/lampepfl/dotty/pull/18788) +- Support implicit arguments before extractor method [#18671](https://github.com/lampepfl/dotty/pull/18671) +- Tweak convertible implicits fix [#18727](https://github.com/lampepfl/dotty/pull/18727) + +## Incremental Compilation + +- Make incremental compilation aware of synthesized mirrors [#18310](https://github.com/lampepfl/dotty/pull/18310) + +## Inference + +- Honour hard unions in lubbing and param replacing [#18680](https://github.com/lampepfl/dotty/pull/18680) + +## Infrastructure + +- Use -Yscala2-library-tasty to add Scala 2 lib TASTY to scalac (internal only) [#18613](https://github.com/lampepfl/dotty/pull/18613) +- Rename `stdlib-bootstrapped-tasty` to `scala2-library-tasty` [#18615](https://github.com/lampepfl/dotty/pull/18615) +- Fix #19286: Freeze rubygems-update at < 3.5.0. [#19288](https://github.com/lampepfl/dotty/pull/19288) + +## Initialization + +- Fix #17997: Handle intersection type as this type of super type [#18069](https://github.com/lampepfl/dotty/pull/18069) +- Add test for issue #17997 affecting the global object initialization checker [#18141](https://github.com/lampepfl/dotty/pull/18141) +- Fix i18624 and add test case for it [#18859](https://github.com/lampepfl/dotty/pull/18859) +- Treat new Array(0) as immutable [#19192](https://github.com/lampepfl/dotty/pull/19192) +- Fix #18407: Ignore Quote/Slice in init checker [#18848](https://github.com/lampepfl/dotty/pull/18848) +- Check safe initialization of static objects [#16970](https://github.com/lampepfl/dotty/pull/16970) +- Pattern match support in checking global objects [#18127](https://github.com/lampepfl/dotty/pull/18127) +- Fix crash in global object initialization checker when select target has no source [#18627](https://github.com/lampepfl/dotty/pull/18627) +- Fix warning underlining in global init checker [#18668](https://github.com/lampepfl/dotty/pull/18668) +- Fix i18629 [#18839](https://github.com/lampepfl/dotty/pull/18839) +- I18628 [#18841](https://github.com/lampepfl/dotty/pull/18841) +- Make safe init checker skip global objects [#18906](https://github.com/lampepfl/dotty/pull/18906) +- Handle local lazy vals properly [#18998](https://github.com/lampepfl/dotty/pull/18998) + +## Inline + +- Fix regression: inline match crash when rhs uses private inlined methods [#18595](https://github.com/lampepfl/dotty/pull/18595) +- Add structural classes of dynamicApply before inlining [#18766](https://github.com/lampepfl/dotty/pull/18766) +- Set missing expansion span for copied inlined node [#18229](https://github.com/lampepfl/dotty/pull/18229) +- Fix `callTrace` of inlined methods [#18738](https://github.com/lampepfl/dotty/pull/18738) + +## Linting + +- Keep tree of type ascriptions of quote pattern splices [#18412](https://github.com/lampepfl/dotty/pull/18412) +- Fix false positive in WUnused for renamed path-dependent imports [#18468](https://github.com/lampepfl/dotty/pull/18468) +- Fix false positive in WUnused for renamed path-dependent imports (2) [#18617](https://github.com/lampepfl/dotty/pull/18617) +- Fix wunused false positive on CanEqual [#18641](https://github.com/lampepfl/dotty/pull/18641) +- Implement -Xlint:private-shadow, type-parameter-shadow [#17622](https://github.com/lampepfl/dotty/pull/17622) +- Fix: reversed wconf parsing order to mirror scala 2 [#18503](https://github.com/lampepfl/dotty/pull/18503) +- Revert Fix false positive in WUnused for renamed path-dependent imports [#18514](https://github.com/lampepfl/dotty/pull/18514) + +## Macro Annotations + +- Enter missing symbols generated by the MacroAnnotation expansion [#18826](https://github.com/lampepfl/dotty/pull/18826) + +## Match Types + +- Allow Tuple.Head and Tuple.Tail to work with EmptyTuple [#17189](https://github.com/lampepfl/dotty/pull/17189) +- Fix match type reduction with avoided types [#18043](https://github.com/lampepfl/dotty/pull/18043) +- Strip LazyRef before calling simplified, in MT reduction [#18218](https://github.com/lampepfl/dotty/pull/18218) +- Fix MT separate compilation bug [#18398](https://github.com/lampepfl/dotty/pull/18398) +- Do not show deprecation warning for `_` in type match case [#18887](https://github.com/lampepfl/dotty/pull/18887) + +## Nullability + +- Improve logic when to emit pattern type error [#18093](https://github.com/lampepfl/dotty/pull/18093) +- Allow nullability flow typing even in presence of pattern match [#18206](https://github.com/lampepfl/dotty/pull/18206) +- Fix #11967: flow typing nullability in pattern matches [#18212](https://github.com/lampepfl/dotty/pull/18212) +- Fix #18282: consider Predef.eq/ne in nullability flow typing [#18299](https://github.com/lampepfl/dotty/pull/18299) +- Make `this.type` nullable again (unless under -Yexplicit-nulls). [#18399](https://github.com/lampepfl/dotty/pull/18399) + +## Opaque Types + +- Type ascribe trees that require opaque type usage [#18101](https://github.com/lampepfl/dotty/pull/18101) + +## Parser + +- Fix selecting terms using _root_ [#18335](https://github.com/lampepfl/dotty/pull/18335) +- Tweak java getlitch not to skip zero [#18491](https://github.com/lampepfl/dotty/pull/18491) +- Fix i18518 [#18520](https://github.com/lampepfl/dotty/pull/18520) +- Only apply `future` patches on `future-migration` [#18820](https://github.com/lampepfl/dotty/pull/18820) +- Parser simple expression error recovery change from `null` to `???` [#19103](https://github.com/lampepfl/dotty/pull/19103) + +## Pattern Matching + +- Fix syntax and parsing of vararg patterns [#18055](https://github.com/lampepfl/dotty/pull/18055) +- Avoid over widening in SpaceEngine [#18252](https://github.com/lampepfl/dotty/pull/18252) +- Fix regression in exhaustivity of HK types [#18303](https://github.com/lampepfl/dotty/pull/18303) +- Fix missing case in isSubspace, which broke reachablility [#18326](https://github.com/lampepfl/dotty/pull/18326) +- Unsuppress unchecked warnings [#18377](https://github.com/lampepfl/dotty/pull/18377) +- Consider extension methods in Space isSameUnapply [#18642](https://github.com/lampepfl/dotty/pull/18642) +- Fix unreachable warning in deeply nested sealed hierarchy [#18706](https://github.com/lampepfl/dotty/pull/18706) +- Remove unnecessary and recursive Space decomposition [#19216](https://github.com/lampepfl/dotty/pull/19216) +- Prioritise sequence-matches over product-sequence-matches [#19260](https://github.com/lampepfl/dotty/pull/19260) +- Propagate constant in result of inline match [#18455](https://github.com/lampepfl/dotty/pull/18455) +- Disable match anaylsis in inlined trees [#19190](https://github.com/lampepfl/dotty/pull/19190) +- Teach provablyDisjoint about AnyKind [#18510](https://github.com/lampepfl/dotty/pull/18510) +- Warn about unchecked type tests in primitive catch cases [#19206](https://github.com/lampepfl/dotty/pull/19206) +- Reprioritise seq-match over product-seq-match [#19277](https://github.com/lampepfl/dotty/pull/19277) +- Fix exhaustivity due to separate TypeVar lambdas [#18616](https://github.com/lampepfl/dotty/pull/18616) + +## Presentation Compiler + +- Support completions for extension definition parameter [#18331](https://github.com/lampepfl/dotty/pull/18331) +- Fix: Don't collect map, flatMap, withFilter in for-comprehension [#18430](https://github.com/lampepfl/dotty/pull/18430) +- Bugfix: Catch exception from the compiler for broken shadowed pickles [#18502](https://github.com/lampepfl/dotty/pull/18502) +- Bugfix: highlight for enum type params [#18528](https://github.com/lampepfl/dotty/pull/18528) +- Bugfix: No signature help for local methods [#18594](https://github.com/lampepfl/dotty/pull/18594) +- Bugfix: add `moduleClass` imported symbols in `IndexedContext` [#18620](https://github.com/lampepfl/dotty/pull/18620) +- Bugfix: Named args completions with default values [#18633](https://github.com/lampepfl/dotty/pull/18633) +- Fix: match completions for type aliases [#18667](https://github.com/lampepfl/dotty/pull/18667) +- Bugfix: add multiline comment completion [#18703](https://github.com/lampepfl/dotty/pull/18703) +- Bugfix: Backticked named arguments [#18704](https://github.com/lampepfl/dotty/pull/18704) +- Bugfix: [metals] Case completions for tuple type [#18751](https://github.com/lampepfl/dotty/pull/18751) +- Completions should prepend, not replace as it is for Scala 2 [#18803](https://github.com/lampepfl/dotty/pull/18803) +- Bugfix: rename end marker [#18838](https://github.com/lampepfl/dotty/pull/18838) +- Presentation compiler: Bugfix for semantic tokens and synthetic decorations [#18955](https://github.com/lampepfl/dotty/pull/18955) +- Show documentation for value forwarders in completions [#19200](https://github.com/lampepfl/dotty/pull/19200) +- Bugfix: Document highlight on class constructors [#19209](https://github.com/lampepfl/dotty/pull/19209) +- Bugfix: Completions for extension methods with name conflict [#19225](https://github.com/lampepfl/dotty/pull/19225) + +## Polyfunctions + +- Check user defined PolyFunction refinements [#18457](https://github.com/lampepfl/dotty/pull/18457) +- Support polymorphic functions with erased parameters [#18293](https://github.com/lampepfl/dotty/pull/18293) +- Use `PolyFunction` instead of `ErasedFunction` [#18295](https://github.com/lampepfl/dotty/pull/18295) + +## Quotes + +- Support type variable with bounds in quoted pattern [#16910](https://github.com/lampepfl/dotty/pull/16910) +- Add new EXPLICITtpt to TASTy format [#17298](https://github.com/lampepfl/dotty/pull/17298) +- Inhibit typer to insert contextual arguments when it is inside arguments of HOAS patterns [#18040](https://github.com/lampepfl/dotty/pull/18040) +- Compile quote patterns directly into QuotePattern AST [#18133](https://github.com/lampepfl/dotty/pull/18133) +- Add missing span to synthesized product mirror [#18354](https://github.com/lampepfl/dotty/pull/18354) +- Improve non-static macro implementation error message [#18405](https://github.com/lampepfl/dotty/pull/18405) +- Fix scala 2 macros in traits with type parameters [#18663](https://github.com/lampepfl/dotty/pull/18663) +- Patch `underlyingArgument` to avoid mapping into modules [#18923](https://github.com/lampepfl/dotty/pull/18923) +- Fallback erasing term references [#18731](https://github.com/lampepfl/dotty/pull/18731) +- Fix ignored type variable bound warning in type quote pattern [#18199](https://github.com/lampepfl/dotty/pull/18199) +- Splice hole with singleton captures [#18357](https://github.com/lampepfl/dotty/pull/18357) +- Fix macros with erased arguments [#18431](https://github.com/lampepfl/dotty/pull/18431) +- Deprecate 3-arg `FunctionClass` constructor [#18472](https://github.com/lampepfl/dotty/pull/18472) +- Deprecate `Quotes` `{MethodType,TermParamClause}.isErased` [#18479](https://github.com/lampepfl/dotty/pull/18479) +- Avoid crashes on missing positions [#19250](https://github.com/lampepfl/dotty/pull/19250) + +## Reflection + +- Add reflect.ValOrDefDef [#16974](https://github.com/lampepfl/dotty/pull/16974) +- Check New tree for ill-formed module instantiations [#17553](https://github.com/lampepfl/dotty/pull/17553) +- Add reflect `TypeLambda.paramVariances` [#17568](https://github.com/lampepfl/dotty/pull/17568) +- Make check flags for `newMethod`, `newVal` and `newBind` in Quotes API less restrictive [#18217](https://github.com/lampepfl/dotty/pull/18217) +- Normalise mirrorType for mirror Synthesis [#19199](https://github.com/lampepfl/dotty/pull/19199) +- Add reflect `defn.FunctionClass` overloads [#16849](https://github.com/lampepfl/dotty/pull/16849) +- Stabilize reflect flag `JavaAnnotation` [#19267](https://github.com/lampepfl/dotty/pull/19267) +- Stabilize reflect `paramVariance` [#19268](https://github.com/lampepfl/dotty/pull/19268) + +## Reporting + +- Take into account the result type of inline implicit conversions unless they are transparent [#17924](https://github.com/lampepfl/dotty/pull/17924) +- Check if a fatal warning issued in typer is silenced, before converting it into an error [#18089](https://github.com/lampepfl/dotty/pull/18089) +- Elide companion defs to a `object` extending `AnyVal` [#18451](https://github.com/lampepfl/dotty/pull/18451) +- Add regression test for issue i18493 [#18497](https://github.com/lampepfl/dotty/pull/18497) +- Add better explanation to error message [#18665](https://github.com/lampepfl/dotty/pull/18665) +- Better error message when accessing private members [#18690](https://github.com/lampepfl/dotty/pull/18690) +- Improve message for discarded pure non-Unit values [#18723](https://github.com/lampepfl/dotty/pull/18723) +- Better error message when a pattern match extractor is not found. [#18725](https://github.com/lampepfl/dotty/pull/18725) +- Give "did you mean ...?" hints also for simple identifiers [#18747](https://github.com/lampepfl/dotty/pull/18747) +- Better error for definition followed by keyword [#18752](https://github.com/lampepfl/dotty/pull/18752) +- Better explain message for 'pattern expected' [#18753](https://github.com/lampepfl/dotty/pull/18753) +- Improve failure message of enum `fromOrdinal`/`valueOf` [#19182](https://github.com/lampepfl/dotty/pull/19182) +- Fix type mismatch error confusion between types with same simple name [#19204](https://github.com/lampepfl/dotty/pull/19204) +- Add hint for nested quotes missing staged `Quotes` [#18755](https://github.com/lampepfl/dotty/pull/18755) +- Better error messages for missing commas and more [#18785](https://github.com/lampepfl/dotty/pull/18785) +- Fix imported twice error messages [#18102](https://github.com/lampepfl/dotty/pull/18102) +- Improve error message for inaccessible types [#18406](https://github.com/lampepfl/dotty/pull/18406) +- Future migration warning for `with` type operator [#18818](https://github.com/lampepfl/dotty/pull/18818) +- Improve assertion error message for `Apply` and `TypeApply` [#18700](https://github.com/lampepfl/dotty/pull/18700) +- Shorten traces for TypeMismatch errors under -explain [#18742]( +- Improve `with` in type migration warning [#18852](https://github.com/lampepfl/dotty/pull/18852) +hub.com/lampepfl/dotty/pull/18742) +- Future migration warning for alphanumeric infix operator [#18908](https://github.com/lampepfl/dotty/pull/18908) +- Make sure that trace is shown correctly in the presence of invalid line numbers [#18930](https://github.com/lampepfl/dotty/pull/18930) +- Add migration warning for XML literals in language future [#19101](https://github.com/lampepfl/dotty/pull/19101) +- Avoid diagnostic message forcing crashing the compiler [#19113](https://github.com/lampepfl/dotty/pull/19113) +- Make sure that the stacktrace is shown with `-Ydebug-unpickling` [#19115](https://github.com/lampepfl/dotty/pull/19115) +- Improve `asExprOf` cast error formatting [#19195](https://github.com/lampepfl/dotty/pull/19195) +- Do not warn on underscore wildcard type in pattern [#19249](https://github.com/lampepfl/dotty/pull/19249) + +## Scala-JS + +- Fix #18658: Handle varargs of generic types in `JSExportsGen`. [#18659](https://github.com/lampepfl/dotty/pull/18659) + +## Scaladoc + +- Fix incorrect comment parser used in nightly scaladoc [#18523](https://github.com/lampepfl/dotty/pull/18523) + +## SemanticDB + +- Export diagnostics (including unused warnings) to SemanticDB [#17835](https://github.com/lampepfl/dotty/pull/17835) +- Bugfix: Incorrect semanticdb span on Selectable [#18576](https://github.com/lampepfl/dotty/pull/18576) +- Bugfix: in semanticdb make synthetic apply disambiguator consistent w/ Scala 2 implicit [#17341](https://github.com/lampepfl/dotty/pull/17341) + +## Standard Library + +- Intrinsify `constValueTuple` and `summonAll` [#18013](https://github.com/lampepfl/dotty/pull/18013) +- Fix #18609: Add language.`3.4` and language.`3.4-migration`. [#18610](https://github.com/lampepfl/dotty/pull/18610) + +## TASTy format + +- Eliminate FromJavaObject from TASTy of Java sources [#19259](https://github.com/lampepfl/dotty/pull/19259) +- Add new HOLETYPES to TASTy format [#17225](https://github.com/lampepfl/dotty/pull/17225) +- Add capture checking attributes to TASTy [#19033](https://github.com/lampepfl/dotty/pull/19033) +- Add TASTyInfo abstraction [#19089](https://github.com/lampepfl/dotty/pull/19089) +- Add UTF8 abstraction in the TASTy format [#19090](https://github.com/lampepfl/dotty/pull/19090) + +## Tooling + +- Don't add explanation twice [#18779](https://github.com/lampepfl/dotty/pull/18779) +- ExtractDependencies uses more efficient caching [#18403](https://github.com/lampepfl/dotty/pull/18403) +- Introduce the SourceVersions 3.4 and 3.4-migration; make 3.4 the default. [#18501](https://github.com/lampepfl/dotty/pull/18501) +- Bugfix: Completions for named args in wrong order [#18702](https://github.com/lampepfl/dotty/pull/18702) +- Align unpickled Scala 2 accessors encoding with Scala 3 [#18874](https://github.com/lampepfl/dotty/pull/18874) +- Reinterpret Scala 2 case accessors `xyz$access$idx` [#18907](https://github.com/lampepfl/dotty/pull/18907) +- Presentation-compiler: Add synthetic decorations [#18951](https://github.com/lampepfl/dotty/pull/18951) +- Add compilation unit info to `ClassSymbol` [#19010](https://github.com/lampepfl/dotty/pull/19010) +- Make sure that patches for 3.0 are also applied in later versions [#19018](https://github.com/lampepfl/dotty/pull/19018) + +## Transform + +- Also consider @targetName when checking private overrides [#18361](https://github.com/lampepfl/dotty/pull/18361) +- Teach PostTyper to handle untupled context closures [#17739](https://github.com/lampepfl/dotty/pull/17739) +- Properly dealias tuple types when specializing [#18724](https://github.com/lampepfl/dotty/pull/18724) +- Fix condition in prefixIsElidable to prevent compiler crash [#18924](https://github.com/lampepfl/dotty/pull/18924) +- Fix #18816: Transfer the span of rewired `This` nodes in `fullyParameterizedDef`. [#18840](https://github.com/lampepfl/dotty/pull/18840) +- List(...) optimization to avoid intermediate array [#17166](https://github.com/lampepfl/dotty/pull/17166) +- Make Array.apply an intrinsic [#18537](https://github.com/lampepfl/dotty/pull/18537) +- Add missing span to extension method select [#18557](https://github.com/lampepfl/dotty/pull/18557) + +## Tuples + +- Handle TupleXXL in match analysis [#19212](https://github.com/lampepfl/dotty/pull/19212) +- Add `reverse` method to `NonEmptyTuple` [#13752](https://github.com/lampepfl/dotty/pull/13752) +- Refine handling of pattern binders for large tuples [#19085](https://github.com/lampepfl/dotty/pull/19085) +- Introduce `Tuple.ReverseOnto` and use it in `Tuple.reverse` [#19183](https://github.com/lampepfl/dotty/pull/19183) + +## Typeclass Derivation + +- Consider all parents when checking access to the children of a sum [#19083](https://github.com/lampepfl/dotty/pull/19083) + +## Typer + +- Fix logic when comparing var/def bindings with val refinements [#18049](https://github.com/lampepfl/dotty/pull/18049) +- Fix variance checking in refinements [#18053](https://github.com/lampepfl/dotty/pull/18053) +- Fix accessibleType for package object prefixes [#18057](https://github.com/lampepfl/dotty/pull/18057) +- Refix avoid GADT casting with ProtoTypes [#18085](https://github.com/lampepfl/dotty/pull/18085) +- Avoid shadowing by private definitions in more situations [#18142](https://github.com/lampepfl/dotty/pull/18142) +- Refine infoDependsOnPrefix [#18204](https://github.com/lampepfl/dotty/pull/18204) +- Fix spurious subtype check pruning when both sides have unions [#18213](https://github.com/lampepfl/dotty/pull/18213) +- Reimplement support for type aliases in SAM types [#18317](https://github.com/lampepfl/dotty/pull/18317) +- Fix adaptation of constants to constant type aliases [#18360](https://github.com/lampepfl/dotty/pull/18360) +- Issue "positional after named argument" errors [#18363](https://github.com/lampepfl/dotty/pull/18363) +- Deprecate `ops.long.S` [#18426](https://github.com/lampepfl/dotty/pull/18426) +- Tweak selection from self types [#18467](https://github.com/lampepfl/dotty/pull/18467) +- Use the unwidened type when casting structural calls [#18527](https://github.com/lampepfl/dotty/pull/18527) +- Fix #18649: Use loBound of param types when materializing a context function. [#18651](https://github.com/lampepfl/dotty/pull/18651) +- Identify structural trees on Match Type qualifiers [#18765](https://github.com/lampepfl/dotty/pull/18765) +- Tweak approximation of type variables when computing default types [#18798](https://github.com/lampepfl/dotty/pull/18798) +- Admit parametric aliases of classes in parent typing [#18849](https://github.com/lampepfl/dotty/pull/18849) +- Also add privateWithin when creating constructor proxies [#18893](https://github.com/lampepfl/dotty/pull/18893) +- Revert part of `Simplify defn.FunctionOf.unapply` [#19012](https://github.com/lampepfl/dotty/pull/19012) +- Check @targetName when subtyping Refined Types [#19081](https://github.com/lampepfl/dotty/pull/19081) +- Record failures to adapt application arguments [#18269](https://github.com/lampepfl/dotty/pull/18269) +- Improve handling of AndTypes on the LHS of subtype comparisons [#18235](https://github.com/lampepfl/dotty/pull/18235) +- Allow inferred parameter types always, when eta-expanding [#18771](https://github.com/lampepfl/dotty/pull/18771) +- Fix failing bounds check on default getter [#18419](https://github.com/lampepfl/dotty/pull/18419) +- Use constructor's default getters in case class synthetic `apply` methods [#18716](https://github.com/lampepfl/dotty/pull/18716) +- Keep qualifier of Ident when selecting setter [#18714](https://github.com/lampepfl/dotty/pull/18714) +- Retract SynthesizeExtMethodReceiver mode when when going deeper in overloading resolution [#18759](https://github.com/lampepfl/dotty/pull/18759) +- Constant fold all the number conversion methods [#17446](https://github.com/lampepfl/dotty/pull/17446) +- Refine criterion when to widen types [#17180](https://github.com/lampepfl/dotty/pull/17180) +- Run all MatchType reduction under Mode.Type [#17937](https://github.com/lampepfl/dotty/pull/17937) +- Force consistent MT post-redux normalisation, disallow infinite match types [#18073](https://github.com/lampepfl/dotty/pull/18073) +- Fix #17467: Limit isNullable widening to stable TermRefs; remove under explicit nulls. [#17470](https://github.com/lampepfl/dotty/pull/17470) +- Disallow naming the root package, except for selections [#18187](https://github.com/lampepfl/dotty/pull/18187) +- Contextual varargs parameters [#18186](https://github.com/lampepfl/dotty/pull/18186) +- Encode the name of the attribute in Selectable.selectDynamic [#18928](https://github.com/lampepfl/dotty/pull/18928) +- Remove linearization requirement for override ref checks from java classes [#18953](https://github.com/lampepfl/dotty/pull/18953) +- Fix type inferencing (constraining) regressions [#19189](https://github.com/lampepfl/dotty/pull/19189) +- Repeated params must correspond in override [#16836](https://github.com/lampepfl/dotty/pull/16836) +- Convert SAM result types to function types [#17740](https://github.com/lampepfl/dotty/pull/17740) +- Disallow `infix` objects [#17966](https://github.com/lampepfl/dotty/pull/17966) +- Fix hasMatchingMember handling NoDenotation [#17977](https://github.com/lampepfl/dotty/pull/17977) +- Fix: disallow toplevel infix definitions for vals, vars, givens, methods and implicits [#17994](https://github.com/lampepfl/dotty/pull/17994) +- Curried methods are not valid SAM methods [#18110](https://github.com/lampepfl/dotty/pull/18110) +- Fix #17115: Try to normalize while computing `typeSize`. [#18386](https://github.com/lampepfl/dotty/pull/18386) +- Add default arguments to derived refined type [#18435](https://github.com/lampepfl/dotty/pull/18435) +- Handle dependent context functions [#18443](https://github.com/lampepfl/dotty/pull/18443) +- Fix variance loophole for private vars [#18693](https://github.com/lampepfl/dotty/pull/18693) +- Avoid crash arising from trying to find conversions from polymorphic singleton types [#18760](https://github.com/lampepfl/dotty/pull/18760) +- Allow inner classes of universal traits [#18796](https://github.com/lampepfl/dotty/pull/18796) +- Prevent crash when extension not found [#18830](https://github.com/lampepfl/dotty/pull/18830) + +# Contributors + +Thank you to all the contributors who made this release possible 🎉 + +According to `git shortlog -sn --no-merges 3.3.1..3.4.0-RC1` these are: + +``` + 458 Martin Odersky + 291 Nicolas Stucki + 132 Fengyun Liu + 118 Dale Wijnand + 77 Jamie Thompson + 69 Sébastien Doeraene + 49 Paweł Marks + 32 Chris Kipp + 27 Guillaume Martres + 26 Rikito Taniguchi + 21 Yichen Xu + 19 EnzeXing + 14 Szymon Rodziewicz + 13 Lucas Leblanc + 12 Jakub Ciesluk + 12 Jędrzej Rochala + 12 Katarzyna Marek + 11 Carl + 10 David Hua + 9 Florian3k + 9 Wojciech Mazur + 8 Eugene Flesselle + 8 ghostbuster91 + 7 Hamza Remmal + 7 Ondrej Lhotak + 7 Quentin Bernet + 6 Jan Chyb + 6 Julien Richard-Foy + 6 Kacper Korban + 6 Seth Tisue + 5 Lorenzo Gabriele + 5 Matt Bovel + 5 Som Snytt + 5 Yuito Murase + 5 dependabot[bot] + 3 David + 3 Lucas + 3 Pascal Weisenburger + 3 Tomasz Godzik + 2 Aleksander Rainko + 2 Decel + 2 Guillaume Raffin + 2 Ondřej Lhoták + 2 Oron Port + 2 danecek + 2 rochala + 1 Adam Dąbrowski + 1 Aleksey Troitskiy + 1 Arnout Engelen + 1 Ausmarton Zarino Fernandes + 1 Bjorn Regnell + 1 Daniel Esik + 1 Eugene Yokota + 1 François Monniot + 1 Jakub Cieśluk + 1 John Duffell + 1 John M. Higgins + 1 Justin Reardon + 1 Kai + 1 Kisaragi + 1 Lucas Nouguier + 1 Lukas Rytz + 1 LydiaSkuse + 1 Martin Kucera + 1 Martin Kučera + 1 Matthew Rooney + 1 Matthias Kurz + 1 Mikołaj Fornal + 1 Nicolas Almerge + 1 Preveen P + 1 Shardul Chiplunkar + 1 Stefan Wachter + 1 philippus + 1 q-ata + 1 slim +``` From 939ba35b59a7ed3126228a3173729f59fee60b44 Mon Sep 17 00:00:00 2001 From: =?UTF-8?q?Pawe=C5=82=20Marks?= Date: Wed, 20 Dec 2023 17:33:01 +0100 Subject: [PATCH 141/371] Release 3.4.0-RC1 From 074128b9f2d6bce066e7137042b19dd401353ecc Mon Sep 17 00:00:00 2001 From: Bersier Date: Thu, 28 Dec 2023 11:21:51 -0500 Subject: [PATCH 142/371] Update wildcards.md The documentation gives incorrect Scala versions for the transitions. --- .../reference/changed-features/wildcards.md | 16 ++++++++-------- 1 file changed, 8 insertions(+), 8 deletions(-) diff --git a/docs/_docs/reference/changed-features/wildcards.md b/docs/_docs/reference/changed-features/wildcards.md index 0d3e13c3d7e0..ac7235770e36 100644 --- a/docs/_docs/reference/changed-features/wildcards.md +++ b/docs/_docs/reference/changed-features/wildcards.md @@ -4,7 +4,7 @@ title: Wildcard Arguments in Types nightlyOf: https://docs.scala-lang.org/scala3/reference/changed-features/wildcards.html --- -The syntax of wildcard arguments in types has changed from `_` to `?`. Example: +The syntax of wildcard arguments in types is changing from `_` to `?`. Example: ```scala List[?] Map[? <: AnyRef, ? >: Null] @@ -14,8 +14,8 @@ Map[? <: AnyRef, ? >: Null] We would like to use the underscore syntax `_` to stand for an anonymous type parameter, aligning it with its meaning in value parameter lists. So, just as `f(_)` is a shorthand for the lambda `x => f(x)`, in the future `C[_]` will be a shorthand -for the type lambda `[X] =>> C[X]`. This makes higher-kinded types easier to use. It also removes the wart that, used as a type -parameter, `F[_]` means `F` is a type constructor whereas used as a type, `F[_]` means it is a wildcard (i.e. existential) type. +for the type lambda `[X] =>> C[X]`. This will make higher-kinded types easier to use. It will also remove the wart that, used as a type +parameter, `F[_]` means `F` is a type constructor, whereas used as a type, `F[_]` means it is a wildcard (i.e. existential) type. In the future, `F[_]` will mean the same thing, no matter where it is used. We pick `?` as a replacement syntax for wildcard types, since it aligns with @@ -28,11 +28,11 @@ compiler plugin still uses the reverse convention, with `?` meaning parameter pl A step-by-step migration is made possible with the following measures: - 1. In Scala 3.0, both `_` and `?` are legal names for wildcards. - 2. In Scala 3.1, `_` is deprecated in favor of `?` as a name for a wildcard. A `-rewrite` option is + 1. In earlier versions of Scala 3, both `_` and `?` are legal names for wildcards. + 2. In Scala 3.4, `_` will be deprecated in favor of `?` as a name for wildcards. A `-rewrite` option is available to rewrite one to the other. - 3. In Scala 3.2, the meaning of `_` changes from wildcard to placeholder for type parameter. - 4. The Scala 3.1 behavior is already available today under the `-source future` setting. + 3. At some later point in the future, the meaning of `_` will change from wildcard to placeholder for type parameters. + 4. Some deprecation warnings are already available under the `-source future` setting. To smooth the transition for codebases that use kind-projector, we adopt the following measures under the command line option `-Ykind-projector`: @@ -42,7 +42,7 @@ option `-Ykind-projector`: available to rewrite one to the other. 3. In Scala 3.3, `*` is removed again, and all type parameter placeholders will be expressed with `_`. -These rules make it possible to cross build between Scala 2 using the kind projector plugin and Scala 3.0 - 3.2 using the compiler option `-Ykind-projector`. +These rules make it possible to cross-build between Scala 2 using the kind projector plugin and Scala 3.0 - 3.2 using the compiler option `-Ykind-projector`. There is also a migration path for users that want a one-time transition to syntax with `_` as a type parameter placeholder. With option `-Ykind-projector:underscores` Scala 3 will regard `_` as a type parameter placeholder, leaving `?` as the only syntax for wildcards. From 20d0d59b8a59214ea33fb304f262b492803cc4e7 Mon Sep 17 00:00:00 2001 From: Bersier Date: Tue, 2 Jan 2024 05:46:47 -0500 Subject: [PATCH 143/371] Update context-functions.md Mainly fixed indentation --- .../reference/contextual/context-functions.md | 124 ++++++++++-------- 1 file changed, 67 insertions(+), 57 deletions(-) diff --git a/docs/_docs/reference/contextual/context-functions.md b/docs/_docs/reference/contextual/context-functions.md index 0ad3c8757782..0d174583f230 100644 --- a/docs/_docs/reference/contextual/context-functions.md +++ b/docs/_docs/reference/contextual/context-functions.md @@ -8,27 +8,29 @@ _Context functions_ are functions with (only) context parameters. Their types are _context function types_. Here is an example of a context function type: ```scala +import scala.concurrent.ExecutionContext + type Executable[T] = ExecutionContext ?=> T ``` Context functions are written using `?=>` as the "arrow" sign. They are applied to synthesized arguments, in the same way methods with context parameters are applied. For instance: ```scala - given ec: ExecutionContext = ... +given ec: ExecutionContext = ... - def f(x: Int): ExecutionContext ?=> Int = ... +def f(x: Int): ExecutionContext ?=> Int = ... - // could be written as follows with the type alias from above - // def f(x: Int): Executable[Int] = ... +// could be written as follows with the type alias from above +// def f(x: Int): Executable[Int] = ... - f(2)(using ec) // explicit argument - f(2) // argument is inferred +f(2)(using ec) // explicit argument +f(2) // argument is inferred ``` Conversely, if the expected type of an expression `E` is a context function type `(T_1, ..., T_n) ?=> U` and `E` is not already an context function literal, `E` is converted to a context function literal by rewriting it to ```scala - (x_1: T1, ..., x_n: Tn) ?=> E +(x_1: T1, ..., x_n: Tn) ?=> E ``` where the names `x_1`, ..., `x_n` are arbitrary. This expansion is performed before the expression `E` is typechecked, which means that `x_1`, ..., `x_n` @@ -38,14 +40,14 @@ Like their types, context function literals are written using `?=>` as the arrow For example, continuing with the previous definitions, ```scala - def g(arg: Executable[Int]) = ... +def g(arg: Executable[Int]) = ... - g(22) // is expanded to g((ev: ExecutionContext) ?=> 22) +g(22) // is expanded to g((ev: ExecutionContext) ?=> 22) - g(f(2)) // is expanded to g((ev: ExecutionContext) ?=> f(2)(using ev)) +g(f(2)) // is expanded to g((ev: ExecutionContext) ?=> f(2)(using ev)) - g((ctx: ExecutionContext) ?=> f(3)) // is expanded to g((ctx: ExecutionContext) ?=> f(3)(using ctx)) - g((ctx: ExecutionContext) ?=> f(3)(using ctx)) // is left as it is +g((ctx: ExecutionContext) ?=> f(3)) // is expanded to g((ctx: ExecutionContext) ?=> f(3)(using ctx)) +g((ctx: ExecutionContext) ?=> f(3)(using ctx)) // is left as it is ``` ## Example: Builder Pattern @@ -54,63 +56,65 @@ Context function types have considerable expressive power. For instance, here is how they can support the "builder pattern", where the aim is to construct tables like this: ```scala - table { - row { - cell("top left") - cell("top right") - } - row { - cell("bottom left") - cell("bottom right") - } +table { + row { + cell("top left") + cell("top right") + } + row { + cell("bottom left") + cell("bottom right") } +} ``` The idea is to define classes for `Table` and `Row` that allow the addition of elements via `add`: ```scala - class Table: - val rows = new ArrayBuffer[Row] - def add(r: Row): Unit = rows += r - override def toString = rows.mkString("Table(", ", ", ")") +import scala.collection.mutable.ArrayBuffer + +class Table: + val rows = new ArrayBuffer[Row] + def add(r: Row): Unit = rows += r + override def toString = rows.mkString("Table(", ", ", ")") - class Row: - val cells = new ArrayBuffer[Cell] - def add(c: Cell): Unit = cells += c - override def toString = cells.mkString("Row(", ", ", ")") +class Row: + val cells = new ArrayBuffer[Cell] + def add(c: Cell): Unit = cells += c + override def toString = cells.mkString("Row(", ", ", ")") - case class Cell(elem: String) +case class Cell(elem: String) ``` Then, the `table`, `row` and `cell` constructor methods can be defined with context function types as parameters to avoid the plumbing boilerplate that would otherwise be necessary. ```scala - def table(init: Table ?=> Unit) = - given t: Table = Table() - init - t - - def row(init: Row ?=> Unit)(using t: Table) = - given r: Row = Row() - init - t.add(r) - - def cell(str: String)(using r: Row) = - r.add(new Cell(str)) +def table(init: Table ?=> Unit) = + given t: Table = Table() + init + t + +def row(init: Row ?=> Unit)(using t: Table) = + given r: Row = Row() + init + t.add(r) + +def cell(str: String)(using r: Row) = + r.add(new Cell(str)) ``` With that setup, the table construction code above compiles and expands to: ```scala - table { ($t: Table) ?=> - - row { ($r: Row) ?=> - cell("top left")(using $r) - cell("top right")(using $r) - }(using $t) - - row { ($r: Row) ?=> - cell("bottom left")(using $r) - cell("bottom right")(using $r) - }(using $t) - } +table { ($t: Table) ?=> + + row { ($r: Row) ?=> + cell("top left")(using $r) + cell("top right")(using $r) + }(using $t) + + row { ($r: Row) ?=> + cell("bottom left")(using $r) + cell("bottom right")(using $r) + }(using $t) +} ``` ## Example: Postconditions @@ -131,12 +135,18 @@ import PostConditions.{ensuring, result} val s = List(1, 2, 3).sum.ensuring(result == 6) ``` -**Explanations**: We use a context function type `WrappedResult[T] ?=> Boolean` +### Explanation + +We use a context function type `WrappedResult[T] ?=> Boolean` as the type of the condition of `ensuring`. An argument to `ensuring` such as `(result == 6)` will therefore have a given of type `WrappedResult[T]` in -scope to pass along to the `result` method. `WrappedResult` is a fresh type, to make sure +scope to pass along to the `result` method. + +`WrappedResult` is a fresh type, to make sure that we do not get unwanted givens in scope (this is good practice in all cases -where context parameters are involved). Since `WrappedResult` is an opaque type alias, its +where context parameters are involved). + +Since `WrappedResult` is an opaque type alias, its values need not be boxed, and since `ensuring` is added as an extension method, its argument does not need boxing either. Hence, the implementation of `ensuring` is close in efficiency to the best possible code one could write by hand: From acdbcfa1921a4f7110b5f5743479e69b611f6160 Mon Sep 17 00:00:00 2001 From: Dale Wijnand Date: Wed, 10 Jan 2024 15:12:16 +0000 Subject: [PATCH 144/371] Fix expandParam's use of argForParam/isArgPrefixOf. --- compiler/src/dotty/tools/dotc/core/Types.scala | 2 +- tests/pos/i19354.orig.scala | 16 ++++++++++++++++ tests/pos/i19354.scala | 7 +++++++ 3 files changed, 24 insertions(+), 1 deletion(-) create mode 100644 tests/pos/i19354.orig.scala create mode 100644 tests/pos/i19354.scala diff --git a/compiler/src/dotty/tools/dotc/core/Types.scala b/compiler/src/dotty/tools/dotc/core/Types.scala index 5e867e3eee97..fba5f3f56648 100644 --- a/compiler/src/dotty/tools/dotc/core/Types.scala +++ b/compiler/src/dotty/tools/dotc/core/Types.scala @@ -6276,7 +6276,7 @@ object Types extends TypeUtils { */ def expandParam(tp: NamedType, pre: Type): Type = tp.argForParam(pre) match { - case arg @ TypeRef(pre, _) if pre.isArgPrefixOf(arg.symbol) => + case arg @ TypeRef(`pre`, _) if pre.isArgPrefixOf(arg.symbol) => arg.info match { case argInfo: TypeBounds => expandBounds(argInfo) case argInfo => useAlternate(arg) diff --git a/tests/pos/i19354.orig.scala b/tests/pos/i19354.orig.scala new file mode 100644 index 000000000000..0443bcb06836 --- /dev/null +++ b/tests/pos/i19354.orig.scala @@ -0,0 +1,16 @@ +import javax.annotation.processing.{ AbstractProcessor, RoundEnvironment } +import javax.lang.model.element.{ ElementKind, PackageElement, TypeElement } + +import java.util as ju + +class P extends AbstractProcessor { + override def process(annotations: ju.Set[? <: TypeElement], roundEnv: RoundEnvironment): Boolean = { + annotations + .stream() + .flatMap(annotation => roundEnv.getElementsAnnotatedWith(annotation).stream()) + .filter(element => element.getKind == ElementKind.PACKAGE) + .map(element => element.asInstanceOf[PackageElement]) + .toList() + true + } +} diff --git a/tests/pos/i19354.scala b/tests/pos/i19354.scala new file mode 100644 index 000000000000..db1d4961e79f --- /dev/null +++ b/tests/pos/i19354.scala @@ -0,0 +1,7 @@ +class Foo; class Bar +class Test: + def t1(xs: java.util.stream.Stream[? <: Foo]) = + xs.map(x => take(x)) + + def take(x: Foo) = "" + def take(x: Bar) = "" From 9ded26365178f008fe008a4a24cd7d79c47ba3d7 Mon Sep 17 00:00:00 2001 From: Nicolas Stucki Date: Wed, 3 Jan 2024 13:01:16 +0100 Subject: [PATCH 145/371] Update copyright year in cmdScaladocTests Fixes #19360 --- project/scripts/cmdScaladocTests | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/project/scripts/cmdScaladocTests b/project/scripts/cmdScaladocTests index 26e39e13796f..7ffa2506b1dd 100755 --- a/project/scripts/cmdScaladocTests +++ b/project/scripts/cmdScaladocTests @@ -37,7 +37,7 @@ dist/target/pack/bin/scaladoc \ "-snippet-compiler:scaladoc-testcases/docs=compile" \ "-comment-syntax:scaladoc-testcases/src/example/comment-md=markdown,scaladoc-testcases/src/example/comment-wiki=wiki" \ -siteroot scaladoc-testcases/docs \ - -project-footer "Copyright (c) 2002-2023, LAMP/EPFL" \ + -project-footer "Copyright (c) 2002-2024, LAMP/EPFL" \ -default-template static-site-main \ -author -groups -revision main -project-version "${DOTTY_BOOTSTRAPPED_VERSION}" \ "-quick-links:Learn::https://docs.scala-lang.org/,Install::https://www.scala-lang.org/download/,Playground::https://scastie.scala-lang.org,Find A Library::https://index.scala-lang.org,Community::https://www.scala-lang.org/community/,Blog::https://www.scala-lang.org/blog/," \ From e63f71f685c066be2339feda363787760ec98aba Mon Sep 17 00:00:00 2001 From: Nicolas Stucki Date: Wed, 3 Jan 2024 13:03:53 +0100 Subject: [PATCH 146/371] Get current year for copyright footer --- project/scripts/cmdScaladocTests | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/project/scripts/cmdScaladocTests b/project/scripts/cmdScaladocTests index 7ffa2506b1dd..e9403d988b98 100755 --- a/project/scripts/cmdScaladocTests +++ b/project/scripts/cmdScaladocTests @@ -37,7 +37,7 @@ dist/target/pack/bin/scaladoc \ "-snippet-compiler:scaladoc-testcases/docs=compile" \ "-comment-syntax:scaladoc-testcases/src/example/comment-md=markdown,scaladoc-testcases/src/example/comment-wiki=wiki" \ -siteroot scaladoc-testcases/docs \ - -project-footer "Copyright (c) 2002-2024, LAMP/EPFL" \ + -project-footer "Copyright (c) 2002-$(date +%Y), LAMP/EPFL" \ -default-template static-site-main \ -author -groups -revision main -project-version "${DOTTY_BOOTSTRAPPED_VERSION}" \ "-quick-links:Learn::https://docs.scala-lang.org/,Install::https://www.scala-lang.org/download/,Playground::https://scastie.scala-lang.org,Find A Library::https://index.scala-lang.org,Community::https://www.scala-lang.org/community/,Blog::https://www.scala-lang.org/blog/," \ From ce9e6993447b0391d8a56ac6e73645d2d7529a2c Mon Sep 17 00:00:00 2001 From: Nicolas Stucki Date: Mon, 8 Jan 2024 15:51:42 +0100 Subject: [PATCH 147/371] Remove `ascriptionVarargsUnpacking` as we never used it --- compiler/src/dotty/tools/dotc/config/Feature.scala | 1 - library/src/scala/runtime/stdLibPatches/language.scala | 3 --- 2 files changed, 4 deletions(-) diff --git a/compiler/src/dotty/tools/dotc/config/Feature.scala b/compiler/src/dotty/tools/dotc/config/Feature.scala index fa262a5880ff..2798828ad9a7 100644 --- a/compiler/src/dotty/tools/dotc/config/Feature.scala +++ b/compiler/src/dotty/tools/dotc/config/Feature.scala @@ -26,7 +26,6 @@ object Feature: val dependent = experimental("dependent") val erasedDefinitions = experimental("erasedDefinitions") val symbolLiterals = deprecated("symbolLiterals") - val ascriptionVarargsUnpacking = deprecated("ascriptionVarargsUnpacking") val fewerBraces = experimental("fewerBraces") val saferExceptions = experimental("saferExceptions") val clauseInterleaving = experimental("clauseInterleaving") diff --git a/library/src/scala/runtime/stdLibPatches/language.scala b/library/src/scala/runtime/stdLibPatches/language.scala index c2a12cec2ecc..6018f537613b 100644 --- a/library/src/scala/runtime/stdLibPatches/language.scala +++ b/library/src/scala/runtime/stdLibPatches/language.scala @@ -115,9 +115,6 @@ object language: @compileTimeOnly("`symbolLiterals` can only be used at compile time in import statements") object symbolLiterals - /** TODO */ - @compileTimeOnly("`ascriptionVarargsUnpacking` can only be used at compile time in import statements") - object ascriptionVarargsUnpacking end deprecated /** Where imported, auto-tupling is disabled. From e3cbb63eab9225034cc7cc747835fa4fcecb4883 Mon Sep 17 00:00:00 2001 From: odersky Date: Wed, 20 Dec 2023 20:08:57 +0100 Subject: [PATCH 148/371] Make explicit arguments for context bounds an error from 3.5 --- .../src/dotty/tools/dotc/ast/Desugar.scala | 3 +- .../dotty/tools/dotc/typer/Migrations.scala | 61 +++++++++++++++++++ .../src/dotty/tools/dotc/typer/Typer.scala | 29 ++------- tests/neg/context-bounds-migration.scala | 10 +++ tests/neg/hidden-type-errors.check | 26 ++++++++ 5 files changed, 105 insertions(+), 24 deletions(-) create mode 100644 compiler/src/dotty/tools/dotc/typer/Migrations.scala create mode 100644 tests/neg/context-bounds-migration.scala diff --git a/compiler/src/dotty/tools/dotc/ast/Desugar.scala b/compiler/src/dotty/tools/dotc/ast/Desugar.scala index 3386dc7d7a6c..36f2d593de1c 100644 --- a/compiler/src/dotty/tools/dotc/ast/Desugar.scala +++ b/compiler/src/dotty/tools/dotc/ast/Desugar.scala @@ -674,7 +674,8 @@ object desugar { val nu = vparamss.foldLeft(makeNew(classTypeRef)) { (nu, vparams) => val app = Apply(nu, vparams.map(refOfDef)) vparams match { - case vparam :: _ if vparam.mods.is(Given) => app.setApplyKind(ApplyKind.Using) + case vparam :: _ if vparam.mods.is(Given) || vparam.name.is(ContextBoundParamName) => + app.setApplyKind(ApplyKind.Using) case _ => app } } diff --git a/compiler/src/dotty/tools/dotc/typer/Migrations.scala b/compiler/src/dotty/tools/dotc/typer/Migrations.scala new file mode 100644 index 000000000000..284ec1d18799 --- /dev/null +++ b/compiler/src/dotty/tools/dotc/typer/Migrations.scala @@ -0,0 +1,61 @@ +package dotty.tools +package dotc +package typer + +import core.* +import ast.* +import Contexts.* +import Types.* +import Flags.* +import Names.* +import StdNames.* +import Symbols.* +import Trees.* +import ProtoTypes.* +import Decorators.* +import config.MigrationVersion +import config.Feature.{sourceVersion, migrateTo3} +import config.SourceVersion.* +import reporting.* +import NameKinds.ContextBoundParamName +import rewrites.Rewrites.patch +import util.Spans.Span + +/** A utility module containing source-dependent deprecation messages + * and migrations + */ +object Migrations: + + import tpd.* + + /** Flag & migrate `?` used as a higher-kinded type parameter + * Warning in 3.0-migration, error from 3.0 + */ + def migrateKindProjectorQMark(tree: untpd.TypeDef, sym: Symbol)(using Context): Unit = + if tree.name eq tpnme.? then + val addendum = if sym.owner.is(TypeParam) + then ", use `_` to denote a higher-kinded type parameter" + else "" + val namePos = tree.sourcePos.withSpan(tree.nameSpan) + report.errorOrMigrationWarning( + em"`?` is not a valid type name$addendum", namePos, MigrationVersion.Scala2to3) + + /** Flag & migrate explicit normal arguments to parameters coming from context bounds + * Warning in 3.4, error in 3.5, rewrite in 3.5-migration. + */ + def migrateContextBoundParams(tree: Tree, tp: Type, pt: FunProto)(using Context): Unit = + def isContextBoundParams = tp.stripPoly match + case MethodType(ContextBoundParamName(_) :: _) => true + case _ => false + if sourceVersion.isAtLeast(`3.4`) + && isContextBoundParams + && pt.applyKind != ApplyKind.Using + then + def rewriteMsg = Message.rewriteNotice("This code", `3.5-migration`) + report.errorOrMigrationWarning( + em"""Context bounds will map to context parameters. + |A `using` clause is needed to pass explicit arguments to them.$rewriteMsg""", + tree.srcPos, MigrationVersion(`3.4`, `3.5`)) + if sourceVersion.isAtLeast(`3.5-migration`) then + patch(Span(pt.args.head.span.start), "using ") + end migrateContextBoundParams diff --git a/compiler/src/dotty/tools/dotc/typer/Typer.scala b/compiler/src/dotty/tools/dotc/typer/Typer.scala index 2f03c79754e8..50e5b22fabe0 100644 --- a/compiler/src/dotty/tools/dotc/typer/Typer.scala +++ b/compiler/src/dotty/tools/dotc/typer/Typer.scala @@ -51,6 +51,7 @@ import NullOpsDecorator.* import cc.CheckCaptures import config.Config import config.MigrationVersion +import Migrations.* import scala.annotation.constructorOnly import dotty.tools.dotc.rewrites.Rewrites @@ -3137,13 +3138,7 @@ class Typer(@constructorOnly nestingLevel: Int = 0) extends Namer case tree: untpd.TypeDef => // separate method to keep dispatching method `typedNamed` short which might help the JIT def typedTypeOrClassDef: Tree = - if tree.name eq tpnme.? then - val addendum = if sym.owner.is(TypeParam) - then ", use `_` to denote a higher-kinded type parameter" - else "" - val namePos = tree.sourcePos.withSpan(tree.nameSpan) - report.errorOrMigrationWarning( - em"`?` is not a valid type name$addendum", namePos, MigrationVersion.Scala2to3) + migrateKindProjectorQMark(tree, sym) if tree.isClassDef then typedClassDef(tree, sym.asClass)(using ctx.localContext(tree, sym)) else @@ -3818,24 +3813,12 @@ class Typer(@constructorOnly nestingLevel: Int = 0) extends Namer def adaptToArgs(wtp: Type, pt: FunProto): Tree = wtp match { case wtp: MethodOrPoly => def methodStr = methPart(tree).symbol.showLocated - if (matchingApply(wtp, pt)) + if matchingApply(wtp, pt) then + migrateContextBoundParams(tree, wtp, pt) if needsTupledDual(wtp, pt) then adapt(tree, pt.tupledDual, locked) else tree else if wtp.isContextualMethod then - def isContextBoundParams = wtp.stripPoly match - case MethodType(ContextBoundParamName(_) :: _) => true - case _ => false - if sourceVersion == `future-migration` && isContextBoundParams && pt.args.nonEmpty - then // Under future-migration, don't infer implicit arguments yet for parameters - // coming from context bounds. Issue a warning instead and offer a patch. - def rewriteMsg = Message.rewriteNotice("This code", `future-migration`) - report.migrationWarning( - em"""Context bounds will map to context parameters. - |A `using` clause is needed to pass explicit arguments to them.$rewriteMsg""", tree.srcPos) - patch(Span(pt.args.head.span.start), "using ") - tree - else - adaptNoArgs(wtp) // insert arguments implicitly + adaptNoArgs(wtp) // insert arguments implicitly else if (tree.symbol.isPrimaryConstructor && tree.symbol.info.firstParamTypes.isEmpty) readapt(tree.appliedToNone) // insert () to primary constructors else @@ -4441,7 +4424,7 @@ class Typer(@constructorOnly nestingLevel: Int = 0) extends Namer protected def matchingApply(methType: MethodOrPoly, pt: FunProto)(using Context): Boolean = val isUsingApply = pt.applyKind == ApplyKind.Using methType.isContextualMethod == isUsingApply - || methType.isImplicitMethod && isUsingApply // for a transition allow `with` arguments for regular implicit parameters + || methType.isImplicitMethod && isUsingApply // for a transition allow `using` arguments for regular implicit parameters /** Check that `tree == x: pt` is typeable. Used when checking a pattern * against a selector of type `pt`. This implementation accounts for diff --git a/tests/neg/context-bounds-migration.scala b/tests/neg/context-bounds-migration.scala new file mode 100644 index 000000000000..b27dc884692c --- /dev/null +++ b/tests/neg/context-bounds-migration.scala @@ -0,0 +1,10 @@ +//> using options -Xfatal-warnings + +class C[T] +def foo[X: C] = () + +given [T]: C[T] = C[T]() + +def Test = + foo(C[Int]()) // error + foo(using C[Int]()) // ok diff --git a/tests/neg/hidden-type-errors.check b/tests/neg/hidden-type-errors.check index 2f4a1748dc67..2cf77134c2c5 100644 --- a/tests/neg/hidden-type-errors.check +++ b/tests/neg/hidden-type-errors.check @@ -1,3 +1,16 @@ +-- Warning: tests/neg/hidden-type-errors/Test.scala:8:24 --------------------------------------------------------------- + 8 | val x = X.doSomething("XXX") // error + | ^^^^^^^^^^^^^^^^^^^^ + | Context bounds will map to context parameters. + | A `using` clause is needed to pass explicit arguments to them. + | This code can be rewritten automatically under -rewrite -source 3.5-migration. + |-------------------------------------------------------------------------------------------------------------------- + |Inline stack trace + |- - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - + |This location contains code that was inlined from Macro.scala:15 +15 | doSomethingImpl('x) + | ^^^^^^^^^^^^^^^^^^^ + -------------------------------------------------------------------------------------------------------------------- -- [E007] Type Mismatch Error: tests/neg/hidden-type-errors/Test.scala:8:24 -------------------------------------------- 8 | val x = X.doSomething("XXX") // error | ^^^^^^^^^^^^^^^^^^^^ @@ -18,3 +31,16 @@ | | The tests were made under the empty constraint --------------------------------------------------------------------------------------------------------------------- +-- Warning: tests/neg/hidden-type-errors/Test.scala:8:24 --------------------------------------------------------------- + 8 | val x = X.doSomething("XXX") // error + | ^^^^^^^^^^^^^^^^^^^^ + | Context bounds will map to context parameters. + | A `using` clause is needed to pass explicit arguments to them. + | This code can be rewritten automatically under -rewrite -source 3.5-migration. + |-------------------------------------------------------------------------------------------------------------------- + |Inline stack trace + |- - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - + |This location contains code that was inlined from Macro.scala:15 +15 | doSomethingImpl('x) + | ^^^^^^^^^^^^^^^^^^^ + -------------------------------------------------------------------------------------------------------------------- From 04d1cf1636610f9d11165d72db380af8703eb93d Mon Sep 17 00:00:00 2001 From: odersky Date: Thu, 21 Dec 2023 16:49:30 +0100 Subject: [PATCH 149/371] Don't run migration operations in ReTypers --- .../dotty/tools/dotc/typer/Migrations.scala | 59 +++++++++++++++++-- .../src/dotty/tools/dotc/typer/ReTyper.scala | 1 + .../src/dotty/tools/dotc/typer/Typer.scala | 58 ++++-------------- 3 files changed, 64 insertions(+), 54 deletions(-) diff --git a/compiler/src/dotty/tools/dotc/typer/Migrations.scala b/compiler/src/dotty/tools/dotc/typer/Migrations.scala index 284ec1d18799..9c038abbd851 100644 --- a/compiler/src/dotty/tools/dotc/typer/Migrations.scala +++ b/compiler/src/dotty/tools/dotc/typer/Migrations.scala @@ -20,18 +20,20 @@ import reporting.* import NameKinds.ContextBoundParamName import rewrites.Rewrites.patch import util.Spans.Span +import rewrites.Rewrites -/** A utility module containing source-dependent deprecation messages - * and migrations +/** A utility trait containing source-dependent deprecation messages + * and migrations. */ -object Migrations: +trait Migrations: + this: Typer => import tpd.* /** Flag & migrate `?` used as a higher-kinded type parameter * Warning in 3.0-migration, error from 3.0 */ - def migrateKindProjectorQMark(tree: untpd.TypeDef, sym: Symbol)(using Context): Unit = + def kindProjectorQMark(tree: untpd.TypeDef, sym: Symbol)(using Context): Unit = if tree.name eq tpnme.? then val addendum = if sym.owner.is(TypeParam) then ", use `_` to denote a higher-kinded type parameter" @@ -40,10 +42,53 @@ object Migrations: report.errorOrMigrationWarning( em"`?` is not a valid type name$addendum", namePos, MigrationVersion.Scala2to3) + def typedAsFunction(tree: untpd.PostfixOp, pt: Type)(using Context): Tree = { + val untpd.PostfixOp(qual, Ident(nme.WILDCARD)) = tree: @unchecked + val pt1 = if (defn.isFunctionNType(pt)) pt else AnyFunctionProto + val nestedCtx = ctx.fresh.setNewTyperState() + val res = typed(qual, pt1)(using nestedCtx) + res match { + case closure(_, _, _) => + case _ => + val recovered = typed(qual)(using ctx.fresh.setExploreTyperState()) + val msg = OnlyFunctionsCanBeFollowedByUnderscore(recovered.tpe.widen, tree) + report.errorOrMigrationWarning(msg, tree.srcPos, MigrationVersion.Scala2to3) + if MigrationVersion.Scala2to3.needsPatch then + // Under -rewrite, patch `x _` to `(() => x)` + msg.actions + .headOption + .foreach(Rewrites.applyAction) + return typed(untpd.Function(Nil, qual), pt) + } + nestedCtx.typerState.commit() + + lazy val (prefix, suffix) = res match { + case Block(mdef @ DefDef(_, vparams :: Nil, _, _) :: Nil, _: Closure) => + val arity = vparams.length + if (arity > 0) ("", "") else ("(() => ", "())") + case _ => + ("(() => ", ")") + } + def remedy = + if ((prefix ++ suffix).isEmpty) "simply leave out the trailing ` _`" + else s"use `$prefix$suffix` instead" + def rewrite = Message.rewriteNotice("This construct", `3.4-migration`) + report.errorOrMigrationWarning( + em"""The syntax ` _` is no longer supported; + |you can $remedy$rewrite""", + tree.srcPos, + MigrationVersion.FunctionUnderscore) + if MigrationVersion.FunctionUnderscore.needsPatch then + patch(Span(tree.span.start), prefix) + patch(Span(qual.span.end, tree.span.end), suffix) + + res + } + /** Flag & migrate explicit normal arguments to parameters coming from context bounds * Warning in 3.4, error in 3.5, rewrite in 3.5-migration. */ - def migrateContextBoundParams(tree: Tree, tp: Type, pt: FunProto)(using Context): Unit = + def contextBoundParams(tree: Tree, tp: Type, pt: FunProto)(using Context): Unit = def isContextBoundParams = tp.stripPoly match case MethodType(ContextBoundParamName(_) :: _) => true case _ => false @@ -58,4 +103,6 @@ object Migrations: tree.srcPos, MigrationVersion(`3.4`, `3.5`)) if sourceVersion.isAtLeast(`3.5-migration`) then patch(Span(pt.args.head.span.start), "using ") - end migrateContextBoundParams + end contextBoundParams + +end Migrations diff --git a/compiler/src/dotty/tools/dotc/typer/ReTyper.scala b/compiler/src/dotty/tools/dotc/typer/ReTyper.scala index e152b5e6b9c7..253c4fda9396 100644 --- a/compiler/src/dotty/tools/dotc/typer/ReTyper.scala +++ b/compiler/src/dotty/tools/dotc/typer/ReTyper.scala @@ -189,4 +189,5 @@ class ReTyper(nestingLevel: Int = 0) extends Typer(nestingLevel) with ReChecking override protected def checkEqualityEvidence(tree: tpd.Tree, pt: Type)(using Context): Unit = () override protected def matchingApply(methType: MethodOrPoly, pt: FunProto)(using Context): Boolean = true override protected def typedScala2MacroBody(call: untpd.Tree)(using Context): Tree = promote(call) + override protected def migrate[T](migration: => T, disabled: => T = ()): T = disabled } diff --git a/compiler/src/dotty/tools/dotc/typer/Typer.scala b/compiler/src/dotty/tools/dotc/typer/Typer.scala index 50e5b22fabe0..5cb42f659551 100644 --- a/compiler/src/dotty/tools/dotc/typer/Typer.scala +++ b/compiler/src/dotty/tools/dotc/typer/Typer.scala @@ -43,7 +43,7 @@ import config.Printers.{gadts, typr} import config.Feature import config.Feature.{sourceVersion, migrateTo3} import config.SourceVersion.* -import rewrites.Rewrites.patch +import rewrites.Rewrites, Rewrites.patch import staging.StagingLevel import reporting.* import Nullables.* @@ -51,10 +51,8 @@ import NullOpsDecorator.* import cc.CheckCaptures import config.Config import config.MigrationVersion -import Migrations.* import scala.annotation.constructorOnly -import dotty.tools.dotc.rewrites.Rewrites object Typer { @@ -128,7 +126,8 @@ class Typer(@constructorOnly nestingLevel: Int = 0) extends Namer with Dynamic with Checking with QuotesAndSplices - with Deriving { + with Deriving + with Migrations { import Typer.* import tpd.{cpy => _, _} @@ -159,6 +158,9 @@ class Typer(@constructorOnly nestingLevel: Int = 0) extends Namer // Overridden in derived typers def newLikeThis(nestingLevel: Int): Typer = new Typer(nestingLevel) + // Overridden to do nothing in derived typers + protected def migrate[T](migration: => T, disabled: => T = ()): T = migration + /** Find the type of an identifier with given `name` in given context `ctx`. * @param name the name of the identifier * @param pt the expected type @@ -2979,48 +2981,8 @@ class Typer(@constructorOnly nestingLevel: Int = 0) extends Namer else tree1 } - def typedAsFunction(tree: untpd.PostfixOp, pt: Type)(using Context): Tree = { - val untpd.PostfixOp(qual, Ident(nme.WILDCARD)) = tree: @unchecked - val pt1 = if (defn.isFunctionNType(pt)) pt else AnyFunctionProto - val nestedCtx = ctx.fresh.setNewTyperState() - val res = typed(qual, pt1)(using nestedCtx) - res match { - case closure(_, _, _) => - case _ => - val recovered = typed(qual)(using ctx.fresh.setExploreTyperState()) - val msg = OnlyFunctionsCanBeFollowedByUnderscore(recovered.tpe.widen, tree) - report.errorOrMigrationWarning(msg, tree.srcPos, MigrationVersion.Scala2to3) - if MigrationVersion.Scala2to3.needsPatch then - // Under -rewrite, patch `x _` to `(() => x)` - msg.actions - .headOption - .foreach(Rewrites.applyAction) - return typed(untpd.Function(Nil, qual), pt) - } - nestedCtx.typerState.commit() - - lazy val (prefix, suffix) = res match { - case Block(mdef @ DefDef(_, vparams :: Nil, _, _) :: Nil, _: Closure) => - val arity = vparams.length - if (arity > 0) ("", "") else ("(() => ", "())") - case _ => - ("(() => ", ")") - } - def remedy = - if ((prefix ++ suffix).isEmpty) "simply leave out the trailing ` _`" - else s"use `$prefix$suffix` instead" - def rewrite = Message.rewriteNotice("This construct", `3.4-migration`) - report.errorOrMigrationWarning( - em"""The syntax ` _` is no longer supported; - |you can $remedy$rewrite""", - tree.srcPos, - MigrationVersion.FunctionUnderscore) - if MigrationVersion.FunctionUnderscore.needsPatch then - patch(Span(tree.span.start), prefix) - patch(Span(qual.span.end, tree.span.end), suffix) - - res - } + override def typedAsFunction(tree: untpd.PostfixOp, pt: Type)(using Context): Tree = + migrate(super.typedAsFunction(tree, pt), throw new AssertionError("can't retype a PostfixOp")) /** Translate infix operation expression `l op r` to * @@ -3138,7 +3100,7 @@ class Typer(@constructorOnly nestingLevel: Int = 0) extends Namer case tree: untpd.TypeDef => // separate method to keep dispatching method `typedNamed` short which might help the JIT def typedTypeOrClassDef: Tree = - migrateKindProjectorQMark(tree, sym) + migrate(kindProjectorQMark(tree, sym)) if tree.isClassDef then typedClassDef(tree, sym.asClass)(using ctx.localContext(tree, sym)) else @@ -3814,7 +3776,7 @@ class Typer(@constructorOnly nestingLevel: Int = 0) extends Namer case wtp: MethodOrPoly => def methodStr = methPart(tree).symbol.showLocated if matchingApply(wtp, pt) then - migrateContextBoundParams(tree, wtp, pt) + migrate(contextBoundParams(tree, wtp, pt)) if needsTupledDual(wtp, pt) then adapt(tree, pt.tupledDual, locked) else tree else if wtp.isContextualMethod then From 04dc82335500d4f526f25b08b9a44c01824ac08e Mon Sep 17 00:00:00 2001 From: odersky Date: Sat, 23 Dec 2023 16:14:56 +0100 Subject: [PATCH 150/371] Fix test --- tests/neg/hidden-type-errors.check | 26 -------------------------- 1 file changed, 26 deletions(-) diff --git a/tests/neg/hidden-type-errors.check b/tests/neg/hidden-type-errors.check index 2cf77134c2c5..2f4a1748dc67 100644 --- a/tests/neg/hidden-type-errors.check +++ b/tests/neg/hidden-type-errors.check @@ -1,16 +1,3 @@ --- Warning: tests/neg/hidden-type-errors/Test.scala:8:24 --------------------------------------------------------------- - 8 | val x = X.doSomething("XXX") // error - | ^^^^^^^^^^^^^^^^^^^^ - | Context bounds will map to context parameters. - | A `using` clause is needed to pass explicit arguments to them. - | This code can be rewritten automatically under -rewrite -source 3.5-migration. - |-------------------------------------------------------------------------------------------------------------------- - |Inline stack trace - |- - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - |This location contains code that was inlined from Macro.scala:15 -15 | doSomethingImpl('x) - | ^^^^^^^^^^^^^^^^^^^ - -------------------------------------------------------------------------------------------------------------------- -- [E007] Type Mismatch Error: tests/neg/hidden-type-errors/Test.scala:8:24 -------------------------------------------- 8 | val x = X.doSomething("XXX") // error | ^^^^^^^^^^^^^^^^^^^^ @@ -31,16 +18,3 @@ | | The tests were made under the empty constraint --------------------------------------------------------------------------------------------------------------------- --- Warning: tests/neg/hidden-type-errors/Test.scala:8:24 --------------------------------------------------------------- - 8 | val x = X.doSomething("XXX") // error - | ^^^^^^^^^^^^^^^^^^^^ - | Context bounds will map to context parameters. - | A `using` clause is needed to pass explicit arguments to them. - | This code can be rewritten automatically under -rewrite -source 3.5-migration. - |-------------------------------------------------------------------------------------------------------------------- - |Inline stack trace - |- - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - |This location contains code that was inlined from Macro.scala:15 -15 | doSomethingImpl('x) - | ^^^^^^^^^^^^^^^^^^^ - -------------------------------------------------------------------------------------------------------------------- From 59c5391146b576f9bc7af8a40124d3d87f2e126d Mon Sep 17 00:00:00 2001 From: Nicolas Stucki Date: Wed, 3 Jan 2024 09:27:56 +0100 Subject: [PATCH 151/371] Fix context bound application in enum desugaring --- compiler/src/dotty/tools/dotc/ast/DesugarEnums.scala | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/compiler/src/dotty/tools/dotc/ast/DesugarEnums.scala b/compiler/src/dotty/tools/dotc/ast/DesugarEnums.scala index 98873dba85c7..7268ec720ce2 100644 --- a/compiler/src/dotty/tools/dotc/ast/DesugarEnums.scala +++ b/compiler/src/dotty/tools/dotc/ast/DesugarEnums.scala @@ -99,7 +99,7 @@ object DesugarEnums { val clazzOf = TypeApply(ref(defn.Predef_classOf.termRef), tpt :: Nil) val ctag = Apply(TypeApply(ref(defn.ClassTagModule_apply.termRef), tpt :: Nil), clazzOf :: Nil) val apply = Select(ref(defn.ArrayModule.termRef), nme.apply) - Apply(Apply(TypeApply(apply, tpt :: Nil), values), ctag :: Nil) + Apply(Apply(TypeApply(apply, tpt :: Nil), values), ctag :: Nil).setApplyKind(ApplyKind.Using) /** The following lists of definitions for an enum type E and known value cases e_0, ..., e_n: * From dc60bcc3468b1ad31cf096f502a4cfdb03e13fec Mon Sep 17 00:00:00 2001 From: odersky Date: Mon, 8 Jan 2024 19:34:33 +0100 Subject: [PATCH 152/371] Address review comments --- .../tools/dotc/config/MigrationVersion.scala | 14 ++++++-- .../tools/dotc/config/SourceVersion.scala | 3 ++ .../dotty/tools/dotc/typer/Migrations.scala | 32 ++++++++++++------- .../src/dotty/tools/dotc/typer/ReTyper.scala | 1 - .../src/dotty/tools/dotc/typer/Typer.scala | 5 +-- .../context-bounds-migration.scala | 4 +-- 6 files changed, 38 insertions(+), 21 deletions(-) rename tests/{neg => warn}/context-bounds-migration.scala (61%) diff --git a/compiler/src/dotty/tools/dotc/config/MigrationVersion.scala b/compiler/src/dotty/tools/dotc/config/MigrationVersion.scala index d4afc599896c..4dd9d065395b 100644 --- a/compiler/src/dotty/tools/dotc/config/MigrationVersion.scala +++ b/compiler/src/dotty/tools/dotc/config/MigrationVersion.scala @@ -6,10 +6,16 @@ import SourceVersion.* import Feature.* import core.Contexts.Context -class MigrationVersion(val warnFrom: SourceVersion, val errorFrom: SourceVersion): - assert(warnFrom.ordinal <= errorFrom.ordinal) +class MigrationVersion( + val warnFrom: SourceVersion, + val errorFrom: SourceVersion): + require(warnFrom.ordinal <= errorFrom.ordinal) + def needsPatch(using Context): Boolean = - sourceVersion.isMigrating && sourceVersion.isAtLeast(errorFrom) + sourceVersion.isMigrating && sourceVersion.isAtLeast(warnFrom) + + def patchFrom: SourceVersion = + warnFrom.prevMigrating object MigrationVersion: @@ -27,6 +33,8 @@ object MigrationVersion: val AscriptionAfterPattern = MigrationVersion(`3.3`, future) + val ExplicitContextBoundArgument = MigrationVersion(`3.4`, `3.5`) + val AlphanumericInfix = MigrationVersion(`3.4`, future) val RemoveThisQualifier = MigrationVersion(`3.4`, future) val UninitializedVars = MigrationVersion(`3.4`, future) diff --git a/compiler/src/dotty/tools/dotc/config/SourceVersion.scala b/compiler/src/dotty/tools/dotc/config/SourceVersion.scala index f6db0bac0452..f59c4670a29a 100644 --- a/compiler/src/dotty/tools/dotc/config/SourceVersion.scala +++ b/compiler/src/dotty/tools/dotc/config/SourceVersion.scala @@ -18,6 +18,9 @@ enum SourceVersion: def stable: SourceVersion = if isMigrating then SourceVersion.values(ordinal + 1) else this + def prevMigrating: SourceVersion = + if isMigrating then this else SourceVersion.values(ordinal - 1).prevMigrating + def isAtLeast(v: SourceVersion) = stable.ordinal >= v.ordinal def isAtMost(v: SourceVersion) = stable.ordinal <= v.ordinal diff --git a/compiler/src/dotty/tools/dotc/typer/Migrations.scala b/compiler/src/dotty/tools/dotc/typer/Migrations.scala index 9c038abbd851..84db91f9dee9 100644 --- a/compiler/src/dotty/tools/dotc/typer/Migrations.scala +++ b/compiler/src/dotty/tools/dotc/typer/Migrations.scala @@ -13,7 +13,7 @@ import Symbols.* import Trees.* import ProtoTypes.* import Decorators.* -import config.MigrationVersion +import config.MigrationVersion as mv import config.Feature.{sourceVersion, migrateTo3} import config.SourceVersion.* import reporting.* @@ -30,6 +30,15 @@ trait Migrations: import tpd.* + /** Run `migration`, asserting we are in the proper Typer (not a ReTyper) */ + inline def migrate[T](inline migration: T): T = + assert(!this.isInstanceOf[ReTyper]) + migration + + /** Run `migration`, provided we are in the proper Typer (not a ReTyper) */ + inline def migrate(inline migration: Unit): Unit = + if !this.isInstanceOf[ReTyper] then migration + /** Flag & migrate `?` used as a higher-kinded type parameter * Warning in 3.0-migration, error from 3.0 */ @@ -40,7 +49,7 @@ trait Migrations: else "" val namePos = tree.sourcePos.withSpan(tree.nameSpan) report.errorOrMigrationWarning( - em"`?` is not a valid type name$addendum", namePos, MigrationVersion.Scala2to3) + em"`?` is not a valid type name$addendum", namePos, mv.Scala2to3) def typedAsFunction(tree: untpd.PostfixOp, pt: Type)(using Context): Tree = { val untpd.PostfixOp(qual, Ident(nme.WILDCARD)) = tree: @unchecked @@ -52,8 +61,8 @@ trait Migrations: case _ => val recovered = typed(qual)(using ctx.fresh.setExploreTyperState()) val msg = OnlyFunctionsCanBeFollowedByUnderscore(recovered.tpe.widen, tree) - report.errorOrMigrationWarning(msg, tree.srcPos, MigrationVersion.Scala2to3) - if MigrationVersion.Scala2to3.needsPatch then + report.errorOrMigrationWarning(msg, tree.srcPos, mv.Scala2to3) + if mv.Scala2to3.needsPatch then // Under -rewrite, patch `x _` to `(() => x)` msg.actions .headOption @@ -69,16 +78,16 @@ trait Migrations: case _ => ("(() => ", ")") } + val mversion = mv.FunctionUnderscore def remedy = if ((prefix ++ suffix).isEmpty) "simply leave out the trailing ` _`" else s"use `$prefix$suffix` instead" - def rewrite = Message.rewriteNotice("This construct", `3.4-migration`) + def rewrite = Message.rewriteNotice("This construct", mversion.patchFrom) report.errorOrMigrationWarning( em"""The syntax ` _` is no longer supported; |you can $remedy$rewrite""", - tree.srcPos, - MigrationVersion.FunctionUnderscore) - if MigrationVersion.FunctionUnderscore.needsPatch then + tree.srcPos, mversion) + if mversion.needsPatch then patch(Span(tree.span.start), prefix) patch(Span(qual.span.end, tree.span.end), suffix) @@ -89,6 +98,7 @@ trait Migrations: * Warning in 3.4, error in 3.5, rewrite in 3.5-migration. */ def contextBoundParams(tree: Tree, tp: Type, pt: FunProto)(using Context): Unit = + val mversion = mv.ExplicitContextBoundArgument def isContextBoundParams = tp.stripPoly match case MethodType(ContextBoundParamName(_) :: _) => true case _ => false @@ -96,12 +106,12 @@ trait Migrations: && isContextBoundParams && pt.applyKind != ApplyKind.Using then - def rewriteMsg = Message.rewriteNotice("This code", `3.5-migration`) + def rewriteMsg = Message.rewriteNotice("This code", mversion.patchFrom) report.errorOrMigrationWarning( em"""Context bounds will map to context parameters. |A `using` clause is needed to pass explicit arguments to them.$rewriteMsg""", - tree.srcPos, MigrationVersion(`3.4`, `3.5`)) - if sourceVersion.isAtLeast(`3.5-migration`) then + tree.srcPos, mversion) + if mversion.needsPatch then patch(Span(pt.args.head.span.start), "using ") end contextBoundParams diff --git a/compiler/src/dotty/tools/dotc/typer/ReTyper.scala b/compiler/src/dotty/tools/dotc/typer/ReTyper.scala index 253c4fda9396..e152b5e6b9c7 100644 --- a/compiler/src/dotty/tools/dotc/typer/ReTyper.scala +++ b/compiler/src/dotty/tools/dotc/typer/ReTyper.scala @@ -189,5 +189,4 @@ class ReTyper(nestingLevel: Int = 0) extends Typer(nestingLevel) with ReChecking override protected def checkEqualityEvidence(tree: tpd.Tree, pt: Type)(using Context): Unit = () override protected def matchingApply(methType: MethodOrPoly, pt: FunProto)(using Context): Boolean = true override protected def typedScala2MacroBody(call: untpd.Tree)(using Context): Tree = promote(call) - override protected def migrate[T](migration: => T, disabled: => T = ()): T = disabled } diff --git a/compiler/src/dotty/tools/dotc/typer/Typer.scala b/compiler/src/dotty/tools/dotc/typer/Typer.scala index 5cb42f659551..1303b64cbd12 100644 --- a/compiler/src/dotty/tools/dotc/typer/Typer.scala +++ b/compiler/src/dotty/tools/dotc/typer/Typer.scala @@ -158,9 +158,6 @@ class Typer(@constructorOnly nestingLevel: Int = 0) extends Namer // Overridden in derived typers def newLikeThis(nestingLevel: Int): Typer = new Typer(nestingLevel) - // Overridden to do nothing in derived typers - protected def migrate[T](migration: => T, disabled: => T = ()): T = migration - /** Find the type of an identifier with given `name` in given context `ctx`. * @param name the name of the identifier * @param pt the expected type @@ -2982,7 +2979,7 @@ class Typer(@constructorOnly nestingLevel: Int = 0) extends Namer } override def typedAsFunction(tree: untpd.PostfixOp, pt: Type)(using Context): Tree = - migrate(super.typedAsFunction(tree, pt), throw new AssertionError("can't retype a PostfixOp")) + migrate(super.typedAsFunction(tree, pt)) /** Translate infix operation expression `l op r` to * diff --git a/tests/neg/context-bounds-migration.scala b/tests/warn/context-bounds-migration.scala similarity index 61% rename from tests/neg/context-bounds-migration.scala rename to tests/warn/context-bounds-migration.scala index b27dc884692c..1094db68f41b 100644 --- a/tests/neg/context-bounds-migration.scala +++ b/tests/warn/context-bounds-migration.scala @@ -1,4 +1,4 @@ -//> using options -Xfatal-warnings +//> using options -source 3.4 class C[T] def foo[X: C] = () @@ -6,5 +6,5 @@ def foo[X: C] = () given [T]: C[T] = C[T]() def Test = - foo(C[Int]()) // error + foo(C[Int]()) // warn foo(using C[Int]()) // ok From 5342719fe6745ab2c745a56b2fa61b967af5196b Mon Sep 17 00:00:00 2001 From: odersky Date: Sun, 17 Dec 2023 13:45:43 +0100 Subject: [PATCH 153/371] Avoid generating given definitions that loop --- .../dotty/tools/dotc/typer/Implicits.scala | 73 ++++++++++++++++--- .../changed-features/implicit-resolution.md | 21 +++++- tests/neg/i15474.check | 6 ++ tests/neg/i15474.scala | 8 +- tests/neg/i6716.check | 6 ++ tests/neg/i6716.scala | 18 +++++ tests/pos/i15474.scala | 20 +++++ tests/pos/i6716.scala | 15 ---- tests/run/i17115.check | 2 + tests/{pos => run}/i17115.scala | 0 tests/run/i6716.check | 2 + tests/run/i6716.scala | 20 +++++ 12 files changed, 161 insertions(+), 30 deletions(-) create mode 100644 tests/neg/i15474.check create mode 100644 tests/neg/i6716.check create mode 100644 tests/neg/i6716.scala create mode 100644 tests/pos/i15474.scala delete mode 100644 tests/pos/i6716.scala create mode 100644 tests/run/i17115.check rename tests/{pos => run}/i17115.scala (100%) create mode 100644 tests/run/i6716.check create mode 100644 tests/run/i6716.scala diff --git a/compiler/src/dotty/tools/dotc/typer/Implicits.scala b/compiler/src/dotty/tools/dotc/typer/Implicits.scala index ff23e8180f1c..bb35306f696c 100644 --- a/compiler/src/dotty/tools/dotc/typer/Implicits.scala +++ b/compiler/src/dotty/tools/dotc/typer/Implicits.scala @@ -26,8 +26,8 @@ import Scopes.newScope import Typer.BindingPrec, BindingPrec.* import Hashable.* import util.{EqHashMap, Stats} -import config.{Config, Feature} -import Feature.migrateTo3 +import config.{Config, Feature, SourceVersion} +import Feature.{migrateTo3, sourceVersion} import config.Printers.{implicits, implicitsDetailed} import collection.mutable import reporting.* @@ -324,7 +324,7 @@ object Implicits: /** Is this the outermost implicits? This is the case if it either the implicits * of NoContext, or the last one before it. */ - private def isOuterMost = { + private def isOutermost = { val finalImplicits = NoContext.implicits (this eq finalImplicits) || (outerImplicits eqn finalImplicits) } @@ -356,7 +356,7 @@ object Implicits: Stats.record("uncached eligible") if monitored then record(s"check uncached eligible refs in irefCtx", refs.length) val ownEligible = filterMatching(tp) - if isOuterMost then ownEligible + if isOutermost then ownEligible else combineEligibles(ownEligible, outerImplicits.nn.uncachedEligible(tp)) /** The implicit references that are eligible for type `tp`. */ @@ -383,7 +383,7 @@ object Implicits: private def computeEligible(tp: Type): List[Candidate] = /*>|>*/ trace(i"computeEligible $tp in $refs%, %", implicitsDetailed) /*<|<*/ { if (monitored) record(s"check eligible refs in irefCtx", refs.length) val ownEligible = filterMatching(tp) - if isOuterMost then ownEligible + if isOutermost then ownEligible else combineEligibles(ownEligible, outerImplicits.nn.eligible(tp)) } @@ -392,7 +392,7 @@ object Implicits: override def toString: String = { val own = i"(implicits: $refs%, %)" - if (isOuterMost) own else own + "\n " + outerImplicits + if (isOutermost) own else own + "\n " + outerImplicits } /** This context, or a copy, ensuring root import from symbol `root` @@ -1550,11 +1550,15 @@ trait Implicits: case _ => tp.isAny || tp.isAnyRef - private def searchImplicit(contextual: Boolean): SearchResult = + /** Search implicit in context `ctxImplicits` or else in implicit scope + * of expected type if `ctxImplicits == null`. + */ + private def searchImplicit(ctxImplicits: ContextualImplicits | Null): SearchResult = if isUnderspecified(wildProto) then SearchFailure(TooUnspecific(pt), span) else - val eligible = + val contextual = ctxImplicits != null + val preEligible = // the eligible candidates, ignoring positions if contextual then if ctx.gadt.isNarrowing then withoutMode(Mode.ImplicitsEnabled) { @@ -1562,6 +1566,43 @@ trait Implicits: } else ctx.implicits.eligible(wildProto) else implicitScope(wildProto).eligible + + /** Does candidate `cand` come too late for it to be considered as an + * eligible candidate? This is the case if `cand` appears in the same + * scope as a given definition enclosing the search point and comes + * later in the source or coincides with that given definition. + */ + def comesTooLate(cand: Candidate): Boolean = + val candSym = cand.ref.symbol + def candSucceedsGiven(sym: Symbol): Boolean = + if sym.owner == candSym.owner then + if sym.is(ModuleClass) then candSucceedsGiven(sym.sourceModule) + else sym.is(Given) && sym.span.exists && sym.span.start <= candSym.span.start + else if sym.is(Package) then false + else candSucceedsGiven(sym.owner) + + ctx.isTyper + && !candSym.isOneOf(TermParamOrAccessor | Synthetic) + && candSym.span.exists + && candSucceedsGiven(ctx.owner) + end comesTooLate + + val eligible = if contextual then preEligible.filterNot(comesTooLate) else preEligible + + def checkResolutionChange(result: SearchResult) = result match + case result: SearchSuccess + if (eligible ne preEligible) && !sourceVersion.isAtLeast(SourceVersion.`future`) => + searchImplicit(preEligible.diff(eligible), contextual) match + case prevResult: SearchSuccess => + report.error( + em"""Warning: result of implicit search for $pt will change. + |current result: ${prevResult.ref.symbol.showLocated} + |result with -source future: ${result.ref.symbol.showLocated}""", + srcPos + ) + case _ => + case _ => + searchImplicit(eligible, contextual) match case result: SearchSuccess => result @@ -1570,14 +1611,24 @@ trait Implicits: case _: AmbiguousImplicits => failure case reason => if contextual then - searchImplicit(contextual = false).recoverWith { + // If we filtered out some candidates for being too late, we should + // do another contextual search further out, since the dropped candidates + // might have shadowed an eligible candidate in an outer level. + // Otherwise, proceed with a search of the implicit scope. + val newCtxImplicits = + if eligible eq preEligible then null + else ctxImplicits.nn.outerImplicits: ContextualImplicits | Null + // !!! Dotty problem: without the ContextualImplicits | Null type ascription + // we get a Ycheck failure after arrayConstructors due to "Types differ" + val result = searchImplicit(newCtxImplicits).recoverWith: failure2 => failure2.reason match case _: AmbiguousImplicits => failure2 case _ => reason match case (_: DivergingImplicit) => failure case _ => List(failure, failure2).maxBy(_.tree.treeSize) - } + checkResolutionChange(result) + result else failure end searchImplicit @@ -1595,7 +1646,7 @@ trait Implicits: case ref: TermRef => SearchSuccess(tpd.ref(ref).withSpan(span.startPos), ref, 0)(ctx.typerState, ctx.gadt) case _ => - searchImplicit(contextual = true) + searchImplicit(ctx.implicits) end bestImplicit def implicitScope(tp: Type): OfTypeImplicits = ctx.run.nn.implicitScope(tp) diff --git a/docs/_docs/reference/changed-features/implicit-resolution.md b/docs/_docs/reference/changed-features/implicit-resolution.md index 6a898690b565..ab8293724a4e 100644 --- a/docs/_docs/reference/changed-features/implicit-resolution.md +++ b/docs/_docs/reference/changed-features/implicit-resolution.md @@ -163,8 +163,27 @@ The new rules are as follows: An implicit `a` defined in `A` is more specific th Condition (*) is new. It is necessary to ensure that the defined relation is transitive. +[//]: # todo: expand with precise rules +**9.** Implicit resolution now tries to avoid recursive givens that can lead to an infinite loop at runtime. Here is an example: +```scala +object Prices { + opaque type Price = BigDecimal + object Price{ + given Ordering[Price] = summon[Ordering[BigDecimal]] // was error, now avoided + } +} +``` + +Previously, implicit resolution would resolve the `summon` to the given in `Price`, leading to an infinite loop (a warning was issued in that case). We now use the underlying given in `BigDecimal` instead. We achieve that by adding the following rule for implicit search: + + - When doing an implicit search while checking the implementation of a `given` definition `G`, discard all search results that lead back to `G` or to a given +with the same owner as `G` that comes later in the source than `G`. + +The new behavior is enabled under `-source future`. In earlier versions, a +warning is issued where that behavior will change. + +Old-style implicit definitions are unaffected by this change. -[//]: # todo: expand with precise rules diff --git a/tests/neg/i15474.check b/tests/neg/i15474.check new file mode 100644 index 000000000000..267a02a80786 --- /dev/null +++ b/tests/neg/i15474.check @@ -0,0 +1,6 @@ +-- Error: tests/neg/i15474.scala:16:56 --------------------------------------------------------------------------------- +16 | given Ordering[Price] = summon[Ordering[BigDecimal]] // error + | ^ + | Warning: result of implicit search for Ordering[BigDecimal] will change. + | current result: given instance given_Ordering_Price in object Price + | result with -source future: object BigDecimal in object Ordering diff --git a/tests/neg/i15474.scala b/tests/neg/i15474.scala index 8edf97a1e55a..c5cf934bdd7a 100644 --- a/tests/neg/i15474.scala +++ b/tests/neg/i15474.scala @@ -4,10 +4,10 @@ import scala.language.implicitConversions object Test1: given c: Conversion[ String, Int ] with - def apply(from: String): Int = from.toInt // error + def apply(from: String): Int = from.toInt // was error, now avoided object Test2: - given c: Conversion[ String, Int ] = _.toInt // loop not detected, could be used as a fallback to avoid the warning. + given c: Conversion[ String, Int ] = _.toInt // now avoided, was loop not detected, could be used as a fallback to avoid the warning. object Prices { opaque type Price = BigDecimal @@ -15,4 +15,6 @@ object Prices { object Price{ given Ordering[Price] = summon[Ordering[BigDecimal]] // error } -} \ No newline at end of file +} + + diff --git a/tests/neg/i6716.check b/tests/neg/i6716.check new file mode 100644 index 000000000000..1e1359442bec --- /dev/null +++ b/tests/neg/i6716.check @@ -0,0 +1,6 @@ +-- Error: tests/neg/i6716.scala:12:39 ---------------------------------------------------------------------------------- +12 | given Monad[Bar] = summon[Monad[Foo]] // error + | ^ + | Warning: result of implicit search for Monad[Foo] will change. + | current result: given instance given_Monad_Bar in object Bar + | result with -source future: object given_Monad_Foo in object Foo diff --git a/tests/neg/i6716.scala b/tests/neg/i6716.scala new file mode 100644 index 000000000000..bbbd9d6d6cd0 --- /dev/null +++ b/tests/neg/i6716.scala @@ -0,0 +1,18 @@ +//> using options -Xfatal-warnings + +trait Monad[T]: + def id: String +class Foo +object Foo { + given Monad[Foo] with { def id = "Foo" } +} + +opaque type Bar = Foo +object Bar { + given Monad[Bar] = summon[Monad[Foo]] // error +} + +object Test extends App { + println(summon[Monad[Foo]].id) + println(summon[Monad[Bar]].id) +} \ No newline at end of file diff --git a/tests/pos/i15474.scala b/tests/pos/i15474.scala new file mode 100644 index 000000000000..e40e11d84581 --- /dev/null +++ b/tests/pos/i15474.scala @@ -0,0 +1,20 @@ +//> using options -Xfatal-warnings +import scala.language.implicitConversions +import language.future + +object Test1: + given c: Conversion[ String, Int ] with + def apply(from: String): Int = from.toInt // was error, now avoided + +object Test2: + given c: Conversion[ String, Int ] = _.toInt // now avoided, was loop not detected, could be used as a fallback to avoid the warning. + +object Prices { + opaque type Price = BigDecimal + + object Price{ + given Ordering[Price] = summon[Ordering[BigDecimal]] // was error, now avoided + } +} + + diff --git a/tests/pos/i6716.scala b/tests/pos/i6716.scala deleted file mode 100644 index 446cd49c9214..000000000000 --- a/tests/pos/i6716.scala +++ /dev/null @@ -1,15 +0,0 @@ -trait Monad[T] -class Foo -object Foo { - given Monad[Foo] with {} -} - -opaque type Bar = Foo -object Bar { - given Monad[Bar] = summon[Monad[Foo]] -} - -object Test { - val mf = summon[Monad[Foo]] - val mb = summon[Monad[Bar]] -} \ No newline at end of file diff --git a/tests/run/i17115.check b/tests/run/i17115.check new file mode 100644 index 000000000000..61c83cba41ce --- /dev/null +++ b/tests/run/i17115.check @@ -0,0 +1,2 @@ +4 +5 diff --git a/tests/pos/i17115.scala b/tests/run/i17115.scala similarity index 100% rename from tests/pos/i17115.scala rename to tests/run/i17115.scala diff --git a/tests/run/i6716.check b/tests/run/i6716.check new file mode 100644 index 000000000000..bb85bd267288 --- /dev/null +++ b/tests/run/i6716.check @@ -0,0 +1,2 @@ +Foo +Foo diff --git a/tests/run/i6716.scala b/tests/run/i6716.scala new file mode 100644 index 000000000000..7c4e7fe394d8 --- /dev/null +++ b/tests/run/i6716.scala @@ -0,0 +1,20 @@ +//> using options -Xfatal-warnings + +import language.future + +trait Monad[T]: + def id: String +class Foo +object Foo { + given Monad[Foo] with { def id = "Foo" } +} + +opaque type Bar = Foo +object Bar { + given Monad[Bar] = summon[Monad[Foo]] // error +} + +object Test extends App { + println(summon[Monad[Foo]].id) + println(summon[Monad[Bar]].id) +} \ No newline at end of file From 4547e1b82c6ae04fce01e173983ef32bdc3018ab Mon Sep 17 00:00:00 2001 From: odersky Date: Mon, 18 Dec 2023 14:10:56 +0100 Subject: [PATCH 154/371] Use experimental language import to enable the new behavior --- .../src/dotty/tools/dotc/config/Feature.scala | 1 + .../dotty/tools/dotc/semanticdb/Scala3.scala | 3 ++ .../dotty/tools/dotc/typer/Implicits.scala | 47 ++++++++++++++----- .../quoted/runtime/impl/QuotesImpl.scala | 4 ++ .../runtime/stdLibPatches/language.scala | 9 ++++ tests/neg/i15474.check | 12 +++-- tests/neg/i6716.check | 10 +++- tests/neg/i7294-a.check | 23 +++++++++ tests/neg/i7294-a.scala | 2 +- tests/neg/i7294-b.scala | 2 +- tests/pos/i15474.scala | 2 +- tests/run/i6716.scala | 4 +- 12 files changed, 96 insertions(+), 23 deletions(-) create mode 100644 tests/neg/i7294-a.check diff --git a/compiler/src/dotty/tools/dotc/config/Feature.scala b/compiler/src/dotty/tools/dotc/config/Feature.scala index fa262a5880ff..e8ca30ecb243 100644 --- a/compiler/src/dotty/tools/dotc/config/Feature.scala +++ b/compiler/src/dotty/tools/dotc/config/Feature.scala @@ -33,6 +33,7 @@ object Feature: val pureFunctions = experimental("pureFunctions") val captureChecking = experimental("captureChecking") val into = experimental("into") + val avoidLoopingGivens = experimental("avoidLoopingGivens") val globalOnlyImports: Set[TermName] = Set(pureFunctions, captureChecking) diff --git a/compiler/src/dotty/tools/dotc/semanticdb/Scala3.scala b/compiler/src/dotty/tools/dotc/semanticdb/Scala3.scala index f49b00089712..fdb9251951e5 100644 --- a/compiler/src/dotty/tools/dotc/semanticdb/Scala3.scala +++ b/compiler/src/dotty/tools/dotc/semanticdb/Scala3.scala @@ -77,6 +77,9 @@ object Scala3: type SemanticSymbol = Symbol | FakeSymbol given SemanticSymbolOps : AnyRef with + import SymbolOps.* + import StringOps.* + extension (sym: SemanticSymbol) def name(using Context): Name = sym match case s: Symbol => s.name diff --git a/compiler/src/dotty/tools/dotc/typer/Implicits.scala b/compiler/src/dotty/tools/dotc/typer/Implicits.scala index bb35306f696c..7cdc3d4a2508 100644 --- a/compiler/src/dotty/tools/dotc/typer/Implicits.scala +++ b/compiler/src/dotty/tools/dotc/typer/Implicits.scala @@ -1589,19 +1589,40 @@ trait Implicits: val eligible = if contextual then preEligible.filterNot(comesTooLate) else preEligible - def checkResolutionChange(result: SearchResult) = result match - case result: SearchSuccess - if (eligible ne preEligible) && !sourceVersion.isAtLeast(SourceVersion.`future`) => - searchImplicit(preEligible.diff(eligible), contextual) match - case prevResult: SearchSuccess => - report.error( - em"""Warning: result of implicit search for $pt will change. - |current result: ${prevResult.ref.symbol.showLocated} - |result with -source future: ${result.ref.symbol.showLocated}""", - srcPos - ) - case _ => - case _ => + def checkResolutionChange(result: SearchResult) = + if (eligible ne preEligible) + && !Feature.enabled(Feature.avoidLoopingGivens) + then searchImplicit(preEligible.diff(eligible), contextual) match + case prevResult: SearchSuccess => + def remedy = pt match + case _: SelectionProto => + "conversion,\n - use an import to get extension method into scope" + case _: ViewProto => + "conversion" + case _ => + "argument" + + def showResult(r: SearchResult) = r match + case r: SearchSuccess => ctx.printer.toTextRef(r.ref).show + case r => r.show + + result match + case result: SearchSuccess if prevResult.ref frozen_=:= result.ref => + // OK + case _ => + report.error( + em"""Warning: result of implicit search for $pt will change. + |Current result ${showResult(prevResult)} will be no longer eligible + | because it is not defined before the search position. + |Result with new rules: ${showResult(result)}. + |To opt into the new rules, use the `avoidLoopingGivens` language import, + | + |To fix the problem you could try one of the following: + | - rearrange definitions, + | - use an explicit $remedy.""", + srcPos) + case _ => + end checkResolutionChange searchImplicit(eligible, contextual) match case result: SearchSuccess => diff --git a/compiler/src/scala/quoted/runtime/impl/QuotesImpl.scala b/compiler/src/scala/quoted/runtime/impl/QuotesImpl.scala index 51f133c972b4..1203e309c484 100644 --- a/compiler/src/scala/quoted/runtime/impl/QuotesImpl.scala +++ b/compiler/src/scala/quoted/runtime/impl/QuotesImpl.scala @@ -1781,6 +1781,8 @@ class QuotesImpl private (using val ctx: Context) extends Quotes, QuoteUnpickler end TypeRepr given TypeReprMethods: TypeReprMethods with + import SymbolMethods.* + extension (self: TypeRepr) def show(using printer: Printer[TypeRepr]): String = printer.show(self) @@ -2608,6 +2610,8 @@ class QuotesImpl private (using val ctx: Context) extends Quotes, QuoteUnpickler end Symbol given SymbolMethods: SymbolMethods with + import FlagsMethods.* + extension (self: Symbol) def owner: Symbol = self.denot.owner def maybeOwner: Symbol = self.denot.maybeOwner diff --git a/library/src/scala/runtime/stdLibPatches/language.scala b/library/src/scala/runtime/stdLibPatches/language.scala index c2a12cec2ecc..9fa8bff120af 100644 --- a/library/src/scala/runtime/stdLibPatches/language.scala +++ b/library/src/scala/runtime/stdLibPatches/language.scala @@ -91,6 +91,15 @@ object language: @compileTimeOnly("`into` can only be used at compile time in import statements") object into + /** Experimental support for new given resolution rules that avoid looping + * givens. By the new rules, a given may not implicitly use itself or givens + * defined after it. + * + * @see [[https://dotty.epfl.ch/docs/reference/experimental/avoid-looping-givens]] + */ + @compileTimeOnly("`avoidLoopingGivens` can only be used at compile time in import statements") + object avoidLoopingGivens + /** Was needed to add support for relaxed imports of extension methods. * The language import is no longer needed as this is now a standard feature since SIP was accepted. * @see [[http://dotty.epfl.ch/docs/reference/contextual/extension-methods]] diff --git a/tests/neg/i15474.check b/tests/neg/i15474.check index 267a02a80786..4bf344dc5a71 100644 --- a/tests/neg/i15474.check +++ b/tests/neg/i15474.check @@ -1,6 +1,12 @@ -- Error: tests/neg/i15474.scala:16:56 --------------------------------------------------------------------------------- 16 | given Ordering[Price] = summon[Ordering[BigDecimal]] // error | ^ - | Warning: result of implicit search for Ordering[BigDecimal] will change. - | current result: given instance given_Ordering_Price in object Price - | result with -source future: object BigDecimal in object Ordering + | Warning: result of implicit search for Ordering[BigDecimal] will change. + | Current result Prices.Price.given_Ordering_Price will be no longer eligible + | because it is not defined before the search position. + | Result with new rules: scala.math.Ordering.BigDecimal. + | To opt into the new rules, use the `avoidLoopingGivens` language import, + | + | To fix the problem you could try one of the following: + | - rearrange definitions, + | - use an explicit argument. diff --git a/tests/neg/i6716.check b/tests/neg/i6716.check index 1e1359442bec..3746eaafad50 100644 --- a/tests/neg/i6716.check +++ b/tests/neg/i6716.check @@ -2,5 +2,11 @@ 12 | given Monad[Bar] = summon[Monad[Foo]] // error | ^ | Warning: result of implicit search for Monad[Foo] will change. - | current result: given instance given_Monad_Bar in object Bar - | result with -source future: object given_Monad_Foo in object Foo + | Current result Bar.given_Monad_Bar will be no longer eligible + | because it is not defined before the search position. + | Result with new rules: Foo.given_Monad_Foo. + | To opt into the new rules, use the `avoidLoopingGivens` language import, + | + | To fix the problem you could try one of the following: + | - rearrange definitions, + | - use an explicit argument. diff --git a/tests/neg/i7294-a.check b/tests/neg/i7294-a.check new file mode 100644 index 000000000000..9541f7979a7a --- /dev/null +++ b/tests/neg/i7294-a.check @@ -0,0 +1,23 @@ +-- Error: tests/neg/i7294-a.scala:6:10 --------------------------------------------------------------------------------- +6 | case x: T => x.g(10) // error // error + | ^ + | Warning: result of implicit search for scala.reflect.TypeTest[Nothing, T] will change. + | Current result foo.i7294-a$package.f will be no longer eligible + | because it is not defined before the search position. + | Result with new rules: No Matching Implicit. + | To opt into the new rules, use the `avoidLoopingGivens` language import, + | + | To fix the problem you could try one of the following: + | - rearrange definitions, + | - use an explicit argument. + | + | where: T is a type in given instance f with bounds <: foo.Foo +-- [E007] Type Mismatch Error: tests/neg/i7294-a.scala:6:15 ------------------------------------------------------------ +6 | case x: T => x.g(10) // error // error + | ^ + | Found: (x : Nothing) + | Required: ?{ g: ? } + | Note that implicit conversions were not tried because the result of an implicit conversion + | must be more specific than ?{ g: [applied to (10) returning T] } + | + | longer explanation available when compiling with `-explain` diff --git a/tests/neg/i7294-a.scala b/tests/neg/i7294-a.scala index 13981fa4d375..538dc3159fb8 100644 --- a/tests/neg/i7294-a.scala +++ b/tests/neg/i7294-a.scala @@ -3,7 +3,7 @@ package foo trait Foo { def g(x: Int): Any } inline given f[T <: Foo]: T = ??? match { - case x: T => x.g(10) // error + case x: T => x.g(10) // error // error } @main def Test = f diff --git a/tests/neg/i7294-b.scala b/tests/neg/i7294-b.scala index 423d5037db96..b06d814444e8 100644 --- a/tests/neg/i7294-b.scala +++ b/tests/neg/i7294-b.scala @@ -3,7 +3,7 @@ package foo trait Foo { def g(x: Any): Any } inline given f[T <: Foo]: T = ??? match { - case x: T => x.g(10) // error + case x: T => x.g(10) // error // error } @main def Test = f diff --git a/tests/pos/i15474.scala b/tests/pos/i15474.scala index e40e11d84581..8adc5ad7233d 100644 --- a/tests/pos/i15474.scala +++ b/tests/pos/i15474.scala @@ -1,6 +1,6 @@ //> using options -Xfatal-warnings import scala.language.implicitConversions -import language.future +import scala.language.experimental.avoidLoopingGivens object Test1: given c: Conversion[ String, Int ] with diff --git a/tests/run/i6716.scala b/tests/run/i6716.scala index 7c4e7fe394d8..6208a52190fe 100644 --- a/tests/run/i6716.scala +++ b/tests/run/i6716.scala @@ -1,6 +1,6 @@ //> using options -Xfatal-warnings -import language.future +import scala.language.experimental.avoidLoopingGivens trait Monad[T]: def id: String @@ -11,7 +11,7 @@ object Foo { opaque type Bar = Foo object Bar { - given Monad[Bar] = summon[Monad[Foo]] // error + given Monad[Bar] = summon[Monad[Foo]] // was error fixed by avoidLoopingGivens } object Test extends App { From 4bb0aaaf39d3598ce0f443188a4c1f6b6c42fc10 Mon Sep 17 00:00:00 2001 From: odersky Date: Mon, 18 Dec 2023 17:52:30 +0100 Subject: [PATCH 155/371] Make the new behavior dependent on source >= 3.4 --- .../dotty/tools/dotc/typer/Implicits.scala | 91 ++++++++++--------- tests/neg/i15474.check | 42 +++++++-- tests/neg/i15474.scala | 4 +- tests/neg/i6716.check | 16 ++-- tests/neg/i7294-a.check | 18 ++-- tests/neg/looping-givens.scala | 9 ++ tests/pos/looping-givens.scala | 11 +++ 7 files changed, 122 insertions(+), 69 deletions(-) create mode 100644 tests/neg/looping-givens.scala create mode 100644 tests/pos/looping-givens.scala diff --git a/compiler/src/dotty/tools/dotc/typer/Implicits.scala b/compiler/src/dotty/tools/dotc/typer/Implicits.scala index 7cdc3d4a2508..77328cdbb33d 100644 --- a/compiler/src/dotty/tools/dotc/typer/Implicits.scala +++ b/compiler/src/dotty/tools/dotc/typer/Implicits.scala @@ -1587,46 +1587,54 @@ trait Implicits: && candSucceedsGiven(ctx.owner) end comesTooLate - val eligible = if contextual then preEligible.filterNot(comesTooLate) else preEligible + val eligible = // the eligible candidates that come before the search point + if contextual && sourceVersion.isAtLeast(SourceVersion.`3.4`) + then preEligible.filterNot(comesTooLate) + else preEligible def checkResolutionChange(result: SearchResult) = if (eligible ne preEligible) && !Feature.enabled(Feature.avoidLoopingGivens) - then searchImplicit(preEligible.diff(eligible), contextual) match - case prevResult: SearchSuccess => - def remedy = pt match - case _: SelectionProto => - "conversion,\n - use an import to get extension method into scope" - case _: ViewProto => - "conversion" - case _ => - "argument" - - def showResult(r: SearchResult) = r match - case r: SearchSuccess => ctx.printer.toTextRef(r.ref).show - case r => r.show - - result match - case result: SearchSuccess if prevResult.ref frozen_=:= result.ref => - // OK - case _ => - report.error( - em"""Warning: result of implicit search for $pt will change. - |Current result ${showResult(prevResult)} will be no longer eligible - | because it is not defined before the search position. - |Result with new rules: ${showResult(result)}. - |To opt into the new rules, use the `avoidLoopingGivens` language import, - | - |To fix the problem you could try one of the following: - | - rearrange definitions, - | - use an explicit $remedy.""", - srcPos) - case _ => + then + val prevResult = searchImplicit(preEligible, contextual) + prevResult match + case prevResult: SearchSuccess => + def remedy = pt match + case _: SelectionProto => + "conversion,\n - use an import to get extension method into scope" + case _: ViewProto => + "conversion" + case _ => + "argument" + + def showResult(r: SearchResult) = r match + case r: SearchSuccess => ctx.printer.toTextRef(r.ref).show + case r => r.show + + result match + case result: SearchSuccess if prevResult.ref frozen_=:= result.ref => + // OK + case _ => + report.error( + em"""Warning: result of implicit search for $pt will change. + |Current result ${showResult(prevResult)} will be no longer eligible + | because it is not defined before the search position. + |Result with new rules: ${showResult(result)}. + |To opt into the new rules, use the `experimental.avoidLoopingGivens` language import. + | + |To fix the problem without the language import, you could try one of the following: + | - rearrange definitions so that ${showResult(prevResult)} comes earlier, + | - use an explicit $remedy.""", + srcPos) + case _ => + prevResult + else result end checkResolutionChange - searchImplicit(eligible, contextual) match + val result = searchImplicit(eligible, contextual) + result match case result: SearchSuccess => - result + checkResolutionChange(result) case failure: SearchFailure => failure.reason match case _: AmbiguousImplicits => failure @@ -1641,15 +1649,14 @@ trait Implicits: else ctxImplicits.nn.outerImplicits: ContextualImplicits | Null // !!! Dotty problem: without the ContextualImplicits | Null type ascription // we get a Ycheck failure after arrayConstructors due to "Types differ" - val result = searchImplicit(newCtxImplicits).recoverWith: - failure2 => failure2.reason match - case _: AmbiguousImplicits => failure2 - case _ => - reason match - case (_: DivergingImplicit) => failure - case _ => List(failure, failure2).maxBy(_.tree.treeSize) - checkResolutionChange(result) - result + checkResolutionChange: + searchImplicit(newCtxImplicits).recoverWith: + failure2 => failure2.reason match + case _: AmbiguousImplicits => failure2 + case _ => + reason match + case (_: DivergingImplicit) => failure + case _ => List(failure, failure2).maxBy(_.tree.treeSize) else failure end searchImplicit diff --git a/tests/neg/i15474.check b/tests/neg/i15474.check index 4bf344dc5a71..0b23628d3051 100644 --- a/tests/neg/i15474.check +++ b/tests/neg/i15474.check @@ -1,12 +1,38 @@ +-- Error: tests/neg/i15474.scala:7:35 ---------------------------------------------------------------------------------- +7 | def apply(from: String): Int = from.toInt // error + | ^^^^ + | Warning: result of implicit search for ?{ toInt: ? } will change. + | Current result Test1.c will be no longer eligible + | because it is not defined before the search position. + | Result with new rules: augmentString. + | To opt into the new rules, use the `experimental.avoidLoopingGivens` language import. + | + | To fix the problem without the language import, you could try one of the following: + | - rearrange definitions so that Test1.c comes earlier, + | - use an explicit conversion, + | - use an import to get extension method into scope. +-- Error: tests/neg/i15474.scala:10:39 --------------------------------------------------------------------------------- +10 | given c: Conversion[ String, Int ] = _.toInt // error + | ^ + | Warning: result of implicit search for ?{ toInt: ? } will change. + | Current result Test2.c will be no longer eligible + | because it is not defined before the search position. + | Result with new rules: augmentString. + | To opt into the new rules, use the `experimental.avoidLoopingGivens` language import. + | + | To fix the problem without the language import, you could try one of the following: + | - rearrange definitions so that Test2.c comes earlier, + | - use an explicit conversion, + | - use an import to get extension method into scope. -- Error: tests/neg/i15474.scala:16:56 --------------------------------------------------------------------------------- 16 | given Ordering[Price] = summon[Ordering[BigDecimal]] // error | ^ - | Warning: result of implicit search for Ordering[BigDecimal] will change. - | Current result Prices.Price.given_Ordering_Price will be no longer eligible - | because it is not defined before the search position. - | Result with new rules: scala.math.Ordering.BigDecimal. - | To opt into the new rules, use the `avoidLoopingGivens` language import, + | Warning: result of implicit search for Ordering[BigDecimal] will change. + | Current result Prices.Price.given_Ordering_Price will be no longer eligible + | because it is not defined before the search position. + | Result with new rules: scala.math.Ordering.BigDecimal. + | To opt into the new rules, use the `experimental.avoidLoopingGivens` language import. | - | To fix the problem you could try one of the following: - | - rearrange definitions, - | - use an explicit argument. + | To fix the problem without the language import, you could try one of the following: + | - rearrange definitions so that Prices.Price.given_Ordering_Price comes earlier, + | - use an explicit argument. diff --git a/tests/neg/i15474.scala b/tests/neg/i15474.scala index c5cf934bdd7a..5a66ea016630 100644 --- a/tests/neg/i15474.scala +++ b/tests/neg/i15474.scala @@ -4,10 +4,10 @@ import scala.language.implicitConversions object Test1: given c: Conversion[ String, Int ] with - def apply(from: String): Int = from.toInt // was error, now avoided + def apply(from: String): Int = from.toInt // error object Test2: - given c: Conversion[ String, Int ] = _.toInt // now avoided, was loop not detected, could be used as a fallback to avoid the warning. + given c: Conversion[ String, Int ] = _.toInt // error object Prices { opaque type Price = BigDecimal diff --git a/tests/neg/i6716.check b/tests/neg/i6716.check index 3746eaafad50..6771d736b6af 100644 --- a/tests/neg/i6716.check +++ b/tests/neg/i6716.check @@ -1,12 +1,12 @@ -- Error: tests/neg/i6716.scala:12:39 ---------------------------------------------------------------------------------- 12 | given Monad[Bar] = summon[Monad[Foo]] // error | ^ - | Warning: result of implicit search for Monad[Foo] will change. - | Current result Bar.given_Monad_Bar will be no longer eligible - | because it is not defined before the search position. - | Result with new rules: Foo.given_Monad_Foo. - | To opt into the new rules, use the `avoidLoopingGivens` language import, + | Warning: result of implicit search for Monad[Foo] will change. + | Current result Bar.given_Monad_Bar will be no longer eligible + | because it is not defined before the search position. + | Result with new rules: Foo.given_Monad_Foo. + | To opt into the new rules, use the `experimental.avoidLoopingGivens` language import. | - | To fix the problem you could try one of the following: - | - rearrange definitions, - | - use an explicit argument. + | To fix the problem without the language import, you could try one of the following: + | - rearrange definitions so that Bar.given_Monad_Bar comes earlier, + | - use an explicit argument. diff --git a/tests/neg/i7294-a.check b/tests/neg/i7294-a.check index 9541f7979a7a..887635d89a35 100644 --- a/tests/neg/i7294-a.check +++ b/tests/neg/i7294-a.check @@ -5,19 +5,19 @@ | Current result foo.i7294-a$package.f will be no longer eligible | because it is not defined before the search position. | Result with new rules: No Matching Implicit. - | To opt into the new rules, use the `avoidLoopingGivens` language import, + | To opt into the new rules, use the `experimental.avoidLoopingGivens` language import. | - | To fix the problem you could try one of the following: - | - rearrange definitions, + | To fix the problem without the language import, you could try one of the following: + | - rearrange definitions so that foo.i7294-a$package.f comes earlier, | - use an explicit argument. | | where: T is a type in given instance f with bounds <: foo.Foo --- [E007] Type Mismatch Error: tests/neg/i7294-a.scala:6:15 ------------------------------------------------------------ +-- [E007] Type Mismatch Error: tests/neg/i7294-a.scala:6:18 ------------------------------------------------------------ 6 | case x: T => x.g(10) // error // error - | ^ - | Found: (x : Nothing) - | Required: ?{ g: ? } - | Note that implicit conversions were not tried because the result of an implicit conversion - | must be more specific than ?{ g: [applied to (10) returning T] } + | ^^^^^^^ + | Found: Any + | Required: T + | + | where: T is a type in given instance f with bounds <: foo.Foo | | longer explanation available when compiling with `-explain` diff --git a/tests/neg/looping-givens.scala b/tests/neg/looping-givens.scala new file mode 100644 index 000000000000..572f1707861f --- /dev/null +++ b/tests/neg/looping-givens.scala @@ -0,0 +1,9 @@ +class A +class B + +given joint(using a: A, b: B): (A & B) = ??? + +def foo(using a: A, b: B) = + given aa: A = summon // error + given bb: B = summon // error + given ab: (A & B) = summon // error diff --git a/tests/pos/looping-givens.scala b/tests/pos/looping-givens.scala new file mode 100644 index 000000000000..ed11981c1bf6 --- /dev/null +++ b/tests/pos/looping-givens.scala @@ -0,0 +1,11 @@ +import language.experimental.avoidLoopingGivens + +class A +class B + +given joint(using a: A, b: B): (A & B) = ??? + +def foo(using a: A, b: B) = + given aa: A = summon // error + given bb: B = summon // error + given ab: (A & B) = summon // error From a48a759f00ffe4864b20b2f8b9b70288b8e3fb0a Mon Sep 17 00:00:00 2001 From: odersky Date: Mon, 18 Dec 2023 21:51:19 +0100 Subject: [PATCH 156/371] Restrict the new rules to givens that don't define a new class `given ... with` or `given ... = new { ... }` kinds of definitions now follow the old rules. This allows recursive `given...with` definitions as they are found in protoQuill. We still have the old check in a later phase against directly recursive methods. Of the three loops in the original i15474 we now detect #2 and #3 with new new restrictions. #1 slips through since it is a loop involving a `given...with` instance of `Conversion`, but is caught later with the recursive method check. Previously tests #1 and #3 were detected with the recursive methods check and #2 slipped through altogether. The new rules are enough for defining simple givens with `=` without fear of looping. --- .../dotty/tools/dotc/semanticdb/Scala3.scala | 3 --- .../dotty/tools/dotc/typer/Implicits.scala | 8 +++--- .../quoted/runtime/impl/QuotesImpl.scala | 4 --- tests/neg/i15474.check | 27 +++++-------------- tests/neg/i15474.scala | 4 --- tests/neg/i15474b.check | 5 ++++ tests/neg/i15474b.scala | 8 ++++++ tests/pos/i15474.scala | 4 --- 8 files changed, 25 insertions(+), 38 deletions(-) create mode 100644 tests/neg/i15474b.check create mode 100644 tests/neg/i15474b.scala diff --git a/compiler/src/dotty/tools/dotc/semanticdb/Scala3.scala b/compiler/src/dotty/tools/dotc/semanticdb/Scala3.scala index fdb9251951e5..f49b00089712 100644 --- a/compiler/src/dotty/tools/dotc/semanticdb/Scala3.scala +++ b/compiler/src/dotty/tools/dotc/semanticdb/Scala3.scala @@ -77,9 +77,6 @@ object Scala3: type SemanticSymbol = Symbol | FakeSymbol given SemanticSymbolOps : AnyRef with - import SymbolOps.* - import StringOps.* - extension (sym: SemanticSymbol) def name(using Context): Name = sym match case s: Symbol => s.name diff --git a/compiler/src/dotty/tools/dotc/typer/Implicits.scala b/compiler/src/dotty/tools/dotc/typer/Implicits.scala index 77328cdbb33d..3d3320eb7589 100644 --- a/compiler/src/dotty/tools/dotc/typer/Implicits.scala +++ b/compiler/src/dotty/tools/dotc/typer/Implicits.scala @@ -1569,8 +1569,10 @@ trait Implicits: /** Does candidate `cand` come too late for it to be considered as an * eligible candidate? This is the case if `cand` appears in the same - * scope as a given definition enclosing the search point and comes - * later in the source or coincides with that given definition. + * scope as a given definition enclosing the search point (with no + * class methods between the given definition and the search point) + * and `cand` comes later in the source or coincides with that given + * definition. */ def comesTooLate(cand: Candidate): Boolean = val candSym = cand.ref.symbol @@ -1578,7 +1580,7 @@ trait Implicits: if sym.owner == candSym.owner then if sym.is(ModuleClass) then candSucceedsGiven(sym.sourceModule) else sym.is(Given) && sym.span.exists && sym.span.start <= candSym.span.start - else if sym.is(Package) then false + else if sym.owner.isClass then false else candSucceedsGiven(sym.owner) ctx.isTyper diff --git a/compiler/src/scala/quoted/runtime/impl/QuotesImpl.scala b/compiler/src/scala/quoted/runtime/impl/QuotesImpl.scala index 1203e309c484..51f133c972b4 100644 --- a/compiler/src/scala/quoted/runtime/impl/QuotesImpl.scala +++ b/compiler/src/scala/quoted/runtime/impl/QuotesImpl.scala @@ -1781,8 +1781,6 @@ class QuotesImpl private (using val ctx: Context) extends Quotes, QuoteUnpickler end TypeRepr given TypeReprMethods: TypeReprMethods with - import SymbolMethods.* - extension (self: TypeRepr) def show(using printer: Printer[TypeRepr]): String = printer.show(self) @@ -2610,8 +2608,6 @@ class QuotesImpl private (using val ctx: Context) extends Quotes, QuoteUnpickler end Symbol given SymbolMethods: SymbolMethods with - import FlagsMethods.* - extension (self: Symbol) def owner: Symbol = self.denot.owner def maybeOwner: Symbol = self.denot.maybeOwner diff --git a/tests/neg/i15474.check b/tests/neg/i15474.check index 0b23628d3051..f99c6778d1ae 100644 --- a/tests/neg/i15474.check +++ b/tests/neg/i15474.check @@ -1,31 +1,18 @@ --- Error: tests/neg/i15474.scala:7:35 ---------------------------------------------------------------------------------- -7 | def apply(from: String): Int = from.toInt // error - | ^^^^ +-- Error: tests/neg/i15474.scala:6:39 ---------------------------------------------------------------------------------- +6 | given c: Conversion[ String, Int ] = _.toInt // error + | ^ | Warning: result of implicit search for ?{ toInt: ? } will change. - | Current result Test1.c will be no longer eligible + | Current result Test2.c will be no longer eligible | because it is not defined before the search position. | Result with new rules: augmentString. | To opt into the new rules, use the `experimental.avoidLoopingGivens` language import. | | To fix the problem without the language import, you could try one of the following: - | - rearrange definitions so that Test1.c comes earlier, + | - rearrange definitions so that Test2.c comes earlier, | - use an explicit conversion, | - use an import to get extension method into scope. --- Error: tests/neg/i15474.scala:10:39 --------------------------------------------------------------------------------- -10 | given c: Conversion[ String, Int ] = _.toInt // error - | ^ - | Warning: result of implicit search for ?{ toInt: ? } will change. - | Current result Test2.c will be no longer eligible - | because it is not defined before the search position. - | Result with new rules: augmentString. - | To opt into the new rules, use the `experimental.avoidLoopingGivens` language import. - | - | To fix the problem without the language import, you could try one of the following: - | - rearrange definitions so that Test2.c comes earlier, - | - use an explicit conversion, - | - use an import to get extension method into scope. --- Error: tests/neg/i15474.scala:16:56 --------------------------------------------------------------------------------- -16 | given Ordering[Price] = summon[Ordering[BigDecimal]] // error +-- Error: tests/neg/i15474.scala:12:56 --------------------------------------------------------------------------------- +12 | given Ordering[Price] = summon[Ordering[BigDecimal]] // error | ^ | Warning: result of implicit search for Ordering[BigDecimal] will change. | Current result Prices.Price.given_Ordering_Price will be no longer eligible diff --git a/tests/neg/i15474.scala b/tests/neg/i15474.scala index 5a66ea016630..b196d1b400ef 100644 --- a/tests/neg/i15474.scala +++ b/tests/neg/i15474.scala @@ -2,10 +2,6 @@ import scala.language.implicitConversions -object Test1: - given c: Conversion[ String, Int ] with - def apply(from: String): Int = from.toInt // error - object Test2: given c: Conversion[ String, Int ] = _.toInt // error diff --git a/tests/neg/i15474b.check b/tests/neg/i15474b.check new file mode 100644 index 000000000000..73ef720af7e3 --- /dev/null +++ b/tests/neg/i15474b.check @@ -0,0 +1,5 @@ +-- Error: tests/neg/i15474b.scala:7:40 --------------------------------------------------------------------------------- +7 | def apply(from: String): Int = from.toInt // error: infinite loop in function body + | ^^^^^^^^^^ + | Infinite loop in function body + | Test1.c.apply(from).toInt diff --git a/tests/neg/i15474b.scala b/tests/neg/i15474b.scala new file mode 100644 index 000000000000..9d496c37ef00 --- /dev/null +++ b/tests/neg/i15474b.scala @@ -0,0 +1,8 @@ +//> using options -Xfatal-warnings + +import scala.language.implicitConversions + +object Test1: + given c: Conversion[ String, Int ] with + def apply(from: String): Int = from.toInt // error: infinite loop in function body + diff --git a/tests/pos/i15474.scala b/tests/pos/i15474.scala index 8adc5ad7233d..6b9e55806ae3 100644 --- a/tests/pos/i15474.scala +++ b/tests/pos/i15474.scala @@ -2,10 +2,6 @@ import scala.language.implicitConversions import scala.language.experimental.avoidLoopingGivens -object Test1: - given c: Conversion[ String, Int ] with - def apply(from: String): Int = from.toInt // was error, now avoided - object Test2: given c: Conversion[ String, Int ] = _.toInt // now avoided, was loop not detected, could be used as a fallback to avoid the warning. From b0fecc5be60b2b090c51622bc06cdeb727cd2a5f Mon Sep 17 00:00:00 2001 From: odersky Date: Mon, 18 Dec 2023 22:58:34 +0100 Subject: [PATCH 157/371] Add tweaks and docs --- .../src/dotty/tools/dotc/config/Feature.scala | 2 +- .../tools/dotc/config/SourceVersion.scala | 1 + .../dotty/tools/dotc/typer/Implicits.scala | 32 ++++++++-------- .../changed-features/implicit-resolution.md | 23 ----------- .../experimental/given-loop-prevention.md | 31 +++++++++++++++ docs/sidebar.yml | 1 + .../runtime/stdLibPatches/language.scala | 6 +-- tests/neg/i15474.check | 38 ++++++++++--------- tests/neg/i6716.check | 18 +++++---- tests/neg/i7294-a.check | 28 +++++++------- tests/neg/i7294-a.scala | 2 + tests/neg/i7294-b.scala | 2 + tests/neg/looping-givens.scala | 2 + tests/pos/i15474.scala | 2 +- tests/pos/looping-givens.scala | 2 +- tests/run/i6716.scala | 4 +- 16 files changed, 110 insertions(+), 84 deletions(-) create mode 100644 docs/_docs/reference/experimental/given-loop-prevention.md diff --git a/compiler/src/dotty/tools/dotc/config/Feature.scala b/compiler/src/dotty/tools/dotc/config/Feature.scala index e8ca30ecb243..cdd83b15f4fc 100644 --- a/compiler/src/dotty/tools/dotc/config/Feature.scala +++ b/compiler/src/dotty/tools/dotc/config/Feature.scala @@ -33,7 +33,7 @@ object Feature: val pureFunctions = experimental("pureFunctions") val captureChecking = experimental("captureChecking") val into = experimental("into") - val avoidLoopingGivens = experimental("avoidLoopingGivens") + val givenLoopPrevention = experimental("givenLoopPrevention") val globalOnlyImports: Set[TermName] = Set(pureFunctions, captureChecking) diff --git a/compiler/src/dotty/tools/dotc/config/SourceVersion.scala b/compiler/src/dotty/tools/dotc/config/SourceVersion.scala index f6db0bac0452..33b946ed173f 100644 --- a/compiler/src/dotty/tools/dotc/config/SourceVersion.scala +++ b/compiler/src/dotty/tools/dotc/config/SourceVersion.scala @@ -10,6 +10,7 @@ enum SourceVersion: case `3.2-migration`, `3.2` case `3.3-migration`, `3.3` case `3.4-migration`, `3.4` + case `3.5-migration`, `3.5` // !!! Keep in sync with scala.runtime.stdlibPatches.language !!! case `future-migration`, `future` diff --git a/compiler/src/dotty/tools/dotc/typer/Implicits.scala b/compiler/src/dotty/tools/dotc/typer/Implicits.scala index 3d3320eb7589..1672c94fd969 100644 --- a/compiler/src/dotty/tools/dotc/typer/Implicits.scala +++ b/compiler/src/dotty/tools/dotc/typer/Implicits.scala @@ -1569,19 +1569,18 @@ trait Implicits: /** Does candidate `cand` come too late for it to be considered as an * eligible candidate? This is the case if `cand` appears in the same - * scope as a given definition enclosing the search point (with no - * class methods between the given definition and the search point) - * and `cand` comes later in the source or coincides with that given - * definition. + * scope as a given definition of the form `given ... = ...` that + * encloses the search point and `cand` comes later in the source or + * coincides with that given definition. */ def comesTooLate(cand: Candidate): Boolean = val candSym = cand.ref.symbol def candSucceedsGiven(sym: Symbol): Boolean = - if sym.owner == candSym.owner then - if sym.is(ModuleClass) then candSucceedsGiven(sym.sourceModule) - else sym.is(Given) && sym.span.exists && sym.span.start <= candSym.span.start - else if sym.owner.isClass then false - else candSucceedsGiven(sym.owner) + val owner = sym.owner + if owner == candSym.owner then + sym.is(GivenVal) && sym.span.exists && sym.span.start <= candSym.span.start + else if owner.isClass then false + else candSucceedsGiven(owner) ctx.isTyper && !candSym.isOneOf(TermParamOrAccessor | Synthetic) @@ -1596,7 +1595,7 @@ trait Implicits: def checkResolutionChange(result: SearchResult) = if (eligible ne preEligible) - && !Feature.enabled(Feature.avoidLoopingGivens) + && !Feature.enabled(Feature.givenLoopPrevention) then val prevResult = searchImplicit(preEligible, contextual) prevResult match @@ -1617,17 +1616,20 @@ trait Implicits: case result: SearchSuccess if prevResult.ref frozen_=:= result.ref => // OK case _ => - report.error( - em"""Warning: result of implicit search for $pt will change. + val msg = + em"""Result of implicit search for $pt will change. |Current result ${showResult(prevResult)} will be no longer eligible | because it is not defined before the search position. |Result with new rules: ${showResult(result)}. - |To opt into the new rules, use the `experimental.avoidLoopingGivens` language import. + |To opt into the new rules, use the `experimental.givenLoopPrevention` language import. | |To fix the problem without the language import, you could try one of the following: + | - use a `given ... with` clause as the enclosing given, | - rearrange definitions so that ${showResult(prevResult)} comes earlier, - | - use an explicit $remedy.""", - srcPos) + | - use an explicit $remedy.""" + if sourceVersion.isAtLeast(SourceVersion.`3.5`) + then report.error(msg, srcPos) + else report.warning(msg.append("\nThis will be an error in Scala 3.5 and later."), srcPos) case _ => prevResult else result diff --git a/docs/_docs/reference/changed-features/implicit-resolution.md b/docs/_docs/reference/changed-features/implicit-resolution.md index ab8293724a4e..861a63bd4a05 100644 --- a/docs/_docs/reference/changed-features/implicit-resolution.md +++ b/docs/_docs/reference/changed-features/implicit-resolution.md @@ -164,26 +164,3 @@ The new rules are as follows: An implicit `a` defined in `A` is more specific th Condition (*) is new. It is necessary to ensure that the defined relation is transitive. [//]: # todo: expand with precise rules - -**9.** Implicit resolution now tries to avoid recursive givens that can lead to an infinite loop at runtime. Here is an example: - -```scala -object Prices { - opaque type Price = BigDecimal - - object Price{ - given Ordering[Price] = summon[Ordering[BigDecimal]] // was error, now avoided - } -} -``` - -Previously, implicit resolution would resolve the `summon` to the given in `Price`, leading to an infinite loop (a warning was issued in that case). We now use the underlying given in `BigDecimal` instead. We achieve that by adding the following rule for implicit search: - - - When doing an implicit search while checking the implementation of a `given` definition `G`, discard all search results that lead back to `G` or to a given -with the same owner as `G` that comes later in the source than `G`. - -The new behavior is enabled under `-source future`. In earlier versions, a -warning is issued where that behavior will change. - -Old-style implicit definitions are unaffected by this change. - diff --git a/docs/_docs/reference/experimental/given-loop-prevention.md b/docs/_docs/reference/experimental/given-loop-prevention.md new file mode 100644 index 000000000000..e306ba977d45 --- /dev/null +++ b/docs/_docs/reference/experimental/given-loop-prevention.md @@ -0,0 +1,31 @@ +--- +layout: doc-page +title: Given Loop Prevention +redirectFrom: /docs/reference/other-new-features/into-modifier.html +nightlyOf: https://docs.scala-lang.org/scala3/reference/experimental/into-modifier.html +--- + +Implicit resolution now avoids generating recursive givens that can lead to an infinite loop at runtime. Here is an example: + +```scala +object Prices { + opaque type Price = BigDecimal + + object Price{ + given Ordering[Price] = summon[Ordering[BigDecimal]] // was error, now avoided + } +} +``` + +Previously, implicit resolution would resolve the `summon` to the given in `Price`, leading to an infinite loop (a warning was issued in that case). We now use the underlying given in `BigDecimal` instead. We achieve that by adding the following rule for implicit search: + + - When doing an implicit search while checking the implementation of a `given` definition `G` of the form + ``` + given ... = .... + ``` + discard all search results that lead back to `G` or to a given with the same owner as `G` that comes later in the source than `G`. + +The new behavior is enabled with the `experimental.givenLoopPrevention` language import. If no such import or setting is given, a warning is issued where the behavior would change under that import (for source version 3.4 and later). + +Old-style implicit definitions are unaffected by this change. + diff --git a/docs/sidebar.yml b/docs/sidebar.yml index 65d7ac2f9ee4..d9f86d5141c3 100644 --- a/docs/sidebar.yml +++ b/docs/sidebar.yml @@ -153,6 +153,7 @@ subsection: - page: reference/experimental/cc.md - page: reference/experimental/purefuns.md - page: reference/experimental/tupled-function.md + - page: reference/experimental/given-loop-prevention.md - page: reference/syntax.md - title: Language Versions index: reference/language-versions/language-versions.md diff --git a/library/src/scala/runtime/stdLibPatches/language.scala b/library/src/scala/runtime/stdLibPatches/language.scala index 9fa8bff120af..9a2a034c6b7d 100644 --- a/library/src/scala/runtime/stdLibPatches/language.scala +++ b/library/src/scala/runtime/stdLibPatches/language.scala @@ -95,10 +95,10 @@ object language: * givens. By the new rules, a given may not implicitly use itself or givens * defined after it. * - * @see [[https://dotty.epfl.ch/docs/reference/experimental/avoid-looping-givens]] + * @see [[https://dotty.epfl.ch/docs/reference/experimental/given-loop-prevention]] */ - @compileTimeOnly("`avoidLoopingGivens` can only be used at compile time in import statements") - object avoidLoopingGivens + @compileTimeOnly("`givenLoopPrevention` can only be used at compile time in import statements") + object givenLoopPrevention /** Was needed to add support for relaxed imports of extension methods. * The language import is no longer needed as this is now a standard feature since SIP was accepted. diff --git a/tests/neg/i15474.check b/tests/neg/i15474.check index f99c6778d1ae..6a60aed304aa 100644 --- a/tests/neg/i15474.check +++ b/tests/neg/i15474.check @@ -1,25 +1,29 @@ -- Error: tests/neg/i15474.scala:6:39 ---------------------------------------------------------------------------------- 6 | given c: Conversion[ String, Int ] = _.toInt // error | ^ - | Warning: result of implicit search for ?{ toInt: ? } will change. - | Current result Test2.c will be no longer eligible - | because it is not defined before the search position. - | Result with new rules: augmentString. - | To opt into the new rules, use the `experimental.avoidLoopingGivens` language import. + | Result of implicit search for ?{ toInt: ? } will change. + | Current result Test2.c will be no longer eligible + | because it is not defined before the search position. + | Result with new rules: augmentString. + | To opt into the new rules, use the `experimental.givenLoopPrevention` language import. | - | To fix the problem without the language import, you could try one of the following: - | - rearrange definitions so that Test2.c comes earlier, - | - use an explicit conversion, - | - use an import to get extension method into scope. + | To fix the problem without the language import, you could try one of the following: + | - use a `given ... with` clause as the enclosing given, + | - rearrange definitions so that Test2.c comes earlier, + | - use an explicit conversion, + | - use an import to get extension method into scope. + | This will be an error in Scala 3.5 and later. -- Error: tests/neg/i15474.scala:12:56 --------------------------------------------------------------------------------- 12 | given Ordering[Price] = summon[Ordering[BigDecimal]] // error | ^ - | Warning: result of implicit search for Ordering[BigDecimal] will change. - | Current result Prices.Price.given_Ordering_Price will be no longer eligible - | because it is not defined before the search position. - | Result with new rules: scala.math.Ordering.BigDecimal. - | To opt into the new rules, use the `experimental.avoidLoopingGivens` language import. + | Result of implicit search for Ordering[BigDecimal] will change. + | Current result Prices.Price.given_Ordering_Price will be no longer eligible + | because it is not defined before the search position. + | Result with new rules: scala.math.Ordering.BigDecimal. + | To opt into the new rules, use the `experimental.givenLoopPrevention` language import. | - | To fix the problem without the language import, you could try one of the following: - | - rearrange definitions so that Prices.Price.given_Ordering_Price comes earlier, - | - use an explicit argument. + | To fix the problem without the language import, you could try one of the following: + | - use a `given ... with` clause as the enclosing given, + | - rearrange definitions so that Prices.Price.given_Ordering_Price comes earlier, + | - use an explicit argument. + | This will be an error in Scala 3.5 and later. diff --git a/tests/neg/i6716.check b/tests/neg/i6716.check index 6771d736b6af..e70ac4b15f9c 100644 --- a/tests/neg/i6716.check +++ b/tests/neg/i6716.check @@ -1,12 +1,14 @@ -- Error: tests/neg/i6716.scala:12:39 ---------------------------------------------------------------------------------- 12 | given Monad[Bar] = summon[Monad[Foo]] // error | ^ - | Warning: result of implicit search for Monad[Foo] will change. - | Current result Bar.given_Monad_Bar will be no longer eligible - | because it is not defined before the search position. - | Result with new rules: Foo.given_Monad_Foo. - | To opt into the new rules, use the `experimental.avoidLoopingGivens` language import. + | Result of implicit search for Monad[Foo] will change. + | Current result Bar.given_Monad_Bar will be no longer eligible + | because it is not defined before the search position. + | Result with new rules: Foo.given_Monad_Foo. + | To opt into the new rules, use the `experimental.givenLoopPrevention` language import. | - | To fix the problem without the language import, you could try one of the following: - | - rearrange definitions so that Bar.given_Monad_Bar comes earlier, - | - use an explicit argument. + | To fix the problem without the language import, you could try one of the following: + | - use a `given ... with` clause as the enclosing given, + | - rearrange definitions so that Bar.given_Monad_Bar comes earlier, + | - use an explicit argument. + | This will be an error in Scala 3.5 and later. diff --git a/tests/neg/i7294-a.check b/tests/neg/i7294-a.check index 887635d89a35..319ed8e1c9d0 100644 --- a/tests/neg/i7294-a.check +++ b/tests/neg/i7294-a.check @@ -1,23 +1,25 @@ --- Error: tests/neg/i7294-a.scala:6:10 --------------------------------------------------------------------------------- -6 | case x: T => x.g(10) // error // error +-- [E007] Type Mismatch Error: tests/neg/i7294-a.scala:8:18 ------------------------------------------------------------ +8 | case x: T => x.g(10) // error // error + | ^^^^^^^ + | Found: Any + | Required: T + | + | where: T is a type in given instance f with bounds <: foo.Foo + | + | longer explanation available when compiling with `-explain` +-- Error: tests/neg/i7294-a.scala:8:10 --------------------------------------------------------------------------------- +8 | case x: T => x.g(10) // error // error | ^ - | Warning: result of implicit search for scala.reflect.TypeTest[Nothing, T] will change. + | Result of implicit search for scala.reflect.TypeTest[Nothing, T] will change. | Current result foo.i7294-a$package.f will be no longer eligible | because it is not defined before the search position. | Result with new rules: No Matching Implicit. - | To opt into the new rules, use the `experimental.avoidLoopingGivens` language import. + | To opt into the new rules, use the `experimental.givenLoopPrevention` language import. | | To fix the problem without the language import, you could try one of the following: + | - use a `given ... with` clause as the enclosing given, | - rearrange definitions so that foo.i7294-a$package.f comes earlier, | - use an explicit argument. + | This will be an error in Scala 3.5 and later. | | where: T is a type in given instance f with bounds <: foo.Foo --- [E007] Type Mismatch Error: tests/neg/i7294-a.scala:6:18 ------------------------------------------------------------ -6 | case x: T => x.g(10) // error // error - | ^^^^^^^ - | Found: Any - | Required: T - | - | where: T is a type in given instance f with bounds <: foo.Foo - | - | longer explanation available when compiling with `-explain` diff --git a/tests/neg/i7294-a.scala b/tests/neg/i7294-a.scala index 538dc3159fb8..b0710413eefd 100644 --- a/tests/neg/i7294-a.scala +++ b/tests/neg/i7294-a.scala @@ -1,3 +1,5 @@ +//> using options -Xfatal-warnings + package foo trait Foo { def g(x: Int): Any } diff --git a/tests/neg/i7294-b.scala b/tests/neg/i7294-b.scala index b06d814444e8..8c6f9328cc20 100644 --- a/tests/neg/i7294-b.scala +++ b/tests/neg/i7294-b.scala @@ -1,3 +1,5 @@ +//> using options -Xfatal-warnings + package foo trait Foo { def g(x: Any): Any } diff --git a/tests/neg/looping-givens.scala b/tests/neg/looping-givens.scala index 572f1707861f..357a417f0ed9 100644 --- a/tests/neg/looping-givens.scala +++ b/tests/neg/looping-givens.scala @@ -1,3 +1,5 @@ +//> using options -Xfatal-warnings + class A class B diff --git a/tests/pos/i15474.scala b/tests/pos/i15474.scala index 6b9e55806ae3..b006f8b61cf4 100644 --- a/tests/pos/i15474.scala +++ b/tests/pos/i15474.scala @@ -1,6 +1,6 @@ //> using options -Xfatal-warnings import scala.language.implicitConversions -import scala.language.experimental.avoidLoopingGivens +import scala.language.experimental.givenLoopPrevention object Test2: given c: Conversion[ String, Int ] = _.toInt // now avoided, was loop not detected, could be used as a fallback to avoid the warning. diff --git a/tests/pos/looping-givens.scala b/tests/pos/looping-givens.scala index ed11981c1bf6..1b620b5c113e 100644 --- a/tests/pos/looping-givens.scala +++ b/tests/pos/looping-givens.scala @@ -1,4 +1,4 @@ -import language.experimental.avoidLoopingGivens +import language.experimental.givenLoopPrevention class A class B diff --git a/tests/run/i6716.scala b/tests/run/i6716.scala index 6208a52190fe..4cca37f96a6f 100644 --- a/tests/run/i6716.scala +++ b/tests/run/i6716.scala @@ -1,6 +1,6 @@ //> using options -Xfatal-warnings -import scala.language.experimental.avoidLoopingGivens +import scala.language.experimental.givenLoopPrevention trait Monad[T]: def id: String @@ -11,7 +11,7 @@ object Foo { opaque type Bar = Foo object Bar { - given Monad[Bar] = summon[Monad[Foo]] // was error fixed by avoidLoopingGivens + given Monad[Bar] = summon[Monad[Foo]] // was error, fixed by givenLoopPrevention } object Test extends App { From 6b621a3bb4770801dab06a140636f72df2bf2faa Mon Sep 17 00:00:00 2001 From: odersky Date: Tue, 19 Dec 2023 13:38:33 +0100 Subject: [PATCH 158/371] Swap two givens in Specs2 to satisfy new restriuction --- community-build/community-projects/specs2 | 2 +- tests/neg/i7294-a.check | 50 +++++++++++------------ tests/neg/i7294-a.scala | 10 +++-- 3 files changed, 32 insertions(+), 30 deletions(-) diff --git a/community-build/community-projects/specs2 b/community-build/community-projects/specs2 index e42f7987b4ce..ba01cca013d9 160000 --- a/community-build/community-projects/specs2 +++ b/community-build/community-projects/specs2 @@ -1 +1 @@ -Subproject commit e42f7987b4ce30d95fca3f30b9d508021f2fdac7 +Subproject commit ba01cca013d9d99e390d17619664bdedd716e0d7 diff --git a/tests/neg/i7294-a.check b/tests/neg/i7294-a.check index 319ed8e1c9d0..6fac76a9faa5 100644 --- a/tests/neg/i7294-a.check +++ b/tests/neg/i7294-a.check @@ -1,25 +1,25 @@ --- [E007] Type Mismatch Error: tests/neg/i7294-a.scala:8:18 ------------------------------------------------------------ -8 | case x: T => x.g(10) // error // error - | ^^^^^^^ - | Found: Any - | Required: T - | - | where: T is a type in given instance f with bounds <: foo.Foo - | - | longer explanation available when compiling with `-explain` --- Error: tests/neg/i7294-a.scala:8:10 --------------------------------------------------------------------------------- -8 | case x: T => x.g(10) // error // error - | ^ - | Result of implicit search for scala.reflect.TypeTest[Nothing, T] will change. - | Current result foo.i7294-a$package.f will be no longer eligible - | because it is not defined before the search position. - | Result with new rules: No Matching Implicit. - | To opt into the new rules, use the `experimental.givenLoopPrevention` language import. - | - | To fix the problem without the language import, you could try one of the following: - | - use a `given ... with` clause as the enclosing given, - | - rearrange definitions so that foo.i7294-a$package.f comes earlier, - | - use an explicit argument. - | This will be an error in Scala 3.5 and later. - | - | where: T is a type in given instance f with bounds <: foo.Foo +-- [E007] Type Mismatch Error: tests/neg/i7294-a.scala:10:20 ----------------------------------------------------------- +10 | case x: T => x.g(10) // error // error + | ^^^^^^^ + | Found: Any + | Required: T + | + | where: T is a type in given instance f with bounds <: foo.Foo + | + | longer explanation available when compiling with `-explain` +-- Error: tests/neg/i7294-a.scala:10:12 -------------------------------------------------------------------------------- +10 | case x: T => x.g(10) // error // error + | ^ + | Result of implicit search for scala.reflect.TypeTest[Nothing, T] will change. + | Current result foo.Test.f will be no longer eligible + | because it is not defined before the search position. + | Result with new rules: No Matching Implicit. + | To opt into the new rules, use the `experimental.givenLoopPrevention` language import. + | + | To fix the problem without the language import, you could try one of the following: + | - use a `given ... with` clause as the enclosing given, + | - rearrange definitions so that foo.Test.f comes earlier, + | - use an explicit argument. + | This will be an error in Scala 3.5 and later. + | + | where: T is a type in given instance f with bounds <: foo.Foo diff --git a/tests/neg/i7294-a.scala b/tests/neg/i7294-a.scala index b0710413eefd..3453e88cf741 100644 --- a/tests/neg/i7294-a.scala +++ b/tests/neg/i7294-a.scala @@ -4,8 +4,10 @@ package foo trait Foo { def g(x: Int): Any } -inline given f[T <: Foo]: T = ??? match { - case x: T => x.g(10) // error // error -} +object Test: -@main def Test = f + inline given f[T <: Foo]: T = ??? match { + case x: T => x.g(10) // error // error + } + + @main def Test = f From 6f4770227cdce8f1cf7cc1619e143cd76afaeea3 Mon Sep 17 00:00:00 2001 From: odersky Date: Sun, 7 Jan 2024 12:39:09 +0100 Subject: [PATCH 159/371] Turn given loop prevention on for -source future Drop the experimental language import. I believe everybody agrees that this is a desirable improvement, and type inference and implicit search have traditionally been in the realm of the compiler implementers. Experimental language imports have to live forever (albeit as deprecated once the feature is accepted as standard). So they are rather heavyweight and it is unergonomic to require them for smallish improvements to type inference. The new road map is as follows: - In 3.4: warning if behavior would change in the future. - In 3.5: error if behavior would change in the future - In 3.future (at the earliest 3.6): new behavior. --- .../src/dotty/tools/dotc/config/Feature.scala | 1 - .../dotty/tools/dotc/typer/Implicits.scala | 5 ++- .../runtime/stdLibPatches/language.scala | 9 ---- tests/neg/i15474.check | 44 ++++++++++--------- tests/neg/i6716.check | 21 ++++----- tests/neg/i7294-a.check | 3 +- tests/pos/i15474.scala | 2 +- tests/pos/looping-givens.scala | 2 +- tests/run/i6716.scala | 6 +-- 9 files changed, 43 insertions(+), 50 deletions(-) diff --git a/compiler/src/dotty/tools/dotc/config/Feature.scala b/compiler/src/dotty/tools/dotc/config/Feature.scala index cdd83b15f4fc..fa262a5880ff 100644 --- a/compiler/src/dotty/tools/dotc/config/Feature.scala +++ b/compiler/src/dotty/tools/dotc/config/Feature.scala @@ -33,7 +33,6 @@ object Feature: val pureFunctions = experimental("pureFunctions") val captureChecking = experimental("captureChecking") val into = experimental("into") - val givenLoopPrevention = experimental("givenLoopPrevention") val globalOnlyImports: Set[TermName] = Set(pureFunctions, captureChecking) diff --git a/compiler/src/dotty/tools/dotc/typer/Implicits.scala b/compiler/src/dotty/tools/dotc/typer/Implicits.scala index 1672c94fd969..37086cff0b4c 100644 --- a/compiler/src/dotty/tools/dotc/typer/Implicits.scala +++ b/compiler/src/dotty/tools/dotc/typer/Implicits.scala @@ -1595,7 +1595,7 @@ trait Implicits: def checkResolutionChange(result: SearchResult) = if (eligible ne preEligible) - && !Feature.enabled(Feature.givenLoopPrevention) + && !sourceVersion.isAtLeast(SourceVersion.future) then val prevResult = searchImplicit(preEligible, contextual) prevResult match @@ -1621,7 +1621,8 @@ trait Implicits: |Current result ${showResult(prevResult)} will be no longer eligible | because it is not defined before the search position. |Result with new rules: ${showResult(result)}. - |To opt into the new rules, use the `experimental.givenLoopPrevention` language import. + |To opt into the new rules, compile with `-source future` or use + |the `scala.language.future` language import. | |To fix the problem without the language import, you could try one of the following: | - use a `given ... with` clause as the enclosing given, diff --git a/library/src/scala/runtime/stdLibPatches/language.scala b/library/src/scala/runtime/stdLibPatches/language.scala index 9a2a034c6b7d..c2a12cec2ecc 100644 --- a/library/src/scala/runtime/stdLibPatches/language.scala +++ b/library/src/scala/runtime/stdLibPatches/language.scala @@ -91,15 +91,6 @@ object language: @compileTimeOnly("`into` can only be used at compile time in import statements") object into - /** Experimental support for new given resolution rules that avoid looping - * givens. By the new rules, a given may not implicitly use itself or givens - * defined after it. - * - * @see [[https://dotty.epfl.ch/docs/reference/experimental/given-loop-prevention]] - */ - @compileTimeOnly("`givenLoopPrevention` can only be used at compile time in import statements") - object givenLoopPrevention - /** Was needed to add support for relaxed imports of extension methods. * The language import is no longer needed as this is now a standard feature since SIP was accepted. * @see [[http://dotty.epfl.ch/docs/reference/contextual/extension-methods]] diff --git a/tests/neg/i15474.check b/tests/neg/i15474.check index 6a60aed304aa..3205f703cd50 100644 --- a/tests/neg/i15474.check +++ b/tests/neg/i15474.check @@ -1,29 +1,31 @@ -- Error: tests/neg/i15474.scala:6:39 ---------------------------------------------------------------------------------- 6 | given c: Conversion[ String, Int ] = _.toInt // error | ^ - | Result of implicit search for ?{ toInt: ? } will change. - | Current result Test2.c will be no longer eligible - | because it is not defined before the search position. - | Result with new rules: augmentString. - | To opt into the new rules, use the `experimental.givenLoopPrevention` language import. + | Result of implicit search for ?{ toInt: ? } will change. + | Current result Test2.c will be no longer eligible + | because it is not defined before the search position. + | Result with new rules: augmentString. + | To opt into the new rules, compile with `-source future` or use + | the `scala.language.future` language import. | - | To fix the problem without the language import, you could try one of the following: - | - use a `given ... with` clause as the enclosing given, - | - rearrange definitions so that Test2.c comes earlier, - | - use an explicit conversion, - | - use an import to get extension method into scope. - | This will be an error in Scala 3.5 and later. + | To fix the problem without the language import, you could try one of the following: + | - use a `given ... with` clause as the enclosing given, + | - rearrange definitions so that Test2.c comes earlier, + | - use an explicit conversion, + | - use an import to get extension method into scope. + | This will be an error in Scala 3.5 and later. -- Error: tests/neg/i15474.scala:12:56 --------------------------------------------------------------------------------- 12 | given Ordering[Price] = summon[Ordering[BigDecimal]] // error | ^ - | Result of implicit search for Ordering[BigDecimal] will change. - | Current result Prices.Price.given_Ordering_Price will be no longer eligible - | because it is not defined before the search position. - | Result with new rules: scala.math.Ordering.BigDecimal. - | To opt into the new rules, use the `experimental.givenLoopPrevention` language import. + | Result of implicit search for Ordering[BigDecimal] will change. + | Current result Prices.Price.given_Ordering_Price will be no longer eligible + | because it is not defined before the search position. + | Result with new rules: scala.math.Ordering.BigDecimal. + | To opt into the new rules, compile with `-source future` or use + | the `scala.language.future` language import. | - | To fix the problem without the language import, you could try one of the following: - | - use a `given ... with` clause as the enclosing given, - | - rearrange definitions so that Prices.Price.given_Ordering_Price comes earlier, - | - use an explicit argument. - | This will be an error in Scala 3.5 and later. + | To fix the problem without the language import, you could try one of the following: + | - use a `given ... with` clause as the enclosing given, + | - rearrange definitions so that Prices.Price.given_Ordering_Price comes earlier, + | - use an explicit argument. + | This will be an error in Scala 3.5 and later. diff --git a/tests/neg/i6716.check b/tests/neg/i6716.check index e70ac4b15f9c..cdf655710452 100644 --- a/tests/neg/i6716.check +++ b/tests/neg/i6716.check @@ -1,14 +1,15 @@ -- Error: tests/neg/i6716.scala:12:39 ---------------------------------------------------------------------------------- 12 | given Monad[Bar] = summon[Monad[Foo]] // error | ^ - | Result of implicit search for Monad[Foo] will change. - | Current result Bar.given_Monad_Bar will be no longer eligible - | because it is not defined before the search position. - | Result with new rules: Foo.given_Monad_Foo. - | To opt into the new rules, use the `experimental.givenLoopPrevention` language import. + | Result of implicit search for Monad[Foo] will change. + | Current result Bar.given_Monad_Bar will be no longer eligible + | because it is not defined before the search position. + | Result with new rules: Foo.given_Monad_Foo. + | To opt into the new rules, compile with `-source future` or use + | the `scala.language.future` language import. | - | To fix the problem without the language import, you could try one of the following: - | - use a `given ... with` clause as the enclosing given, - | - rearrange definitions so that Bar.given_Monad_Bar comes earlier, - | - use an explicit argument. - | This will be an error in Scala 3.5 and later. + | To fix the problem without the language import, you could try one of the following: + | - use a `given ... with` clause as the enclosing given, + | - rearrange definitions so that Bar.given_Monad_Bar comes earlier, + | - use an explicit argument. + | This will be an error in Scala 3.5 and later. diff --git a/tests/neg/i7294-a.check b/tests/neg/i7294-a.check index 6fac76a9faa5..2fe260fcf99a 100644 --- a/tests/neg/i7294-a.check +++ b/tests/neg/i7294-a.check @@ -14,7 +14,8 @@ | Current result foo.Test.f will be no longer eligible | because it is not defined before the search position. | Result with new rules: No Matching Implicit. - | To opt into the new rules, use the `experimental.givenLoopPrevention` language import. + | To opt into the new rules, compile with `-source future` or use + | the `scala.language.future` language import. | | To fix the problem without the language import, you could try one of the following: | - use a `given ... with` clause as the enclosing given, diff --git a/tests/pos/i15474.scala b/tests/pos/i15474.scala index b006f8b61cf4..f2c85120e4b2 100644 --- a/tests/pos/i15474.scala +++ b/tests/pos/i15474.scala @@ -1,6 +1,6 @@ //> using options -Xfatal-warnings import scala.language.implicitConversions -import scala.language.experimental.givenLoopPrevention +import scala.language.future object Test2: given c: Conversion[ String, Int ] = _.toInt // now avoided, was loop not detected, could be used as a fallback to avoid the warning. diff --git a/tests/pos/looping-givens.scala b/tests/pos/looping-givens.scala index 1b620b5c113e..0e615c8251df 100644 --- a/tests/pos/looping-givens.scala +++ b/tests/pos/looping-givens.scala @@ -1,4 +1,4 @@ -import language.experimental.givenLoopPrevention +import language.future class A class B diff --git a/tests/run/i6716.scala b/tests/run/i6716.scala index 4cca37f96a6f..3bef45ac7465 100644 --- a/tests/run/i6716.scala +++ b/tests/run/i6716.scala @@ -1,6 +1,4 @@ -//> using options -Xfatal-warnings - -import scala.language.experimental.givenLoopPrevention +//> using options -Xfatal-warnings -source future trait Monad[T]: def id: String @@ -11,7 +9,7 @@ object Foo { opaque type Bar = Foo object Bar { - given Monad[Bar] = summon[Monad[Foo]] // was error, fixed by givenLoopPrevention + given Monad[Bar] = summon[Monad[Foo]] // was error, fixed by given loop prevention } object Test extends App { From 65d9a0daca737c4c7d223df173bd84c0fa859113 Mon Sep 17 00:00:00 2001 From: odersky Date: Sun, 7 Jan 2024 18:46:43 +0100 Subject: [PATCH 160/371] Adapt docs --- .../changed-features/implicit-resolution.md | 32 +++++++++++++++++++ docs/sidebar.yml | 1 - 2 files changed, 32 insertions(+), 1 deletion(-) diff --git a/docs/_docs/reference/changed-features/implicit-resolution.md b/docs/_docs/reference/changed-features/implicit-resolution.md index 861a63bd4a05..1396ed04b6d3 100644 --- a/docs/_docs/reference/changed-features/implicit-resolution.md +++ b/docs/_docs/reference/changed-features/implicit-resolution.md @@ -164,3 +164,35 @@ The new rules are as follows: An implicit `a` defined in `A` is more specific th Condition (*) is new. It is necessary to ensure that the defined relation is transitive. [//]: # todo: expand with precise rules + +**9.** The following change is currently enabled in `-source future`: + +Implicit resolution now avoids generating recursive givens that can lead to an infinite loop at runtime. Here is an example: + +```scala +object Prices { + opaque type Price = BigDecimal + + object Price{ + given Ordering[Price] = summon[Ordering[BigDecimal]] // was error, now avoided + } +} +``` + +Previously, implicit resolution would resolve the `summon` to the given in `Price`, leading to an infinite loop (a warning was issued in that case). We now use the underlying given in `BigDecimal` instead. We achieve that by adding the following rule for implicit search: + + - When doing an implicit search while checking the implementation of a `given` definition `G` of the form + ``` + given ... = .... + ``` + discard all search results that lead back to `G` or to a given with the same owner as `G` that comes later in the source than `G`. + +The new behavior is currently enabled in `source.future` and will be enabled at the earliest in Scala 3.6. For earlier source versions, the behavior is as +follows: + + - Scala 3.3: no change + - Scala 3.4: A warning is issued where the behavior will change in 3.future. + - Scala 3.5: An error is issued where the behavior will change in 3.future. + +Old-style implicit definitions are unaffected by this change. + diff --git a/docs/sidebar.yml b/docs/sidebar.yml index d9f86d5141c3..65d7ac2f9ee4 100644 --- a/docs/sidebar.yml +++ b/docs/sidebar.yml @@ -153,7 +153,6 @@ subsection: - page: reference/experimental/cc.md - page: reference/experimental/purefuns.md - page: reference/experimental/tupled-function.md - - page: reference/experimental/given-loop-prevention.md - page: reference/syntax.md - title: Language Versions index: reference/language-versions/language-versions.md From 6dc38d9ae928178f6758cfbbbe32f3075dbf8c22 Mon Sep 17 00:00:00 2001 From: odersky Date: Mon, 8 Jan 2024 09:38:11 +0100 Subject: [PATCH 161/371] Add test showing use of @nowarn --- tests/pos/given-loop-prevention.scala | 14 ++++++++++++++ tests/pos/i6716.scala | 14 ++++++++++++++ 2 files changed, 28 insertions(+) create mode 100644 tests/pos/given-loop-prevention.scala create mode 100644 tests/pos/i6716.scala diff --git a/tests/pos/given-loop-prevention.scala b/tests/pos/given-loop-prevention.scala new file mode 100644 index 000000000000..0bae0bb24fed --- /dev/null +++ b/tests/pos/given-loop-prevention.scala @@ -0,0 +1,14 @@ +//> using options -Xfatal-warnings + +class Foo + +object Bar { + given Foo with {} + given List[Foo] = List(summon[Foo]) // ok +} + +object Baz { + @annotation.nowarn + given List[Foo] = List(summon[Foo]) // gives a warning, which is suppressed + given Foo with {} +} diff --git a/tests/pos/i6716.scala b/tests/pos/i6716.scala new file mode 100644 index 000000000000..f02559af1e82 --- /dev/null +++ b/tests/pos/i6716.scala @@ -0,0 +1,14 @@ +//> using options -Xfatal-warnings -source 3.4 + +class Foo + +object Bar { + given Foo with {} + given List[Foo] = List(summon[Foo]) // ok +} + +object Baz { + @annotation.nowarn + given List[Foo] = List(summon[Foo]) // gives a warning, which is suppressed + given Foo with {} +} From d449f0f3955b40950b8540de015e477de0f0fa58 Mon Sep 17 00:00:00 2001 From: odersky Date: Wed, 10 Jan 2024 17:29:35 +0100 Subject: [PATCH 162/371] Fix algorithm to prevent recursive givens Fixes #19404 Fixes #19407 --- .../dotty/tools/dotc/typer/Implicits.scala | 77 ++++++++++--------- tests/pos/i19404.scala | 13 ++++ tests/pos/i19407.scala | 11 +++ 3 files changed, 65 insertions(+), 36 deletions(-) create mode 100644 tests/pos/i19404.scala create mode 100644 tests/pos/i19407.scala diff --git a/compiler/src/dotty/tools/dotc/typer/Implicits.scala b/compiler/src/dotty/tools/dotc/typer/Implicits.scala index 37086cff0b4c..389669beff01 100644 --- a/compiler/src/dotty/tools/dotc/typer/Implicits.scala +++ b/compiler/src/dotty/tools/dotc/typer/Implicits.scala @@ -93,7 +93,7 @@ object Implicits: if (initctx eq NoContext) initctx else initctx.retractMode(Mode.ImplicitsEnabled) protected given Context = irefCtx - /** The nesting level of this context. Non-zero only in ContextialImplicits */ + /** The nesting level of this context. Non-zero only in ContextualImplicits */ def level: Int = 0 /** The implicit references */ @@ -408,6 +408,13 @@ object Implicits: } } + /** Search mode to use for possibly avoiding looping givens */ + enum SearchMode: + case Old, // up to 3.3, old mode w/o protection + CompareWarn, // from 3.4, old mode, warn if new mode would change result + CompareErr, // from 3.5, old mode, error if new mode would change result + New // from future, new mode where looping givens are avoided + /** The result of an implicit search */ sealed abstract class SearchResult extends Showable { def tree: Tree @@ -1553,18 +1560,18 @@ trait Implicits: /** Search implicit in context `ctxImplicits` or else in implicit scope * of expected type if `ctxImplicits == null`. */ - private def searchImplicit(ctxImplicits: ContextualImplicits | Null): SearchResult = + private def searchImplicit(ctxImplicits: ContextualImplicits | Null, mode: SearchMode): SearchResult = if isUnderspecified(wildProto) then SearchFailure(TooUnspecific(pt), span) else val contextual = ctxImplicits != null val preEligible = // the eligible candidates, ignoring positions - if contextual then + if ctxImplicits != null then if ctx.gadt.isNarrowing then withoutMode(Mode.ImplicitsEnabled) { - ctx.implicits.uncachedEligible(wildProto) + ctxImplicits.uncachedEligible(wildProto) } - else ctx.implicits.eligible(wildProto) + else ctxImplicits.eligible(wildProto) else implicitScope(wildProto).eligible /** Does candidate `cand` come too late for it to be considered as an @@ -1589,16 +1596,13 @@ trait Implicits: end comesTooLate val eligible = // the eligible candidates that come before the search point - if contextual && sourceVersion.isAtLeast(SourceVersion.`3.4`) + if contextual && mode != SearchMode.Old then preEligible.filterNot(comesTooLate) else preEligible def checkResolutionChange(result: SearchResult) = - if (eligible ne preEligible) - && !sourceVersion.isAtLeast(SourceVersion.future) - then - val prevResult = searchImplicit(preEligible, contextual) - prevResult match + if (eligible ne preEligible) && mode != SearchMode.New then + searchImplicit(preEligible, contextual) match case prevResult: SearchSuccess => def remedy = pt match case _: SelectionProto => @@ -1628,41 +1632,38 @@ trait Implicits: | - use a `given ... with` clause as the enclosing given, | - rearrange definitions so that ${showResult(prevResult)} comes earlier, | - use an explicit $remedy.""" - if sourceVersion.isAtLeast(SourceVersion.`3.5`) + if mode == SearchMode.CompareErr then report.error(msg, srcPos) else report.warning(msg.append("\nThis will be an error in Scala 3.5 and later."), srcPos) - case _ => - prevResult + prevResult + case prevResult: SearchFailure => result else result end checkResolutionChange - val result = searchImplicit(eligible, contextual) - result match - case result: SearchSuccess => - checkResolutionChange(result) - case failure: SearchFailure => - failure.reason match - case _: AmbiguousImplicits => failure - case reason => - if contextual then - // If we filtered out some candidates for being too late, we should - // do another contextual search further out, since the dropped candidates - // might have shadowed an eligible candidate in an outer level. - // Otherwise, proceed with a search of the implicit scope. - val newCtxImplicits = - if eligible eq preEligible then null - else ctxImplicits.nn.outerImplicits: ContextualImplicits | Null - // !!! Dotty problem: without the ContextualImplicits | Null type ascription - // we get a Ycheck failure after arrayConstructors due to "Types differ" - checkResolutionChange: - searchImplicit(newCtxImplicits).recoverWith: + checkResolutionChange: + searchImplicit(eligible, contextual).recoverWith: + case failure: SearchFailure => + failure.reason match + case _: AmbiguousImplicits => failure + case reason => + if contextual then + // If we filtered out some candidates for being too late, we should + // do another contextual search further out, since the dropped candidates + // might have shadowed an eligible candidate in an outer level. + // Otherwise, proceed with a search of the implicit scope. + val newCtxImplicits = + if eligible eq preEligible then null + else ctxImplicits.nn.outerImplicits: ContextualImplicits | Null + // !!! Dotty problem: without the ContextualImplicits | Null type ascription + // we get a Ycheck failure after arrayConstructors due to "Types differ" + searchImplicit(newCtxImplicits, SearchMode.New).recoverWith: failure2 => failure2.reason match case _: AmbiguousImplicits => failure2 case _ => reason match case (_: DivergingImplicit) => failure case _ => List(failure, failure2).maxBy(_.tree.treeSize) - else failure + else failure end searchImplicit /** Find a unique best implicit reference */ @@ -1679,7 +1680,11 @@ trait Implicits: case ref: TermRef => SearchSuccess(tpd.ref(ref).withSpan(span.startPos), ref, 0)(ctx.typerState, ctx.gadt) case _ => - searchImplicit(ctx.implicits) + searchImplicit(ctx.implicits, + if sourceVersion.isAtLeast(SourceVersion.future) then SearchMode.New + else if sourceVersion.isAtLeast(SourceVersion.`3.5`) then SearchMode.CompareErr + else if sourceVersion.isAtLeast(SourceVersion.`3.4`) then SearchMode.CompareWarn + else SearchMode.Old) end bestImplicit def implicitScope(tp: Type): OfTypeImplicits = ctx.run.nn.implicitScope(tp) diff --git a/tests/pos/i19404.scala b/tests/pos/i19404.scala new file mode 100644 index 000000000000..8d6d4406ebb2 --- /dev/null +++ b/tests/pos/i19404.scala @@ -0,0 +1,13 @@ +given ipEncoder[IP <: IpAddress]: Encoder[IP] = Encoder[String].contramap(_.toString) + +class Encoder[A] { + final def contramap[B](f: B => A): Encoder[B] = new Encoder[B] +} + +object Encoder { + final def apply[A](implicit instance: Encoder[A]): Encoder[A] = instance + implicit final val encodeString: Encoder[String] = new Encoder[String] +} + +trait Json +trait IpAddress \ No newline at end of file diff --git a/tests/pos/i19407.scala b/tests/pos/i19407.scala new file mode 100644 index 000000000000..b7440a53540d --- /dev/null +++ b/tests/pos/i19407.scala @@ -0,0 +1,11 @@ +trait GeneratedEnum +trait Decoder[A] + +object Decoder: + given Decoder[Int] = ??? + +object GeneratedEnumDecoder: + + given [A <: GeneratedEnum]: Decoder[A] = + summon[Decoder[Int]] + ??? \ No newline at end of file From b9857ef417b9a93432c73059818ba811eb5be6bb Mon Sep 17 00:00:00 2001 From: odersky Date: Thu, 11 Jan 2024 19:58:05 +0100 Subject: [PATCH 163/371] Regression test for 19417 --- tests/pos/i19417/defs_1.scala | 5 +++++ tests/pos/i19417/usage_2.scala | 2 ++ 2 files changed, 7 insertions(+) create mode 100644 tests/pos/i19417/defs_1.scala create mode 100644 tests/pos/i19417/usage_2.scala diff --git a/tests/pos/i19417/defs_1.scala b/tests/pos/i19417/defs_1.scala new file mode 100644 index 000000000000..92dc10990d90 --- /dev/null +++ b/tests/pos/i19417/defs_1.scala @@ -0,0 +1,5 @@ +trait QueryParamDecoder[E]: + def emap[T](fn: E => Either[Throwable, T]): QueryParamDecoder[T] +object QueryParamDecoder: + def apply[T](implicit ev: QueryParamDecoder[T]): QueryParamDecoder[T] = ev + implicit lazy val stringQueryParamDecoder: QueryParamDecoder[String] = ??? \ No newline at end of file diff --git a/tests/pos/i19417/usage_2.scala b/tests/pos/i19417/usage_2.scala new file mode 100644 index 000000000000..c686f46280d7 --- /dev/null +++ b/tests/pos/i19417/usage_2.scala @@ -0,0 +1,2 @@ +given[E](using e: EnumOf[E]): QueryParamDecoder[E] = QueryParamDecoder[String].emap(_ => Right(???)) +trait EnumOf[E] \ No newline at end of file From 9037df25ed0b29f890bc45009263c6ccb5ada8c8 Mon Sep 17 00:00:00 2001 From: Jan Chyb Date: Thu, 11 Jan 2024 14:38:36 +0100 Subject: [PATCH 164/371] Remove an incompatible java api method in java 8 from a test --- tests/pos/i19354.orig.scala | 1 - 1 file changed, 1 deletion(-) diff --git a/tests/pos/i19354.orig.scala b/tests/pos/i19354.orig.scala index 0443bcb06836..0301974a9e59 100644 --- a/tests/pos/i19354.orig.scala +++ b/tests/pos/i19354.orig.scala @@ -10,7 +10,6 @@ class P extends AbstractProcessor { .flatMap(annotation => roundEnv.getElementsAnnotatedWith(annotation).stream()) .filter(element => element.getKind == ElementKind.PACKAGE) .map(element => element.asInstanceOf[PackageElement]) - .toList() true } } From 48e03408cce4dd351a312b2f5a5e1078d6fc59c9 Mon Sep 17 00:00:00 2001 From: =?UTF-8?q?Pawe=C5=82=20Marks?= Date: Thu, 18 Jan 2024 19:44:32 +0100 Subject: [PATCH 165/371] Replace 3.5 with future --- compiler/src/dotty/tools/dotc/config/MigrationVersion.scala | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/compiler/src/dotty/tools/dotc/config/MigrationVersion.scala b/compiler/src/dotty/tools/dotc/config/MigrationVersion.scala index 4dd9d065395b..89fd9b6de715 100644 --- a/compiler/src/dotty/tools/dotc/config/MigrationVersion.scala +++ b/compiler/src/dotty/tools/dotc/config/MigrationVersion.scala @@ -33,7 +33,7 @@ object MigrationVersion: val AscriptionAfterPattern = MigrationVersion(`3.3`, future) - val ExplicitContextBoundArgument = MigrationVersion(`3.4`, `3.5`) + val ExplicitContextBoundArgument = MigrationVersion(`3.4`, future) val AlphanumericInfix = MigrationVersion(`3.4`, future) val RemoveThisQualifier = MigrationVersion(`3.4`, future) From 88d97603cdb1cb6c680619385eedb9af4ff91291 Mon Sep 17 00:00:00 2001 From: =?UTF-8?q?Pawe=C5=82=20Marks?= Date: Fri, 19 Jan 2024 12:37:12 +0100 Subject: [PATCH 166/371] Add changelog for 3.4.0-RC2 --- changelogs/3.4.0-RC2.md | 22 ++++++++++++++++++++++ 1 file changed, 22 insertions(+) create mode 100644 changelogs/3.4.0-RC2.md diff --git a/changelogs/3.4.0-RC2.md b/changelogs/3.4.0-RC2.md new file mode 100644 index 000000000000..84d85e19efb0 --- /dev/null +++ b/changelogs/3.4.0-RC2.md @@ -0,0 +1,22 @@ +# Backported fixes + +- Fix expandParam's use of argForParam/isArgPrefixOf. [#19412](https://github.com/lampepfl/dotty/pull/19412) +- Remove ascriptionVarargsUnpacking as we never used it [#19399](https://github.com/lampepfl/dotty/pull/19399) +- Make explicit arguments for context bounds an error from 3.5 [#19316](https://github.com/lampepfl/dotty/pull/19316) +- Avoid generating given definitions that loop [#19282](https://github.com/lampepfl/dotty/pull/19282) +- Turn given loop prevention on for -source future [#19392](https://github.com/lampepfl/dotty/pull/19392) +- Fix algorithm to prevent recursive givens [#19411](https://github.com/lampepfl/dotty/pull/19411) + +# Contributors + +Thank you to all the contributors who made this release possible 🎉 + +According to `git shortlog -sn --no-merges 3.4.0-RC1..3.4.0-RC2` these are: + +``` + 15 Martin Odersky + 4 Nicolas Stucki + 3 Paweł Marks + 1 Dale Wijnand + 1 Jan Chyb +``` From a22062179cceea49d2927503531f55dde85cb8ee Mon Sep 17 00:00:00 2001 From: =?UTF-8?q?Pawe=C5=82=20Marks?= Date: Fri, 19 Jan 2024 12:38:08 +0100 Subject: [PATCH 167/371] Release 3.4.0-RC2 --- project/Build.scala | 4 ++-- .../test/scala/dotty/tools/dotc/ScalaJSLink.scala | 2 +- 2 files changed, 3 insertions(+), 3 deletions(-) diff --git a/project/Build.scala b/project/Build.scala index b799c32eb65a..2b5c18abe2ff 100644 --- a/project/Build.scala +++ b/project/Build.scala @@ -83,9 +83,9 @@ object DottyJSPlugin extends AutoPlugin { object Build { import ScaladocConfigs._ - val referenceVersion = "3.3.1" + val referenceVersion = "3.4.0-RC1" - val baseVersion = "3.4.0-RC1" + val baseVersion = "3.4.0-RC2" // Versions used by the vscode extension to create a new project // This should be the latest published releases. diff --git a/sjs-compiler-tests/test/scala/dotty/tools/dotc/ScalaJSLink.scala b/sjs-compiler-tests/test/scala/dotty/tools/dotc/ScalaJSLink.scala index 54e92b1559d6..2560021aec99 100644 --- a/sjs-compiler-tests/test/scala/dotty/tools/dotc/ScalaJSLink.scala +++ b/sjs-compiler-tests/test/scala/dotty/tools/dotc/ScalaJSLink.scala @@ -45,7 +45,7 @@ object ScalaJSLink: val result = PathIRContainer .fromClasspath(cpEntries.toSeq.map(entry => new File(entry).toPath())) .map(_._1) - .flatMap(cache.cached _) + .flatMap(cache.cached) .flatMap(linker.link(_, moduleInitializers, PathOutputDirectory(dir), logger)) val report = Await.result(result, Duration.Inf) From e9fa840bec1d7b67e2e24f1b5e70e679dba92147 Mon Sep 17 00:00:00 2001 From: =?UTF-8?q?Pawe=C5=82=20Marks?= Date: Tue, 23 Jan 2024 11:02:38 +0100 Subject: [PATCH 168/371] Sync language.scala with main --- .../src/scala/runtime/stdLibPatches/language.scala | 14 ++++++++++++++ 1 file changed, 14 insertions(+) diff --git a/library/src/scala/runtime/stdLibPatches/language.scala b/library/src/scala/runtime/stdLibPatches/language.scala index 6018f537613b..c0dd8a74419e 100644 --- a/library/src/scala/runtime/stdLibPatches/language.scala +++ b/library/src/scala/runtime/stdLibPatches/language.scala @@ -246,6 +246,20 @@ object language: @compileTimeOnly("`3.4` can only be used at compile time in import statements") object `3.4` + /** Set source version to 3.5-migration. + * + * @see [[https://docs.scala-lang.org/scala3/guides/migration/compatibility-intro.html]] + */ + @compileTimeOnly("`3.5-migration` can only be used at compile time in import statements") + object `3.5-migration` + + /** Set source version to 3.5 + * + * @see [[https://docs.scala-lang.org/scala3/guides/migration/compatibility-intro.html]] + */ + @compileTimeOnly("`3.5` can only be used at compile time in import statements") + object `3.5` + // !!! Keep in sync with dotty.tools.dotc.config.SourceVersion !!! // Also add tests in `tests/pos/source-import-3-x.scala` and `tests/pos/source-import-3-x-migration.scala` From 40fab5146b18667941e84d4d5ad47a990b020c62 Mon Sep 17 00:00:00 2001 From: =?UTF-8?q?Pawe=C5=82=20Marks?= Date: Tue, 23 Jan 2024 11:03:37 +0100 Subject: [PATCH 169/371] Replace future with 3.5 This reverts commit 48e03408cce4dd351a312b2f5a5e1078d6fc59c9. --- compiler/src/dotty/tools/dotc/config/MigrationVersion.scala | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/compiler/src/dotty/tools/dotc/config/MigrationVersion.scala b/compiler/src/dotty/tools/dotc/config/MigrationVersion.scala index 89fd9b6de715..4dd9d065395b 100644 --- a/compiler/src/dotty/tools/dotc/config/MigrationVersion.scala +++ b/compiler/src/dotty/tools/dotc/config/MigrationVersion.scala @@ -33,7 +33,7 @@ object MigrationVersion: val AscriptionAfterPattern = MigrationVersion(`3.3`, future) - val ExplicitContextBoundArgument = MigrationVersion(`3.4`, future) + val ExplicitContextBoundArgument = MigrationVersion(`3.4`, `3.5`) val AlphanumericInfix = MigrationVersion(`3.4`, future) val RemoveThisQualifier = MigrationVersion(`3.4`, future) From c76806c30ca5afb419b047de0869e59d151fafa7 Mon Sep 17 00:00:00 2001 From: Nicolas Stucki Date: Mon, 22 Jan 2024 16:58:34 +0100 Subject: [PATCH 170/371] Add tests for context bounds migration Improves tests of #19316 --- tests/neg/context-bounds-migration-3.5.check | 6 ++++++ tests/neg/context-bounds-migration-3.5.scala | 10 ++++++++++ tests/neg/context-bounds-migration-future.check | 6 ++++++ tests/neg/context-bounds-migration-future.scala | 10 ++++++++++ tests/warn/context-bounds-migration-3.4.check | 6 ++++++ ...ration.scala => context-bounds-migration-3.4.scala} | 0 6 files changed, 38 insertions(+) create mode 100644 tests/neg/context-bounds-migration-3.5.check create mode 100644 tests/neg/context-bounds-migration-3.5.scala create mode 100644 tests/neg/context-bounds-migration-future.check create mode 100644 tests/neg/context-bounds-migration-future.scala create mode 100644 tests/warn/context-bounds-migration-3.4.check rename tests/warn/{context-bounds-migration.scala => context-bounds-migration-3.4.scala} (100%) diff --git a/tests/neg/context-bounds-migration-3.5.check b/tests/neg/context-bounds-migration-3.5.check new file mode 100644 index 000000000000..dd8a2aeefbf3 --- /dev/null +++ b/tests/neg/context-bounds-migration-3.5.check @@ -0,0 +1,6 @@ +-- Error: tests/neg/context-bounds-migration-3.5.scala:9:2 ------------------------------------------------------------- +9 | foo(C[Int]()) // error + | ^^^ + | Context bounds will map to context parameters. + | A `using` clause is needed to pass explicit arguments to them. + | This code can be rewritten automatically under -rewrite -source 3.4-migration. diff --git a/tests/neg/context-bounds-migration-3.5.scala b/tests/neg/context-bounds-migration-3.5.scala new file mode 100644 index 000000000000..e5c571d0e22e --- /dev/null +++ b/tests/neg/context-bounds-migration-3.5.scala @@ -0,0 +1,10 @@ +//> using options -source 3.5 + +class C[T] +def foo[X: C] = () + +given [T]: C[T] = C[T]() + +def Test = + foo(C[Int]()) // error + foo(using C[Int]()) // ok diff --git a/tests/neg/context-bounds-migration-future.check b/tests/neg/context-bounds-migration-future.check new file mode 100644 index 000000000000..f56da5d6b28d --- /dev/null +++ b/tests/neg/context-bounds-migration-future.check @@ -0,0 +1,6 @@ +-- [E050] Type Error: tests/neg/context-bounds-migration-future.scala:9:2 ---------------------------------------------- +9 | foo(C[Int]()) // error + | ^^^ + | method foo does not take more parameters + | + | longer explanation available when compiling with `-explain` diff --git a/tests/neg/context-bounds-migration-future.scala b/tests/neg/context-bounds-migration-future.scala new file mode 100644 index 000000000000..6d0e94c0b434 --- /dev/null +++ b/tests/neg/context-bounds-migration-future.scala @@ -0,0 +1,10 @@ +//> using options -source future + +class C[T] +def foo[X: C] = () + +given [T]: C[T] = C[T]() + +def Test = + foo(C[Int]()) // error + foo(using C[Int]()) // ok diff --git a/tests/warn/context-bounds-migration-3.4.check b/tests/warn/context-bounds-migration-3.4.check new file mode 100644 index 000000000000..5341cfbe3ea5 --- /dev/null +++ b/tests/warn/context-bounds-migration-3.4.check @@ -0,0 +1,6 @@ +-- Warning: tests/warn/context-bounds-migration-3.4.scala:9:2 ---------------------------------------------------------- +9 | foo(C[Int]()) // warn + | ^^^ + | Context bounds will map to context parameters. + | A `using` clause is needed to pass explicit arguments to them. + | This code can be rewritten automatically under -rewrite -source 3.4-migration. diff --git a/tests/warn/context-bounds-migration.scala b/tests/warn/context-bounds-migration-3.4.scala similarity index 100% rename from tests/warn/context-bounds-migration.scala rename to tests/warn/context-bounds-migration-3.4.scala From bc20aa6b568874f787b1b2cd7e54c3b6b1791cc4 Mon Sep 17 00:00:00 2001 From: odersky Date: Mon, 22 Jan 2024 19:14:57 +0100 Subject: [PATCH 171/371] Handle default implicits to context parameters under -3.4-migration Synthesized calls for default implicits needed a using clause when the method was an implicit method, but had a context bound parameter in 3.4-migration. Also, we can't rewrite adding a `using` clause if the argument list is empty, since we are lacking precise position info. Fixes #19506 --- .../src/dotty/tools/dotc/printing/RefinedPrinter.scala | 2 +- compiler/src/dotty/tools/dotc/typer/Migrations.scala | 6 ++++-- compiler/src/dotty/tools/dotc/typer/Typer.scala | 5 ++++- tests/neg/i19506.scala | 8 ++++++++ 4 files changed, 17 insertions(+), 4 deletions(-) create mode 100644 tests/neg/i19506.scala diff --git a/compiler/src/dotty/tools/dotc/printing/RefinedPrinter.scala b/compiler/src/dotty/tools/dotc/printing/RefinedPrinter.scala index a556a87173f5..de9e21aa4146 100644 --- a/compiler/src/dotty/tools/dotc/printing/RefinedPrinter.scala +++ b/compiler/src/dotty/tools/dotc/printing/RefinedPrinter.scala @@ -309,7 +309,7 @@ class RefinedPrinter(_ctx: Context) extends PlainPrinter(_ctx) { toText(tp.argType) ~ " ?=>? " ~ toText(tp.resultType) case tp @ FunProto(args, resultType) => "[applied to (" - ~ keywordText("using ").provided(tp.isContextualMethod) + ~ keywordText("using ").provided(tp.applyKind == ApplyKind.Using) ~ argsTreeText(args) ~ ") returning " ~ toText(resultType) diff --git a/compiler/src/dotty/tools/dotc/typer/Migrations.scala b/compiler/src/dotty/tools/dotc/typer/Migrations.scala index 84db91f9dee9..8d468fd68bba 100644 --- a/compiler/src/dotty/tools/dotc/typer/Migrations.scala +++ b/compiler/src/dotty/tools/dotc/typer/Migrations.scala @@ -106,12 +106,14 @@ trait Migrations: && isContextBoundParams && pt.applyKind != ApplyKind.Using then - def rewriteMsg = Message.rewriteNotice("This code", mversion.patchFrom) + def rewriteMsg = + if pt.args.isEmpty then "" + else Message.rewriteNotice("This code", mversion.patchFrom) report.errorOrMigrationWarning( em"""Context bounds will map to context parameters. |A `using` clause is needed to pass explicit arguments to them.$rewriteMsg""", tree.srcPos, mversion) - if mversion.needsPatch then + if mversion.needsPatch && pt.args.nonEmpty then patch(Span(pt.args.head.span.start), "using ") end contextBoundParams diff --git a/compiler/src/dotty/tools/dotc/typer/Typer.scala b/compiler/src/dotty/tools/dotc/typer/Typer.scala index 1303b64cbd12..fe2a6f92eb97 100644 --- a/compiler/src/dotty/tools/dotc/typer/Typer.scala +++ b/compiler/src/dotty/tools/dotc/typer/Typer.scala @@ -3906,7 +3906,10 @@ class Typer(@constructorOnly nestingLevel: Int = 0) extends Namer if (arg.tpe.isError) Nil else untpd.NamedArg(pname, untpd.TypedSplice(arg)) :: Nil } val app = cpy.Apply(tree)(untpd.TypedSplice(tree), namedArgs) - if (wtp.isContextualMethod) app.setApplyKind(ApplyKind.Using) + val needsUsing = wtp.isContextualMethod || wtp.match + case MethodType(ContextBoundParamName(_) :: _) => sourceVersion.isAtLeast(`3.4`) + case _ => false + if needsUsing then app.setApplyKind(ApplyKind.Using) typr.println(i"try with default implicit args $app") typed(app, pt, locked) else issueErrors() diff --git a/tests/neg/i19506.scala b/tests/neg/i19506.scala new file mode 100644 index 000000000000..4e139fed07d0 --- /dev/null +++ b/tests/neg/i19506.scala @@ -0,0 +1,8 @@ +//> using options "-source 3.4-migration", + +trait Reader[T] +def read[T: Reader](s: String, trace: Boolean = false): T = ??? + +def Test = + read[Object]("") // error + read[Object]("")() // error From b1339d720329667ed1c1a26cb5e477e7a335e11b Mon Sep 17 00:00:00 2001 From: =?UTF-8?q?Pawe=C5=82=20Marks?= Date: Wed, 24 Jan 2024 17:09:52 +0100 Subject: [PATCH 172/371] Add changelog for 3.4.0-RC3 --- changelogs/3.4.0-RC3.md | 17 +++++++++++++++++ 1 file changed, 17 insertions(+) create mode 100644 changelogs/3.4.0-RC3.md diff --git a/changelogs/3.4.0-RC3.md b/changelogs/3.4.0-RC3.md new file mode 100644 index 000000000000..57b360d2399c --- /dev/null +++ b/changelogs/3.4.0-RC3.md @@ -0,0 +1,17 @@ +# Backported fixes + +- Sync language.scala with main and backport "Add tests for context bounds migration" [#19515] (https://github.com/lampepfl/dotty/pull/19515) +- Handle default implicits to context parameters under -3.4-migration [#19512] (https://github.com/lampepfl/dotty/pull/19512) + +# Contributors + +Thank you to all the contributors who made this release possible 🎉 + +According to `git shortlog -sn --no-merges 3.4.0-RC2..3.4.0-RC3` these are: + +``` + 4 Paweł Marks + 1 Martin Odersky + 1 Nicolas Stucki + +``` From 64f8c806cbb64f7f30f3930cf7588b83df6f6d81 Mon Sep 17 00:00:00 2001 From: =?UTF-8?q?Pawe=C5=82=20Marks?= Date: Wed, 24 Jan 2024 17:10:40 +0100 Subject: [PATCH 173/371] Release 3.4.0-RC3 --- project/Build.scala | 4 ++-- 1 file changed, 2 insertions(+), 2 deletions(-) diff --git a/project/Build.scala b/project/Build.scala index 2b5c18abe2ff..515cf04b8cf3 100644 --- a/project/Build.scala +++ b/project/Build.scala @@ -83,9 +83,9 @@ object DottyJSPlugin extends AutoPlugin { object Build { import ScaladocConfigs._ - val referenceVersion = "3.4.0-RC1" + val referenceVersion = "3.4.0-RC2" - val baseVersion = "3.4.0-RC2" + val baseVersion = "3.4.0-RC3" // Versions used by the vscode extension to create a new project // This should be the latest published releases. From ec4299464bdbabad83892b442cc9533d2a4cf6a7 Mon Sep 17 00:00:00 2001 From: Stephane Bersier Date: Mon, 29 Jan 2024 05:45:20 -0500 Subject: [PATCH 174/371] Update derivation.md Attempted small grammar improvements --- docs/_docs/reference/contextual/derivation.md | 42 +++++++++---------- 1 file changed, 21 insertions(+), 21 deletions(-) diff --git a/docs/_docs/reference/contextual/derivation.md b/docs/_docs/reference/contextual/derivation.md index 66d0cf3fdf38..5157bb57699c 100644 --- a/docs/_docs/reference/contextual/derivation.md +++ b/docs/_docs/reference/contextual/derivation.md @@ -104,7 +104,7 @@ given TC[DerivingType] = TC.derived // simplified form of: given TC[ [A_1, ..., A_K] =>> DerivingType[A_1, ..., A_K] ] = TC.derived ``` -If `DerivingType` takes less arguments than `F` (`N < K`), we use only the rightmost parameters from the type lambda: +If `DerivingType` takes fewer arguments than `F` (`N < K`), we use only the rightmost parameters from the type lambda: ```scala given TC[ [A_1, ..., A_K] =>> DerivingType[A_(K-N+1), ..., A_K] ] = TC.derived @@ -112,7 +112,7 @@ given TC[ [A_1, ..., A_K] =>> DerivingType[A_(K-N+1), ..., A_K] ] = TC.derived given TC[ [A_1, ..., A_K] =>> DerivingType ] = TC.derived ``` -If `F` takes less arguments than `DerivingType` (`K < N`), we fill in the remaining leftmost slots with type parameters of the given: +If `F` takes fewer arguments than `DerivingType` (`K < N`), we fill in the remaining leftmost slots with type parameters of the given: ```scala given [T_1, ... T_(N-K)]: TC[[A_1, ..., A_K] =>> DerivingType[T_1, ... T_(N-K), A_1, ..., A_K]] = TC.derived ``` @@ -158,7 +158,7 @@ of the `Mirror` type class available. ## `Mirror` `scala.deriving.Mirror` type class instances provide information at the type level about the components and labelling of the type. -They also provide minimal term level infrastructure to allow higher level libraries to provide comprehensive +They also provide minimal term-level infrastructure to allow higher-level libraries to provide comprehensive derivation support. Instances of the `Mirror` type class are generated automatically by the compiler @@ -269,14 +269,14 @@ No given instance of type deriving.Mirror.Of[A] was found for parameter x of met Note the following properties of `Mirror` types, + Properties are encoded using types rather than terms. This means that they have no runtime footprint unless used and - also that they are a compile time feature for use with Scala 3's metaprogramming facilities. + also that they are a compile-time feature for use with Scala 3's metaprogramming facilities. + There is no restriction against the mirrored type being a local or inner class. + The kinds of `MirroredType` and `MirroredElemTypes` match the kind of the data type the mirror is an instance for. This allows `Mirror`s to support ADTs of all kinds. + There is no distinct representation type for sums or products (ie. there is no `HList` or `Coproduct` type as in Scala 2 versions of Shapeless). Instead the collection of child types of a data type is represented by an ordinary, possibly parameterized, tuple type. Scala 3's metaprogramming facilities can be used to work with these tuple types - as-is, and higher level libraries can be built on top of them. + as-is, and higher-level libraries can be built on top of them. + For both product and sum types, the elements of `MirroredElemTypes` are arranged in definition order (i.e. `Branch[T]` precedes `Leaf[T]` in `MirroredElemTypes` for `Tree` because `Branch` is defined before `Leaf` in the source file). This means that `Mirror.Sum` differs in this respect from Shapeless's generic representation for ADTs in Scala 2, @@ -303,16 +303,16 @@ has a context `Mirror` parameter, or not at all (e.g. they might use some comple instance using Scala 3 macros or runtime reflection). We expect that (direct or indirect) `Mirror` based implementations will be the most common and that is what this document emphasises. -Type class authors will most likely use higher level derivation or generic programming libraries to implement -`derived` methods. An example of how a `derived` method might be implemented using _only_ the low level facilities +Type class authors will most likely use higher-level derivation or generic programming libraries to implement +`derived` methods. An example of how a `derived` method might be implemented using _only_ the low-level facilities described above and Scala 3's general metaprogramming features is provided below. It is not anticipated that type class authors would normally implement a `derived` method in this way, however this walkthrough can be taken as a guide for -authors of the higher level derivation libraries that we expect typical type class authors will use (for a fully +authors of the higher-level derivation libraries that we expect typical type class authors will use (for a fully worked out example of such a library, see [Shapeless 3](https://github.com/milessabin/shapeless/tree/shapeless-3)). -## How to write a type class `derived` method using low level mechanisms +## How to write a type class `derived` method using low-level mechanisms -The low-level method we will use to implement a type class `derived` method in this example exploits three new type-level constructs in Scala 3: inline methods, inline matches, and implicit searches via `summonInline` or `summonFrom`. +The low-level technique we will use to implement a type class `derived` method in this example exploits three new type-level constructs in Scala 3: inline methods, inline matches, and implicit searches via `summonInline` or `summonFrom`. Given this definition of the `Eq` type class, ```scala @@ -335,13 +335,13 @@ inline def derived[T](using m: Mirror.Of[T]): Eq[T] = ``` Note that `derived` is defined as an `inline def`. -This means that the method will be inlined at all call sites (for instance the compiler generated instance definitions in the companion objects of ADTs which have a `deriving Eq` clause). +This means that the method will be inlined at all call sites (for instance the compiler-generated instance definitions in the companion objects of ADTs which have a `deriving Eq` clause). > Inlining of complex code is potentially expensive if overused (meaning slower compile times) so we should be careful to limit how many times `derived` is called for the same type. -> For example, when computing an instance for a sum type, it may be necessary to call `derived` recursively to compute an instance for a one of its child cases. +> For example, when computing an instance for a sum type, it may be necessary to call `derived` recursively to compute an instance for each one of its child cases. > That child case may in turn be a product type, that declares a field referring back to the parent sum type. > To compute the instance for this field, we should not call `derived` recursively, but instead summon from the context. -> Typically the found given instance will be the root given instance that initially called `derived`. +> Typically, the found given instance will be the root given instance that initially called `derived`. The body of `derived` (1) first materializes the `Eq` instances for all the child types of type the instance is being derived for. This is either all the branches of a sum type or all the fields of a product type. @@ -380,7 +380,7 @@ def eqSum[T](s: Mirror.SumOf[T], elems: => List[Eq[?]]): Eq[T] = (s.ordinal(y) == ordx) && check(x, y, elems(ordx)) // (4) ``` -In the product case, `eqProduct` we test the runtime values of the arguments to `eqv` for equality as products based on the `Eq` instances for the fields of the data type (5), +In the product case, `eqProduct`, we test the runtime values of the arguments to `eqv` for equality as products based on the `Eq` instances for the fields of the data type (5), ```scala import scala.deriving.Mirror @@ -486,7 +486,7 @@ Alternative approaches can be taken to the way that `derived` methods can be def inlined variants using Scala 3 macros, whilst being more involved for type class authors to write than the example above, can produce code for type classes like `Eq` which eliminate all the abstraction artefacts (eg. the `Lists` of child instances in the above) and generate code which is indistinguishable from what a programmer might write by hand. -As a third example, using a higher level library such as Shapeless the type class author could define an equivalent +As a third example, using a higher-level library such as Shapeless, the type class author could define an equivalent `derived` method as, ```scala @@ -508,7 +508,7 @@ inline def derived[A](using gen: K0.Generic[A]): Eq[A] = The framework described here enables all three of these approaches without mandating any of them. For a brief discussion on how to use macros to write a type class `derived` -method please read more at [How to write a type class `derived` method using macros](./derivation-macro.md). +method, please read more at [How to write a type class `derived` method using macros](./derivation-macro.md). ## Syntax @@ -539,22 +539,22 @@ This type class derivation framework is intentionally very small and low-level. infrastructure in compiler-generated `Mirror` instances, + type members encoding properties of the mirrored types. -+ a minimal value level mechanism for working generically with terms of the mirrored types. ++ a minimal value-level mechanism for working generically with terms of the mirrored types. The `Mirror` infrastructure can be seen as an extension of the existing `Product` infrastructure for case classes: -typically `Mirror` types will be implemented by the ADTs companion object, hence the type members and the `ordinal` or +typically, `Mirror` types will be implemented by the ADTs companion object, hence the type members and the `ordinal` or `fromProduct` methods will be members of that object. The primary motivation for this design decision, and the decision to encode properties via types rather than terms was to keep the bytecode and runtime footprint of the feature small enough to make it possible to provide `Mirror` instances _unconditionally_. -Whilst `Mirrors` encode properties precisely via type members, the value level `ordinal` and `fromProduct` are +Whilst `Mirrors` encode properties precisely via type members, the value-level `ordinal` and `fromProduct` are somewhat weakly typed (because they are defined in terms of `MirroredMonoType`) just like the members of `Product`. This means that code for generic type classes has to ensure that type exploration and value selection proceed in lockstep and it has to assert this conformance in some places using casts. If generic type classes are correctly written these casts will never fail. -As mentioned, however, the compiler-provided mechanism is intentionally very low level and it is anticipated that -higher level type class derivation and generic programming libraries will build on this and Scala 3's other +As mentioned, however, the compiler-provided mechanism is intentionally very low-level and it is anticipated that +higher-level type class derivation and generic programming libraries will build on this and Scala 3's other metaprogramming facilities to hide these low-level details from type class authors and general users. Type class derivation in the style of both Shapeless and Magnolia are possible (a prototype of Shapeless 3, which combines aspects of both Shapeless 2 and Magnolia has been developed alongside this language feature) as is a more aggressively From 1627f0586de8d95ab938e02a483435f2e9faf279 Mon Sep 17 00:00:00 2001 From: Stephane Bersier Date: Mon, 29 Jan 2024 06:45:44 -0500 Subject: [PATCH 175/371] Update derivation.md Added missing import --- docs/_docs/reference/contextual/derivation.md | 3 ++- 1 file changed, 2 insertions(+), 1 deletion(-) diff --git a/docs/_docs/reference/contextual/derivation.md b/docs/_docs/reference/contextual/derivation.md index 5157bb57699c..ed0e005c1bd4 100644 --- a/docs/_docs/reference/contextual/derivation.md +++ b/docs/_docs/reference/contextual/derivation.md @@ -396,8 +396,9 @@ Both `eqSum` and `eqProduct` have a by-name parameter `elems`, because the argum Pulling this all together we have the following complete implementation, ```scala +import scala.collection.AbstractIterable +import scala.compiletime.{erasedValue, error, summonInline} import scala.deriving.* -import scala.compiletime.{error, erasedValue, summonInline} inline def summonInstances[T, Elems <: Tuple]: List[Eq[?]] = inline erasedValue[Elems] match From 4789d09045489b78e92fa8c6a4fedfaf50fcf734 Mon Sep 17 00:00:00 2001 From: =?UTF-8?q?Fabi=C3=A1n=20Heredia=20Montiel?= Date: Mon, 29 Jan 2024 12:49:56 -0600 Subject: [PATCH 176/371] =?UTF-8?q?jsoup:=201.14.3=20=E2=86=92=201.17.2?= MIME-Version: 1.0 Content-Type: text/plain; charset=UTF-8 Content-Transfer-Encoding: 8bit From MVNRepository: Direct vulnerabilities: - CVE-2022-36033 Vulnerabilities from dependencies: - CVE-2023-26049 - CVE-2023-26048 - CVE-2022-25647 https://mvnrepository.com/artifact/org.jsoup/jsoup/1.14.3 (cherry picked from commit 27fbeaf85965a9f0345459730a744b2f9ee51698) --- project/Build.scala | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/project/Build.scala b/project/Build.scala index 515cf04b8cf3..601da1981129 100644 --- a/project/Build.scala +++ b/project/Build.scala @@ -1711,7 +1711,7 @@ object Build { ), libraryDependencies ++= Dependencies.flexmarkDeps ++ Seq( "nl.big-o" % "liqp" % "0.8.2", - "org.jsoup" % "jsoup" % "1.14.3", // Needed to process .html files for static site + "org.jsoup" % "jsoup" % "1.17.2", // Needed to process .html files for static site Dependencies.`jackson-dataformat-yaml`, "com.github.sbt" % "junit-interface" % "0.13.3" % Test, From 5ece528b8282e0616bfd381c49c2265340eca8a9 Mon Sep 17 00:00:00 2001 From: =?UTF-8?q?Pawe=C5=82=20Marks?= Date: Thu, 1 Feb 2024 15:21:23 +0100 Subject: [PATCH 177/371] Add changelog for 3.4.0-RC4 --- changelogs/3.4.0-RC4.md | 14 ++++++++++++++ 1 file changed, 14 insertions(+) create mode 100644 changelogs/3.4.0-RC4.md diff --git a/changelogs/3.4.0-RC4.md b/changelogs/3.4.0-RC4.md new file mode 100644 index 000000000000..ecbcdabdd586 --- /dev/null +++ b/changelogs/3.4.0-RC4.md @@ -0,0 +1,14 @@ +# Backported fixes + +- Update jsoup dependency of Scaladoc to 7.2 [#19584](https://github.com/lampepfl/dotty/pull/19584) + +# Contributors + +Thank you to all the contributors who made this release possible 🎉 + +According to `git shortlog -sn --no-merges 3.4.0-RC3..3.4.0-RC4` these are: + +``` + 2 Paweł Marks + 1 Fabián Heredia Montiel +``` From 97a4238ad0de85ede2f49c0ca65d6a62e1c7eeb1 Mon Sep 17 00:00:00 2001 From: =?UTF-8?q?Pawe=C5=82=20Marks?= Date: Thu, 1 Feb 2024 15:22:08 +0100 Subject: [PATCH 178/371] Release 3.4.0-RC4 --- project/Build.scala | 4 ++-- 1 file changed, 2 insertions(+), 2 deletions(-) diff --git a/project/Build.scala b/project/Build.scala index 601da1981129..79570fdbd401 100644 --- a/project/Build.scala +++ b/project/Build.scala @@ -83,9 +83,9 @@ object DottyJSPlugin extends AutoPlugin { object Build { import ScaladocConfigs._ - val referenceVersion = "3.4.0-RC2" + val referenceVersion = "3.4.0-RC3" - val baseVersion = "3.4.0-RC3" + val baseVersion = "3.4.0-RC4" // Versions used by the vscode extension to create a new project // This should be the latest published releases. From 3997e79413b232d7d7c50b0abd5385ae8174b0f8 Mon Sep 17 00:00:00 2001 From: =?UTF-8?q?Pawe=C5=82=20Marks?= Date: Wed, 14 Feb 2024 12:41:16 +0100 Subject: [PATCH 179/371] Add changelog for 3.4.0 --- changelogs/3.4.0.md | 474 ++++++++++++++++++++++++++++++++++++++++++++ 1 file changed, 474 insertions(+) create mode 100644 changelogs/3.4.0.md diff --git a/changelogs/3.4.0.md b/changelogs/3.4.0.md new file mode 100644 index 000000000000..cf6ee8d010d5 --- /dev/null +++ b/changelogs/3.4.0.md @@ -0,0 +1,474 @@ +# Highlights of the release + +- Make polymorphic functions more efficient and expressive [#17548](https://github.com/lampepfl/dotty/pull/17548) +- SIP-56: Better foundations for match types [#18262](https://github.com/lampepfl/dotty/pull/18262) +- Make SIP 54 (Multi-Source Extension Overloads) a standard feature [#17441](https://github.com/lampepfl/dotty/pull/17441) +- Value parameter inference for polymorphic lambdas [#18041](https://github.com/lampepfl/dotty/pull/18041) +- Add `@publicInBinary` annotation and `-WunstableInlineAccessors` linting flag [#18402](https://github.com/lampepfl/dotty/pull/18402) +- Stabilize Quotes `defn.PolyFunction` [#18480](https://github.com/lampepfl/dotty/pull/18480) +- Stabilize Quotes `Flags.AbsOverride` [#18482](https://github.com/lampepfl/dotty/pull/18482) +- Add `-experimental` compiler flags [#18571](https://github.com/lampepfl/dotty/pull/18571) +- Stabilize SIP-53 (quote pattern explicit type variable syntax) [#18574](https://github.com/lampepfl/dotty/pull/18574) +- Add reflect TypeRepr.dealiasKeepOpaques [#18583](https://github.com/lampepfl/dotty/pull/18583) +- Add attributes section to TASTy and use it for Stdlib TASTy [#18599](https://github.com/lampepfl/dotty/pull/18599) +- Error when reading class file with unknown newer jdk version [#18618](https://github.com/lampepfl/dotty/pull/18618) +- Add support for xsbti.compile.CompileProgress [#18739](https://github.com/lampepfl/dotty/pull/18739) +- Improve type inference for functions like fold [#18780](https://github.com/lampepfl/dotty/pull/18780) +- Improve error message for mismatched tasty versions, allow configuration of header unpickler [#18828](https://github.com/lampepfl/dotty/pull/18828) +- In 3.4 make refutable patterns in a for comprehension an error [#18842](https://github.com/lampepfl/dotty/pull/18842) +- Disallow use of PolyFunction in user code [#18920](https://github.com/lampepfl/dotty/pull/18920) +- Store source file in TASTY attributes [#18948](https://github.com/lampepfl/dotty/pull/18948) +- First step to pipelining support - enable reading Java symbols from TASTy [#19074](https://github.com/lampepfl/dotty/pull/19074) +- Activate constrainResult fix in 3.4 [#19253](https://github.com/lampepfl/dotty/pull/19253) +- Parallelise JVM backend - Scala 2 port [#15392](https://github.com/lampepfl/dotty/pull/15392) +- Avoid generating given definitions that loop [#19282](https://github.com/lampepfl/dotty/pull/19282) + +## Deprecation warnings for old syntax + +- `_` type wildcards [#18813](https://github.com/lampepfl/dotty/pull/18813) +- `private[this]` [#18819](https://github.com/lampepfl/dotty/pull/18819) +- `var x = _` [#18821](https://github.com/lampepfl/dotty/pull/18821) +- `with` as a type operator [#18837](https://github.com/lampepfl/dotty/pull/18837) +- `xs: _*` varargs [#18872](https://github.com/lampepfl/dotty/pull/18872) +- trailing `_` to force eta expansion [#18926](https://github.com/lampepfl/dotty/pull/18926) +- Make explicit arguments for context bounds an error from 3.5 [#19316](https://github.com/lampepfl/dotty/pull/19316) + +# Other changes and fixes + +## Backend + +- Count size of parameters for platform limit check [#18464](https://github.com/lampepfl/dotty/pull/18464) +- Don't emit line number for synthetic unit value [#18717](https://github.com/lampepfl/dotty/pull/18717) +- Avoid too eager transform of $outer for lhs & accessor rhs [#18949](https://github.com/lampepfl/dotty/pull/18949) +- Make more anonymous functions static [#19251](https://github.com/lampepfl/dotty/pull/19251) +- Fix deadlock in initialization of CoreBTypes using Lazy container [#19298](https://github.com/lampepfl/dotty/pull/19298) +- Fix #18769: Allow HK type args in Java signatures. [#18883](https://github.com/lampepfl/dotty/pull/18883) +- Loading symbols from TASTy files directly [#17594](https://github.com/lampepfl/dotty/pull/17594) +- Use dedicated equals method for univerval equality of chars [#18770](https://github.com/lampepfl/dotty/pull/18770) + +## Erasure + +- Get generic signature of fields entered after erasure from their accessor [#19207](https://github.com/lampepfl/dotty/pull/19207) +- Detect case where two alternatives are the same after widening ExprTypes [#18787](https://github.com/lampepfl/dotty/pull/18787) +- Improve erased params logic [#18433](https://github.com/lampepfl/dotty/pull/18433) + +## Experimental: Capture Checking + +- Fix capture set variable installation in Setup [#18885](https://github.com/lampepfl/dotty/pull/18885) +- Don't follow opaque aliases when transforming sym info for cc [#18929](https://github.com/lampepfl/dotty/pull/18929) +- Reset `comparersInUse` to zero in `ContextState.reset` [#18915](https://github.com/lampepfl/dotty/pull/18915) +- Special handling of experimental.captureChecking import [#17427](https://github.com/lampepfl/dotty/pull/17427) +- Change handling of curried function types in capture checking [#18131](https://github.com/lampepfl/dotty/pull/18131) +- Fix #18246: correctly compute capture sets in `TypeComparer.glb` [#18254](https://github.com/lampepfl/dotty/pull/18254) +- New capture escape checking based on levels [#18463](https://github.com/lampepfl/dotty/pull/18463) +- A more robust scheme for resetting denotations after Recheck [#18534](https://github.com/lampepfl/dotty/pull/18534) +- A more flexible scheme for handling the universal capability [#18699](https://github.com/lampepfl/dotty/pull/18699) +- Fix potential soundness hole when adding references to a mapped capture set [#18758](https://github.com/lampepfl/dotty/pull/18758) +- Alternative scheme for cc encapsulation [#18899](https://github.com/lampepfl/dotty/pull/18899) +- Make reach refinement shallow [#19171](https://github.com/lampepfl/dotty/pull/19171) + +## F-bounds + +- Don't check bounds of Java applications in Java units [#18054](https://github.com/lampepfl/dotty/pull/18054) + +## GADTs + +- Avoid embedding SelectionProtos in Conversions [#17755](https://github.com/lampepfl/dotty/pull/17755) +- Freeze constraints while calculating GADT full bounds [#18222](https://github.com/lampepfl/dotty/pull/18222) + +## Implicits + +- Followup fix to transparent inline conversion [#18130](https://github.com/lampepfl/dotty/pull/18130) +- Select local implicits over name-imported over wildcard imported [#18203](https://github.com/lampepfl/dotty/pull/18203) +- Fix how implicit candidates are combined [#18321](https://github.com/lampepfl/dotty/pull/18321) +- Improve error message about missing type of context function parameter [#18788](https://github.com/lampepfl/dotty/pull/18788) +- Support implicit arguments before extractor method [#18671](https://github.com/lampepfl/dotty/pull/18671) +- Tweak convertible implicits fix [#18727](https://github.com/lampepfl/dotty/pull/18727) +- Turn given loop prevention on for -source future [#19392](https://github.com/lampepfl/dotty/pull/19392) +- Fix algorithm to prevent recursive givens [#19411](https://github.com/lampepfl/dotty/pull/19411) +- Handle default implicits to context parameters under -3.4-migration [#19512] (https://github.com/lampepfl/dotty/pull/19512) + +## Incremental Compilation + +- Make incremental compilation aware of synthesized mirrors [#18310](https://github.com/lampepfl/dotty/pull/18310) + +## Inference + +- Honour hard unions in lubbing and param replacing [#18680](https://github.com/lampepfl/dotty/pull/18680) + +## Infrastructure + +- Use -Yscala2-library-tasty to add Scala 2 lib TASTY to scalac (internal only) [#18613](https://github.com/lampepfl/dotty/pull/18613) +- Rename `stdlib-bootstrapped-tasty` to `scala2-library-tasty` [#18615](https://github.com/lampepfl/dotty/pull/18615) +- Fix #19286: Freeze rubygems-update at < 3.5.0. [#19288](https://github.com/lampepfl/dotty/pull/19288) + +## Initialization + +- Fix #17997: Handle intersection type as this type of super type [#18069](https://github.com/lampepfl/dotty/pull/18069) +- Add test for issue #17997 affecting the global object initialization checker [#18141](https://github.com/lampepfl/dotty/pull/18141) +- Fix i18624 and add test case for it [#18859](https://github.com/lampepfl/dotty/pull/18859) +- Treat new Array(0) as immutable [#19192](https://github.com/lampepfl/dotty/pull/19192) +- Fix #18407: Ignore Quote/Slice in init checker [#18848](https://github.com/lampepfl/dotty/pull/18848) +- Check safe initialization of static objects [#16970](https://github.com/lampepfl/dotty/pull/16970) +- Pattern match support in checking global objects [#18127](https://github.com/lampepfl/dotty/pull/18127) +- Fix crash in global object initialization checker when select target has no source [#18627](https://github.com/lampepfl/dotty/pull/18627) +- Fix warning underlining in global init checker [#18668](https://github.com/lampepfl/dotty/pull/18668) +- Fix i18629 [#18839](https://github.com/lampepfl/dotty/pull/18839) +- I18628 [#18841](https://github.com/lampepfl/dotty/pull/18841) +- Make safe init checker skip global objects [#18906](https://github.com/lampepfl/dotty/pull/18906) +- Handle local lazy vals properly [#18998](https://github.com/lampepfl/dotty/pull/18998) + +## Inline + +- Fix regression: inline match crash when rhs uses private inlined methods [#18595](https://github.com/lampepfl/dotty/pull/18595) +- Add structural classes of dynamicApply before inlining [#18766](https://github.com/lampepfl/dotty/pull/18766) +- Set missing expansion span for copied inlined node [#18229](https://github.com/lampepfl/dotty/pull/18229) +- Fix `callTrace` of inlined methods [#18738](https://github.com/lampepfl/dotty/pull/18738) + +## Linting + +- Keep tree of type ascriptions of quote pattern splices [#18412](https://github.com/lampepfl/dotty/pull/18412) +- Fix false positive in WUnused for renamed path-dependent imports [#18468](https://github.com/lampepfl/dotty/pull/18468) +- Fix false positive in WUnused for renamed path-dependent imports (2) [#18617](https://github.com/lampepfl/dotty/pull/18617) +- Fix wunused false positive on CanEqual [#18641](https://github.com/lampepfl/dotty/pull/18641) +- Implement -Xlint:private-shadow, type-parameter-shadow [#17622](https://github.com/lampepfl/dotty/pull/17622) +- Fix: reversed wconf parsing order to mirror scala 2 [#18503](https://github.com/lampepfl/dotty/pull/18503) +- Revert Fix false positive in WUnused for renamed path-dependent imports [#18514](https://github.com/lampepfl/dotty/pull/18514) + +## Macro Annotations + +- Enter missing symbols generated by the MacroAnnotation expansion [#18826](https://github.com/lampepfl/dotty/pull/18826) + +## Match Types + +- Allow Tuple.Head and Tuple.Tail to work with EmptyTuple [#17189](https://github.com/lampepfl/dotty/pull/17189) +- Fix match type reduction with avoided types [#18043](https://github.com/lampepfl/dotty/pull/18043) +- Strip LazyRef before calling simplified, in MT reduction [#18218](https://github.com/lampepfl/dotty/pull/18218) +- Fix MT separate compilation bug [#18398](https://github.com/lampepfl/dotty/pull/18398) +- Do not show deprecation warning for `_` in type match case [#18887](https://github.com/lampepfl/dotty/pull/18887) + +## Nullability + +- Improve logic when to emit pattern type error [#18093](https://github.com/lampepfl/dotty/pull/18093) +- Allow nullability flow typing even in presence of pattern match [#18206](https://github.com/lampepfl/dotty/pull/18206) +- Fix #11967: flow typing nullability in pattern matches [#18212](https://github.com/lampepfl/dotty/pull/18212) +- Fix #18282: consider Predef.eq/ne in nullability flow typing [#18299](https://github.com/lampepfl/dotty/pull/18299) +- Make `this.type` nullable again (unless under -Yexplicit-nulls). [#18399](https://github.com/lampepfl/dotty/pull/18399) + +## Opaque Types + +- Type ascribe trees that require opaque type usage [#18101](https://github.com/lampepfl/dotty/pull/18101) + +## Parser + +- Fix selecting terms using _root_ [#18335](https://github.com/lampepfl/dotty/pull/18335) +- Tweak java getlitch not to skip zero [#18491](https://github.com/lampepfl/dotty/pull/18491) +- Fix i18518 [#18520](https://github.com/lampepfl/dotty/pull/18520) +- Only apply `future` patches on `future-migration` [#18820](https://github.com/lampepfl/dotty/pull/18820) +- Parser simple expression error recovery change from `null` to `???` [#19103](https://github.com/lampepfl/dotty/pull/19103) + +## Pattern Matching + +- Fix syntax and parsing of vararg patterns [#18055](https://github.com/lampepfl/dotty/pull/18055) +- Avoid over widening in SpaceEngine [#18252](https://github.com/lampepfl/dotty/pull/18252) +- Fix regression in exhaustivity of HK types [#18303](https://github.com/lampepfl/dotty/pull/18303) +- Fix missing case in isSubspace, which broke reachablility [#18326](https://github.com/lampepfl/dotty/pull/18326) +- Unsuppress unchecked warnings [#18377](https://github.com/lampepfl/dotty/pull/18377) +- Consider extension methods in Space isSameUnapply [#18642](https://github.com/lampepfl/dotty/pull/18642) +- Fix unreachable warning in deeply nested sealed hierarchy [#18706](https://github.com/lampepfl/dotty/pull/18706) +- Remove unnecessary and recursive Space decomposition [#19216](https://github.com/lampepfl/dotty/pull/19216) +- Prioritise sequence-matches over product-sequence-matches [#19260](https://github.com/lampepfl/dotty/pull/19260) +- Propagate constant in result of inline match [#18455](https://github.com/lampepfl/dotty/pull/18455) +- Disable match anaylsis in inlined trees [#19190](https://github.com/lampepfl/dotty/pull/19190) +- Teach provablyDisjoint about AnyKind [#18510](https://github.com/lampepfl/dotty/pull/18510) +- Warn about unchecked type tests in primitive catch cases [#19206](https://github.com/lampepfl/dotty/pull/19206) +- Reprioritise seq-match over product-seq-match [#19277](https://github.com/lampepfl/dotty/pull/19277) +- Fix exhaustivity due to separate TypeVar lambdas [#18616](https://github.com/lampepfl/dotty/pull/18616) + +## Presentation Compiler + +- Support completions for extension definition parameter [#18331](https://github.com/lampepfl/dotty/pull/18331) +- Fix: Don't collect map, flatMap, withFilter in for-comprehension [#18430](https://github.com/lampepfl/dotty/pull/18430) +- Bugfix: Catch exception from the compiler for broken shadowed pickles [#18502](https://github.com/lampepfl/dotty/pull/18502) +- Bugfix: highlight for enum type params [#18528](https://github.com/lampepfl/dotty/pull/18528) +- Bugfix: No signature help for local methods [#18594](https://github.com/lampepfl/dotty/pull/18594) +- Bugfix: add `moduleClass` imported symbols in `IndexedContext` [#18620](https://github.com/lampepfl/dotty/pull/18620) +- Bugfix: Named args completions with default values [#18633](https://github.com/lampepfl/dotty/pull/18633) +- Fix: match completions for type aliases [#18667](https://github.com/lampepfl/dotty/pull/18667) +- Bugfix: add multiline comment completion [#18703](https://github.com/lampepfl/dotty/pull/18703) +- Bugfix: Backticked named arguments [#18704](https://github.com/lampepfl/dotty/pull/18704) +- Bugfix: [metals] Case completions for tuple type [#18751](https://github.com/lampepfl/dotty/pull/18751) +- Completions should prepend, not replace as it is for Scala 2 [#18803](https://github.com/lampepfl/dotty/pull/18803) +- Bugfix: rename end marker [#18838](https://github.com/lampepfl/dotty/pull/18838) +- Presentation compiler: Bugfix for semantic tokens and synthetic decorations [#18955](https://github.com/lampepfl/dotty/pull/18955) +- Show documentation for value forwarders in completions [#19200](https://github.com/lampepfl/dotty/pull/19200) +- Bugfix: Document highlight on class constructors [#19209](https://github.com/lampepfl/dotty/pull/19209) +- Bugfix: Completions for extension methods with name conflict [#19225](https://github.com/lampepfl/dotty/pull/19225) + +## Polyfunctions + +- Check user defined PolyFunction refinements [#18457](https://github.com/lampepfl/dotty/pull/18457) +- Support polymorphic functions with erased parameters [#18293](https://github.com/lampepfl/dotty/pull/18293) +- Use `PolyFunction` instead of `ErasedFunction` [#18295](https://github.com/lampepfl/dotty/pull/18295) + +## Quotes + +- Support type variable with bounds in quoted pattern [#16910](https://github.com/lampepfl/dotty/pull/16910) +- Add new EXPLICITtpt to TASTy format [#17298](https://github.com/lampepfl/dotty/pull/17298) +- Inhibit typer to insert contextual arguments when it is inside arguments of HOAS patterns [#18040](https://github.com/lampepfl/dotty/pull/18040) +- Compile quote patterns directly into QuotePattern AST [#18133](https://github.com/lampepfl/dotty/pull/18133) +- Add missing span to synthesized product mirror [#18354](https://github.com/lampepfl/dotty/pull/18354) +- Improve non-static macro implementation error message [#18405](https://github.com/lampepfl/dotty/pull/18405) +- Fix scala 2 macros in traits with type parameters [#18663](https://github.com/lampepfl/dotty/pull/18663) +- Patch `underlyingArgument` to avoid mapping into modules [#18923](https://github.com/lampepfl/dotty/pull/18923) +- Fallback erasing term references [#18731](https://github.com/lampepfl/dotty/pull/18731) +- Fix ignored type variable bound warning in type quote pattern [#18199](https://github.com/lampepfl/dotty/pull/18199) +- Splice hole with singleton captures [#18357](https://github.com/lampepfl/dotty/pull/18357) +- Fix macros with erased arguments [#18431](https://github.com/lampepfl/dotty/pull/18431) +- Deprecate 3-arg `FunctionClass` constructor [#18472](https://github.com/lampepfl/dotty/pull/18472) +- Deprecate `Quotes` `{MethodType,TermParamClause}.isErased` [#18479](https://github.com/lampepfl/dotty/pull/18479) +- Avoid crashes on missing positions [#19250](https://github.com/lampepfl/dotty/pull/19250) + +## Reflection + +- Add reflect.ValOrDefDef [#16974](https://github.com/lampepfl/dotty/pull/16974) +- Check New tree for ill-formed module instantiations [#17553](https://github.com/lampepfl/dotty/pull/17553) +- Add reflect `TypeLambda.paramVariances` [#17568](https://github.com/lampepfl/dotty/pull/17568) +- Make check flags for `newMethod`, `newVal` and `newBind` in Quotes API less restrictive [#18217](https://github.com/lampepfl/dotty/pull/18217) +- Normalise mirrorType for mirror Synthesis [#19199](https://github.com/lampepfl/dotty/pull/19199) +- Add reflect `defn.FunctionClass` overloads [#16849](https://github.com/lampepfl/dotty/pull/16849) +- Stabilize reflect flag `JavaAnnotation` [#19267](https://github.com/lampepfl/dotty/pull/19267) +- Stabilize reflect `paramVariance` [#19268](https://github.com/lampepfl/dotty/pull/19268) + +## Reporting + +- Take into account the result type of inline implicit conversions unless they are transparent [#17924](https://github.com/lampepfl/dotty/pull/17924) +- Check if a fatal warning issued in typer is silenced, before converting it into an error [#18089](https://github.com/lampepfl/dotty/pull/18089) +- Elide companion defs to a `object` extending `AnyVal` [#18451](https://github.com/lampepfl/dotty/pull/18451) +- Add regression test for issue i18493 [#18497](https://github.com/lampepfl/dotty/pull/18497) +- Add better explanation to error message [#18665](https://github.com/lampepfl/dotty/pull/18665) +- Better error message when accessing private members [#18690](https://github.com/lampepfl/dotty/pull/18690) +- Improve message for discarded pure non-Unit values [#18723](https://github.com/lampepfl/dotty/pull/18723) +- Better error message when a pattern match extractor is not found. [#18725](https://github.com/lampepfl/dotty/pull/18725) +- Give "did you mean ...?" hints also for simple identifiers [#18747](https://github.com/lampepfl/dotty/pull/18747) +- Better error for definition followed by keyword [#18752](https://github.com/lampepfl/dotty/pull/18752) +- Better explain message for 'pattern expected' [#18753](https://github.com/lampepfl/dotty/pull/18753) +- Improve failure message of enum `fromOrdinal`/`valueOf` [#19182](https://github.com/lampepfl/dotty/pull/19182) +- Fix type mismatch error confusion between types with same simple name [#19204](https://github.com/lampepfl/dotty/pull/19204) +- Add hint for nested quotes missing staged `Quotes` [#18755](https://github.com/lampepfl/dotty/pull/18755) +- Better error messages for missing commas and more [#18785](https://github.com/lampepfl/dotty/pull/18785) +- Fix imported twice error messages [#18102](https://github.com/lampepfl/dotty/pull/18102) +- Improve error message for inaccessible types [#18406](https://github.com/lampepfl/dotty/pull/18406) +- Future migration warning for `with` type operator [#18818](https://github.com/lampepfl/dotty/pull/18818) +- Improve assertion error message for `Apply` and `TypeApply` [#18700](https://github.com/lampepfl/dotty/pull/18700) +- Shorten traces for TypeMismatch errors under -explain [#18742]( +- Improve `with` in type migration warning [#18852](https://github.com/lampepfl/dotty/pull/18852) +hub.com/lampepfl/dotty/pull/18742) +- Future migration warning for alphanumeric infix operator [#18908](https://github.com/lampepfl/dotty/pull/18908) +- Make sure that trace is shown correctly in the presence of invalid line numbers [#18930](https://github.com/lampepfl/dotty/pull/18930) +- Add migration warning for XML literals in language future [#19101](https://github.com/lampepfl/dotty/pull/19101) +- Avoid diagnostic message forcing crashing the compiler [#19113](https://github.com/lampepfl/dotty/pull/19113) +- Make sure that the stacktrace is shown with `-Ydebug-unpickling` [#19115](https://github.com/lampepfl/dotty/pull/19115) +- Improve `asExprOf` cast error formatting [#19195](https://github.com/lampepfl/dotty/pull/19195) +- Do not warn on underscore wildcard type in pattern [#19249](https://github.com/lampepfl/dotty/pull/19249) + +## Scala-JS + +- Fix #18658: Handle varargs of generic types in `JSExportsGen`. [#18659](https://github.com/lampepfl/dotty/pull/18659) + +## Scaladoc + +- Fix incorrect comment parser used in nightly scaladoc [#18523](https://github.com/lampepfl/dotty/pull/18523) +- Update jsoup dependency of Scaladoc to 7.2 [#19584](https://github.com/lampepfl/dotty/pull/19584) + +## SemanticDB + +- Export diagnostics (including unused warnings) to SemanticDB [#17835](https://github.com/lampepfl/dotty/pull/17835) +- Bugfix: Incorrect semanticdb span on Selectable [#18576](https://github.com/lampepfl/dotty/pull/18576) +- Bugfix: in semanticdb make synthetic apply disambiguator consistent w/ Scala 2 implicit [#17341](https://github.com/lampepfl/dotty/pull/17341) + +## Standard Library + +- Intrinsify `constValueTuple` and `summonAll` [#18013](https://github.com/lampepfl/dotty/pull/18013) +- Fix #18609: Add language.`3.4` and language.`3.4-migration`. [#18610](https://github.com/lampepfl/dotty/pull/18610) + +## TASTy format + +- Eliminate FromJavaObject from TASTy of Java sources [#19259](https://github.com/lampepfl/dotty/pull/19259) +- Add new HOLETYPES to TASTy format [#17225](https://github.com/lampepfl/dotty/pull/17225) +- Add capture checking attributes to TASTy [#19033](https://github.com/lampepfl/dotty/pull/19033) +- Add TASTyInfo abstraction [#19089](https://github.com/lampepfl/dotty/pull/19089) +- Add UTF8 abstraction in the TASTy format [#19090](https://github.com/lampepfl/dotty/pull/19090) + +## Tooling + +- Don't add explanation twice [#18779](https://github.com/lampepfl/dotty/pull/18779) +- ExtractDependencies uses more efficient caching [#18403](https://github.com/lampepfl/dotty/pull/18403) +- Introduce the SourceVersions 3.4 and 3.4-migration; make 3.4 the default. [#18501](https://github.com/lampepfl/dotty/pull/18501) +- Bugfix: Completions for named args in wrong order [#18702](https://github.com/lampepfl/dotty/pull/18702) +- Align unpickled Scala 2 accessors encoding with Scala 3 [#18874](https://github.com/lampepfl/dotty/pull/18874) +- Reinterpret Scala 2 case accessors `xyz$access$idx` [#18907](https://github.com/lampepfl/dotty/pull/18907) +- Presentation-compiler: Add synthetic decorations [#18951](https://github.com/lampepfl/dotty/pull/18951) +- Add compilation unit info to `ClassSymbol` [#19010](https://github.com/lampepfl/dotty/pull/19010) +- Make sure that patches for 3.0 are also applied in later versions [#19018](https://github.com/lampepfl/dotty/pull/19018) + +## Transform + +- Also consider @targetName when checking private overrides [#18361](https://github.com/lampepfl/dotty/pull/18361) +- Teach PostTyper to handle untupled context closures [#17739](https://github.com/lampepfl/dotty/pull/17739) +- Properly dealias tuple types when specializing [#18724](https://github.com/lampepfl/dotty/pull/18724) +- Fix condition in prefixIsElidable to prevent compiler crash [#18924](https://github.com/lampepfl/dotty/pull/18924) +- Fix #18816: Transfer the span of rewired `This` nodes in `fullyParameterizedDef`. [#18840](https://github.com/lampepfl/dotty/pull/18840) +- List(...) optimization to avoid intermediate array [#17166](https://github.com/lampepfl/dotty/pull/17166) +- Make Array.apply an intrinsic [#18537](https://github.com/lampepfl/dotty/pull/18537) +- Add missing span to extension method select [#18557](https://github.com/lampepfl/dotty/pull/18557) + +## Tuples + +- Handle TupleXXL in match analysis [#19212](https://github.com/lampepfl/dotty/pull/19212) +- Add `reverse` method to `NonEmptyTuple` [#13752](https://github.com/lampepfl/dotty/pull/13752) +- Refine handling of pattern binders for large tuples [#19085](https://github.com/lampepfl/dotty/pull/19085) +- Introduce `Tuple.ReverseOnto` and use it in `Tuple.reverse` [#19183](https://github.com/lampepfl/dotty/pull/19183) + +## Typeclass Derivation + +- Consider all parents when checking access to the children of a sum [#19083](https://github.com/lampepfl/dotty/pull/19083) + +## Typer + +- Fix logic when comparing var/def bindings with val refinements [#18049](https://github.com/lampepfl/dotty/pull/18049) +- Fix variance checking in refinements [#18053](https://github.com/lampepfl/dotty/pull/18053) +- Fix accessibleType for package object prefixes [#18057](https://github.com/lampepfl/dotty/pull/18057) +- Refix avoid GADT casting with ProtoTypes [#18085](https://github.com/lampepfl/dotty/pull/18085) +- Avoid shadowing by private definitions in more situations [#18142](https://github.com/lampepfl/dotty/pull/18142) +- Refine infoDependsOnPrefix [#18204](https://github.com/lampepfl/dotty/pull/18204) +- Fix spurious subtype check pruning when both sides have unions [#18213](https://github.com/lampepfl/dotty/pull/18213) +- Reimplement support for type aliases in SAM types [#18317](https://github.com/lampepfl/dotty/pull/18317) +- Fix adaptation of constants to constant type aliases [#18360](https://github.com/lampepfl/dotty/pull/18360) +- Issue "positional after named argument" errors [#18363](https://github.com/lampepfl/dotty/pull/18363) +- Deprecate `ops.long.S` [#18426](https://github.com/lampepfl/dotty/pull/18426) +- Tweak selection from self types [#18467](https://github.com/lampepfl/dotty/pull/18467) +- Use the unwidened type when casting structural calls [#18527](https://github.com/lampepfl/dotty/pull/18527) +- Fix #18649: Use loBound of param types when materializing a context function. [#18651](https://github.com/lampepfl/dotty/pull/18651) +- Identify structural trees on Match Type qualifiers [#18765](https://github.com/lampepfl/dotty/pull/18765) +- Tweak approximation of type variables when computing default types [#18798](https://github.com/lampepfl/dotty/pull/18798) +- Admit parametric aliases of classes in parent typing [#18849](https://github.com/lampepfl/dotty/pull/18849) +- Also add privateWithin when creating constructor proxies [#18893](https://github.com/lampepfl/dotty/pull/18893) +- Revert part of `Simplify defn.FunctionOf.unapply` [#19012](https://github.com/lampepfl/dotty/pull/19012) +- Check @targetName when subtyping Refined Types [#19081](https://github.com/lampepfl/dotty/pull/19081) +- Record failures to adapt application arguments [#18269](https://github.com/lampepfl/dotty/pull/18269) +- Improve handling of AndTypes on the LHS of subtype comparisons [#18235](https://github.com/lampepfl/dotty/pull/18235) +- Allow inferred parameter types always, when eta-expanding [#18771](https://github.com/lampepfl/dotty/pull/18771) +- Fix failing bounds check on default getter [#18419](https://github.com/lampepfl/dotty/pull/18419) +- Use constructor's default getters in case class synthetic `apply` methods [#18716](https://github.com/lampepfl/dotty/pull/18716) +- Keep qualifier of Ident when selecting setter [#18714](https://github.com/lampepfl/dotty/pull/18714) +- Retract SynthesizeExtMethodReceiver mode when when going deeper in overloading resolution [#18759](https://github.com/lampepfl/dotty/pull/18759) +- Constant fold all the number conversion methods [#17446](https://github.com/lampepfl/dotty/pull/17446) +- Refine criterion when to widen types [#17180](https://github.com/lampepfl/dotty/pull/17180) +- Run all MatchType reduction under Mode.Type [#17937](https://github.com/lampepfl/dotty/pull/17937) +- Force consistent MT post-redux normalisation, disallow infinite match types [#18073](https://github.com/lampepfl/dotty/pull/18073) +- Fix #17467: Limit isNullable widening to stable TermRefs; remove under explicit nulls. [#17470](https://github.com/lampepfl/dotty/pull/17470) +- Disallow naming the root package, except for selections [#18187](https://github.com/lampepfl/dotty/pull/18187) +- Contextual varargs parameters [#18186](https://github.com/lampepfl/dotty/pull/18186) +- Encode the name of the attribute in Selectable.selectDynamic [#18928](https://github.com/lampepfl/dotty/pull/18928) +- Remove linearization requirement for override ref checks from java classes [#18953](https://github.com/lampepfl/dotty/pull/18953) +- Fix type inferencing (constraining) regressions [#19189](https://github.com/lampepfl/dotty/pull/19189) +- Repeated params must correspond in override [#16836](https://github.com/lampepfl/dotty/pull/16836) +- Convert SAM result types to function types [#17740](https://github.com/lampepfl/dotty/pull/17740) +- Disallow `infix` objects [#17966](https://github.com/lampepfl/dotty/pull/17966) +- Fix hasMatchingMember handling NoDenotation [#17977](https://github.com/lampepfl/dotty/pull/17977) +- Fix: disallow toplevel infix definitions for vals, vars, givens, methods and implicits [#17994](https://github.com/lampepfl/dotty/pull/17994) +- Curried methods are not valid SAM methods [#18110](https://github.com/lampepfl/dotty/pull/18110) +- Fix #17115: Try to normalize while computing `typeSize`. [#18386](https://github.com/lampepfl/dotty/pull/18386) +- Add default arguments to derived refined type [#18435](https://github.com/lampepfl/dotty/pull/18435) +- Handle dependent context functions [#18443](https://github.com/lampepfl/dotty/pull/18443) +- Fix variance loophole for private vars [#18693](https://github.com/lampepfl/dotty/pull/18693) +- Avoid crash arising from trying to find conversions from polymorphic singleton types [#18760](https://github.com/lampepfl/dotty/pull/18760) +- Allow inner classes of universal traits [#18796](https://github.com/lampepfl/dotty/pull/18796) +- Prevent crash when extension not found [#18830](https://github.com/lampepfl/dotty/pull/18830) +- Fix expandParam's use of argForParam/isArgPrefixOf. [#19412](https://github.com/lampepfl/dotty/pull/19412) + +# Contributors + +Thank you to all the contributors who made this release possible 🎉 + +According to `git shortlog -sn --no-merges 3.3.1..3.4.0` these are: + +``` + 474 Martin Odersky + 296 Nicolas Stucki + 132 Fengyun Liu + 119 Dale Wijnand + 77 Jamie Thompson + 69 Sébastien Doeraene + 60 Paweł Marks + 32 Chris Kipp + 27 Guillaume Martres + 26 Rikito Taniguchi + 21 Yichen Xu + 19 EnzeXing + 14 Szymon Rodziewicz + 13 Lucas Leblanc + 12 Jakub Ciesluk + 12 Jędrzej Rochala + 12 Katarzyna Marek + 11 Carl + 10 David Hua + 9 Florian3k + 9 Wojciech Mazur + 8 Eugene Flesselle + 8 ghostbuster91 + 7 Hamza Remmal + 7 Jan Chyb + 7 Ondrej Lhotak + 7 Quentin Bernet + 6 Julien Richard-Foy + 6 Kacper Korban + 6 Seth Tisue + 5 Lorenzo Gabriele + 5 Matt Bovel + 5 Som Snytt + 5 Yuito Murase + 5 dependabot[bot] + 3 David + 3 Lucas + 3 Pascal Weisenburger + 3 Tomasz Godzik + 2 Aleksander Rainko + 2 Decel + 2 Guillaume Raffin + 2 Ondřej Lhoták + 2 Oron Port + 2 danecek + 2 rochala + 1 Adam Dąbrowski + 1 Aleksey Troitskiy + 1 Arnout Engelen + 1 Ausmarton Zarino Fernandes + 1 Bjorn Regnell + 1 Daniel Esik + 1 Eugene Yokota + 1 Fabián Heredia Montiel + 1 François Monniot + 1 Jakub Cieśluk + 1 John Duffell + 1 John M. Higgins + 1 Justin Reardon + 1 Kai + 1 Kisaragi + 1 Lucas Nouguier + 1 Lukas Rytz + 1 LydiaSkuse + 1 Martin Kucera + 1 Martin Kučera + 1 Matthew Rooney + 1 Matthias Kurz + 1 Mikołaj Fornal + 1 Nicolas Almerge + 1 Preveen P + 1 Shardul Chiplunkar + 1 Stefan Wachter + 1 philippus + 1 q-ata + 1 slim +``` From a92a4639e1db7a1ad55633a436650a348dffa152 Mon Sep 17 00:00:00 2001 From: =?UTF-8?q?Pawe=C5=82=20Marks?= Date: Wed, 14 Feb 2024 12:43:27 +0100 Subject: [PATCH 180/371] Release 3.4.0 --- project/Build.scala | 4 ++-- tasty/src/dotty/tools/tasty/TastyFormat.scala | 2 +- 2 files changed, 3 insertions(+), 3 deletions(-) diff --git a/project/Build.scala b/project/Build.scala index 79570fdbd401..1640bf0fac1b 100644 --- a/project/Build.scala +++ b/project/Build.scala @@ -83,9 +83,9 @@ object DottyJSPlugin extends AutoPlugin { object Build { import ScaladocConfigs._ - val referenceVersion = "3.4.0-RC3" + val referenceVersion = "3.3.1" - val baseVersion = "3.4.0-RC4" + val baseVersion = "3.4.0" // Versions used by the vscode extension to create a new project // This should be the latest published releases. diff --git a/tasty/src/dotty/tools/tasty/TastyFormat.scala b/tasty/src/dotty/tools/tasty/TastyFormat.scala index ce3e1a852c74..aa14904b6889 100644 --- a/tasty/src/dotty/tools/tasty/TastyFormat.scala +++ b/tasty/src/dotty/tools/tasty/TastyFormat.scala @@ -334,7 +334,7 @@ object TastyFormat { * is able to read final TASTy documents if the file's * `MinorVersion` is strictly less than the current value. */ - final val ExperimentalVersion: Int = 1 + final val ExperimentalVersion: Int = 0 /**This method implements a binary relation (`<:<`) between two TASTy versions. * From 6310999de63287ed769e73317694cea4b5aa5fe8 Mon Sep 17 00:00:00 2001 From: =?UTF-8?q?Pawe=C5=82=20Marks?= Date: Wed, 14 Feb 2024 08:34:42 -0800 Subject: [PATCH 181/371] Revert "Implement SIP-42 - Support for binary integer literals" --- .../dotty/tools/dotc/parsing/Scanners.scala | 2 +- .../other-new-features/binary-literals.md | 19 ----- docs/_spec/01-lexical-syntax.md | 5 +- docs/sidebar.yml | 1 - .../referenceReplacements/sidebar.yml | 1 - tests/neg/binaryLiterals.scala | 8 --- tests/run/binaryLiterals.scala | 72 ------------------- 7 files changed, 3 insertions(+), 105 deletions(-) delete mode 100644 docs/_docs/reference/other-new-features/binary-literals.md delete mode 100644 tests/neg/binaryLiterals.scala delete mode 100644 tests/run/binaryLiterals.scala diff --git a/compiler/src/dotty/tools/dotc/parsing/Scanners.scala b/compiler/src/dotty/tools/dotc/parsing/Scanners.scala index 3f9e8ca6532e..ea43706e9fdb 100644 --- a/compiler/src/dotty/tools/dotc/parsing/Scanners.scala +++ b/compiler/src/dotty/tools/dotc/parsing/Scanners.scala @@ -884,7 +884,7 @@ object Scanners { nextChar() ch match { case 'x' | 'X' => base = 16 ; nextChar() - case 'b' | 'B' => base = 2 ; nextChar() + //case 'b' | 'B' => base = 2 ; nextChar() case _ => base = 10 ; putChar('0') } if (base != 10 && !isNumberSeparator(ch) && digit2int(ch, base) < 0) diff --git a/docs/_docs/reference/other-new-features/binary-literals.md b/docs/_docs/reference/other-new-features/binary-literals.md deleted file mode 100644 index ba19fdd3d7f7..000000000000 --- a/docs/_docs/reference/other-new-features/binary-literals.md +++ /dev/null @@ -1,19 +0,0 @@ ---- -layout: doc-page -title: "Binary Integer Literals" -nightlyOf: https://docs.scala-lang.org/scala3/reference/changed-features/binary-integer-literals.html ---- - -A new syntax for integer literals has been added, it is now possible to do the following: -```scala -val bitmask = 0b0010_0000 // equivalent to 32, 0x20 -``` - -Binary integer literals behave similarly to hex integer literals (`0x...`), for example: -* Both `0b...` and `0B...` are allowed -* `0b`/`0B` on its own is disallowed, possible alternatives: `0`, `0b0`, `0B0` -* Only `0` and `1` are allowed after the b (`b`/`B`) -* Underscores `_` are allowed anywhere between digits, and are ignored: `0b__1 == 0b1` - - -Note: This change has been backported to Scala 2.13.13, it is therefore not technically a changed feature diff --git a/docs/_spec/01-lexical-syntax.md b/docs/_spec/01-lexical-syntax.md index e1686204116e..7dfcea87bd2d 100644 --- a/docs/_spec/01-lexical-syntax.md +++ b/docs/_spec/01-lexical-syntax.md @@ -332,10 +332,9 @@ Literal ::= [‘-’] integerLiteral ### Integer Literals ```ebnf -integerLiteral ::= (decimalNumeral | hexNumeral | binaryNumeral) [‘L’ | ‘l’] +integerLiteral ::= (decimalNumeral | hexNumeral) [‘L’ | ‘l’] decimalNumeral ::= ‘0’ | digit [{digit | ‘_’} digit] hexNumeral ::= ‘0’ (‘x’ | ‘X’) hexDigit [{hexDigit | ‘_’} hexDigit] -binaryNumeral ::= ‘0’ (‘b’ | ‘B’) binaryDigit [{binaryDigit | ‘_’} binaryDigit] ``` Values of type `Int` are all integer numbers between $-2\^{31}$ and $2\^{31}-1$, inclusive. @@ -358,7 +357,7 @@ The numeric ranges given by these types are: The digits of a numeric literal may be separated by arbitrarily many underscores for purposes of legibility. > ```scala -> 0 21_000 0x7F -42L 0xFFFF_FFFF 0b0100_0010 +> 0 21_000 0x7F -42L 0xFFFF_FFFF > ``` ### Floating Point Literals diff --git a/docs/sidebar.yml b/docs/sidebar.yml index 16678e682dd6..65d7ac2f9ee4 100644 --- a/docs/sidebar.yml +++ b/docs/sidebar.yml @@ -81,7 +81,6 @@ subsection: - page: reference/other-new-features/safe-initialization.md - page: reference/other-new-features/type-test.md - page: reference/other-new-features/experimental-defs.md - - page: reference/other-new-features/binary-literals.md - title: Other Changed Features directory: changed-features index: reference/changed-features/changed-features.md diff --git a/project/resources/referenceReplacements/sidebar.yml b/project/resources/referenceReplacements/sidebar.yml index 240085b681f2..de0f3d7bec2c 100644 --- a/project/resources/referenceReplacements/sidebar.yml +++ b/project/resources/referenceReplacements/sidebar.yml @@ -77,7 +77,6 @@ subsection: - page: reference/other-new-features/safe-initialization.md - page: reference/other-new-features/type-test.md - page: reference/other-new-features/experimental-defs.md - - page: reference/other-new-features/binary-literals.md - title: Other Changed Features directory: changed-features index: reference/changed-features/changed-features.md diff --git a/tests/neg/binaryLiterals.scala b/tests/neg/binaryLiterals.scala deleted file mode 100644 index 5d5f0b4986fc..000000000000 --- a/tests/neg/binaryLiterals.scala +++ /dev/null @@ -1,8 +0,0 @@ - -object Test: - val x = 0b1__0000_0000_0000_0000__0000_0000_0000_0000 // error: number too large - val X = 0B1__0000_0000_0000_0000__0000_0000_0000_0000 // error: number too large - val y = 0b1__0000_0000_0000_0000__0000_0000_0000_0000__0000_0000_0000_0000__0000_0000_0000_0000L // error: number too large - val Y = 0B1__0000_0000_0000_0000__0000_0000_0000_0000__0000_0000_0000_0000__0000_0000_0000_0000L // error: number too large - 0b // error: invalid literal number - 0b2 // error: invalid literal number diff --git a/tests/run/binaryLiterals.scala b/tests/run/binaryLiterals.scala deleted file mode 100644 index 5ac8c7b6f8bc..000000000000 --- a/tests/run/binaryLiterals.scala +++ /dev/null @@ -1,72 +0,0 @@ -@main -def Test = - val kenobi = 0b1 - - assert(kenobi == 1) - - assert(0B0000 == 0) - assert(0B0001 == 1) - assert(0B0010 == 2) - assert(0B0100 == 4) - assert(0B1000 == 8) - - assert(0b0000 == 0) - assert(0b0001 == 1) - assert(0b0010 == 2) - assert(0b0100 == 4) - assert(0b1000 == 8) - - assert(0b0001_0000 == 16) - assert(0b0010_0000 == 32) - assert(0b0100_0000 == 64) - assert(0b1000_0000 == 128) - - assert(0b0001_0000_0000 == 256) - assert(0b0010_0000_0000 == 512) - assert(0b0100_0000_0000 == 1024) - assert(0b1000_0000_0000 == 2048) - - assert(0b0001_0000_0000_0000 == 4096) - assert(0b0010_0000_0000_0000 == 8192) - assert(0b0100_0000_0000_0000 == 16384) - assert(0b1000_0000_0000_0000 == 32768) - - assert(0b0001__0000_0000_0000_0000 == 65536) - assert(0b0010__0000_0000_0000_0000 == 131072) - assert(0b0100__0000_0000_0000_0000 == 262144) - assert(0b1000__0000_0000_0000_0000 == 524288) - - assert(0b0001_0000__0000_0000_0000_0000 == 1048576) - assert(0b0010_0000__0000_0000_0000_0000 == 2097152) - assert(0b0100_0000__0000_0000_0000_0000 == 4194304) - assert(0b1000_0000__0000_0000_0000_0000 == 8388608) - - assert(0b0001_0000_0000__0000_0000_0000_0000 == 16777216) - assert(0b0010_0000_0000__0000_0000_0000_0000 == 33554432) - assert(0b0100_0000_0000__0000_0000_0000_0000 == 67108864) - assert(0b1000_0000_0000__0000_0000_0000_0000 == 134217728) - - assert(0b0001_0000_0000_0000__0000_0000_0000_0000 == 268435456) - assert(0b0010_0000_0000_0000__0000_0000_0000_0000 == 536870912) - assert(0b0100_0000_0000_0000__0000_0000_0000_0000 == 1073741824) - assert(0b1000_0000_0000_0000__0000_0000_0000_0000L == 2147483648L) - - assert(0b1000_0000_0000_0000__0000_0000_0000_0000 == -2147483648) // Signed ! - assert(0b1111_1111_1111_1111__1111_1111_1111_1111 == -1) - - // Randomly generated using https://numbergenerator.org/random-32-bit-binary-number#!numbers=10&length=32&addfilters= - // Converted to signed decimal using https://onlinetoolz.net/unsigned-signed#base=2&bits=32 - assert(0b0110_1000_1100_0101_0010_1100_0100_0011 == 1757752387) - assert(0b1111_0101_0100_1011_0101_1000_0011_0110 == -179611594) - assert(0b0000_0011_0000_1010_1010_0011_0000_0000 == 51028736) - assert(0b0101_0010_1111_1001_0100_0101_1101_1011 == 1392068059) - assert(0b1001_0000_1111_1001_1011_1101_1100_1111 == -1862681137) - - assert(0B0000_0111_1110_1100_0111_1100_1000_0010 == 132938882) - assert(0B0000_1011_0111_1011_0001_1010_1010_1000 == 192617128) - assert(0B1100_1100_1000_1010_1111_0111_0100_1101 == -863307955) - assert(0B1000_0000_0001_0010_0001_1001_0101_1110 == -2146297506) - assert(0B1110_0000_0110_1100_0111_0110_1100_1111 == -529762609) - - assert(0b0010_1001_0101_1001__1010_0100_1000_1010__1001_1000_0011_0111__1100_1011_0111_0101L == 2979593543648529269L) - assert(0b1101_1110_0100_1000__0010_1101_1010_0010__0111_1000_1111_1001__1010_1001_0101_1000L == -2429641823128802984L) From 53075ba78842b87d49b324d2578f50d28cb7d3ef Mon Sep 17 00:00:00 2001 From: =?UTF-8?q?Pawe=C5=82=20Marks?= Date: Wed, 14 Feb 2024 17:52:33 +0100 Subject: [PATCH 182/371] Add changelog for 3.4.1 --- changelogs/3.4.1-RC1.md | 190 ++++++++++++++++++++++++++++++++++++++++ 1 file changed, 190 insertions(+) create mode 100644 changelogs/3.4.1-RC1.md diff --git a/changelogs/3.4.1-RC1.md b/changelogs/3.4.1-RC1.md new file mode 100644 index 000000000000..f374c6768497 --- /dev/null +++ b/changelogs/3.4.1-RC1.md @@ -0,0 +1,190 @@ +# Highlights of the release + +- Add support for `@deprecatedInheritance` [#19082](https://github.com/lampepfl/dotty/pull/19082) +- Avoid generating given definitions that loop [#19282](https://github.com/lampepfl/dotty/pull/19282) + +# Other changes and fixes + +## Coverage + +- Correctly prettify names in coverage info [#18542](https://github.com/lampepfl/dotty/pull/18542) + +## Desugaring + +- Make apply proxies work with overloaded ctors [#19464](https://github.com/lampepfl/dotty/pull/19464) +- Fix possible crash in Desugar [#19567](https://github.com/lampepfl/dotty/pull/19567) + +## Documentation + +- Update `private[this]` deprecation warning and documentation [#19393](https://github.com/lampepfl/dotty/pull/19393) + +## Erasure + +- Make eraseInfo work for classes with EmptyScopes [#19550](https://github.com/lampepfl/dotty/pull/19550) + +## Exports + +- Do not propagate `@tailrec` to exported methods [#19509](https://github.com/lampepfl/dotty/pull/19509) +- Fix retained flags in exports [#19636](https://github.com/lampepfl/dotty/pull/19636) + +## GADTs + +- Only cache base types when gadt state is empty [#19562](https://github.com/lampepfl/dotty/pull/19562) + +## Implicits + +- Run CheckStatic after UncacheGivenAliases [#19318](https://github.com/lampepfl/dotty/pull/19318) +- Add tests to verify that crash is fixed elsewhere. Fixes #19328 [#19329](https://github.com/lampepfl/dotty/pull/19329) +- Don't search for implicit conversions to NoType [#19563](https://github.com/lampepfl/dotty/pull/19563) +- Instantiate argument type vars before implicit search [#19096](https://github.com/lampepfl/dotty/pull/19096) + +## Java Interop + +- Classfile reader: handle JDK 9+ constant types in constant pool [#19533](https://github.com/lampepfl/dotty/pull/19533) + +## Linting + +- Make fatal warnings not fail compilation early & aggregate warns [#19245](https://github.com/lampepfl/dotty/pull/19245) + +## Macro Annotations + +- Check and enter missing symbols in MacroAnnotations only for definitions [#19579](https://github.com/lampepfl/dotty/pull/19579) + +## Match Types + +- Normalize MatchAlias in unrollTupleTypes [#19565](https://github.com/lampepfl/dotty/pull/19565) +- Fix #19445: Remove too-strict test in match type matching. [#19511](https://github.com/lampepfl/dotty/pull/19511) + +## Opaque Types + +- Fix problems with cycle checks [#19453](https://github.com/lampepfl/dotty/pull/19453) + +## Parser + +- Fix(#18265): crash on extension method without type nor RHS [#18743](https://github.com/lampepfl/dotty/pull/18743) +- Warn when @volatile is used on vals [#19462](https://github.com/lampepfl/dotty/pull/19462) +- Fix(#16459) xml parse regression [#19531](https://github.com/lampepfl/dotty/pull/19531) + +## Pattern Matching + +- Fix false unreachable due to opaqueness [#19368](https://github.com/lampepfl/dotty/pull/19368) +- Improve recursive decompose prefix fix [#19375](https://github.com/lampepfl/dotty/pull/19375) +- Allow constraining a parameter to Nothing [#19397](https://github.com/lampepfl/dotty/pull/19397) +- Add a test case, proving i15661 is fixed [#19432](https://github.com/lampepfl/dotty/pull/19432) + +## Presentation Compiler + +- Improvement: Support completions for implicit classes [#19314](https://github.com/lampepfl/dotty/pull/19314) +- Chore: Backport changes from Metals [#19410](https://github.com/lampepfl/dotty/pull/19410) +- Fix goto-def on exported forwarders [#19494](https://github.com/lampepfl/dotty/pull/19494) +- Backport pc changes from metals [#19617](https://github.com/lampepfl/dotty/pull/19617) +- Chore: Backport changes from Metals [#19592](https://github.com/lampepfl/dotty/pull/19592) +- Use comma counting for all signature help types [#19520](https://github.com/lampepfl/dotty/pull/19520) +- Make PC more resilient to crashes [#19488](https://github.com/lampepfl/dotty/pull/19488) +- Make order of renames and missing imports deterministic [#19468](https://github.com/lampepfl/dotty/pull/19468) +- Chore: backport changes from metals [#19452](https://github.com/lampepfl/dotty/pull/19452) +- Improve signature help by more stable position calculation + better named arg support [#19214](https://github.com/lampepfl/dotty/pull/19214) +- Instantiate Type Vars in completion labels of extension methods [#18914](https://github.com/lampepfl/dotty/pull/18914) + +## Quotes + +- Only evaluate transparent inline unapply once [#19380](https://github.com/lampepfl/dotty/pull/19380) +- Update `staging.Compiler.make` documentation [#19428](https://github.com/lampepfl/dotty/pull/19428) +- Error instead of StaleSymbol crash for certain cyclic macro dependencies [#19549](https://github.com/lampepfl/dotty/pull/19549) +- Refine handling of StaleSymbol type errors [#19605](https://github.com/lampepfl/dotty/pull/19605) +- Fix module symbol recovery from `NoClassDefFoundError` [#19645](https://github.com/lampepfl/dotty/pull/19645) +- Fix HOAS pattern example and error message [#19655](https://github.com/lampepfl/dotty/pull/19655) +- Set the correct type when copying reflect Inlined trees [#19409](https://github.com/lampepfl/dotty/pull/19409) + +## Reporting + +- Don't explain erroneous bounds [#19338](https://github.com/lampepfl/dotty/pull/19338) +- Better error diagnostics for cyclic references [#19408](https://github.com/lampepfl/dotty/pull/19408) +- Properly identify empty bounds in error message [#19310](https://github.com/lampepfl/dotty/pull/19310) + +## Scala-JS + +- Fix #19528: Actually remove Dynamic from interfaces of native JS classes. [#19536](https://github.com/lampepfl/dotty/pull/19536) +- Consider static and non-static methods as non-double def [#19400](https://github.com/lampepfl/dotty/pull/19400) + +## Scaladoc + +- Scaladoc - add option for dynamic side menu [#19337](https://github.com/lampepfl/dotty/pull/19337) +- Scaladoc: Fix "case case Foo" in enum's cases [#19519](https://github.com/lampepfl/dotty/pull/19519) +- Fix(#19377): show inherited abstract members in dedicated section [#19552](https://github.com/lampepfl/dotty/pull/19552) +- Jsoup: 1.14.3 → 1.17.2 [#19564](https://github.com/lampepfl/dotty/pull/19564) +- Extend copyright into 2024 [#19367](https://github.com/lampepfl/dotty/pull/19367) + +## Tooling + +- Prioritize TASTy files over classfiles on classpath aggregation [#19431](https://github.com/lampepfl/dotty/pull/19431) + +## Transform + +- Fix purity check for val inside of object [#19598](https://github.com/lampepfl/dotty/pull/19598) +- Drop special treatment of function types in overloading resolution [#19654](https://github.com/lampepfl/dotty/pull/19654) +- Add checks for the consistency of the parents in TreeChecker [#18935](https://github.com/lampepfl/dotty/pull/18935) + +## Type Inference + +- More careful type variable instance improvements [#19659](https://github.com/lampepfl/dotty/pull/19659) + +## Typer + +- Reject wildcard types in using clauses [#19459](https://github.com/lampepfl/dotty/pull/19459) +- Don't leave underspecified SAM types in the code [#19461](https://github.com/lampepfl/dotty/pull/19461) +- Also compute base classes of wildcardTypes [#19465](https://github.com/lampepfl/dotty/pull/19465) +- Fix(#15784): ident rule for pat match was too strict [#19501](https://github.com/lampepfl/dotty/pull/19501) +- Heal occurrences of => T between ElimByName and Erasure [#19558](https://github.com/lampepfl/dotty/pull/19558) +- Fix(#i18645): overload ext method body in braces didn't compile [#19651](https://github.com/lampepfl/dotty/pull/19651) +- Fix #19202: Passing NotNullInfos to a mutable field of a Completer [#19463](https://github.com/lampepfl/dotty/pull/19463) +- Fix Java record problems (#19578) and (#19386) [#19583](https://github.com/lampepfl/dotty/pull/19583) +- Improve when deprecation warnings are emitted [#19621](https://github.com/lampepfl/dotty/pull/19621) +- Space: Replace showType & make Space Showable [#19370](https://github.com/lampepfl/dotty/pull/19370) + + +# Contributors + +Thank you to all the contributors who made this release possible 🎉 + +According to `git shortlog -sn --no-merges 3.4.0..3.4.1-RC1` these are: + +``` + 53 Martin Odersky + 53 Nicolas Stucki + 20 Dale Wijnand + 11 Szymon Rodziewicz + 11 i10416 + 7 noti0na1 + 6 Yilin Wei + 4 Hamza REMMAL + 4 Jędrzej Rochala + 3 Eugene Flesselle + 3 Paweł Marks + 3 Seth Tisue + 2 Florian3k + 2 Hamza Remmal + 2 Jan Chyb + 2 Katarzyna Marek + 2 Sébastien Doeraene + 2 Tomasz Godzik + 2 dependabot[bot] + 1 Bersier + 1 Fabián Heredia Montiel + 1 Jakub Ciesluk + 1 Jakub Cieśluk + 1 Kacper Korban + 1 Kenji Yoshida + 1 Mehdi Alaoui + 1 Nikita Gazarov + 1 Oron Port + 1 Pascal Weisenburger + 1 Philippus Baalman + 1 Quentin Bernet + 1 Som Snytt + 1 Wojciech Mazur + 1 Yichen Xu + 1 aherlihy + 1 rochala + +``` From 59085f1b903b1810971bab14d400210f2ed3e086 Mon Sep 17 00:00:00 2001 From: =?UTF-8?q?Pawe=C5=82=20Marks?= Date: Wed, 14 Feb 2024 17:56:02 +0100 Subject: [PATCH 183/371] Release 3.4.1 --- project/Build.scala | 4 ++-- 1 file changed, 2 insertions(+), 2 deletions(-) diff --git a/project/Build.scala b/project/Build.scala index 720aa329bbdd..caede50d0c6c 100644 --- a/project/Build.scala +++ b/project/Build.scala @@ -83,7 +83,7 @@ object DottyJSPlugin extends AutoPlugin { object Build { import ScaladocConfigs._ - val referenceVersion = "3.3.1" + val referenceVersion = "3.4.0" val baseVersion = "3.4.1-RC1" @@ -101,7 +101,7 @@ object Build { * set to 3.1.3. If it is going to be 3.1.0, it must be set to the latest * 3.0.x release. */ - val previousDottyVersion = "3.4.0-RC3" + val previousDottyVersion = "3.4.0" /** Version against which we check binary compatibility. */ val ltsDottyVersion = "3.3.0" From 97138fbb6eff1fa2206f48de8f20ec3a42805779 Mon Sep 17 00:00:00 2001 From: Aleksander Boruch-Gruszecki Date: Wed, 14 Feb 2024 19:59:51 +0100 Subject: [PATCH 184/371] Fix the capture checking documentation The syntax on the official Scala page was outdated. Also, new docs nudge people to use the latest version of Scala, which allow trying capture checking out on a stable compiler version. --- docs/_docs/reference/experimental/cc.md | 178 +++++++++++------------- 1 file changed, 85 insertions(+), 93 deletions(-) diff --git a/docs/_docs/reference/experimental/cc.md b/docs/_docs/reference/experimental/cc.md index 2a7236453eab..5bdf91f628ec 100644 --- a/docs/_docs/reference/experimental/cc.md +++ b/docs/_docs/reference/experimental/cc.md @@ -8,7 +8,8 @@ Capture checking is a research project that modifies the Scala type system to tr ```scala import language.experimental.captureChecking ``` -At present, capture checking is still highly experimental and unstable. +At present, capture checking is still highly experimental and unstable, and it evolves quickly. +Before trying it out, make sure you have the latest version of Scala. To get an idea what capture checking can do, let's start with a small example: ```scala @@ -34,17 +35,17 @@ results in an uncaught `IOException`. Capture checking gives us the mechanism to prevent such errors _statically_. To prevent unsafe usages of `usingLogFile`, we can declare it like this: ```scala -def usingLogFile[T](op: ({*} FileOutputStream) => T): T = +def usingLogFile[T](op: FileOutputStream^ => T): T = // same body as before ``` The only thing that's changed is that the `FileOutputStream` parameter of `op` is now -tagged with `{*}`. We'll see that this turns the parameter into a _capability_ whose lifetime is tracked. +followed by `^`. We'll see that this turns the parameter into a _capability_ whose lifetime is tracked. If we now try to define the problematic value `later`, we get a static error: ``` | val later = usingLogFile { f => () => f.write(0) } | ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - |The expression's type {*} () -> Unit is not allowed to capture the root capability `*`. + |The expression's type () => Unit is not allowed to capture the root capability `cap`. |This usually means that a capability persists longer than its allowed lifetime. ``` In this case, it was easy to see that the `logFile` capability escapes in the closure passed to `usingLogFile`. But capture checking also works for more complex cases. @@ -84,26 +85,27 @@ The capture checker extension introduces a new kind of types and it enforces som ## Capabilities and Capturing Types Capture checking is done in terms of _capturing types_ of the form -`{c₁, ..., cᵢ} T`. Here `T` is a type, and `{c₁, ..., cᵢ}` is a _capture set_ consisting of references to capabilities `c₁, ..., cᵢ`. +`T^{c₁, ..., cᵢ}`. Here `T` is a type, and `{c₁, ..., cᵢ}` is a _capture set_ consisting of references to capabilities `c₁, ..., cᵢ`. A _capability_ is syntactically a method- or class-parameter, a local variable, or the `this` of an enclosing class. The type of a capability must be a capturing type with a non-empty capture set. We also say that variables that are capabilities are _tracked_. In a sense, every -capability gets its authority from some other, more sweeping capability which it captures. The most sweeping capability, from which ultimately all others are derived is written `*`. We call it the _universal capability_. +capability gets its authority from some other, more sweeping capability which it captures. The most sweeping capability, from which ultimately all others are derived is written `cap`. We call it the _universal capability_. +If `T` is a type, then `T^` is a shorthand for `T^{cap}`, meaning `T` can capture arbitrary capabilities. Here is an example: ```scala class FileSystem -class Logger(fs: {*} FileSystem): +class Logger(fs: FileSystem^): def log(s: String): Unit = ... // Write to a log file, using `fs` -def test(fs: {*} FileSystem) = - val l: {fs} Logger = Logger(fs) +def test(fs: FileSystem^) = + val l: Logger^{fs} = Logger(fs) l.log("hello world!") - val xs: {l} LazyList[Int] = + val xs: LazyList[Int]^{l} = LazyList.from(1) .map { i => l.log(s"computing elem # $i") @@ -113,12 +115,12 @@ def test(fs: {*} FileSystem) = ``` Here, the `test` method takes a `FileSystem` as a parameter. `fs` is a capability since its type has a non-empty capture set. The capability is passed to the `Logger` constructor and retained as a field in class `Logger`. Hence, the local variable `l` has type -`{fs} Logger`: it is a `Logger` which retains the `fs` capability. +`Logger^{fs}`: it is a `Logger` which retains the `fs` capability. The second variable defined in `test` is `xs`, a lazy list that is obtained from `LazyList.from(1)` by logging and mapping consecutive numbers. Since the list is lazy, it needs to retain the reference to the logger `l` for its computations. Hence, the -type of the list is `{l} LazyList[Int]`. On the other hand, since `xs` only logs but does +type of the list is `LazyList[Int]^{l}`. On the other hand, since `xs` only logs but does not do other file operations, it retains the `fs` capability only indirectly. That's why `fs` does not show up in the capture set of `xs`. @@ -129,24 +131,14 @@ any capturing type that adds a capture set to `T`. The usual function type `A => B` now stands for a function that can capture arbitrary capabilities. We call such functions _impure_. By contrast, the new single arrow function type `A -> B` stands for a function that cannot capture any capabilities, or otherwise said, is _pure_. -One can add a capture set in front of an otherwise pure function. -For instance, `{c, d} A -> B` would be a function that can capture capabilities `c` and `d`, but no others. +One can add a capture set after the arrow of an otherwise pure function. +For instance, `A ->{c, d} B` would be a function that can capture capabilities `c` and `d`, but no others. +This type is a shorthand for `(A -> B)^{c, d}`, i.e. the function type `A -> B` with possible captures `{c, d}`. -The impure function type `A => B` is treated as an alias for `{*} A -> B`. That is, impure functions are functions that can capture anything. +The impure function type `A => B` is treated as an alias for `A ->{cap} B`. That is, impure functions are functions that can capture anything. -Function types and captures both associate to the right, so -```scala -{c} A -> {d} B -> C -``` -is the same as -```scala -{c} (A -> {d} (B -> C)) -``` -Contrast with -```scala -({c} A) -> ({d} B) -> C -``` -which is a curried pure function over argument types that can capture `c` and `d`, respectively. +A capture annotation `^` binds more strongly than a function arrow. So +`A -> B^{c}` is read as `A` -> (B^{c})`. Analogous conventions apply to context function types. `A ?=> B` is an impure context function, with `A ?-> B` as its pure complement. @@ -173,13 +165,10 @@ def f(x: -> Int): Int the actual argument to `f` could not refer to any capabilities, so the call above would be rejected. One can also allow specific capabilities like this: ```scala -def f(x: {c}-> Int): Int +def f(x: ->{c} Int): Int ``` Here, the actual argument to `f` is allowed to use the `c` capability but no others. -**Note:** It is strongly recommended to write the capability set and the arrow `->` without intervening spaces, -as otherwise the notation would look confusingly like a function type. - ## Subtyping and Subcapturing Capturing influences subtyping. As usual we write `T₁ <: T₂` to express that the type @@ -201,35 +190,35 @@ A subcapturing relation `C₁ <: C₂` holds if `C₂` _accounts for_ every elem **Example 1.** Given ```scala -fs: {*} FileSystem -ct: {*} CanThrow[Exception] -l : {fs} Logger +fs: FileSystem^ +ct: CanThrow[Exception]^ +l : Logger^{fs} ``` we have ``` -{l} <: {fs} <: {*} -{fs} <: {fs, ct} <: {*} -{ct} <: {fs, ct} <: {*} +{l} <: {fs} <: {cap} +{fs} <: {fs, ct} <: {cap} +{ct} <: {fs, ct} <: {cap} ``` -The set consisting of the root capability `{*}` covers every other capture set. This is -a consequence of the fact that, ultimately, every capability is created from `*`. +The set consisting of the root capability `{cap}` covers every other capture set. This is +a consequence of the fact that, ultimately, every capability is created from `cap`. -**Example 2.** Consider again the FileSystem/Logger example from before. `LazyList[Int]` is a proper subtype of `{l} LazyList[Int]`. So if the `test` method in that example +**Example 2.** Consider again the FileSystem/Logger example from before. `LazyList[Int]` is a proper subtype of `LazyList[Int]^{l}`. So if the `test` method in that example was declared with a result type `LazyList[Int]`, we'd get a type error. Here is the error message: ``` -11 |def test(using fs: {*} FileSystem): LazyList[Int] = { - | ^ - | Found: {fs} LazyList[Int] - | Required: LazyList[Int] +11 |def test(using fs: FileSystem^): LazyList[Int] = { + | ^ + | Found: LazyList[Int]^{fs} + | Required: LazyList[Int] ``` -Why does it say `{fs} LazyList[Int]` and not `{l} LazyList[Int]`, which is, after all, the type of the returned value `xs`? The reason is that `l` is a local variable in the body of `test`, so it cannot be referred to in a type outside that body. What happens instead is that the type is _widened_ to the smallest supertype that does not mention `l`. Since `l` has capture set `fs`, we have that `{fs}` covers `{l}`, and `{fs}` is acceptable in a result type of `test`, so `{fs}` is the result of that widening. +Why does it say `LazyList[Int]^{fs}` and not `LazyList[Int]^{l}`, which is, after all, the type of the returned value `xs`? The reason is that `l` is a local variable in the body of `test`, so it cannot be referred to in a type outside that body. What happens instead is that the type is _widened_ to the smallest supertype that does not mention `l`. Since `l` has capture set `fs`, we have that `{fs}` covers `{l}`, and `{fs}` is acceptable in a result type of `test`, so `{fs}` is the result of that widening. This widening is called _avoidance_; it is not specific to capture checking but applies to all variable references in Scala types. ## Capability Classes Classes like `CanThrow` or `FileSystem` have the property that their values are always intended to be capabilities. We can make this intention explicit and save boilerplate by declaring these classes with a `@capability` annotation. -The capture set of a capability class type is always `{*}`. This means we could equivalently express the `FileSystem` and `Logger` classes as follows: +The capture set of a capability class type is always `{cap}`. This means we could equivalently express the `FileSystem` and `Logger` classes as follows: ```scala import annotation.capability @@ -239,14 +228,14 @@ class Logger(using FileSystem): def log(s: String): Unit = ??? def test(using fs: FileSystem) = - val l: {fs} Logger = Logger() + val l: Logger^{fs} = Logger() ... ``` -In this version, `FileSystem` is a capability class, which means that the `{*}` capture set is implied on the parameters of `Logger` and `test`. Writing the capture set explicitly produces a warning: +In this version, `FileSystem` is a capability class, which means that the `{cap}` capture set is implied on the parameters of `Logger` and `test`. Writing the capture set explicitly produces a warning: ```scala -class Logger(using {*} FileSystem): +class Logger(using FileSystem^{cap}): ^^^^^^^^^^^^^^ - redundant capture: FileSystem already accounts for * + redundant capture: FileSystem already accounts for cap ``` Another, unrelated change in the version of the last example here is that the `FileSystem` capability is now passed as an implicit parameter. It is quite natural to model capabilities with implicit parameters since it greatly reduces the wiring overhead once multiple capabilities are in play. @@ -254,11 +243,11 @@ Another, unrelated change in the version of the last example here is that the `F If a closure refers to capabilities in its body, it captures these capabilities in its type. For instance, consider: ```scala -def test(fs: FileSystem): {fs} String -> Unit = +def test(fs: FileSystem): String ->{fs} Unit = (x: String) => Logger(fs).log(x) ``` Here, the body of `test` is a lambda that refers to the capability `fs`, which means that `fs` is retained in the lambda. -Consequently, the type of the lambda is `{fs} String -> Unit`. +Consequently, the type of the lambda is `String ->{fs} Unit`. **Note:** Function values are always written with `=>` (or `?=>` for context functions). There is no syntactic distinction for pure _vs_ impure function values. The distinction is only made in their types. @@ -271,7 +260,7 @@ def test(fs: FileSystem) = def g() = (x: String) => Logger(fs).log(x) f ``` -the result of `test` has type `{fs} String -> Unit` even though function `f` itself does not refer to `fs`. +the result of `test` has type `String ->{fs} Unit` even though function `f` itself does not refer to `fs`. ## Capture Checking of Classes @@ -280,11 +269,11 @@ The principles for capture checking closures also apply to classes. For instance class Logger(using fs: FileSystem): def log(s: String): Unit = ... summon[FileSystem] ... -def test(xfs: FileSystem): {xfs} Logger = +def test(xfs: FileSystem): Logger^{xfs} = Logger(xfs) ``` Here, class `Logger` retains the capability `fs` as a (private) field. Hence, the result -of `test` is of type `{xfs} Logger` +of `test` is of type `Logger^{xfs}` Sometimes, a tracked capability is meant to be used only in the constructor of a class, but is not intended to be retained as a field. This fact can be communicated to the capture @@ -317,7 +306,7 @@ the capture set of that call is `{a, b, c}`. The capture set of the type of `this` of a class is inferred by the capture checker, unless the type is explicitly declared with a self type annotation like this one: ```scala class C: - self: {a, b} D => ... + self: D^{a, b} => ... ``` The inference observes the following constraints: @@ -351,13 +340,13 @@ class Pair[+A, +B](x: A, y: B): ``` What happens if `Pair` is instantiated like this (assuming `ct` and `fs` are two capabilities in scope)? ```scala -def x: {ct} Int -> String -def y: {fs} Logger +def x: Int ->{ct} String +def y: Logger^{fs} def p = Pair(x, y) ``` The last line will be typed as follows: ```scala -def p: Pair[{ct} Int -> String, {fs} Logger] = Pair(x, y) +def p: Pair[Int ->{ct} String, Logger^{fs}] = Pair(x, y) ``` This might seem surprising. The `Pair(x, y)` value does capture capabilities `ct` and `fs`. Why don't they show up in its type at the outside? @@ -365,53 +354,53 @@ The answer is capture tunnelling. Once a type variable is instantiated to a capt capture is not propagated beyond this point. On the other hand, if the type variable is instantiated again on access, the capture information "pops out" again. For instance, even though `p` is technically pure because its capture set is empty, writing `p.fst` would record a reference to the captured capability `ct`. So if this access was put in a closure, the capability would again form part of the outer capture set. E.g. ```scala -() => p.fst : {ct} () -> {ct} Int -> String +() => p.fst : () -> Int ->{ct} String ``` In other words, references to capabilities "tunnel through" in generic instantiations from creation to access; they do not affect the capture set of the enclosing generic data constructor applications. This principle plays an important part in making capture checking concise and practical. ## Escape Checking -The universal capability `*` should be conceptually available only as a parameter to the main program. Indeed, if it was available everywhere, capability checking would be undermined since one could mint new capabilities -at will. In line with this reasoning, some capture sets are restricted so that +Some capture sets are restricted so that they are not allowed to contain the universal capability. Specifically, if a capturing type is an instance of a type variable, that capturing type -is not allowed to carry the universal capability `{*}`. There's a connection to tunnelling here. +is not allowed to carry the universal capability `cap`. There's a connection to tunnelling here. The capture set of a type has to be present in the environment when a type is instantiated from -a type variable. But `*` is not itself available as a global entity in the environment. Hence, +a type variable. But `cap` is not itself available as a global entity in the environment. Hence, an error should result. We can now reconstruct how this principle produced the error in the introductory example, where `usingLogFile` was declared like this: ```scala -def usingLogFile[T](op: ({*} FileOutputStream) => T): T = ... +def usingLogFile[T](op: FileOutputStream^ => T): T = ... ``` The error message was: ``` | val later = usingLogFile { f => () => f.write(0) } | ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - |The expression's type {*} () -> Unit is not allowed to capture the root capability `*`. + |The expression's type () => Unit is not allowed to capture the root capability `cap`. |This usually means that a capability persists longer than its allowed lifetime. ``` This error message was produced by the following logic: - - The `f` parameter has type `{*} FileOutputStream`, which makes it a capability. - - Therefore, the type of the expression `() => f.write(0)` is `{f} () -> Unit`. + - The `f` parameter has type `FileOutputStream^`, which makes it a capability. + - Therefore, the type of the expression `() => f.write(0)` is `() ->{f} Unit`. - This makes the type of the whole closure passed to `usingLogFile` the dependent function type - `(f: {*} FileOutputStream) -> {f} () -> Unit`. - - The expected type of the closure is a simple, parametric, impure function type `({*} FileOutputStream) => T`, + `(f: FileOutputStream^) -> () ->{f} Unit`. + - The expected type of the closure is a simple, parametric, impure function type `FileOutputStream^ => T`, for some instantiation of the type variable `T`. - The smallest supertype of the closure's dependent function type that is a parametric function type is - `({*} FileOutputStream) => {*} () -> Unit` - - Hence, the type variable `T` is instantiated to `* () -> Unit`, which causes the error. + `FileOutputStream^ => () ->{cap} Unit` + - Hence, the type variable `T` is instantiated to `() ->{cap} Unit`, or abbreviated `() => Unit`, + which causes the error. An analogous restriction applies to the type of a mutable variable. Another way one could try to undermine capture checking would be to assign a closure with a local capability to a global variable. Maybe like this: ```scala -var loophole: {*} () -> Unit = () => () +var loophole: () => Unit = () => () usingLogFile { f => loophole = () => f.write(0) } @@ -427,11 +416,11 @@ val sneaky = usingLogFile { f => Cell(() => f.write(0)) } sneaky.x() ``` At the point where the `Cell` is created, the capture set of the argument is `f`, which -is OK. But at the point of use, it is `*` (because `f` is no longer in scope), which causes again an error: +is OK. But at the point of use, it is `cap` (because `f` is no longer in scope), which causes again an error: ``` | sneaky.x() | ^^^^^^^^ - |The expression's type {*} () -> Unit is not allowed to capture the root capability `*`. + |The expression's type () => Unit is not allowed to capture the root capability `cap`. |This usually means that a capability persists longer than its allowed lifetime. ``` @@ -507,7 +496,7 @@ Under the language import `language.experimental.captureChecking`, the code is i ``` 14 | try () => xs.map(f).sum | ^ - |The expression's type {*} () -> Double is not allowed to capture the root capability `*`. + |The expression's type () => Double is not allowed to capture the root capability `cap`. |This usually means that a capability persists longer than its allowed lifetime. 15 | catch case ex: LimitExceeded => () => -1 ``` @@ -527,7 +516,7 @@ Here is the base trait `LzyList` for our version of lazy lists: trait LzyList[+A]: def isEmpty: Boolean def head: A - def tail: {this} LzyList[A] + def tail: LzyList[A]^{this} ``` Note that `tail` carries a capture annotation. It says that the tail of a lazy list can potentially capture the same references as the lazy list as a whole. @@ -543,16 +532,16 @@ Here is a formulation of the class for lazy cons nodes: ```scala import scala.compiletime.uninitialized -final class LzyCons[+A](hd: A, tl: () => {*} LzyList[A]) extends LzyList[A]: +final class LzyCons[+A](hd: A, tl: () => LzyList[A]^) extends LzyList[A]: private var forced = false - private var cache: {this} LzyList[A] = uninitialized + private var cache: LzyList[A]^{this} = uninitialized private def force = if !forced then { cache = tl(); forced = true } cache def isEmpty = false def head = hd - def tail: {this} LzyList[A] = force + def tail: LzyList[A]^{this} = force end LzyCons ``` The `LzyCons` class takes two parameters: A head `hd` and a tail `tl`, which is a function @@ -564,7 +553,7 @@ Here is an extension method to define an infix cons operator `#:` for lazy lists to `::` but instead of a strict list it produces a lazy list without evaluating its right operand. ```scala extension [A](x: A) - def #:(xs1: => {*} LzyList[A]): {xs1} LzyList[A] = + def #:(xs1: => LzyList[A]^): LzyList[A]^{xs1} = LzyCons(x, () => xs1) ``` Note that `#:` takes an impure call-by-name parameter `xs1` as its right argument. The result @@ -575,7 +564,7 @@ of given length with a generator function `gen`. The generator function is allow to have side effects. ```scala def tabulate[A](n: Int)(gen: Int => A) = - def recur(i: Int): {gen} LzyList[A] = + def recur(i: Int): LzyList[A]^{gen} = if i == n then LzyNil else gen(i) #: recur(i + 1) recur(0) @@ -584,32 +573,31 @@ Here is a use of `tabulate`: ```scala class LimitExceeded extends Exception def squares(n: Int)(using ct: CanThrow[LimitExceeded]) = - tabulate(10) { i => + tabulate(10): i => if i > 9 then throw LimitExceeded() i * i - } ``` -The inferred result type of `squares` is `{ct} LzyList[Int]`, i.e it is a lazy list of +The inferred result type of `squares` is `LzyList[Int]^{ct}`, i.e it is a lazy list of `Int`s that can throw the `LimitExceeded` exception when it is elaborated by calling `tail` one or more times. Here are some further extension methods for mapping, filtering, and concatenating lazy lists: ```scala -extension [A](xs: {*} LzyList[A]) - def map[B](f: A => B): {xs, f} LzyList[B] = +extension [A](xs: LzyList[A]^) + def map[B](f: A => B): LzyList[B]^{xs, f} = if xs.isEmpty then LzyNil else f(xs.head) #: xs.tail.map(f) - def filter(p: A => Boolean): {xs, p} LzyList[A] = + def filter(p: A => Boolean): LzyList[A]^{xs, p} = if xs.isEmpty then LzyNil else if p(xs.head) then xs.head #: xs.tail.filter(p) else xs.tail.filter(p) - def concat(ys: {*} LzyList[A]): {xs, ys} LzyList[A] = + def concat(ys: LzyList[A]^): LzyList[A]^{xs, ys} = if xs.isEmpty then ys else xs.head #: xs.tail.concat(ys) - def drop(n: Int): {xs} LzyList[A] = + def drop(n: Int): LzyList[A]^{xs} = if n == 0 then xs else xs.tail.drop(n - 1) ``` Their capture annotations are all as one would expect: @@ -621,11 +609,11 @@ Their capture annotations are all as one would expect: - Concatenating two lazy lists produces a lazy list that captures both arguments. - Dropping elements from a lazy list gives a safe approximation where the original list is captured in the result. In fact, it's only some suffix of the list that is retained at run time, but our modelling identifies lazy lists and their suffixes, so this additional knowledge would not be useful. -Of course the function passed to `map` or `filter` could also be pure. After all, `A -> B` is a subtype of `{*} A -> B` which is the same as `A => B`. In that case, the pure function +Of course the function passed to `map` or `filter` could also be pure. After all, `A -> B` is a subtype of `(A -> B)^{cap}` which is the same as `A => B`. In that case, the pure function argument will _not_ show up in the result type of `map` or `filter`. For instance: ```scala val xs = squares(10) -val ys: {xs} LzyList[Int] = xs.map(_ + 1) +val ys: LzyList[Int]^{xs} = xs.map(_ + 1) ``` The type of the mapped list `ys` has only `xs` in its capture set. The actual function argument does not show up since it is pure. Likewise, if the lazy list @@ -718,6 +706,8 @@ Generally, the string following the capture set consists of alternating numbers - `F` : a variable resulting from _filtering_ the elements of the variable indicated by the string to the right, - `I` : a variable resulting from an _intersection_ of two capture sets, - `D` : a variable resulting from the set _difference_ of two capture sets. + - `R` : a regular variable that _refines_ a class parameter, so that the capture + set of a constructor argument is known in the class instance type. At the end of a compilation run, `-Ycc-debug` will print all variable dependencies of variables referred to in previous output. Here is an example: ``` @@ -736,3 +726,5 @@ This section lists all variables that appeared in previous diagnostics and their - variable `31` has a constant fixed superset `{xs, f}` - variable `32` has no dependencies. + + From 017c38dfd8e61581fa854ad8f62d6e178425cb97 Mon Sep 17 00:00:00 2001 From: Dale Wijnand Date: Wed, 7 Feb 2024 22:24:17 +0000 Subject: [PATCH 185/371] Add GADT symbols when typing typing-ahead lambda bodies (cherry picked from commit cc1e37ea1b0060d26b1360e1917e9576bd44c8d3) --- .../src/dotty/tools/dotc/typer/Namer.scala | 8 ++++++- tests/pos/i19570.min1.scala | 23 ++++++++++++++++++ tests/pos/i19570.min2.scala | 24 +++++++++++++++++++ tests/pos/i19570.orig.scala | 14 +++++++++++ 4 files changed, 68 insertions(+), 1 deletion(-) create mode 100644 tests/pos/i19570.min1.scala create mode 100644 tests/pos/i19570.min2.scala create mode 100644 tests/pos/i19570.orig.scala diff --git a/compiler/src/dotty/tools/dotc/typer/Namer.scala b/compiler/src/dotty/tools/dotc/typer/Namer.scala index 03ff6e168666..715c1e1be9c4 100644 --- a/compiler/src/dotty/tools/dotc/typer/Namer.scala +++ b/compiler/src/dotty/tools/dotc/typer/Namer.scala @@ -1734,8 +1734,14 @@ class Namer { typer: Typer => val tpe = (paramss: @unchecked) match case TypeSymbols(tparams) :: TermSymbols(vparams) :: Nil => tpFun(tparams, vparams) case TermSymbols(vparams) :: Nil => tpFun(Nil, vparams) + val rhsCtx = (paramss: @unchecked) match + case TypeSymbols(tparams) :: TermSymbols(_) :: Nil => + val rhsCtx = ctx.fresh.setFreshGADTBounds + rhsCtx.gadtState.addToConstraint(tparams) + rhsCtx + case TermSymbols(_) :: Nil => ctx if (isFullyDefined(tpe, ForceDegree.none)) tpe - else typedAheadExpr(mdef.rhs, tpe).tpe + else typedAheadExpr(mdef.rhs, tpe)(using rhsCtx).tpe case TypedSplice(tpt: TypeTree) if !isFullyDefined(tpt.tpe, ForceDegree.none) => mdef match { diff --git a/tests/pos/i19570.min1.scala b/tests/pos/i19570.min1.scala new file mode 100644 index 000000000000..2cbc852641d3 --- /dev/null +++ b/tests/pos/i19570.min1.scala @@ -0,0 +1,23 @@ +enum Op[A]: + case Dup[T]() extends Op[(T, T)] + +def foo[R](f: [A] => Op[A] => R): R = ??? + +def test = + foo([A] => (o: Op[A]) => o match + case o: Op.Dup[u] => + summon[A =:= (u, u)] // Error: Cannot prove that A =:= (u, u) + () + ) + foo[Unit]([A] => (o: Op[A]) => o match + case o: Op.Dup[u] => + summon[A =:= (u, u)] // Ok + () + ) + foo({ + val f1 = [B] => (o: Op[B]) => o match + case o: Op.Dup[u] => + summon[B =:= (u, u)] // Also ok + () + f1 + }) diff --git a/tests/pos/i19570.min2.scala b/tests/pos/i19570.min2.scala new file mode 100644 index 000000000000..b1450d7e2d1a --- /dev/null +++ b/tests/pos/i19570.min2.scala @@ -0,0 +1,24 @@ +sealed trait Op[A, B] { def giveA: A; def giveB: B } +final case class Dup[T](x: T) extends Op[T, (T, T)] { def giveA: T = x; def giveB: (T, T) = (x, x) } + +class Test: + def foo[R](f: [A, B] => (o: Op[A, B]) => R): R = ??? + + def m1: Unit = + foo([A, B] => (o: Op[A, B]) => o match + case o: Dup[t] => + var a1: t = o.giveA + var a2: A = o.giveA + a1 = a2 + a2 = a1 + + var b1: (t, t) = o.giveB + var b2: B = o.giveB + b1 = b2 + b2 = b1 + + summon[A =:= t] // ERROR: Cannot prove that A =:= t. + summon[B =:= (t, t)] // ERROR: Cannot prove that B =:= (t, t). + + () + ) diff --git a/tests/pos/i19570.orig.scala b/tests/pos/i19570.orig.scala new file mode 100644 index 000000000000..6e574f52be91 --- /dev/null +++ b/tests/pos/i19570.orig.scala @@ -0,0 +1,14 @@ +enum Op[A, B]: + case Dup[T]() extends Op[T, (T, T)] + +def foo[R](f: [A, B] => (o: Op[A, B]) => R): R = + f(Op.Dup()) + +def test = + foo([A, B] => (o: Op[A, B]) => { + o match + case o: Op.Dup[t] => + summon[A =:= t] // ERROR: Cannot prove that A =:= t. + summon[B =:= (t, t)] // ERROR: Cannot prove that B =:= (t, t). + 42 + }) From e6359f570d7dfd2faf1ad5e4ac2f8fb399107438 Mon Sep 17 00:00:00 2001 From: Dale Wijnand Date: Thu, 22 Feb 2024 14:39:47 +0000 Subject: [PATCH 186/371] Extract shared prepareRhsCtx (cherry picked from commit 2c81588e620e0ae62aa6641db4aebf9683bd97d3) --- .../src/dotty/tools/dotc/typer/Namer.scala | 27 +++++++++---------- 1 file changed, 13 insertions(+), 14 deletions(-) diff --git a/compiler/src/dotty/tools/dotc/typer/Namer.scala b/compiler/src/dotty/tools/dotc/typer/Namer.scala index 715c1e1be9c4..3af87d311d9d 100644 --- a/compiler/src/dotty/tools/dotc/typer/Namer.scala +++ b/compiler/src/dotty/tools/dotc/typer/Namer.scala @@ -1734,12 +1734,7 @@ class Namer { typer: Typer => val tpe = (paramss: @unchecked) match case TypeSymbols(tparams) :: TermSymbols(vparams) :: Nil => tpFun(tparams, vparams) case TermSymbols(vparams) :: Nil => tpFun(Nil, vparams) - val rhsCtx = (paramss: @unchecked) match - case TypeSymbols(tparams) :: TermSymbols(_) :: Nil => - val rhsCtx = ctx.fresh.setFreshGADTBounds - rhsCtx.gadtState.addToConstraint(tparams) - rhsCtx - case TermSymbols(_) :: Nil => ctx + val rhsCtx = prepareRhsCtx(ctx.fresh, paramss) if (isFullyDefined(tpe, ForceDegree.none)) tpe else typedAheadExpr(mdef.rhs, tpe)(using rhsCtx).tpe @@ -1939,14 +1934,7 @@ class Namer { typer: Typer => var rhsCtx = ctx.fresh.addMode(Mode.InferringReturnType) if sym.isInlineMethod then rhsCtx = rhsCtx.addMode(Mode.InlineableBody) if sym.is(ExtensionMethod) then rhsCtx = rhsCtx.addMode(Mode.InExtensionMethod) - val typeParams = paramss.collect { case TypeSymbols(tparams) => tparams }.flatten - if (typeParams.nonEmpty) { - // we'll be typing an expression from a polymorphic definition's body, - // so we must allow constraining its type parameters - // compare with typedDefDef, see tests/pos/gadt-inference.scala - rhsCtx.setFreshGADTBounds - rhsCtx.gadtState.addToConstraint(typeParams) - } + rhsCtx = prepareRhsCtx(rhsCtx, paramss) def typedAheadRhs(pt: Type) = PrepareInlineable.dropInlineIfError(sym, @@ -1991,4 +1979,15 @@ class Namer { typer: Typer => lhsType orElse WildcardType } end inferredResultType + + /** Prepare a GADT-aware context used to type the RHS of a ValOrDefDef. */ + def prepareRhsCtx(rhsCtx: FreshContext, paramss: List[List[Symbol]])(using Context): FreshContext = + val typeParams = paramss.collect { case TypeSymbols(tparams) => tparams }.flatten + if typeParams.nonEmpty then + // we'll be typing an expression from a polymorphic definition's body, + // so we must allow constraining its type parameters + // compare with typedDefDef, see tests/pos/gadt-inference.scala + rhsCtx.setFreshGADTBounds + rhsCtx.gadtState.addToConstraint(typeParams) + rhsCtx } From 30f86c5f4e250f33d9c0875e205a9f8f0e734dcd Mon Sep 17 00:00:00 2001 From: Hamza REMMAL Date: Sat, 17 Feb 2024 20:13:25 +0100 Subject: [PATCH 187/371] Add dotty to the safe directories in nightly_documentation --- .github/workflows/ci.yaml | 1 + 1 file changed, 1 insertion(+) diff --git a/.github/workflows/ci.yaml b/.github/workflows/ci.yaml index eb29d98632f6..4bcc1a37640c 100644 --- a/.github/workflows/ci.yaml +++ b/.github/workflows/ci.yaml @@ -645,6 +645,7 @@ jobs: - name: Generate Website run: | + git config --global --add safe.directory /__w/dotty/dotty ./project/scripts/genDocs -doc-snapshot - name: Deploy Website to dotty-website From 9f8f1507bbb3c8979c54dca9d47a5a9036d2c368 Mon Sep 17 00:00:00 2001 From: Hamza REMMAL Date: Sat, 17 Feb 2024 21:37:08 +0100 Subject: [PATCH 188/371] Address actions/runner#2033 --- .github/workflows/ci.yaml | 57 +++++++++++++++++++++++++--------- .github/workflows/releases.yml | 4 ++- 2 files changed, 46 insertions(+), 15 deletions(-) diff --git a/.github/workflows/ci.yaml b/.github/workflows/ci.yaml index 4bcc1a37640c..d2d54ffcc9ff 100644 --- a/.github/workflows/ci.yaml +++ b/.github/workflows/ci.yaml @@ -68,8 +68,11 @@ jobs: - name: Set JDK 16 as default run: echo "/usr/lib/jvm/java-16-openjdk-amd64/bin" >> $GITHUB_PATH + ## Workaround for https://github.com/actions/runner/issues/2033 (See https://github.com/lampepfl/dotty/pull/19720) - name: Reset existing repo - run: git -c "http.https://github.com/.extraheader=" fetch --recurse-submodules=no "https://github.com/lampepfl/dotty" && git reset --hard FETCH_HEAD || true + run: | + git config --global --add safe.directory /__w/dotty/dotty + git -c "http.https://github.com/.extraheader=" fetch --recurse-submodules=no "https://github.com/lampepfl/dotty" && git reset --hard FETCH_HEAD || true - name: Checkout cleanup script uses: actions/checkout@v4 @@ -120,7 +123,9 @@ jobs: run: echo "/usr/lib/jvm/java-16-openjdk-amd64/bin" >> $GITHUB_PATH - name: Reset existing repo - run: git -c "http.https://github.com/.extraheader=" fetch --recurse-submodules=no "https://github.com/lampepfl/dotty" && git reset --hard FETCH_HEAD || true + run: | + git config --global --add safe.directory /__w/dotty/dotty + git -c "http.https://github.com/.extraheader=" fetch --recurse-submodules=no "https://github.com/lampepfl/dotty" && git reset --hard FETCH_HEAD || true - name: Checkout cleanup script uses: actions/checkout@v4 @@ -174,7 +179,9 @@ jobs: run: echo "/usr/lib/jvm/java-16-openjdk-amd64/bin" >> $GITHUB_PATH - name: Reset existing repo - run: git -c "http.https://github.com/.extraheader=" fetch --recurse-submodules=no "https://github.com/lampepfl/dotty" && git reset --hard FETCH_HEAD || true + run: | + git config --global --add safe.directory /__w/dotty/dotty + git -c "http.https://github.com/.extraheader=" fetch --recurse-submodules=no "https://github.com/lampepfl/dotty" && git reset --hard FETCH_HEAD || true - name: Checkout cleanup script uses: actions/checkout@v4 @@ -211,8 +218,10 @@ jobs: steps: - name: Reset existing repo - run: git -c "http.https://github.com/.extraheader=" fetch --recurse-submodules=no "https://github.com/lampepfl/dotty" && git reset --hard FETCH_HEAD || true shell: cmd + run: | + git config --global --add safe.directory /__w/dotty/dotty + git -c "http.https://github.com/.extraheader=" fetch --recurse-submodules=no "https://github.com/lampepfl/dotty" && git reset --hard FETCH_HEAD || true - name: Git Checkout uses: actions/checkout@v4 @@ -253,8 +262,10 @@ jobs: steps: - name: Reset existing repo - run: git -c "http.https://github.com/.extraheader=" fetch --recurse-submodules=no "https://github.com/lampepfl/dotty" && git reset --hard FETCH_HEAD || true shell: cmd + run: | + git config --global --add safe.directory /__w/dotty/dotty + git -c "http.https://github.com/.extraheader=" fetch --recurse-submodules=no "https://github.com/lampepfl/dotty" && git reset --hard FETCH_HEAD || true - name: Git Checkout uses: actions/checkout@v4 @@ -291,7 +302,9 @@ jobs: )" steps: - name: Reset existing repo - run: git -c "http.https://github.com/.extraheader=" fetch --recurse-submodules=no "https://github.com/lampepfl/dotty" && git reset --hard FETCH_HEAD || true + run: | + git config --global --add safe.directory /__w/dotty/dotty + git -c "http.https://github.com/.extraheader=" fetch --recurse-submodules=no "https://github.com/lampepfl/dotty" && git reset --hard FETCH_HEAD || true - name: Checkout cleanup script uses: actions/checkout@v4 @@ -339,7 +352,9 @@ jobs: steps: - name: Reset existing repo - run: git -c "http.https://github.com/.extraheader=" fetch --recurse-submodules=no "https://github.com/lampepfl/dotty" && git reset --hard FETCH_HEAD || true + run: | + git config --global --add safe.directory /__w/dotty/dotty + git -c "http.https://github.com/.extraheader=" fetch --recurse-submodules=no "https://github.com/lampepfl/dotty" && git reset --hard FETCH_HEAD || true - name: Checkout cleanup script uses: actions/checkout@v4 @@ -388,7 +403,9 @@ jobs: steps: - name: Reset existing repo - run: git -c "http.https://github.com/.extraheader=" fetch --recurse-submodules=no "https://github.com/lampepfl/dotty" && git reset --hard FETCH_HEAD || true + run: | + git config --global --add safe.directory /__w/dotty/dotty + git -c "http.https://github.com/.extraheader=" fetch --recurse-submodules=no "https://github.com/lampepfl/dotty" && git reset --hard FETCH_HEAD || true - name: Checkout cleanup script uses: actions/checkout@v4 @@ -437,7 +454,9 @@ jobs: steps: - name: Reset existing repo - run: git -c "http.https://github.com/.extraheader=" fetch --recurse-submodules=no "https://github.com/lampepfl/dotty" && git reset --hard FETCH_HEAD || true + run: | + git config --global --add safe.directory /__w/dotty/dotty + git -c "http.https://github.com/.extraheader=" fetch --recurse-submodules=no "https://github.com/lampepfl/dotty" && git reset --hard FETCH_HEAD || true - name: Checkout cleanup script uses: actions/checkout@v4 @@ -484,7 +503,9 @@ jobs: steps: - name: Reset existing repo - run: git -c "http.https://github.com/.extraheader=" fetch --recurse-submodules=no "https://github.com/lampepfl/dotty" && git reset --hard FETCH_HEAD || true + run: | + git config --global --add safe.directory /__w/dotty/dotty + git -c "http.https://github.com/.extraheader=" fetch --recurse-submodules=no "https://github.com/lampepfl/dotty" && git reset --hard FETCH_HEAD || true - name: Checkout cleanup script uses: actions/checkout@v4 @@ -531,7 +552,9 @@ jobs: run: echo "/usr/lib/jvm/java-8-openjdk-amd64/bin" >> $GITHUB_PATH - name: Reset existing repo - run: git -c "http.https://github.com/.extraheader=" fetch --recurse-submodules=no "https://github.com/lampepfl/dotty" && git reset --hard FETCH_HEAD || true + run: | + git config --global --add safe.directory /__w/dotty/dotty + git -c "http.https://github.com/.extraheader=" fetch --recurse-submodules=no "https://github.com/lampepfl/dotty" && git reset --hard FETCH_HEAD || true - name: Checkout cleanup script uses: actions/checkout@v4 @@ -575,7 +598,9 @@ jobs: steps: - name: Reset existing repo - run: git -c "http.https://github.com/.extraheader=" fetch --recurse-submodules=no "https://github.com/lampepfl/dotty" && git reset --hard FETCH_HEAD || true + run: | + git config --global --add safe.directory /__w/dotty/dotty + git -c "http.https://github.com/.extraheader=" fetch --recurse-submodules=no "https://github.com/lampepfl/dotty" && git reset --hard FETCH_HEAD || true - name: Checkout cleanup script uses: actions/checkout@v4 @@ -629,7 +654,9 @@ jobs: steps: - name: Reset existing repo - run: git -c "http.https://github.com/.extraheader=" fetch --recurse-submodules=no "https://github.com/lampepfl/dotty" && git reset --hard FETCH_HEAD || true + run: | + git config --global --add safe.directory /__w/dotty/dotty + git -c "http.https://github.com/.extraheader=" fetch --recurse-submodules=no "https://github.com/lampepfl/dotty" && git reset --hard FETCH_HEAD || true - name: Checkout cleanup script uses: actions/checkout@v4 @@ -680,7 +707,9 @@ jobs: steps: - name: Reset existing repo - run: git -c "http.https://github.com/.extraheader=" fetch --recurse-submodules=no "https://github.com/lampepfl/dotty" && git reset --hard FETCH_HEAD || true + run: | + git config --global --add safe.directory /__w/dotty/dotty + git -c "http.https://github.com/.extraheader=" fetch --recurse-submodules=no "https://github.com/lampepfl/dotty" && git reset --hard FETCH_HEAD || true - name: Checkout cleanup script uses: actions/checkout@v4 diff --git a/.github/workflows/releases.yml b/.github/workflows/releases.yml index f2cd0706cfe7..dde8b0372d52 100644 --- a/.github/workflows/releases.yml +++ b/.github/workflows/releases.yml @@ -18,7 +18,9 @@ jobs: steps: - name: Reset existing repo - run: git -c "http.https://github.com/.extraheader=" fetch --recurse-submodules=no "https://github.com/lampepfl/dotty" && git reset --hard FETCH_HEAD || true + run: | + git config --global --add safe.directory /__w/dotty/dotty + git -c "http.https://github.com/.extraheader=" fetch --recurse-submodules=no "https://github.com/lampepfl/dotty" && git reset --hard FETCH_HEAD || true - name: Cleanup run: .github/workflows/cleanup.sh From a086db7f9c84b9cdb362e5a3f5e7ebc112d1774a Mon Sep 17 00:00:00 2001 From: Hamza REMMAL Date: Fri, 1 Mar 2024 13:38:27 +0100 Subject: [PATCH 189/371] Adapt for scala/scala3 --- .github/workflows/ci.yaml | 105 ++++++++++++++++--------------- project/Build.scala | 16 ++--- project/scripts/cmdScaladocTests | 6 +- 3 files changed, 65 insertions(+), 62 deletions(-) diff --git a/.github/workflows/ci.yaml b/.github/workflows/ci.yaml index d2d54ffcc9ff..cf375b147793 100644 --- a/.github/workflows/ci.yaml +++ b/.github/workflows/ci.yaml @@ -53,7 +53,7 @@ jobs: - ${{ github.workspace }}/../../cache/sbt:/root/.sbt - ${{ github.workspace }}/../../cache/ivy:/root/.ivy2/cache - ${{ github.workspace }}/../../cache/general:/root/.cache - if: "github.event_name == 'schedule' && github.repository == 'lampepfl/dotty' + if: "github.event_name == 'schedule' && github.repository == 'scala/scala3' || github.event_name == 'push' || ( github.event_name == 'pull_request' @@ -62,17 +62,17 @@ jobs: ) || ( github.event_name == 'workflow_dispatch' - && github.repository == 'lampepfl/dotty' + && github.repository == 'scala/scala3' )" steps: - name: Set JDK 16 as default run: echo "/usr/lib/jvm/java-16-openjdk-amd64/bin" >> $GITHUB_PATH - ## Workaround for https://github.com/actions/runner/issues/2033 (See https://github.com/lampepfl/dotty/pull/19720) + ## Workaround for https://github.com/actions/runner/issues/2033 (See https://github.com/scala/scala3/pull/19720) - name: Reset existing repo run: | - git config --global --add safe.directory /__w/dotty/dotty - git -c "http.https://github.com/.extraheader=" fetch --recurse-submodules=no "https://github.com/lampepfl/dotty" && git reset --hard FETCH_HEAD || true + git config --global --add safe.directory /__w/scala3/scala3 + git -c "http.https://github.com/.extraheader=" fetch --recurse-submodules=no "https://github.com/scala/scala3" && git reset --hard FETCH_HEAD || true - name: Checkout cleanup script uses: actions/checkout@v4 @@ -105,7 +105,7 @@ jobs: - ${{ github.workspace }}/../../cache/sbt:/root/.sbt - ${{ github.workspace }}/../../cache/ivy:/root/.ivy2/cache - ${{ github.workspace }}/../../cache/general:/root/.cache - if: "github.event_name == 'schedule' && github.repository == 'lampepfl/dotty' + if: "github.event_name == 'schedule' && github.repository == 'scala/scala3' || github.event_name == 'push' || github.event_name == 'merge_group' || ( @@ -115,7 +115,7 @@ jobs: ) || ( github.event_name == 'workflow_dispatch' - && github.repository == 'lampepfl/dotty' + && github.repository == 'scala/scala3' )" steps: @@ -124,8 +124,8 @@ jobs: - name: Reset existing repo run: | - git config --global --add safe.directory /__w/dotty/dotty - git -c "http.https://github.com/.extraheader=" fetch --recurse-submodules=no "https://github.com/lampepfl/dotty" && git reset --hard FETCH_HEAD || true + git config --global --add safe.directory /__w/scala3/scala3 + git -c "http.https://github.com/.extraheader=" fetch --recurse-submodules=no "https://github.com/scala/scala3" && git reset --hard FETCH_HEAD || true - name: Checkout cleanup script uses: actions/checkout@v4 @@ -164,14 +164,14 @@ jobs: - ${{ github.workspace }}/../../cache/sbt:/root/.sbt - ${{ github.workspace }}/../../cache/ivy:/root/.ivy2/cache - ${{ github.workspace }}/../../cache/general:/root/.cache - if: "github.event_name == 'schedule' && github.repository == 'lampepfl/dotty' + if: "github.event_name == 'schedule' && github.repository == 'scala/scala3' || ( github.event_name == 'pull_request' && contains(github.event.pull_request.body, '[test_scala2_library_tasty]') ) || ( github.event_name == 'workflow_dispatch' - && github.repository == 'lampepfl/dotty' + && github.repository == 'scala/scala3' )" steps: @@ -180,8 +180,8 @@ jobs: - name: Reset existing repo run: | - git config --global --add safe.directory /__w/dotty/dotty - git -c "http.https://github.com/.extraheader=" fetch --recurse-submodules=no "https://github.com/lampepfl/dotty" && git reset --hard FETCH_HEAD || true + git config --global --add safe.directory /__w/scala3/scala3 + git -c "http.https://github.com/.extraheader=" fetch --recurse-submodules=no "https://github.com/scala/scala3" && git reset --hard FETCH_HEAD || true - name: Checkout cleanup script uses: actions/checkout@v4 @@ -220,8 +220,8 @@ jobs: - name: Reset existing repo shell: cmd run: | - git config --global --add safe.directory /__w/dotty/dotty - git -c "http.https://github.com/.extraheader=" fetch --recurse-submodules=no "https://github.com/lampepfl/dotty" && git reset --hard FETCH_HEAD || true + git config --global --add safe.directory /__w/scala3/scala3 + git -c "http.https://github.com/.extraheader=" fetch --recurse-submodules=no "https://github.com/scala/scala3" && git reset --hard FETCH_HEAD || true - name: Git Checkout uses: actions/checkout@v4 @@ -252,7 +252,7 @@ jobs: test_windows_full: runs-on: [self-hosted, Windows] - if: "github.event_name == 'schedule' && github.repository == 'lampepfl/dotty' + if: "github.event_name == 'schedule' && github.repository == 'scala/scala3' || github.event_name == 'push' || ( github.event_name == 'pull_request' @@ -264,8 +264,8 @@ jobs: - name: Reset existing repo shell: cmd run: | - git config --global --add safe.directory /__w/dotty/dotty - git -c "http.https://github.com/.extraheader=" fetch --recurse-submodules=no "https://github.com/lampepfl/dotty" && git reset --hard FETCH_HEAD || true + git config --global --add safe.directory /__w/scala3/scala3 + git -c "http.https://github.com/.extraheader=" fetch --recurse-submodules=no "https://github.com/scala/scala3" && git reset --hard FETCH_HEAD || true - name: Git Checkout uses: actions/checkout@v4 @@ -288,7 +288,7 @@ jobs: - ${{ github.workspace }}/../../cache/sbt:/root/.sbt - ${{ github.workspace }}/../../cache/ivy:/root/.ivy2/cache - ${{ github.workspace }}/../../cache/general:/root/.cache - if: "github.event_name == 'schedule' && github.repository == 'lampepfl/dotty' + if: "github.event_name == 'schedule' && github.repository == 'scala/scala3' || github.event_name == 'push' || github.event_name == 'merge_group' || ( @@ -298,13 +298,13 @@ jobs: ) || ( github.event_name == 'workflow_dispatch' - && github.repository == 'lampepfl/dotty' + && github.repository == 'scala/scala3' )" steps: - name: Reset existing repo run: | - git config --global --add safe.directory /__w/dotty/dotty - git -c "http.https://github.com/.extraheader=" fetch --recurse-submodules=no "https://github.com/lampepfl/dotty" && git reset --hard FETCH_HEAD || true + git config --global --add safe.directory /__w/scala3/scala3 + git -c "http.https://github.com/.extraheader=" fetch --recurse-submodules=no "https://github.com/scala/scala3" && git reset --hard FETCH_HEAD || true - name: Checkout cleanup script uses: actions/checkout@v4 @@ -336,7 +336,7 @@ jobs: - ${{ github.workspace }}/../../cache/sbt:/root/.sbt - ${{ github.workspace }}/../../cache/ivy:/root/.ivy2/cache - ${{ github.workspace }}/../../cache/general:/root/.cache - if: "github.event_name == 'schedule' && github.repository == 'lampepfl/dotty' + if: "github.event_name == 'schedule' && github.repository == 'scala/scala3' || github.event_name == 'push' || github.event_name == 'merge_group' || ( @@ -347,14 +347,14 @@ jobs: ) || ( github.event_name == 'workflow_dispatch' - && github.repository == 'lampepfl/dotty' + && github.repository == 'scala/scala3' )" steps: - name: Reset existing repo run: | - git config --global --add safe.directory /__w/dotty/dotty - git -c "http.https://github.com/.extraheader=" fetch --recurse-submodules=no "https://github.com/lampepfl/dotty" && git reset --hard FETCH_HEAD || true + git config --global --add safe.directory /__w/scala3/scala3 + git -c "http.https://github.com/.extraheader=" fetch --recurse-submodules=no "https://github.com/scala/scala3" && git reset --hard FETCH_HEAD || true - name: Checkout cleanup script uses: actions/checkout@v4 @@ -370,6 +370,7 @@ jobs: - name: Test run: | + git config --global --add safe.directory /__w/scala3/scala3 git submodule sync git submodule update --init --recursive --jobs 7 ./project/scripts/sbt "community-build/testOnly dotty.communitybuild.CommunityBuildTestA" @@ -387,7 +388,7 @@ jobs: - ${{ github.workspace }}/../../cache/sbt:/root/.sbt - ${{ github.workspace }}/../../cache/ivy:/root/.ivy2/cache - ${{ github.workspace }}/../../cache/general:/root/.cache - if: "github.event_name == 'schedule' && github.repository == 'lampepfl/dotty' + if: "github.event_name == 'schedule' && github.repository == 'scala/scala3' || github.event_name == 'push' || github.event_name == 'merge_group' || ( @@ -398,14 +399,14 @@ jobs: ) || ( github.event_name == 'workflow_dispatch' - && github.repository == 'lampepfl/dotty' + && github.repository == 'scala/scala3' )" steps: - name: Reset existing repo run: | - git config --global --add safe.directory /__w/dotty/dotty - git -c "http.https://github.com/.extraheader=" fetch --recurse-submodules=no "https://github.com/lampepfl/dotty" && git reset --hard FETCH_HEAD || true + git config --global --add safe.directory /__w/scala3/scala3 + git -c "http.https://github.com/.extraheader=" fetch --recurse-submodules=no "https://github.com/scala/scala3" && git reset --hard FETCH_HEAD || true - name: Checkout cleanup script uses: actions/checkout@v4 @@ -421,6 +422,7 @@ jobs: - name: Test run: | + git config --global --add safe.directory /__w/scala3/scala3 git submodule sync git submodule update --init --recursive --jobs 7 ./project/scripts/sbt "community-build/testOnly dotty.communitybuild.CommunityBuildTestB" @@ -438,7 +440,7 @@ jobs: - ${{ github.workspace }}/../../cache/sbt:/root/.sbt - ${{ github.workspace }}/../../cache/ivy:/root/.ivy2/cache - ${{ github.workspace }}/../../cache/general:/root/.cache - if: "github.event_name == 'schedule' && github.repository == 'lampepfl/dotty' + if: "github.event_name == 'schedule' && github.repository == 'scala/scala3' || github.event_name == 'push' || github.event_name == 'merge_group' || ( @@ -449,14 +451,14 @@ jobs: ) || ( github.event_name == 'workflow_dispatch' - && github.repository == 'lampepfl/dotty' + && github.repository == 'scala/scala3' )" steps: - name: Reset existing repo run: | - git config --global --add safe.directory /__w/dotty/dotty - git -c "http.https://github.com/.extraheader=" fetch --recurse-submodules=no "https://github.com/lampepfl/dotty" && git reset --hard FETCH_HEAD || true + git config --global --add safe.directory /__w/scala3/scala3 + git -c "http.https://github.com/.extraheader=" fetch --recurse-submodules=no "https://github.com/scala/scala3" && git reset --hard FETCH_HEAD || true - name: Checkout cleanup script uses: actions/checkout@v4 @@ -472,6 +474,7 @@ jobs: - name: Test run: | + git config --global --add safe.directory /__w/scala3/scala3 git submodule sync git submodule update --init --recursive --jobs 7 ./project/scripts/sbt "community-build/testOnly dotty.communitybuild.CommunityBuildTestC" @@ -489,7 +492,7 @@ jobs: - ${{ github.workspace }}/../../cache/sbt:/root/.sbt - ${{ github.workspace }}/../../cache/ivy:/root/.ivy2/cache - ${{ github.workspace }}/../../cache/general:/root/.cache - if: "github.event_name == 'schedule' && github.repository == 'lampepfl/dotty' + if: "github.event_name == 'schedule' && github.repository == 'scala/scala3' || github.event_name == 'push' || ( github.event_name == 'pull_request' @@ -498,14 +501,14 @@ jobs: ) || ( github.event_name == 'workflow_dispatch' - && github.repository == 'lampepfl/dotty' + && github.repository == 'scala/scala3' )" steps: - name: Reset existing repo run: | - git config --global --add safe.directory /__w/dotty/dotty - git -c "http.https://github.com/.extraheader=" fetch --recurse-submodules=no "https://github.com/lampepfl/dotty" && git reset --hard FETCH_HEAD || true + git config --global --add safe.directory /__w/scala3/scala3 + git -c "http.https://github.com/.extraheader=" fetch --recurse-submodules=no "https://github.com/scala/scala3" && git reset --hard FETCH_HEAD || true - name: Checkout cleanup script uses: actions/checkout@v4 @@ -532,7 +535,7 @@ jobs: - ${{ github.workspace }}/../../cache/ivy:/root/.ivy2/cache - ${{ github.workspace }}/../../cache/general:/root/.cache - if: "github.event_name == 'schedule' && github.repository == 'lampepfl/dotty' + if: "github.event_name == 'schedule' && github.repository == 'scala/scala3' || ( github.event_name == 'push' && startsWith(github.event.ref, 'refs/tags/') @@ -544,7 +547,7 @@ jobs: ) || ( github.event_name == 'workflow_dispatch' - && github.repository == 'lampepfl/dotty' + && github.repository == 'scala/scala3' )" steps: @@ -553,8 +556,8 @@ jobs: - name: Reset existing repo run: | - git config --global --add safe.directory /__w/dotty/dotty - git -c "http.https://github.com/.extraheader=" fetch --recurse-submodules=no "https://github.com/lampepfl/dotty" && git reset --hard FETCH_HEAD || true + git config --global --add safe.directory /__w/scala3/scala3 + git -c "http.https://github.com/.extraheader=" fetch --recurse-submodules=no "https://github.com/scala/scala3" && git reset --hard FETCH_HEAD || true - name: Checkout cleanup script uses: actions/checkout@v4 @@ -588,7 +591,7 @@ jobs: - ${{ github.workspace }}/../../cache/ivy:/root/.ivy2/cache - ${{ github.workspace }}/../../cache/general:/root/.cache needs: [test_non_bootstrapped, test, mima, community_build_a, community_build_b, community_build_c, test_sbt, test_java8] - if: "(github.event_name == 'schedule' || github.event_name == 'workflow_dispatch') && github.repository == 'lampepfl/dotty'" + if: "(github.event_name == 'schedule' || github.event_name == 'workflow_dispatch') && github.repository == 'scala/scala3'" env: NIGHTLYBUILD: yes PGP_PW: ${{ secrets.PGP_PW }} # PGP passphrase @@ -599,8 +602,8 @@ jobs: steps: - name: Reset existing repo run: | - git config --global --add safe.directory /__w/dotty/dotty - git -c "http.https://github.com/.extraheader=" fetch --recurse-submodules=no "https://github.com/lampepfl/dotty" && git reset --hard FETCH_HEAD || true + git config --global --add safe.directory /__w/scala3/scala3 + git -c "http.https://github.com/.extraheader=" fetch --recurse-submodules=no "https://github.com/scala/scala3" && git reset --hard FETCH_HEAD || true - name: Checkout cleanup script uses: actions/checkout@v4 @@ -641,7 +644,7 @@ jobs: - ${{ github.workspace }}/../../cache/ivy:/root/.ivy2/cache - ${{ github.workspace }}/../../cache/general:/root/.cache needs: [publish_nightly] - if: "(github.event_name == 'schedule' || github.event_name == 'workflow_dispatch') && github.repository == 'lampepfl/dotty'" + if: "(github.event_name == 'schedule' || github.event_name == 'workflow_dispatch') && github.repository == 'scala/scala3'" env: NIGHTLYBUILD: yes DOTTY_WEBSITE_BOT_TOKEN: ${{ secrets.BOT_TOKEN }} # If you need to change this: @@ -655,8 +658,8 @@ jobs: steps: - name: Reset existing repo run: | - git config --global --add safe.directory /__w/dotty/dotty - git -c "http.https://github.com/.extraheader=" fetch --recurse-submodules=no "https://github.com/lampepfl/dotty" && git reset --hard FETCH_HEAD || true + git config --global --add safe.directory /__w/scala3/scala3 + git -c "http.https://github.com/.extraheader=" fetch --recurse-submodules=no "https://github.com/scala/scala3" && git reset --hard FETCH_HEAD || true - name: Checkout cleanup script uses: actions/checkout@v4 @@ -672,7 +675,7 @@ jobs: - name: Generate Website run: | - git config --global --add safe.directory /__w/dotty/dotty + git config --global --add safe.directory /__w/scala3/scala3 ./project/scripts/genDocs -doc-snapshot - name: Deploy Website to dotty-website @@ -708,8 +711,8 @@ jobs: steps: - name: Reset existing repo run: | - git config --global --add safe.directory /__w/dotty/dotty - git -c "http.https://github.com/.extraheader=" fetch --recurse-submodules=no "https://github.com/lampepfl/dotty" && git reset --hard FETCH_HEAD || true + git config --global --add safe.directory /__w/scala3/scala3 + git -c "http.https://github.com/.extraheader=" fetch --recurse-submodules=no "https://github.com/scala/scala3" && git reset --hard FETCH_HEAD || true - name: Checkout cleanup script uses: actions/checkout@v4 diff --git a/project/Build.scala b/project/Build.scala index caede50d0c6c..49e9e5163cd8 100644 --- a/project/Build.scala +++ b/project/Build.scala @@ -140,8 +140,8 @@ object Build { val stdlibBootstrappedVersion = "2.13.12" val dottyOrganization = "org.scala-lang" - val dottyGithubUrl = "https://github.com/lampepfl/dotty" - val dottyGithubRawUserContentUrl = "https://raw.githubusercontent.com/lampepfl/dotty" + val dottyGithubUrl = "https://github.com/scala/scala3" + val dottyGithubRawUserContentUrl = "https://raw.githubusercontent.com/scala/scala3" val isRelease = sys.env.get("RELEASEBUILD") == Some("yes") @@ -379,7 +379,7 @@ object Build { "-skip-by-regex:.+\\.impl($|\\..+)", "-project-logo", "docs/_assets/images/logo.svg", "-social-links:" + - "github::https://github.com/lampepfl/dotty," + + "github::https://github.com/scala/scala3," + "discord::https://discord.com/invite/scala," + "twitter::https://twitter.com/scala_lang", // contains special definitions which are "transplanted" elsewhere @@ -1857,7 +1857,7 @@ object Build { .add(ProjectVersion(baseVersion)) .remove[VersionsDictionaryUrl] .add(SourceLinks(List( - s"${temp.getAbsolutePath}=github://lampepfl/dotty/language-reference-stable" + s"${temp.getAbsolutePath}=github://scala/scala3/language-reference-stable" ))) .withTargets(List("___fake___.scala")) } @@ -2025,7 +2025,7 @@ object Build { scmInfo := Some( ScmInfo( url(dottyGithubUrl), - "scm:git:git@github.com:lampepfl/dotty.git" + "scm:git:git@github.com:scala/scala3.git" ) ), developers := List( @@ -2250,7 +2250,7 @@ object ScaladocConfigs { sys.env.get("GITHUB_SHA") match { case Some(sha) => s"${sourcesPrefix}github://${sys.env("GITHUB_REPOSITORY")}/$sha$outputPrefix" - case None => s"${sourcesPrefix}github://lampepfl/dotty/$v$outputPrefix" + case None => s"${sourcesPrefix}github://scala/scala3/$v$outputPrefix" } def defaultSourceLinks(version: String = dottyNonBootstrappedVersion, refVersion: String = dottyVersion) = Def.task { @@ -2261,7 +2261,7 @@ object ScaladocConfigs { scalaSrcLink(stdLibVersion, srcManaged(version, "scala") + "="), dottySrcLink(refVersion, "library/src=", "#library/src"), dottySrcLink(refVersion), - "docs=github://lampepfl/dotty/main#docs" + "docs=github://scala/scala3/main#docs" ) ) } @@ -2269,7 +2269,7 @@ object ScaladocConfigs { lazy val DefaultGenerationSettings = Def.task { def projectVersion = version.value def socialLinks = SocialLinks(List( - "github::https://github.com/lampepfl/dotty", + "github::https://github.com/scala/scala3", "discord::https://discord.com/invite/scala", "twitter::https://twitter.com/scala_lang", )) diff --git a/project/scripts/cmdScaladocTests b/project/scripts/cmdScaladocTests index e9403d988b98..06353af693f1 100755 --- a/project/scripts/cmdScaladocTests +++ b/project/scripts/cmdScaladocTests @@ -16,7 +16,7 @@ DOTTY_NONBOOTSTRAPPED_VERSION=$(eval $DOTTY_NONBOOTSTRAPPED_VERSION_COMMAND | ta DOTTY_BOOTSTRAPPED_VERSION_COMMAND="$SBT \"eval println(Build.dottyVersion)\"" DOTTY_BOOTSTRAPPED_VERSION=$(eval $DOTTY_BOOTSTRAPPED_VERSION_COMMAND | tail -n 2 | head -n 1) -SOURCE_LINKS_REPOSITORY="lampepfl/dotty" +SOURCE_LINKS_REPOSITORY="scala/scala3" SOURCE_LINKS_VERSION="${GITHUB_SHA:-$DOTTY_BOOTSTRAPPED_VERSION}" "$SBT" "scaladoc/generateTestcasesDocumentation" > "$tmp" 2>&1 || echo "generated testcases project with sbt" @@ -30,7 +30,7 @@ dist/target/pack/bin/scaladoc \ "-skip-by-regex:.+\.internal($|\..+)" \ "-skip-by-regex:.+\.impl($|\..+)" \ -project-logo docs/_assets/images/logo.svg \ - -social-links:github::https://github.com/lampepfl/dotty,discord::https://discord.com/invite/scala,twitter::https://twitter.com/scala_lang \ + -social-links:github::https://github.com/scala/scala3,discord::https://discord.com/invite/scala,twitter::https://twitter.com/scala_lang \ -Ygenerate-inkuire \ "-skip-by-id:scala.runtime.stdLibPatches" \ "-skip-by-id:scala.runtime.MatchCase" \ @@ -42,4 +42,4 @@ dist/target/pack/bin/scaladoc \ -author -groups -revision main -project-version "${DOTTY_BOOTSTRAPPED_VERSION}" \ "-quick-links:Learn::https://docs.scala-lang.org/,Install::https://www.scala-lang.org/download/,Playground::https://scastie.scala-lang.org,Find A Library::https://index.scala-lang.org,Community::https://www.scala-lang.org/community/,Blog::https://www.scala-lang.org/blog/," \ out/bootstrap/scaladoc-testcases/scala-"${DOTTY_NONBOOTSTRAPPED_VERSION}"/classes > "$tmp" 2>&1 || echo "generated testcases project with scripts" -diff -rq "$OUT1" "scaladoc/output/testcases" +diff -r "$OUT1" "scaladoc/output/testcases" From 08be98c44d14c0f877ce69fee9b551e7cb15e87f Mon Sep 17 00:00:00 2001 From: Hamza REMMAL Date: Fri, 1 Mar 2024 20:40:08 +0100 Subject: [PATCH 190/371] Update links in the repository to scala/scala3 --- .github/ISSUE_TEMPLATE/feature.md | 4 +- .github/ISSUE_TEMPLATE/improve-error.md | 6 +- .github/workflows/scripts/publish-sdkman.sh | 4 +- MAINTENANCE.md | 6 +- NOTICE.md | 2 +- README.md | 4 +- bench/profiles/compiletime.yml | 12 ++-- bench/profiles/default.yml | 2 +- bench/profiles/empty.yml | 8 +-- bench/profiles/exhaustivity.yml | 20 +++---- bench/profiles/implicits.yml | 10 ++-- bench/profiles/misc.yml | 4 +- bench/profiles/projects.yml | 2 +- bench/profiles/pull.yml | 2 +- bench/profiles/quotes.yml | 8 +-- bench/profiles/sbt.yml | 2 +- bench/profiles/tuples.yml | 16 ++--- bench/profiles/typing.yml | 4 +- .../tools/backend/jvm/BCodeHelpers.scala | 4 +- .../tools/backend/jvm/BCodeSkelBuilder.scala | 4 +- .../backend/jvm/DottyBackendInterface.scala | 2 +- compiler/src/dotty/tools/dotc/Driver.scala | 6 +- compiler/src/dotty/tools/dotc/ast/Trees.scala | 2 +- .../tools/dotc/interactive/Interactive.scala | 2 +- .../tools/dotc/interactive/SourceTree.scala | 2 +- compiler/src/dotty/tools/dotc/report.scala | 2 +- .../tools/dotc/transform/ExpandPrivate.scala | 4 +- .../dotc/transform/InstrumentCoverage.scala | 2 +- .../dotty/tools/dotc/typer/ProtoTypes.scala | 2 +- compiler/test/dotty/tools/io/PathTest.scala | 2 +- dist/bin/scalac | 2 +- dist/bin/scalac.bat | 2 +- .../2016-02-17-scaling-dot-soundness.md | 8 +-- .../_posts/2016-05-05-multiversal-equality.md | 4 +- .../2016-12-05-implicit-function-types.md | 4 +- ...017-05-31-first-dotty-milestone-release.md | 8 +-- ...17-07-12-second-dotty-milestone-release.md | 60 +++++++++---------- ...017-09-07-third-dotty-milestone-release.md | 26 ++++---- ...17-10-16-fourth-dotty-milestone-release.md | 14 ++--- ...017-12-01-fifth-dotty-milestone-release.md | 12 ++-- ...8-03-05-seventh-dotty-milestone-release.md | 18 +++--- ...18-04-27-eighth-dotty-milestone-release.md | 16 ++--- ...018-07-06-ninth-dotty-milestone-release.md | 20 +++---- ...2018-10-10-10th-dotty-milestone-release.md | 10 ++-- ...2018-11-30-11th-dotty-milestone-release.md | 8 +-- ...2019-01-21-12th-dotty-milestone-release.md | 8 +-- ...2019-03-05-13th-dotty-milestone-release.md | 32 +++++----- ...2019-04-15-14th-dotty-milestone-release.md | 6 +- ...2019-05-23-15th-dotty-milestone-release.md | 16 ++--- ...2019-06-11-16th-dotty-milestone-release.md | 18 +++--- ...2019-07-25-17th-dotty-milestone-release.md | 14 ++--- ...2019-08-30-18th-dotty-milestone-release.md | 30 +++++----- ...2019-09-23-19th-dotty-milestone-release.md | 6 +- ...2019-11-04-20th-dotty-milestone-release.md | 12 ++-- ...2019-12-20-21th-dotty-milestone-release.md | 8 +-- ...2020-02-05-22nd-dotty-milestone-release.md | 18 +++--- ...2020-03-18-23rd-dotty-milestone-release.md | 6 +- ...2020-04-29-24th-dotty-milestone-release.md | 8 +-- ...2020-06-22-25th-dotty-milestone-release.md | 8 +-- ...2020-07-27-26th-dotty-milestone-release.md | 10 ++-- ...2020-08-31-27th-dotty-milestone-release.md | 24 ++++---- docs/_blog/_posts/2020-11-09-scala3-m1.md | 26 ++++---- docs/_blog/_posts/2020-12-18-scala3-m3.md | 42 ++++++------- docs/_blog/_posts/2021-02-17-scala3-rc1.md | 50 ++++++++-------- docs/_blog/_posts/2021-03-31-scala3-rc2.md | 24 ++++---- docs/_blog/_posts/2021-04-21-scala3-rc3.md | 24 ++++---- .../2021-06-07-scala3.0.1-rc1-release.md | 42 ++++++------- docs/_blog/_posts/2021-06-25-scala301-rc2.md | 6 +- .../contributing/architecture/context.md | 2 +- .../contributing/architecture/lifecycle.md | 16 ++--- .../_docs/contributing/architecture/phases.md | 40 ++++++------- .../contributing/architecture/symbols.md | 16 ++--- docs/_docs/contributing/architecture/time.md | 10 ++-- docs/_docs/contributing/architecture/types.md | 4 +- docs/_docs/contributing/community-build.md | 6 +- .../contributing/debugging/ide-debugging.md | 8 +-- .../contributing/debugging/inspection.md | 6 +- .../contributing/debugging/other-debugging.md | 16 ++--- docs/_docs/contributing/getting-started.md | 6 +- docs/_docs/contributing/index.md | 4 +- docs/_docs/contributing/issues/areas.md | 28 ++++----- docs/_docs/contributing/issues/cause.md | 8 +-- docs/_docs/contributing/issues/reproduce.md | 4 +- docs/_docs/contributing/procedures/release.md | 12 ++-- docs/_docs/contributing/sending-in-a-pr.md | 6 +- .../_docs/contributing/setting-up-your-ide.md | 2 +- docs/_docs/contributing/testing.md | 8 +-- docs/_docs/internals/coverage.md | 2 +- docs/_docs/internals/debug-macros.md | 2 +- docs/_docs/internals/dotc-scalac.md | 6 +- docs/_docs/internals/overall-structure.md | 14 ++--- docs/_docs/internals/periods.md | 6 +- docs/_docs/internals/type-system.md | 8 +-- .../changed-features/eta-expansion-spec.md | 2 +- .../implicit-conversions-spec.md | 2 +- .../changed-features/structural-types-spec.md | 4 +- .../contextual/by-name-context-parameters.md | 2 +- .../contextual/multiversal-equality.md | 6 +- .../reference/dropped-features/auto-apply.md | 2 +- .../dropped-features/type-projection.md | 2 +- docs/_docs/reference/enums/adts.md | 2 +- docs/_docs/reference/enums/enums.md | 6 +- .../reference/experimental/explicit-nulls.md | 2 +- .../reference/experimental/tupled-function.md | 6 +- .../reference/features-classification.md | 6 +- .../metaprogramming/compiletime-ops.md | 4 +- .../dependent-function-types-spec.md | 6 +- .../new-types/intersection-types-spec.md | 2 +- .../new-types/polymorphic-function-types.md | 2 +- .../other-new-features/open-classes.md | 2 +- .../parameter-untupling-spec.md | 2 +- .../other-new-features/parameter-untupling.md | 2 +- docs/_docs/reference/overview.md | 2 +- docs/_docs/release-notes-0.1.2.md | 40 ++++++------- .../dropped-features/auto-apply.md | 2 +- docs/_spec/APPLIEDreference/enums/enums.md | 2 +- .../APPLIEDreference/new-types/union-types.md | 4 +- .../changed-features/eta-expansion-spec.md | 2 +- .../implicit-conversions-spec.md | 2 +- .../changed-features/structural-types-spec.md | 4 +- .../contextual/by-name-context-parameters.md | 2 +- .../contextual/multiversal-equality.md | 6 +- .../dropped-features/type-projection.md | 2 +- .../experimental/explicit-nulls.md | 2 +- .../experimental/tupled-function.md | 6 +- .../TODOreference/features-classification.md | 6 +- .../metaprogramming/compiletime-ops.md | 4 +- .../dependent-function-types-spec.md | 6 +- .../new-types/polymorphic-function-types.md | 2 +- .../parameter-untupling-spec.md | 2 +- .../other-new-features/parameter-untupling.md | 2 +- docs/_spec/TODOreference/overview.md | 2 +- docs/_spec/_layouts/default.yml | 2 +- docs/_spec/_layouts/toc.yml | 2 +- .../runtime/stdLibPatches/language.scala | 2 +- .../main/dotty/tools/pc/HoverProvider.scala | 2 +- .../dotty/tools/pc/MetalsInteractive.scala | 2 +- .../src/main/dotty/tools/pc/PcCollector.scala | 6 +- .../dotty/tools/pc/SemanticdbSymbols.scala | 4 +- .../pc/completions/CompletionProvider.scala | 2 +- .../pc/tests/completion/CompletionSuite.scala | 2 +- .../tools/pc/tests/hover/HoverTermSuite.scala | 2 +- .../tools/pc/tests/hover/HoverTypeSuite.scala | 4 +- .../signaturehelp/SignatureHelpSuite.scala | 2 +- project/scripts/bootstrappedOnlyCmdTests | 4 +- .../tools/xsbt/CompilerBridgeDriver.java | 2 +- sbt-bridge/src/xsbt/CachedCompilerImpl.java | 2 +- sbt-test/sbt-bridge/zinc-13-compat/test | 2 +- .../src/main/scala/MultiversalEquality.scala | 2 +- .../inspector/src/main/scala/main.scala | 2 +- .../malformed-class-name-with-dollar/test | 2 +- .../src/tests/nonScala3Parent.scala | 2 +- .../tools/scaladoc/tasty/TypesSupport.scala | 2 +- .../dotty/tools/scaladoc/ScaladocTest.scala | 2 +- tests/explicit-nulls/pos/nn2.scala | 2 +- tests/init/pos/i9795.scala | 4 +- tests/neg/i11118.scala | 2 +- tests/neg/i4060.scala | 2 +- tests/pos-macros/i9361.scala | 2 +- .../backend/jvm/BCodeHelpers.scala | 4 +- .../backend/jvm/BCodeSkelBuilder.scala | 4 +- .../backend/jvm/DottyBackendInterface.scala | 2 +- tests/pos/erasure-array.scala | 2 +- tests/pos/i10242.scala | 2 +- tests/pos/i11681.scala | 2 +- tests/pos/i12663.scala | 2 +- tests/pos/i12679.scala | 2 +- tests/pos/i14096.scala | 2 +- tests/pos/i14271.scala | 2 +- tests/pos/i14278.scala | 2 +- tests/pos/i14642.scala | 2 +- tests/pos/i14830.scala | 2 +- tests/pos/i15546.scala | 2 +- tests/pos/i5700.scala | 2 +- tests/pos/i7414.scala | 2 +- tests/pos/i7445a.scala | 2 +- tests/pos/i7445b.scala | 2 +- tests/pos/i7653.scala | 2 +- tests/pos/i7790.scala | 2 +- tests/pos/i7807.scala | 2 +- tests/pos/i8300.scala | 2 +- tests/pos/kind-projector.scala | 2 +- tests/run-macros/f-interpolator-tests.scala | 2 +- tests/run/i11583.scala | 2 +- tests/run/i11706.scala | 2 +- tests/run/i12032.scala | 2 +- tests/run/i13216.scala | 2 +- tests/run/i13334.scala | 2 +- tests/run/i13691b.scala | 2 +- tests/run/i14582.scala | 2 +- tests/run/i15913.scala | 2 +- tests/run/i4192/TestCases.scala | 2 +- tests/run/i4496b.scala | 2 +- tests/semanticdb/expect/Advanced.expect.scala | 2 +- tests/semanticdb/expect/Advanced.scala | 2 +- 195 files changed, 672 insertions(+), 672 deletions(-) diff --git a/.github/ISSUE_TEMPLATE/feature.md b/.github/ISSUE_TEMPLATE/feature.md index 52f8010c372e..afbaa8020e07 100644 --- a/.github/ISSUE_TEMPLATE/feature.md +++ b/.github/ISSUE_TEMPLATE/feature.md @@ -1,10 +1,10 @@ --- name: "\U0001F389 Suggest a feature" -about: Please create a feature request here https://github.com/lampepfl/dotty/discussions/new?category=feature-requests +about: Please create a feature request here https://github.com/scala/scala3/discussions/new?category=feature-requests title: '' labels: '' assignees: '' --- -Please create a feature request in the [Dotty Discussions](https://github.com/lampepfl/dotty/discussions/new?category=feature-requests). +Please create a feature request in the [Dotty Discussions](https://github.com/scala/scala3/discussions/new?category=feature-requests). diff --git a/.github/ISSUE_TEMPLATE/improve-error.md b/.github/ISSUE_TEMPLATE/improve-error.md index 918196e1ec53..72d556951e53 100644 --- a/.github/ISSUE_TEMPLATE/improve-error.md +++ b/.github/ISSUE_TEMPLATE/improve-error.md @@ -19,7 +19,7 @@ This code should be self-contained, reproducible (i.e. produces the expected err Ideally, we should be able to just copy this code in a file and run `scalac` (and maybe `scala`) to reproduce the issue. -For a good example, see https://github.com/lampepfl/dotty/issues/18657 +For a good example, see https://github.com/scala/scala3/issues/18657 --> ```Scala @@ -44,12 +44,12 @@ for example: ## Why this Error/Warning was not helpful - + The message was unhelpful because... ## Suggested improvement - + It could be made more helpful by... diff --git a/.github/workflows/scripts/publish-sdkman.sh b/.github/workflows/scripts/publish-sdkman.sh index 70987bff175b..f959c426e9d8 100755 --- a/.github/workflows/scripts/publish-sdkman.sh +++ b/.github/workflows/scripts/publish-sdkman.sh @@ -10,8 +10,8 @@ set -u # latest stable dotty version -DOTTY_VERSION=$(curl -s https://api.github.com/repos/lampepfl/dotty/releases/latest | grep '"tag_name":' | sed -E 's/.*"([^"]+)".*/\1/') -DOTTY_URL="https://github.com/lampepfl/dotty/releases/download/$DOTTY_VERSION/scala3-$DOTTY_VERSION.zip" +DOTTY_VERSION=$(curl -s https://api.github.com/repos/scala/scala3/releases/latest | grep '"tag_name":' | sed -E 's/.*"([^"]+)".*/\1/') +DOTTY_URL="https://github.com/scala/scala3/releases/download/$DOTTY_VERSION/scala3-$DOTTY_VERSION.zip" # checking if dotty version is available if ! curl --output /dev/null --silent --head --fail "$DOTTY_URL"; then diff --git a/MAINTENANCE.md b/MAINTENANCE.md index 1e80f891e987..fd14bab68153 100644 --- a/MAINTENANCE.md +++ b/MAINTENANCE.md @@ -15,11 +15,11 @@ The issue supervisor is responsible for: - Modifying issue labels to best capture information about the issues - Attempting to reproduce the issue (or label “stat:cannot reproduce”) - Further minimizing the issue or asking the reporter of the issue to minimize it correctly (or label “stat:needs minimization”) - - Identifying which issues are of considerable importance and bringing them to the attention of the team during the Dotty meeting, where they can be filtered and added to the [Future Versions](https://github.com/lampepfl/dotty/milestone/46) milestone. + - Identifying which issues are of considerable importance and bringing them to the attention of the team during the Dotty meeting, where they can be filtered and added to the [Future Versions](https://github.com/scala/scala3/milestone/46) milestone. - Identifying if a report is really a feature request and if so, converting it to - a [feature request discussion](https://github.com/lampepfl/dotty/discussions/categories/feature-requests). + a [feature request discussion](https://github.com/scala/scala3/discussions/categories/feature-requests). - Keeping an eye on new -[discussions](https://github.com/lampepfl/dotty/discussions), making sure they +[discussions](https://github.com/scala/scala3/discussions), making sure they don't go unanswered and also correctly labeling new feature requests. Other core teammates are responsible for providing information to the issue supervisor in a timely manner when it is requested if they have that information. diff --git a/NOTICE.md b/NOTICE.md index e9b64ac262f2..fd931397a500 100644 --- a/NOTICE.md +++ b/NOTICE.md @@ -104,4 +104,4 @@ major authors were omitted by oversight. [3] https://github.com/sbt/sbt/tree/0.13/compile/interface/src/main/scala/xsbt [4] https://github.com/scoverage/scalac-scoverage-plugin [5] https://github.com/scalameta/metals -[6] https://github.com/lampepfl/dotty/pull/5783/files +[6] https://github.com/scala/scala3/pull/5783/files diff --git a/README.md b/README.md index 1e62e90a1845..6c3212f0676b 100644 --- a/README.md +++ b/README.md @@ -1,6 +1,6 @@ Dotty ===== -[![Dotty CI](https://github.com/lampepfl/dotty/workflows/Dotty/badge.svg?branch=master)](https://github.com/lampepfl/dotty/actions?query=branch%3Amain) +[![Dotty CI](https://github.com/scala/scala3/workflows/Dotty/badge.svg?branch=master)](https://github.com/scala/scala3/actions?query=branch%3Amain) [![Join the chat at https://discord.com/invite/scala](https://img.shields.io/discord/632150470000902164)](https://discord.com/invite/scala) * [Documentation](https://docs.scala-lang.org/scala3/) @@ -23,7 +23,7 @@ other more direct lines of communication such as email. How to Contribute ================= * [Getting Started as Contributor](https://docs.scala-lang.org/scala3/guides/contribution/contribution-intro.html) -* [Issues](https://github.com/lampepfl/dotty/issues?q=is%3Aissue+is%3Aopen+label%3A%22help+wanted%22) +* [Issues](https://github.com/scala/scala3/issues?q=is%3Aissue+is%3Aopen+label%3A%22help+wanted%22) License ======= diff --git a/bench/profiles/compiletime.yml b/bench/profiles/compiletime.yml index fd77df7dfe9a..520af3750aa4 100644 --- a/bench/profiles/compiletime.yml +++ b/bench/profiles/compiletime.yml @@ -1,31 +1,31 @@ charts: - name: "Compile-time sums of constant integer types (generated)" - url: https://github.com/lampepfl/dotty/blob/main/bench/src/main/scala/generateBenchmarks.scala + url: https://github.com/scala/scala3/blob/main/bench/src/main/scala/generateBenchmarks.scala lines: - key: compiletime-sum-constants label: bootstrapped - name: "Compile-time sums of term reference types (generated)" - url: https://github.com/lampepfl/dotty/blob/main/bench/src/main/scala/generateBenchmarks.scala + url: https://github.com/scala/scala3/blob/main/bench/src/main/scala/generateBenchmarks.scala lines: - key: compiletime-sum-termrefs label: bootstrapped - name: "Sums of term references, result type inferred (generated)" - url: https://github.com/lampepfl/dotty/blob/main/bench/src/main/scala/generateBenchmarks.scala + url: https://github.com/scala/scala3/blob/main/bench/src/main/scala/generateBenchmarks.scala lines: - key: compiletime-sum-termrefs-terms label: bootstrapped - name: "Compile-time sums of type applications (generated)" - url: https://github.com/lampepfl/dotty/blob/main/bench/src/main/scala/generateBenchmarks.scala + url: https://github.com/scala/scala3/blob/main/bench/src/main/scala/generateBenchmarks.scala lines: - key: compiletime-sum-applications label: bootstrapped - name: "Compile-time additions inside multiplications (generated)" - url: https://github.com/lampepfl/dotty/blob/main/bench/src/main/scala/generateBenchmarks.scala + url: https://github.com/scala/scala3/blob/main/bench/src/main/scala/generateBenchmarks.scala lines: - key: compiletime-distribute label: bootstrapped @@ -48,4 +48,4 @@ scripts: - measure 6 6 7 1 $PROG_HOME/dotty/bench/tests-generated/compiletime-ops/distribute.scala config: - pr_base_url: "https://github.com/lampepfl/dotty/pull/" + pr_base_url: "https://github.com/scala/scala3/pull/" diff --git a/bench/profiles/default.yml b/bench/profiles/default.yml index 22ed6d5f31df..5867816217fe 100644 --- a/bench/profiles/default.yml +++ b/bench/profiles/default.yml @@ -11,4 +11,4 @@ includes: config: - pr_base_url: "https://github.com/lampepfl/dotty/pull/" + pr_base_url: "https://github.com/scala/scala3/pull/" diff --git a/bench/profiles/empty.yml b/bench/profiles/empty.yml index ac571e64e831..108150d3934e 100644 --- a/bench/profiles/empty.yml +++ b/bench/profiles/empty.yml @@ -1,19 +1,19 @@ charts: - name: "empty class" - url: https://github.com/lampepfl/dotty/blob/main/tests/bench/empty-class.scala + url: https://github.com/scala/scala3/blob/main/tests/bench/empty-class.scala lines: - key: empty-class label: bootstrapped - name: "empty object" - url: https://github.com/lampepfl/dotty/blob/main/tests/bench/empty-object.scala + url: https://github.com/scala/scala3/blob/main/tests/bench/empty-object.scala lines: - key: empty-object label: bootstrapped - name: "empty file" - url: https://github.com/lampepfl/dotty/blob/main/tests/bench/empty-file.scala + url: https://github.com/scala/scala3/blob/main/tests/bench/empty-file.scala lines: - key: empty-file label: bootstrapped @@ -30,4 +30,4 @@ scripts: - measure $PROG_HOME/dotty/tests/bench/empty-file.scala config: - pr_base_url: "https://github.com/lampepfl/dotty/pull/" + pr_base_url: "https://github.com/scala/scala3/pull/" diff --git a/bench/profiles/exhaustivity.yml b/bench/profiles/exhaustivity.yml index af6eb4041f6c..47710f7cb39f 100644 --- a/bench/profiles/exhaustivity.yml +++ b/bench/profiles/exhaustivity.yml @@ -1,54 +1,54 @@ charts: - name: "exhaustivity check" - url: https://github.com/lampepfl/dotty/blob/main/tests/bench/patmatexhaust.scala + url: https://github.com/scala/scala3/blob/main/tests/bench/patmatexhaust.scala lines: - key: patmatexhaust label: bootstrapped - name: "exhaustivity I" - url: https://github.com/lampepfl/dotty/blob/main/tests/bench/exhaustivity-I.scala + url: https://github.com/scala/scala3/blob/main/tests/bench/exhaustivity-I.scala lines: - key: exhaustivity-I label: bootstrapped - name: "exhaustivity S" - url: https://github.com/lampepfl/dotty/blob/main/tests/bench/exhaustivity-S.scala + url: https://github.com/scala/scala3/blob/main/tests/bench/exhaustivity-S.scala lines: - key: exhaustivity-S label: bootstrapped - name: "exhaustivity T" - url: https://github.com/lampepfl/dotty/blob/main/tests/bench/exhaustivity-T.scala + url: https://github.com/scala/scala3/blob/main/tests/bench/exhaustivity-T.scala lines: - key: exhaustivity-T label: bootstrapped - name: "exhaustivity V" - url: https://github.com/lampepfl/dotty/blob/main/tests/bench/exhaustivity-V.scala + url: https://github.com/scala/scala3/blob/main/tests/bench/exhaustivity-V.scala lines: - key: exhaustivity-V label: bootstrapped - name: "exhaustivity MIPS" - url: https://github.com/lampepfl/dotty/blob/main/tests/patmat/i7186.scala + url: https://github.com/scala/scala3/blob/main/tests/patmat/i7186.scala lines: - key: exhaustivity-mips label: bootstrapped - name: "exhaustivity i12241" - url: https://github.com/lampepfl/dotty/blob/main/tests/patmat/i12241.scala + url: https://github.com/scala/scala3/blob/main/tests/patmat/i12241.scala lines: - key: exhaustivity-i12241 label: bootstrapped - name: "exhaustivity i12358" - url: https://github.com/lampepfl/dotty/blob/main/tests/patmat/i12358.scala + url: https://github.com/scala/scala3/blob/main/tests/patmat/i12358.scala lines: - key: exhaustivity-i12358 label: bootstrapped - name: "exhaustivity i13565" - url: https://github.com/lampepfl/dotty/blob/main/tests/pos/i13565.scala + url: https://github.com/scala/scala3/blob/main/tests/pos/i13565.scala lines: - key: exhaustivity-i13565 label: bootstrapped @@ -83,4 +83,4 @@ scripts: - measure 20 40 3 $PROG_HOME/dotty/tests/pos/i13565.scala config: - pr_base_url: "https://github.com/lampepfl/dotty/pull/" + pr_base_url: "https://github.com/scala/scala3/pull/" diff --git a/bench/profiles/implicits.yml b/bench/profiles/implicits.yml index 3e944b5be28b..ff7f8c34b872 100644 --- a/bench/profiles/implicits.yml +++ b/bench/profiles/implicits.yml @@ -1,6 +1,6 @@ charts: - name: "implicit cache I" - url: https://github.com/lampepfl/dotty/blob/main/tests/bench/implicit_cache.scala + url: https://github.com/scala/scala3/blob/main/tests/bench/implicit_cache.scala lines: - key: implicit-cache label: bootstrapped @@ -8,7 +8,7 @@ charts: label: from tasty - name: "implicit cache II" - url: https://github.com/lampepfl/dotty/blob/main/tests/bench/implicitNums.scala + url: https://github.com/scala/scala3/blob/main/tests/bench/implicitNums.scala lines: - key: implicitNums label: bootstrapped @@ -16,13 +16,13 @@ charts: label: from tasty - name: "implicit scope loop" - url: https://github.com/lampepfl/dotty/blob/main/tests/bench/implicit-scope-loop.scala + url: https://github.com/scala/scala3/blob/main/tests/bench/implicit-scope-loop.scala lines: - key: implicit-scope-loop label: bootstrapped - name: "inductive implicits" - url: https://github.com/lampepfl/dotty/blob/main/tests/bench/inductive-implicits.scala + url: https://github.com/scala/scala3/blob/main/tests/bench/inductive-implicits.scala lines: - key: inductive-implicits label: bootstrapped @@ -48,4 +48,4 @@ scripts: - source $PROG_HOME/dotty/bench/scripts/implicitNums-from-tasty config: - pr_base_url: "https://github.com/lampepfl/dotty/pull/" + pr_base_url: "https://github.com/scala/scala3/pull/" diff --git a/bench/profiles/misc.yml b/bench/profiles/misc.yml index 668f8e60c176..7ef168a0eea9 100644 --- a/bench/profiles/misc.yml +++ b/bench/profiles/misc.yml @@ -1,13 +1,13 @@ charts: - name: "issue #1535" - url: https://github.com/lampepfl/dotty/blob/main/tests/bench/i1535.scala + url: https://github.com/scala/scala3/blob/main/tests/bench/i1535.scala lines: - key: i1535 label: bootstrapped - name: "issue #1687" - url: https://github.com/lampepfl/dotty/blob/main/tests/bench/i1687.scala + url: https://github.com/scala/scala3/blob/main/tests/bench/i1687.scala lines: - key: i1687 label: bootstrapped diff --git a/bench/profiles/projects.yml b/bench/profiles/projects.yml index f1133d180c54..72e506290bad 100644 --- a/bench/profiles/projects.yml +++ b/bench/profiles/projects.yml @@ -41,4 +41,4 @@ scripts: - source $PROG_HOME/dotty/bench/scripts/stdlib213 config: - pr_base_url: "https://github.com/lampepfl/dotty/pull/" + pr_base_url: "https://github.com/scala/scala3/pull/" diff --git a/bench/profiles/pull.yml b/bench/profiles/pull.yml index 163d75a8769d..4760e27daf95 100644 --- a/bench/profiles/pull.yml +++ b/bench/profiles/pull.yml @@ -5,4 +5,4 @@ includes: - empty.yml config: - pr_base_url: "https://github.com/lampepfl/dotty/pull/" + pr_base_url: "https://github.com/scala/scala3/pull/" diff --git a/bench/profiles/quotes.yml b/bench/profiles/quotes.yml index afd970543aa1..454cd0dc5faa 100644 --- a/bench/profiles/quotes.yml +++ b/bench/profiles/quotes.yml @@ -1,18 +1,18 @@ charts: - name: "Inline a quote" - url: https://github.com/lampepfl/dotty/blob/main/tests/bench/power-macro/PowerInlined-1.scala + url: https://github.com/scala/scala3/blob/main/tests/bench/power-macro/PowerInlined-1.scala lines: - key: power-macro-power-inlined-1 label: bootstrapped - name: "Inline 1k quotes" - url: https://github.com/lampepfl/dotty/blob/main/tests/bench/power-macro/PowerInlined-1k.scala + url: https://github.com/scala/scala3/blob/main/tests/bench/power-macro/PowerInlined-1k.scala lines: - key: power-macro-power-inlined-1k label: bootstrapped - name: "Quote String interpolation matching" - url: https://github.com/lampepfl/dotty/blob/main/tests/bench/string-interpolation-macro/Test.scala + url: https://github.com/scala/scala3/blob/main/tests/bench/string-interpolation-macro/Test.scala lines: - key: quote-string-interpolation-matching label: bootstrapped @@ -29,4 +29,4 @@ scripts: - source $PROG_HOME/dotty/bench/scripts/quote-string-interpolation-matching config: - pr_base_url: "https://github.com/lampepfl/dotty/pull/" + pr_base_url: "https://github.com/scala/scala3/pull/" diff --git a/bench/profiles/sbt.yml b/bench/profiles/sbt.yml index 3ab0e43f3db2..653b10381959 100644 --- a/bench/profiles/sbt.yml +++ b/bench/profiles/sbt.yml @@ -12,4 +12,4 @@ scripts: - measure -with-compiler -Yforce-sbt-phases -with-dotty $(find $PROG_HOME/dotty/compiler/src/dotty -name *.scala -o -name *.java) config: - pr_base_url: "https://github.com/lampepfl/dotty/pull/" + pr_base_url: "https://github.com/scala/scala3/pull/" diff --git a/bench/profiles/tuples.yml b/bench/profiles/tuples.yml index 5e41ecf7c80d..24bf76f786cc 100644 --- a/bench/profiles/tuples.yml +++ b/bench/profiles/tuples.yml @@ -1,42 +1,42 @@ charts: - name: "Tuple22 creation with Tuple22.apply" - url: https://github.com/lampepfl/dotty/blob/main/tests/bench/tuple22-creation-apply.scala + url: https://github.com/scala/scala3/blob/main/tests/bench/tuple22-creation-apply.scala lines: - key: tuple22-creation-apply label: bootstrapped - name: "Tuple22 creation with *:" - url: https://github.com/lampepfl/dotty/blob/main/tests/bench/tuple22-creation-cons.scala + url: https://github.com/scala/scala3/blob/main/tests/bench/tuple22-creation-cons.scala lines: - key: tuple22-creation-cons label: bootstrapped - name: "Tuple22.tail" - url: https://github.com/lampepfl/dotty/blob/main/tests/bench/tuple22-tails.scala + url: https://github.com/scala/scala3/blob/main/tests/bench/tuple22-tails.scala lines: - key: tuple22-tails label: bootstrapped - name: "Tuple22.apply" - url: https://github.com/lampepfl/dotty/blob/main/tests/bench/tuple22-apply.scala + url: https://github.com/scala/scala3/blob/main/tests/bench/tuple22-apply.scala lines: - key: tuple22-apply label: bootstrapped - name: "Tuple22.size" - url: https://github.com/lampepfl/dotty/blob/main/tests/bench/tuple22-size.scala + url: https://github.com/scala/scala3/blob/main/tests/bench/tuple22-size.scala lines: - key: tuple22-size label: bootstrapped - name: "Tuple reverse (Runtime)" - url: https://github.com/lampepfl/dotty/blob/main/bench-run/src/main/scala/dotty/tools/benchmarks/tuples/TupleOps.scala#L59 + url: https://github.com/scala/scala3/blob/main/bench-run/src/main/scala/dotty/tools/benchmarks/tuples/TupleOps.scala#L59 lines: - key: tuple-reverse label: bootstrapped - name: "Tuple flatMap (Runtime)" - url: https://github.com/lampepfl/dotty/blob/main/bench-run/src/main/scala/dotty/tools/benchmarks/tuples/TupleOps.scala#L64 + url: https://github.com/scala/scala3/blob/main/bench-run/src/main/scala/dotty/tools/benchmarks/tuples/TupleOps.scala#L64 lines: - key: tuple-flatMap label: bootstrapped @@ -65,4 +65,4 @@ scripts: - measure-run TupleOps.flatMap config: - pr_base_url: "https://github.com/lampepfl/dotty/pull/" + pr_base_url: "https://github.com/scala/scala3/pull/" diff --git a/bench/profiles/typing.yml b/bench/profiles/typing.yml index f6476bca7006..929438cb5a93 100644 --- a/bench/profiles/typing.yml +++ b/bench/profiles/typing.yml @@ -1,6 +1,6 @@ charts: - name: "Find Ref" - url: https://github.com/lampepfl/dotty/blob/main/tests/bench/FindRef.scala + url: https://github.com/scala/scala3/blob/main/tests/bench/FindRef.scala lines: - key: find-ref label: bootstrapped @@ -11,4 +11,4 @@ scripts: - measure $PROG_HOME/dotty/tests/bench/FindRef.scala config: - pr_base_url: "https://github.com/lampepfl/dotty/pull/" + pr_base_url: "https://github.com/scala/scala3/pull/" diff --git a/compiler/src/dotty/tools/backend/jvm/BCodeHelpers.scala b/compiler/src/dotty/tools/backend/jvm/BCodeHelpers.scala index 2ad58fea4cd1..385521e2785f 100644 --- a/compiler/src/dotty/tools/backend/jvm/BCodeHelpers.scala +++ b/compiler/src/dotty/tools/backend/jvm/BCodeHelpers.scala @@ -754,7 +754,7 @@ trait BCodeHelpers extends BCodeIdiomatic { case tp => report.warning( s"an unexpected type representation reached the compiler backend while compiling ${ctx.compilationUnit}: $tp. " + - "If possible, please file a bug on https://github.com/lampepfl/dotty/issues") + "If possible, please file a bug on https://github.com/scala/scala3/issues") tp match { case tp: ThisType if tp.cls == defn.ArrayClass => ObjectRef.asInstanceOf[ct.bTypes.ClassBType] // was introduced in 9b17332f11 to fix SI-999, but this code is not reached in its test, or any other test @@ -795,7 +795,7 @@ trait BCodeHelpers extends BCodeIdiomatic { report.error( em"""|compiler bug: created invalid generic signature for $sym in ${sym.denot.owner.showFullName} |signature: $sig - |if this is reproducible, please report bug at https://github.com/lampepfl/dotty/issues + |if this is reproducible, please report bug at https://github.com/scala/scala3/issues """, sym.sourcePos) throw ex } diff --git a/compiler/src/dotty/tools/backend/jvm/BCodeSkelBuilder.scala b/compiler/src/dotty/tools/backend/jvm/BCodeSkelBuilder.scala index 0ab9ed85b6cf..394700c2898e 100644 --- a/compiler/src/dotty/tools/backend/jvm/BCodeSkelBuilder.scala +++ b/compiler/src/dotty/tools/backend/jvm/BCodeSkelBuilder.scala @@ -191,7 +191,7 @@ trait BCodeSkelBuilder extends BCodeHelpers { // Should we do this transformation earlier, say in Constructors? Or would that just cause // pain for scala-{js, native}? // - // @sjrd (https://github.com/lampepfl/dotty/pull/9181#discussion_r457458205): + // @sjrd (https://github.com/scala/scala3/pull/9181#discussion_r457458205): // moving that before the back-end would make things significantly more complicated for // Scala.js and Native. Both have a first-class concept of ModuleClass, and encode the // singleton pattern of MODULE$ in a completely different way. In the Scala.js IR, there @@ -202,7 +202,7 @@ trait BCodeSkelBuilder extends BCodeHelpers { // TODO: remove `!f.name.is(LazyBitMapName)` once we change lazy val encoding - // https://github.com/lampepfl/dotty/issues/7140 + // https://github.com/scala/scala3/issues/7140 // // Lazy val encoding assumes bitmap fields are non-static // diff --git a/compiler/src/dotty/tools/backend/jvm/DottyBackendInterface.scala b/compiler/src/dotty/tools/backend/jvm/DottyBackendInterface.scala index 37045bda17ec..8016c2bfc209 100644 --- a/compiler/src/dotty/tools/backend/jvm/DottyBackendInterface.scala +++ b/compiler/src/dotty/tools/backend/jvm/DottyBackendInterface.scala @@ -126,7 +126,7 @@ object DottyBackendInterface { * See also `genPlainClass` in `BCodeSkelBuilder.scala`. * * TODO: remove the special handing of `LazyBitMapName` once we swtich to - * the new lazy val encoding: https://github.com/lampepfl/dotty/issues/7140 + * the new lazy val encoding: https://github.com/scala/scala3/issues/7140 */ def isStaticModuleField(using Context): Boolean = sym.owner.isStaticModuleClass && sym.isField && !sym.name.is(LazyBitMapName) && !sym.name.is(LazyLocalName) diff --git a/compiler/src/dotty/tools/dotc/Driver.scala b/compiler/src/dotty/tools/dotc/Driver.scala index fc5367d2ccba..196752aceb29 100644 --- a/compiler/src/dotty/tools/dotc/Driver.scala +++ b/compiler/src/dotty/tools/dotc/Driver.scala @@ -126,7 +126,7 @@ class Driver { * The trade-off is that you can only pass a SimpleReporter to this method * and not a normal Reporter which is more powerful. * - * Usage example: [[https://github.com/lampepfl/dotty/tree/master/compiler/test/dotty/tools/dotc/InterfaceEntryPointTest.scala]] + * Usage example: [[https://github.com/scala/scala3/tree/master/compiler/test/dotty/tools/dotc/InterfaceEntryPointTest.scala]] * * @param args Arguments to pass to the compiler. * @param simple Used to log errors, warnings, and info messages. @@ -143,7 +143,7 @@ class Driver { /** Principal entry point to the compiler. * - * Usage example: [[https://github.com/lampepfl/dotty/tree/master/compiler/test/dotty/tools/dotc/EntryPointsTest.scala.disabled]] + * Usage example: [[https://github.com/scala/scala3/tree/master/compiler/test/dotty/tools/dotc/EntryPointsTest.scala.disabled]] * in method `runCompiler` * * @param args Arguments to pass to the compiler. @@ -182,7 +182,7 @@ class Driver { * the other overloads cannot be overridden, instead you * should override this one which they call internally. * - * Usage example: [[https://github.com/lampepfl/dotty/tree/master/compiler/test/dotty/tools/dotc/EntryPointsTest.scala.disabled]] + * Usage example: [[https://github.com/scala/scala3/tree/master/compiler/test/dotty/tools/dotc/EntryPointsTest.scala.disabled]] * in method `runCompilerWithContext` * * @param args Arguments to pass to the compiler. diff --git a/compiler/src/dotty/tools/dotc/ast/Trees.scala b/compiler/src/dotty/tools/dotc/ast/Trees.scala index 4ec41b95a90b..c64b636648ee 100644 --- a/compiler/src/dotty/tools/dotc/ast/Trees.scala +++ b/compiler/src/dotty/tools/dotc/ast/Trees.scala @@ -1501,7 +1501,7 @@ object Trees { * It ensures that the source is correct, and that the local context is used if * that's necessary for transforming the whole tree. * TODO: ensure transform is always called with the correct context as argument - * @see https://github.com/lampepfl/dotty/pull/13880#discussion_r836395977 + * @see https://github.com/scala/scala3/pull/13880#discussion_r836395977 */ def transformCtx(tree: Tree)(using Context): Context = val sourced = diff --git a/compiler/src/dotty/tools/dotc/interactive/Interactive.scala b/compiler/src/dotty/tools/dotc/interactive/Interactive.scala index 928a9be6103b..5adb68a9e491 100644 --- a/compiler/src/dotty/tools/dotc/interactive/Interactive.scala +++ b/compiler/src/dotty/tools/dotc/interactive/Interactive.scala @@ -250,7 +250,7 @@ object Interactive { * Note that if the given `pos` points out places for incomplete parses, * this method returns `errorTermTree` (`Literal(Consotant(null)`). * - * @see https://github.com/lampepfl/dotty/issues/15294 + * @see https://github.com/scala/scala3/issues/15294 */ def pathTo(trees: List[SourceTree], pos: SourcePosition)(using Context): List[Tree] = pathTo(trees.map(_.tree), pos.span) diff --git a/compiler/src/dotty/tools/dotc/interactive/SourceTree.scala b/compiler/src/dotty/tools/dotc/interactive/SourceTree.scala index 258d92a2d1a8..b950d71eb045 100644 --- a/compiler/src/dotty/tools/dotc/interactive/SourceTree.scala +++ b/compiler/src/dotty/tools/dotc/interactive/SourceTree.scala @@ -33,7 +33,7 @@ case class SourceTree(tree: tpd.Import | tpd.NameTree, source: SourceFile) { } val position = { // FIXME: This is incorrect in some cases, like with backquoted identifiers, - // see https://github.com/lampepfl/dotty/pull/1634#issuecomment-257079436 + // see https://github.com/scala/scala3/pull/1634#issuecomment-257079436 val (start, end) = if (!treeSpan.isSynthetic) (treeSpan.point, treeSpan.point + nameLength) diff --git a/compiler/src/dotty/tools/dotc/report.scala b/compiler/src/dotty/tools/dotc/report.scala index 8e39afdd6e7d..a63b6569fefe 100644 --- a/compiler/src/dotty/tools/dotc/report.scala +++ b/compiler/src/dotty/tools/dotc/report.scala @@ -154,7 +154,7 @@ object report: | | An unhandled exception was thrown in the compiler. | Please file a crash report here: - | https://github.com/lampepfl/dotty/issues/new/choose + | https://github.com/scala/scala3/issues/new/choose | For non-enriched exceptions, compile with -Yno-enrich-error-messages. | |$info1 diff --git a/compiler/src/dotty/tools/dotc/transform/ExpandPrivate.scala b/compiler/src/dotty/tools/dotc/transform/ExpandPrivate.scala index fa2492a261d5..9a6a04621074 100644 --- a/compiler/src/dotty/tools/dotc/transform/ExpandPrivate.scala +++ b/compiler/src/dotty/tools/dotc/transform/ExpandPrivate.scala @@ -25,8 +25,8 @@ import ValueClasses.* * This is necessary since private methods are not allowed to have the same name * as inherited public ones. * - * See discussion in https://github.com/lampepfl/dotty/pull/784 - * and https://github.com/lampepfl/dotty/issues/783 + * See discussion in https://github.com/scala/scala3/pull/784 + * and https://github.com/scala/scala3/issues/783 */ class ExpandPrivate extends MiniPhase with IdentityDenotTransformer { thisPhase => import ast.tpd.* diff --git a/compiler/src/dotty/tools/dotc/transform/InstrumentCoverage.scala b/compiler/src/dotty/tools/dotc/transform/InstrumentCoverage.scala index eac44e982603..3ff72d61d41f 100644 --- a/compiler/src/dotty/tools/dotc/transform/InstrumentCoverage.scala +++ b/compiler/src/dotty/tools/dotc/transform/InstrumentCoverage.scala @@ -98,7 +98,7 @@ class InstrumentCoverage extends MacroTransform with IdentityDenotTransformer: start = pos.start, end = pos.end, // +1 to account for the line number starting at 1 - // the internal line number is 0-base https://github.com/lampepfl/dotty/blob/18ada516a85532524a39a962b2ddecb243c65376/compiler/src/dotty/tools/dotc/util/SourceFile.scala#L173-L176 + // the internal line number is 0-base https://github.com/scala/scala3/blob/18ada516a85532524a39a962b2ddecb243c65376/compiler/src/dotty/tools/dotc/util/SourceFile.scala#L173-L176 line = pos.line + 1, desc = sourceFile.content.slice(pos.start, pos.end).mkString, symbolName = tree.symbol.name.toSimpleName.show, diff --git a/compiler/src/dotty/tools/dotc/typer/ProtoTypes.scala b/compiler/src/dotty/tools/dotc/typer/ProtoTypes.scala index 6b72f3a8b56e..53ae7438d381 100644 --- a/compiler/src/dotty/tools/dotc/typer/ProtoTypes.scala +++ b/compiler/src/dotty/tools/dotc/typer/ProtoTypes.scala @@ -126,7 +126,7 @@ object ProtoTypes { constrainResult(mt, pt) } } else { - // Best-effort to fix https://github.com/lampepfl/dotty/issues/9685 in the 3.3.x series + // Best-effort to fix https://github.com/scala/scala3/issues/9685 in the 3.3.x series // while preserving source compatibility as much as possible val methodMatchedType = constrainResult(mt, wildApprox(pt)) meth.is(Transparent) || methodMatchedType diff --git a/compiler/test/dotty/tools/io/PathTest.scala b/compiler/test/dotty/tools/io/PathTest.scala index 731d29cee8e6..a3755e0ea7dc 100644 --- a/compiler/test/dotty/tools/io/PathTest.scala +++ b/compiler/test/dotty/tools/io/PathTest.scala @@ -3,7 +3,7 @@ package dotty.tools.io import org.junit.Test class PathTest { - // Ref https://github.com/lampepfl/dotty/issues/11644#issuecomment-792457275 + // Ref https://github.com/scala/scala3/issues/11644#issuecomment-792457275 @Test def parent(): Unit = { testParent(Path(""), Directory("..")) testParent(Path("."), Directory("..")) diff --git a/dist/bin/scalac b/dist/bin/scalac index 4b888641e786..d9bd21ca425b 100644 --- a/dist/bin/scalac +++ b/dist/bin/scalac @@ -40,7 +40,7 @@ while [[ $# -gt 0 ]]; do # pass all remaining arguments to scala, e.g. to avoid interpreting them here as -D or -J while [[ $# -gt 0 ]]; do addScala "$1" && shift ; done ;; - # Optimize for short-running applications, see https://github.com/lampepfl/dotty/issues/222 + # Optimize for short-running applications, see https://github.com/scala/scala3/issues/222 -Oshort) addScala "-Oshort" && \ addJava "-XX:+TieredCompilation" && addJava "-XX:TieredStopAtLevel=1" && shift ;; diff --git a/dist/bin/scalac.bat b/dist/bin/scalac.bat index 454158c85666..cb1a76471f70 100644 --- a/dist/bin/scalac.bat +++ b/dist/bin/scalac.bat @@ -55,7 +55,7 @@ if defined _CONSUME_REMAINING ( set _SCALA_ARGS=!_SCALA_ARGS! "%__ARG%" shift ) else if "%__ARG%"=="-Oshort" ( - @rem optimize for short-running applications, see https://github.com/lampepfl/dotty/issues/222 + @rem optimize for short-running applications, see https://github.com/scala/scala3/issues/222 set _JAVA_ARGS=!_JAVA_ARGS! "-XX:+TieredCompilation" "-XX:TieredStopAtLevel=1" set _SCALA_ARGS=!_SCALA_ARGS! -Oshort shift diff --git a/docs/_blog/_posts/2016-02-17-scaling-dot-soundness.md b/docs/_blog/_posts/2016-02-17-scaling-dot-soundness.md index 7619545b844e..bc9c558e3011 100644 --- a/docs/_blog/_posts/2016-02-17-scaling-dot-soundness.md +++ b/docs/_blog/_posts/2016-02-17-scaling-dot-soundness.md @@ -69,9 +69,9 @@ categories: is not associated with a runtime value. We can in fact construct soundness issues in all of these cases. Look -at the discussion for issues [#50](https://github.com/lampepfl/dotty/issues/50) -and [#1050](https://github.com/lampepfl/dotty/issues/1050) in the -[Dotty](https://github.com/lampepfl/dotty/issues/1050) repository +at the discussion for issues [#50](https://github.com/scala/scala3/issues/50) +and [#1050](https://github.com/scala/scala3/issues/1050) in the +[Dotty](https://github.com/scala/scala3/issues/1050) repository on GitHub. All issues work fundamentally in the same way: Construct a type `S` which has a type member `T` with bad bounds, say: @@ -98,7 +98,7 @@ course. The promise is usually broken at run-time by failing with a ## Plugging the Loopholes To get back to soundness we need to plug the loopholes. Some of the -necessary measures are taken in pull request [#1051](https://github.com/lampepfl/dotty/issues/1051). +necessary measures are taken in pull request [#1051](https://github.com/scala/scala3/issues/1051). That pull request - tightens the rules for overrides of lazy values: lazy values diff --git a/docs/_blog/_posts/2016-05-05-multiversal-equality.md b/docs/_blog/_posts/2016-05-05-multiversal-equality.md index 236225eec318..d9a48a9f4424 100644 --- a/docs/_blog/_posts/2016-05-05-multiversal-equality.md +++ b/docs/_blog/_posts/2016-05-05-multiversal-equality.md @@ -6,7 +6,7 @@ authorImg: images/martin.jpg date: 2016-05-05 --- -I have been working recently on making equality tests using `==` and `!=` safer in Scala. This has led to a [Language Enhancement Proposal](https://github.com/lampepfl/dotty/issues/1247) which I summarize in this blog. +I have been working recently on making equality tests using `==` and `!=` safer in Scala. This has led to a [Language Enhancement Proposal](https://github.com/scala/scala3/issues/1247) which I summarize in this blog. ## Why Change Equality? @@ -77,7 +77,7 @@ Given a set of `Eq` instances, the idea is that the Scala compiler will check ev So this means we still keep universal equality as it is in Scala now - we don't have a choice here anyway, because of backwards compatibility. But we render it safe by checking that for each comparison the corresponding `Eq` instance exists. -What about types for which no `Eq` instance exists? To maintain backwards compatibility, we allow comparisons of such types as well, by means of a fall-back `eqAny` instance. But we do not allow comparisons between types that have an `Eq` instance and types that have none. Details are explained in the [proposal](https://github.com/lampepfl/dotty/issues/1247). +What about types for which no `Eq` instance exists? To maintain backwards compatibility, we allow comparisons of such types as well, by means of a fall-back `eqAny` instance. But we do not allow comparisons between types that have an `Eq` instance and types that have none. Details are explained in the [proposal](https://github.com/scala/scala3/issues/1247). ## Properties diff --git a/docs/_blog/_posts/2016-12-05-implicit-function-types.md b/docs/_blog/_posts/2016-12-05-implicit-function-types.md index ba28159c0fa3..0d9eae53d37f 100644 --- a/docs/_blog/_posts/2016-12-05-implicit-function-types.md +++ b/docs/_blog/_posts/2016-12-05-implicit-function-types.md @@ -6,7 +6,7 @@ authorImg: images/martin.jpg date: 2016-12-05 --- -I just made the [first pull request](https://github.com/lampepfl/dotty/pull/1775) to add _implicit function types_ to +I just made the [first pull request](https://github.com/scala/scala3/pull/1775) to add _implicit function types_ to Scala. I am pretty excited about it, because - citing the explanation of the pull request - "_This is the first step to bring contextual abstraction to Scala_". What do I mean by this? @@ -181,7 +181,7 @@ implicit Transaction => Int ``` Just like the normal function type syntax `A => B`, desugars to `scala.Function1[A, B]` the implicit function type syntax `implicit A => B` desugars to `scala.ImplicitFunction1[A, B]`. -The same holds at other function arities. With Dotty's [pull request #1758](https://github.com/lampepfl/dotty/pull/1758) +The same holds at other function arities. With Dotty's [pull request #1758](https://github.com/scala/scala3/pull/1758) merged there is no longer an upper limit of 22 for such functions. The type `ImplicitFunction1` can be thought of being defined as follows: diff --git a/docs/_blog/_posts/2017-05-31-first-dotty-milestone-release.md b/docs/_blog/_posts/2017-05-31-first-dotty-milestone-release.md index 9bfd22b2e3db..3063e658537d 100644 --- a/docs/_blog/_posts/2017-05-31-first-dotty-milestone-release.md +++ b/docs/_blog/_posts/2017-05-31-first-dotty-milestone-release.md @@ -46,7 +46,7 @@ using Dotty with sbt, see the Releases are available for download on the _Releases_ section of the Dotty repository: -https://github.com/lampepfl/dotty/releases +https://github.com/scala/scala3/releases We also provide a [homebrew](https://brew.sh/) package that can be installed by running @@ -92,9 +92,9 @@ See here for the full [version number explanation](https://dotty.epfl.ch/docs/us Over the coming weeks and months, we plan to work on the following topics: - - [Integrate Local optimizations developed in Dotty linker](https://github.com/lampepfl/dotty/pull/2513); - - [Add Language-level support for HMaps and HLists](https://github.com/lampepfl/dotty/pull/2199); - - [Port global optimizations from Dotty linker](https://github.com/lampepfl/dotty/pull/1840). + - [Integrate Local optimizations developed in Dotty linker](https://github.com/scala/scala3/pull/2513); + - [Add Language-level support for HMaps and HLists](https://github.com/scala/scala3/pull/2199); + - [Port global optimizations from Dotty linker](https://github.com/scala/scala3/pull/1840). If you want to get your hands dirty with any of this, now is a good moment to get involved! Join the team of contributors, including diff --git a/docs/_blog/_posts/2017-07-12-second-dotty-milestone-release.md b/docs/_blog/_posts/2017-07-12-second-dotty-milestone-release.md index 9172b5ad67ec..ff314eed13f2 100644 --- a/docs/_blog/_posts/2017-07-12-second-dotty-milestone-release.md +++ b/docs/_blog/_posts/2017-07-12-second-dotty-milestone-release.md @@ -34,52 +34,52 @@ The [previous technology preview](/_blog/_posts/2017-05-31-first-dotty-milestone This technology preview is geared towards improving stability and reliability. It includes: - - [Local optimizations upstreamed from the Dotty Linker](https://github.com/lampepfl/dotty/pull/2513), [2647](https://github.com/lampepfl/dotty/pull/2647) by ([@OlivierBlanvillain](https://github.com/OlivierBlanvillain)). See more details below. - - [Optimizing Pattern Matcher](https://github.com/lampepfl/dotty/pull/2829) by ([@odersky](https://github.com/odersky)) - - [Idempotency checks](https://github.com/lampepfl/dotty/pull/2756) are the first step to reproducible builds - - [Faster Base class sets](https://github.com/lampepfl/dotty/pull/2676) by ([@odersky](https://github.com/odersky)) and ([@darkdimius](https://twitter.com/darkdimius)) + - [Local optimizations upstreamed from the Dotty Linker](https://github.com/scala/scala3/pull/2513), [2647](https://github.com/scala/scala3/pull/2647) by ([@OlivierBlanvillain](https://github.com/OlivierBlanvillain)). See more details below. + - [Optimizing Pattern Matcher](https://github.com/scala/scala3/pull/2829) by ([@odersky](https://github.com/odersky)) + - [Idempotency checks](https://github.com/scala/scala3/pull/2756) are the first step to reproducible builds + - [Faster Base class sets](https://github.com/scala/scala3/pull/2676) by ([@odersky](https://github.com/odersky)) and ([@darkdimius](https://twitter.com/darkdimius)) - Numerous fixes to IDE and Dotty Language Server covering: - - [Windows support for VS Code plugin](https://github.com/lampepfl/dotty/pull/2776) - - [Fix hover-on-type for implicitly converted expressions](https://github.com/lampepfl/dotty/pull/2836) - - [Fixes to find all references in external projects](https://github.com/lampepfl/dotty/pull/2810), [2773](https://github.com/lampepfl/dotty/pull/2773/files) - - [Fix conflict with dragos-vscode-scala](https://github.com/lampepfl/dotty/pull/2777) - - [Fix ide crash on non-parsable file](https://github.com/lampepfl/dotty/pull/2752) - - [Fix hover functionality for enum classes](https://github.com/lampepfl/dotty/pull/2722) - - [Report errors on Dotty Language Server initialization](https://github.com/lampepfl/dotty/pull/2708) - - [Fixes to sbt setting up Dotty IDE](https://github.com/lampepfl/dotty/pull/2690) - - General stability improvements [2838](https://github.com/lampepfl/dotty/pull/2838), [2787](https://github.com/lampepfl/dotty/pull/2787), [2692](https://github.com/lampepfl/dotty/pull/2692) + - [Windows support for VS Code plugin](https://github.com/scala/scala3/pull/2776) + - [Fix hover-on-type for implicitly converted expressions](https://github.com/scala/scala3/pull/2836) + - [Fixes to find all references in external projects](https://github.com/scala/scala3/pull/2810), [2773](https://github.com/scala/scala3/pull/2773/files) + - [Fix conflict with dragos-vscode-scala](https://github.com/scala/scala3/pull/2777) + - [Fix ide crash on non-parsable file](https://github.com/scala/scala3/pull/2752) + - [Fix hover functionality for enum classes](https://github.com/scala/scala3/pull/2722) + - [Report errors on Dotty Language Server initialization](https://github.com/scala/scala3/pull/2708) + - [Fixes to sbt setting up Dotty IDE](https://github.com/scala/scala3/pull/2690) + - General stability improvements [2838](https://github.com/scala/scala3/pull/2838), [2787](https://github.com/scala/scala3/pull/2787), [2692](https://github.com/scala/scala3/pull/2692) - Scalac compatibility improvements: - - [Support Scala 2.12 traits](https://github.com/lampepfl/dotty/pull/2685) - - [Fixes to handling of Scala 2 classfiles](https://github.com/lampepfl/dotty/pull/2834/files) - - [Scalac parser crashes on Dotty.jar](https://github.com/lampepfl/dotty/pull/2719) + - [Support Scala 2.12 traits](https://github.com/scala/scala3/pull/2685) + - [Fixes to handling of Scala 2 classfiles](https://github.com/scala/scala3/pull/2834/files) + - [Scalac parser crashes on Dotty.jar](https://github.com/scala/scala3/pull/2719) - Java compatibility improvements: - - [Fixes to handing of Java generic signatures](https://github.com/lampepfl/dotty/pull/2831) - - [java.lang.System.out is final but that's a lie](https://github.com/lampepfl/dotty/pull/2781) + - [Fixes to handing of Java generic signatures](https://github.com/scala/scala3/pull/2831) + - [java.lang.System.out is final but that's a lie](https://github.com/scala/scala3/pull/2781) - Improved error messages: - - [Nicer error message for "implicit function type needs non-empty parameter list"](https://github.com/lampepfl/dotty/pull/2821) - - [Nicer error message for nonsensical modifier combination](https://github.com/lampepfl/dotty/pull/2807/files), [2747](https://github.com/lampepfl/dotty/pull/2747) - - [Nicer error message for supercall inside @inline method](https://github.com/lampepfl/dotty/pull/2740) - - [Check that case classes don't inherit case classes](https://github.com/lampepfl/dotty/pull/2790) - - [Check that named parameters don't conflict with positional ones](https://github.com/lampepfl/dotty/pull/2785) + - [Nicer error message for "implicit function type needs non-empty parameter list"](https://github.com/scala/scala3/pull/2821) + - [Nicer error message for nonsensical modifier combination](https://github.com/scala/scala3/pull/2807/files), [2747](https://github.com/scala/scala3/pull/2747) + - [Nicer error message for supercall inside @inline method](https://github.com/scala/scala3/pull/2740) + - [Check that case classes don't inherit case classes](https://github.com/scala/scala3/pull/2790) + - [Check that named parameters don't conflict with positional ones](https://github.com/scala/scala3/pull/2785) - Improved command line handling: - - [Support params in a file like @file.txt](https://github.com/lampepfl/dotty/pull/2765) + - [Support params in a file like @file.txt](https://github.com/scala/scala3/pull/2765) - Type system stability: - - [Handle wildcard types in unions and intersections](https://github.com/lampepfl/dotty/pull/2742) + - [Handle wildcard types in unions and intersections](https://github.com/scala/scala3/pull/2742) - Fixes to implicit search: - - [Fix shadowing of higher order implicits](https://github.com/lampepfl/dotty/pull/2739) + - [Fix shadowing of higher order implicits](https://github.com/scala/scala3/pull/2739) ### Better generated code: @@ -313,7 +313,7 @@ using Dotty with sbt, see the Releases are available for download on the _Releases_ section of the Dotty repository: -[https://github.com/lampepfl/dotty/releases](https://github.com/lampepfl/dotty/releases) +[https://github.com/scala/scala3/releases](https://github.com/scala/scala3/releases) We also provide a [homebrew](https://brew.sh/) package that can be installed by running: @@ -338,10 +338,10 @@ You can try it out there without installing anything. Over the coming weeks and months, we plan to work on the following topics: - - [Add support for using Dotty generated classes with Scala 2.12](https://github.com/lampepfl/dotty/pull/2827) - - [Add Language-level support for HMaps and HLists](https://github.com/lampepfl/dotty/pull/2199); + - [Add support for using Dotty generated classes with Scala 2.12](https://github.com/scala/scala3/pull/2827) + - [Add Language-level support for HMaps and HLists](https://github.com/scala/scala3/pull/2199); - Upstream more optimizations from Dotty Linker - - [Add support for existing in the same classpath with Scala 2.12](https://github.com/lampepfl/dotty/pull/2827) + - [Add support for existing in the same classpath with Scala 2.12](https://github.com/scala/scala3/pull/2827) If you want to get your hands dirty with any of this, now is a good moment to get involved! Join the team of contributors, including diff --git a/docs/_blog/_posts/2017-09-07-third-dotty-milestone-release.md b/docs/_blog/_posts/2017-09-07-third-dotty-milestone-release.md index 236591139105..01fec156ecde 100644 --- a/docs/_blog/_posts/2017-09-07-third-dotty-milestone-release.md +++ b/docs/_blog/_posts/2017-09-07-third-dotty-milestone-release.md @@ -33,12 +33,12 @@ stability and reliability: This technology preview further improves stability and reliability. Some highlighted PRs are: - IDE bug fixes: - [#2986](https://github.com/lampepfl/dotty/pull/2986), - [#2932](https://github.com/lampepfl/dotty/pull/2932), - [#2885](https://github.com/lampepfl/dotty/pull/2885), - [#2876](https://github.com/lampepfl/dotty/pull/2876), - [#2870](https://github.com/lampepfl/dotty/pull/2870), - [#2872](https://github.com/lampepfl/dotty/pull/2872) by [@odersky] and [@smarter]. + [#2986](https://github.com/scala/scala3/pull/2986), + [#2932](https://github.com/scala/scala3/pull/2932), + [#2885](https://github.com/scala/scala3/pull/2885), + [#2876](https://github.com/scala/scala3/pull/2876), + [#2870](https://github.com/scala/scala3/pull/2870), + [#2872](https://github.com/scala/scala3/pull/2872) by [@odersky] and [@smarter]. ## How can you try it out? @@ -65,7 +65,7 @@ using Dotty with sbt, see the ### Standalone installation Releases are available for download on the _Releases_ section of the Dotty repository: -[https://github.com/lampepfl/dotty/releases](https://github.com/lampepfl/dotty/releases) +[https://github.com/scala/scala3/releases](https://github.com/scala/scala3/releases) We also provide a [homebrew](https://brew.sh/) package that can be installed by running: @@ -87,16 +87,16 @@ You can try it out there without installing anything. ## What are the next steps? Over the coming weeks and months, we plan to work on the following topics: - - [Add support for using Dotty generated classes with Scala 2.12](https://github.com/lampepfl/dotty/pull/2827) - - [Add Language-level support for HMaps and HLists](https://github.com/lampepfl/dotty/pull/2199); + - [Add support for using Dotty generated classes with Scala 2.12](https://github.com/scala/scala3/pull/2827) + - [Add Language-level support for HMaps and HLists](https://github.com/scala/scala3/pull/2199); - Upstream more optimizations from Dotty Linker - - [Add support for existing in the same classpath with Scala 2.12](https://github.com/lampepfl/dotty/pull/2827) - - [Add native Dotty REPL](https://github.com/lampepfl/dotty/pull/2991) + - [Add support for existing in the same classpath with Scala 2.12](https://github.com/scala/scala3/pull/2827) + - [Add native Dotty REPL](https://github.com/scala/scala3/pull/2991) ## Questions / Reporting Bugs If you have questions or any sort of feedback, feel free to send us a message on our [Gitter channel](https://gitter.im/lampepfl/dotty). If you encounter a bug, please -[open an issue on GitHub](https://github.com/lampepfl/dotty/issues/new). +[open an issue on GitHub](https://github.com/scala/scala3/issues/new). ## Contributing Thank you to all the contributors who made this release possible! @@ -122,7 +122,7 @@ According to `git shortlog -sn --no-merges 0.2.0-RC1..0.3.0-RC2` these are: If you want to get your hands dirty with any of this, now is a good moment to get involved! You can have a look at our [Getting Started page](https://dotty.epfl.ch/docs/contributing/getting-started.html), our [Awesome Error Messages](http://scala-lang.org/blog/2016/10/14/dotty-errors.html) or some of -the simple [Dotty issues](https://github.com/lampepfl/dotty/issues?q=is%3Aissue+is%3Aopen+label%3Aexp%3Anovice). +the simple [Dotty issues](https://github.com/scala/scala3/issues?q=is%3Aissue+is%3Aopen+label%3Aexp%3Anovice). They make perfect entry-points into hacking on the compiler. We are looking forward to having you join the team of contributors. diff --git a/docs/_blog/_posts/2017-10-16-fourth-dotty-milestone-release.md b/docs/_blog/_posts/2017-10-16-fourth-dotty-milestone-release.md index 7f98c4835ee6..1edf198380c8 100644 --- a/docs/_blog/_posts/2017-10-16-fourth-dotty-milestone-release.md +++ b/docs/_blog/_posts/2017-10-16-fourth-dotty-milestone-release.md @@ -25,14 +25,14 @@ stability and reliability. ## What’s new in the 0.4.0-RC1 technology preview? -### Rewritten REPL [#2991](https://github.com/lampepfl/dotty/pull/2991) +### Rewritten REPL [#2991](https://github.com/scala/scala3/pull/2991) The original Dotty REPL was a proof of concept hacked together from -[an ancient version of the scalac REPL](https://github.com/lampepfl/dotty/pull/1082#issuecomment-183905504). +[an ancient version of the scalac REPL](https://github.com/scala/scala3/pull/1082#issuecomment-183905504). It worked by creating Scala source files from the user input using string concatenation, this made it easy to adapt it for Dotty since it did not rely on the internals of scalac, but it was also fragile and hard to reason about. -The [new REPL](https://github.com/lampepfl/dotty/pull/2991) instead works by +The [new REPL](https://github.com/scala/scala3/pull/2991) instead works by manipulating ASTs (Abstract Syntax Trees), this is more robust and will make it easier to develop new features: we have already implemented auto-completion support (by reusing the APIs we had created for @@ -42,7 +42,7 @@ Note that the user interface of the REPL has not changed: like in the old REPL we use code adapted from the [Ammonite REPL](http://ammonite.io/#Ammonite-REPL) to provide syntax highlighting, multi-line editing, history, etc. -### Scala 2.12 support [#2827](https://github.com/lampepfl/dotty/pull/2827) +### Scala 2.12 support [#2827](https://github.com/scala/scala3/pull/2827) Since our first release, it has been possible to use Scala 2 libraries in a Dotty project as explained in the [dotty-example-project](https://github.com/smarter/dotty-example-project#getting-your-project-to-compile-with-dotty). @@ -82,7 +82,7 @@ the IDE sections of the [getting-started page](https://docs.scala-lang.org/scala ### Standalone installation Releases are available for download on the _Releases_ section of the Dotty repository: -[https://github.com/lampepfl/dotty/releases](https://github.com/lampepfl/dotty/releases) +[https://github.com/scala/scala3/releases](https://github.com/scala/scala3/releases) We also provide a [homebrew](https://brew.sh/) package that can be installed by running: @@ -99,7 +99,7 @@ brew upgrade dotty ## Let us know what you think! If you have questions or any sort of feedback, feel free to send us a message on our [Gitter channel](https://gitter.im/lampepfl/dotty). If you encounter a bug, please -[open an issue on GitHub](https://github.com/lampepfl/dotty/issues/new). +[open an issue on GitHub](https://github.com/scala/scala3/issues/new). ## Contributing Thank you to all the contributors who made this release possible! @@ -131,7 +131,7 @@ According to `git shortlog -sn --no-merges 0.3.0-RC2..0.4.0-RC1` these are: If you want to get your hands dirty and contribute to Dotty, now is a good time to get involved! You can have a look at our [Getting Started page for new contributors](https://dotty.epfl.ch/docs/contributing/getting-started.html), the [Awesome Error Messages](http://scala-lang.org/blog/2016/10/14/dotty-errors.html) project or some of -the simple [Dotty issues](https://github.com/lampepfl/dotty/issues?q=is%3Aissue+is%3Aopen+label%3Aexp%3Anovice). +the simple [Dotty issues](https://github.com/scala/scala3/issues?q=is%3Aissue+is%3Aopen+label%3Aexp%3Anovice). They make perfect entry-points into hacking on the compiler. We are looking forward to having you join the team of contributors. diff --git a/docs/_blog/_posts/2017-12-01-fifth-dotty-milestone-release.md b/docs/_blog/_posts/2017-12-01-fifth-dotty-milestone-release.md index 60d1062b7eac..dfc97e64f496 100644 --- a/docs/_blog/_posts/2017-12-01-fifth-dotty-milestone-release.md +++ b/docs/_blog/_posts/2017-12-01-fifth-dotty-milestone-release.md @@ -25,7 +25,7 @@ support for Scala 2.12 and came with a brand new REPL. ## What’s new in the 0.5.0-RC1 technology preview? -### Reworked implicit search [#3421](https://github.com/lampepfl/dotty/pull/3421) +### Reworked implicit search [#3421](https://github.com/scala/scala3/pull/3421) The treatment of ambiguity errors has changed. If an ambiguity is encountered in some recursive step of an implicit search, the ambiguity is propagated to the caller. Example: Say you have the following definitions: @@ -56,7 +56,7 @@ techniques no longer work. But there is now a new special type `scala.implicits. implements negation directly. For any query type `Q`: `Not[Q]` succeeds if and only if the implicit search for `Q` fails. -### Dependent function types [#3464](https://github.com/lampepfl/dotty/pull/3464) +### Dependent function types [#3464](https://github.com/scala/scala3/pull/3464) A dependent function type describes functions where the result type may depend on the function's parameter values. Example: @@ -108,7 +108,7 @@ are currently two backends using the TASTY frontend: This is the first step toward linking and whole word optimisations, recompiling code to a different backends... -### Generic java signatures [#3234](https://github.com/lampepfl/dotty/pull/3234) +### Generic java signatures [#3234](https://github.com/scala/scala3/pull/3234) Dotty now emits generic signatures for classes and methods. Those signatures are used by compilers, debuggers and to support runtime reflection. For example: @@ -144,7 +144,7 @@ the IDE sections of the [getting-started page](https://docs.scala-lang.org/scala ### Standalone installation Releases are available for download on the _Releases_ section of the Dotty repository: -[https://github.com/lampepfl/dotty/releases](https://github.com/lampepfl/dotty/releases) +[https://github.com/scala/scala3/releases](https://github.com/scala/scala3/releases) We also provide a [homebrew](https://brew.sh/) package that can be installed by running: @@ -161,7 +161,7 @@ brew upgrade dotty ## Let us know what you think! If you have questions or any sort of feedback, feel free to send us a message on our [Gitter channel](https://gitter.im/lampepfl/dotty). If you encounter a bug, please -[open an issue on GitHub](https://github.com/lampepfl/dotty/issues/new). +[open an issue on GitHub](https://github.com/scala/scala3/issues/new). ## Contributing Thank you to all the contributors who made this release possible! @@ -198,7 +198,7 @@ According to `git shortlog -sn --no-merges 0.4.0-RC1..0.5.0-RC1` these are: If you want to get your hands dirty and contribute to Dotty, now is a good time to get involved! You can have a look at our [Getting Started page for new contributors](https://dotty.epfl.ch/docs/contributing/getting-started.html), the [Awesome Error Messages](http://scala-lang.org/blog/2016/10/14/dotty-errors.html) project or some of -the simple [Dotty issues](https://github.com/lampepfl/dotty/issues?q=is%3Aissue+is%3Aopen+label%3Aexp%3Anovice). +the simple [Dotty issues](https://github.com/scala/scala3/issues?q=is%3Aissue+is%3Aopen+label%3Aexp%3Anovice). They make perfect entry-points into hacking on the compiler. We are looking forward to having you join the team of contributors. diff --git a/docs/_blog/_posts/2018-03-05-seventh-dotty-milestone-release.md b/docs/_blog/_posts/2018-03-05-seventh-dotty-milestone-release.md index a72e04409ea0..a70d71bc17b8 100644 --- a/docs/_blog/_posts/2018-03-05-seventh-dotty-milestone-release.md +++ b/docs/_blog/_posts/2018-03-05-seventh-dotty-milestone-release.md @@ -19,12 +19,12 @@ You can learn more about Dotty on our [website](http://dotty.epfl.ch). This is our seventh scheduled release according to our [6-week release schedule](https://dotty.epfl.ch/docs/usage/version-numbers.html). -The [previous technology preview](https://github.com/lampepfl/dotty/releases/tag/0.6.0-RC1) focussed +The [previous technology preview](https://github.com/scala/scala3/releases/tag/0.6.0-RC1) focussed on bug fixes and stability work. ## What’s new in the 0.7.0-RC1 technology preview? -### Enum Simplification [#4003](https://github.com/lampepfl/dotty/pull/4003) +### Enum Simplification [#4003](https://github.com/scala/scala3/pull/4003) The previously introduced syntax and rules for enum were arguably too complex. We can considerably simplify them by taking away one capability: that cases can have bodies which can define members. Arguably, if we choose an ADT decomposition of a problem, it's good style to write all methods using @@ -75,7 +75,7 @@ and how to use them to model [Algebraic Data Types](https://dotty.epfl.ch/docs/r visit the respective sections in our documentation. -### Erased terms [#3342](https://github.com/lampepfl/dotty/pull/3342) +### Erased terms [#3342](https://github.com/scala/scala3/pull/3342) The `erased` modifier can be used on parameters, `val` and `def` to enforce that no reference to those terms is ever used. As they are never used, they can safely be removed during compilation. @@ -103,10 +103,10 @@ For more information, visit the [Erased Terms](https://dotty.epfl.ch/docs/refere section of our documentation. **Note**: Erased terms replace _phantom types_: they have similar semantics, but with the added -advantage that any type can be an erased parameter. See [#3410](https://github.com/lampepfl/dotty/pull/3410). +advantage that any type can be an erased parameter. See [#3410](https://github.com/scala/scala3/pull/3410). -### Improved IDE support [#3960](https://github.com/lampepfl/dotty/pull/3960) +### Improved IDE support [#3960](https://github.com/scala/scala3/pull/3960) The Dotty language server now supports context sensitive IDE completions. Completions now include local and imported definitions. Members completions take possible implicit conversions into account. @@ -183,7 +183,7 @@ compile-time. For example, writing `(eval(a), eval(a))` instead of `(eval(a), eval(b))` in the example above should be an error, but it was not caught by Scala 2 or previous versions of Dotty, whereas we now get a type mismatch error as expected. More work remains to be done to fix the remaining [GADT-related -issues](https://github.com/lampepfl/dotty/issues?utf8=%E2%9C%93&q=is%3Aissue+is%3Aopen+gadt), +issues](https://github.com/scala/scala3/issues?utf8=%E2%9C%93&q=is%3Aissue+is%3Aopen+gadt), but so far no show-stopper has been found. ## Trying out Dotty @@ -210,7 +210,7 @@ the IDE sections of the [getting-started page](https://docs.scala-lang.org/scala ### Standalone installation Releases are available for download on the _Releases_ section of the Dotty repository: -[https://github.com/lampepfl/dotty/releases](https://github.com/lampepfl/dotty/releases) +[https://github.com/scala/scala3/releases](https://github.com/scala/scala3/releases) We also provide a [homebrew](https://brew.sh/) package that can be installed by running: @@ -227,7 +227,7 @@ brew upgrade dotty ## Let us know what you think! If you have questions or any sort of feedback, feel free to send us a message on our [Gitter channel](https://gitter.im/lampepfl/dotty). If you encounter a bug, please -[open an issue on GitHub](https://github.com/lampepfl/dotty/issues/new). +[open an issue on GitHub](https://github.com/scala/scala3/issues/new). ## Contributing Thank you to all the contributors who made this release possible! @@ -255,7 +255,7 @@ According to `git shortlog -sn --no-merges 0.6.0..0.7.0-RC1` these are: If you want to get your hands dirty and contribute to Dotty, now is a good time to get involved! Head to our [Getting Started page for new contributors](https://dotty.epfl.ch/docs/contributing/getting-started.html), -and have a look at some of the [good first issues](https://github.com/lampepfl/dotty/issues?q=is%3Aissue+is%3Aopen+label%3Aexp%3Anovice). +and have a look at some of the [good first issues](https://github.com/scala/scala3/issues?q=is%3Aissue+is%3Aopen+label%3Aexp%3Anovice). They make perfect entry-points into hacking on the compiler. We are looking forward to having you join the team of contributors. diff --git a/docs/_blog/_posts/2018-04-27-eighth-dotty-milestone-release.md b/docs/_blog/_posts/2018-04-27-eighth-dotty-milestone-release.md index 72c0747e659d..d2194594edcc 100644 --- a/docs/_blog/_posts/2018-04-27-eighth-dotty-milestone-release.md +++ b/docs/_blog/_posts/2018-04-27-eighth-dotty-milestone-release.md @@ -24,12 +24,12 @@ You can learn more about Dotty on our [website](https://dotty.epfl.ch). This is our eighth scheduled release according to our [6-week release schedule](https://dotty.epfl.ch/docs/usage/version-numbers.html). -The [previous technology preview](https://github.com/lampepfl/dotty/releases/tag/0.7.0-RC1) simplified +The [previous technology preview](https://github.com/scala/scala3/releases/tag/0.7.0-RC1) simplified enums, introduced erased terms, improved IDE support and improved pattern matching for GADT. ## What’s new in the 0.8.0-RC1 technology preview? -### sbt 1 support [#3872](https://github.com/lampepfl/dotty/pull/3872) +### sbt 1 support [#3872](https://github.com/scala/scala3/pull/3872) Starting with Dotty 0.8.0, we will only support versions of sbt >= 1.1.4. Migrating to sbt 1 lets us use the new improved incremental compiler for Scala called [Zinc](https://github.com/sbt/zinc), and enables integration with tools such as [Bloop](https://scalacenter.github.io/bloop/). @@ -43,7 +43,7 @@ If you are already using Dotty with sbt 0.13, follow these simple steps to upgra ``` - replace usages of `.withDottyCompat()` by `.withDottyCompat(scalaVersion.value)` -### Unchecked warnings [#4045](https://github.com/lampepfl/dotty/pull/4045) +### Unchecked warnings [#4045](https://github.com/scala/scala3/pull/4045) Dotty now emits `unchecked` warnings like `scalac` whenever a type test is performed but cannot be fully checked at runtime because of type erasure. For example: @@ -64,7 +64,7 @@ def foo[T](x: T) = x match { } ``` -### Kind Polymorphism [#4108](https://github.com/lampepfl/dotty/pull/4108) +### Kind Polymorphism [#4108](https://github.com/scala/scala3/pull/4108) Normally type parameters in Scala are partitioned into kinds. First-level types are types of values. Higher-kinded types are type constructors such as `List` or `Map`. The kind of a type is indicated by the top type of which it is a subtype. Normal types are subtypes of `Any`, covariant single @@ -94,7 +94,7 @@ f[[X] =>> String] (i.e. `-Ykind-polymorphism`). For more information, visit the [Kind Polymorphism](https://dotty.epfl.ch/docs/reference/other-new-features/kind-polymorphism.html) section of our documentation. -### Improved support for SAM type [#4152](https://github.com/lampepfl/dotty/pull/4152) +### Improved support for SAM type [#4152](https://github.com/scala/scala3/pull/4152) This release includes fixes to [SAM types](https://www.scala-lang.org/news/2.12.0/#lambda-syntax-for-sam-types) that greatly improve interoperability with Java 8 lambdas. One can now easily write Scala code that uses Java streams: @@ -139,7 +139,7 @@ the IDE sections of the [getting-started page](https://docs.scala-lang.org/scala ### Standalone installation Releases are available for download on the _Releases_ section of the Dotty repository: -[https://github.com/lampepfl/dotty/releases](https://github.com/lampepfl/dotty/releases) +[https://github.com/scala/scala3/releases](https://github.com/scala/scala3/releases) We also provide a [homebrew](https://brew.sh/) package that can be installed by running: @@ -156,7 +156,7 @@ brew upgrade dotty ## Let us know what you think! If you have questions or any sort of feedback, feel free to send us a message on our [Gitter channel](https://gitter.im/lampepfl/dotty). If you encounter a bug, please -[open an issue on GitHub](https://github.com/lampepfl/dotty/issues/new). +[open an issue on GitHub](https://github.com/scala/scala3/issues/new). ## Contributing Thank you to all the contributors who made this release possible! @@ -187,7 +187,7 @@ According to `git shortlog -sn --no-merges 0.7.0..0.8.0-RC1` these are: If you want to get your hands dirty and contribute to Dotty, now is a good time to get involved! Head to our [Getting Started page for new contributors](https://dotty.epfl.ch/docs/contributing/getting-started.html), -and have a look at some of the [good first issues](https://github.com/lampepfl/dotty/issues?q=is%3Aissue+is%3Aopen+label%3Aexp%3Anovice). +and have a look at some of the [good first issues](https://github.com/scala/scala3/issues?q=is%3Aissue+is%3Aopen+label%3Aexp%3Anovice). They make perfect entry-points into hacking on the compiler. We are looking forward to having you join the team of contributors. diff --git a/docs/_blog/_posts/2018-07-06-ninth-dotty-milestone-release.md b/docs/_blog/_posts/2018-07-06-ninth-dotty-milestone-release.md index 914eca4d73d3..73066551300b 100644 --- a/docs/_blog/_posts/2018-07-06-ninth-dotty-milestone-release.md +++ b/docs/_blog/_posts/2018-07-06-ninth-dotty-milestone-release.md @@ -24,23 +24,23 @@ You can learn more about Dotty on our [website](https://dotty.epfl.ch). This is our ninth scheduled release according to our [6-week release schedule](https://dotty.epfl.ch/docs/usage/version-numbers.html). -The [previous technology preview](https://github.com/lampepfl/dotty/releases/tag/0.8.0-RC1) added +The [previous technology preview](https://github.com/scala/scala3/releases/tag/0.8.0-RC1) added support for sbt 1, introduced improved unchecked warnings and improved SAM type support. ## What’s new in the 0.9.0-RC1 technology preview? -### Improved REPL [#4680](https://github.com/lampepfl/dotty/pull/4680) +### Improved REPL [#4680](https://github.com/scala/scala3/pull/4680) The REPL now uses [JLine 3](https://github.com/jline/jline3) under the hood which improves on many aspects such as, auto-completions and multi-line editing. The REPL now also works on Windows! -### Documentation support in the IDE [#4461](https://github.com/lampepfl/dotty/pull/4461), [#4648](https://github.com/lampepfl/dotty/pull/4648) +### Documentation support in the IDE [#4461](https://github.com/scala/scala3/pull/4461), [#4648](https://github.com/scala/scala3/pull/4648) The Dotty IDE will now display documentation while hovering over symbols that were previously compiled by the Dotty compiler. In the future, we plan to let users query the documentation in the REPL as well. -### Drop requirement that implicit functions must be non-empty [#4549](https://github.com/lampepfl/dotty/pull/4549) +### Drop requirement that implicit functions must be non-empty [#4549](https://github.com/scala/scala3/pull/4549) We remove the arbitrary restriction that parameters of implicit functions must by non-empty. We can now write: ```scala @@ -63,7 +63,7 @@ timed { Both definitions above are equivalent. -### Emit feature warnings for implicit conversions [#4229](https://github.com/lampepfl/dotty/pull/4229) +### Emit feature warnings for implicit conversions [#4229](https://github.com/scala/scala3/pull/4229) Implicit conversions are easily the most misused feature in Scala. We now emit feature warnings when encountering an implicit conversion definition, just like Scala 2 does. @@ -76,7 +76,7 @@ unless the conversion is: (we might extend this to more conversions). -### Optimise s and raw interpolators [#3961](https://github.com/lampepfl/dotty/pull/3961) +### Optimise s and raw interpolators [#3961](https://github.com/scala/scala3/pull/3961) `s` and `raw` string interpolators were known to be slower than their not type-safe counterparts: ```scala s"Hello $name!" @@ -89,7 +89,7 @@ The compiler will now desugar the former into the latter. Special thanks to compiler! -### Support for compiler plugins [#3438](https://github.com/lampepfl/dotty/pull/3438) +### Support for compiler plugins [#3438](https://github.com/scala/scala3/pull/3438) Dotty now supports Compiler plugins. Compiler plugins let you customize the compiler pipeline without having to modify the compiler source code. A major difference compared to Scala 2 is that Dotty plugins must run after the type checker. Being able to influence normal type checking @@ -123,7 +123,7 @@ the IDE sections of the [getting-started page](https://docs.scala-lang.org/scala ### Standalone installation Releases are available for download on the _Releases_ section of the Dotty repository: -[https://github.com/lampepfl/dotty/releases](https://github.com/lampepfl/dotty/releases) +[https://github.com/scala/scala3/releases](https://github.com/scala/scala3/releases) We also provide a [homebrew](https://brew.sh/) package that can be installed by running: @@ -144,7 +144,7 @@ installing anything. Note however that Scastie only supports Dotty 0.7.0-RC1. ## Let us know what you think! If you have questions or any sort of feedback, feel free to send us a message on our [Gitter channel](https://gitter.im/lampepfl/dotty). If you encounter a bug, please -[open an issue on GitHub](https://github.com/lampepfl/dotty/issues/new). +[open an issue on GitHub](https://github.com/scala/scala3/issues/new). ## Contributing Thank you to all the contributors who made this release possible! @@ -180,7 +180,7 @@ According to `git shortlog -sn --no-merges 0.8.0..0.9.0-RC1` these are: If you want to get your hands dirty and contribute to Dotty, now is a good time to get involved! Head to our [Getting Started page for new contributors](https://dotty.epfl.ch/docs/contributing/getting-started.html), -and have a look at some of the [good first issues](https://github.com/lampepfl/dotty/issues?q=is%3Aissue+is%3Aopen+label%3Aexp%3Anovice). +and have a look at some of the [good first issues](https://github.com/scala/scala3/issues?q=is%3Aissue+is%3Aopen+label%3Aexp%3Anovice). They make perfect entry-points into hacking on the compiler. We are looking forward to having you join the team of contributors. diff --git a/docs/_blog/_posts/2018-10-10-10th-dotty-milestone-release.md b/docs/_blog/_posts/2018-10-10-10th-dotty-milestone-release.md index b583d1bd0f49..0a5ebd067391 100644 --- a/docs/_blog/_posts/2018-10-10-10th-dotty-milestone-release.md +++ b/docs/_blog/_posts/2018-10-10-10th-dotty-milestone-release.md @@ -99,7 +99,7 @@ or the enclosing class did not change at call site. E.g. This restriction has now been removed. We also improve upon `scalac` which is not able to optimise methods that change the type of `this` on a polymorphic recursive call. -[Examples](https://github.com/lampepfl/dotty/blob/7a45a4a386d33180e5b7b21aa74271a77cce4707/tests/neg-tailcall/tailrec.scala#L43-L44) +[Examples](https://github.com/scala/scala3/blob/7a45a4a386d33180e5b7b21aa74271a77cce4707/tests/neg-tailcall/tailrec.scala#L43-L44) can be found in our test suite. ### Experimental support for generic Tuples @@ -125,7 +125,7 @@ val t2: (Int, String, Long, Int, String, Long) = (1,2,3,1,2,3) ### And much more! -Please read our [release notes](https://github.com/lampepfl/dotty/releases/tag/0.10.0-RC1) +Please read our [release notes](https://github.com/scala/scala3/releases/tag/0.10.0-RC1) for more details! ## Breaking changes @@ -160,7 +160,7 @@ the IDE sections of the [getting-started page](https://docs.scala-lang.org/scala Releases are available for download on the _Releases_ section of the Dotty repository: -[https://github.com/lampepfl/dotty/releases](https://github.com/lampepfl/dotty/releases) +[https://github.com/scala/scala3/releases](https://github.com/scala/scala3/releases) For macOS users, we also provide a [homebrew](https://brew.sh/) package that can be installed by running: @@ -179,7 +179,7 @@ brew upgrade dotty If you have questions or any sort of feedback, feel free to send us a message on our [Gitter channel](https://gitter.im/lampepfl/dotty). If you encounter a bug, please -[open an issue on GitHub](https://github.com/lampepfl/dotty/issues/new). +[open an issue on GitHub](https://github.com/scala/scala3/issues/new). ## Contributing @@ -222,7 +222,7 @@ According to `git shortlog -sn --no-merges 0.9.0..0.10.0-RC1` these are: If you want to get your hands dirty and contribute to Dotty, now is a good time to get involved! Head to our [Getting Started page for new contributors](https://dotty.epfl.ch/docs/contributing/getting-started.html), -and have a look at some of the [good first issues](https://github.com/lampepfl/dotty/issues?q=is%3Aissue+is%3Aopen+label%3Aexp%3Anovice). +and have a look at some of the [good first issues](https://github.com/scala/scala3/issues?q=is%3Aissue+is%3Aopen+label%3Aexp%3Anovice). They make perfect entry points into hacking on the compiler. We are looking forward to having you join the team of contributors. diff --git a/docs/_blog/_posts/2018-11-30-11th-dotty-milestone-release.md b/docs/_blog/_posts/2018-11-30-11th-dotty-milestone-release.md index 344900e2e164..268d86db00c6 100644 --- a/docs/_blog/_posts/2018-11-30-11th-dotty-milestone-release.md +++ b/docs/_blog/_posts/2018-11-30-11th-dotty-milestone-release.md @@ -115,7 +115,7 @@ information from the Scaladoc comment, then format it before we display it in th ### And much more! -Please read our [release notes](https://github.com/lampepfl/dotty/releases/tag/0.11.0-RC1) +Please read our [release notes](https://github.com/scala/scala3/releases/tag/0.11.0-RC1) for more details! ## Trying out Dotty @@ -146,7 +146,7 @@ the IDE sections of the [getting-started page](https://docs.scala-lang.org/scala Releases are available for download on the _Releases_ section of the Dotty repository: -[https://github.com/lampepfl/dotty/releases](https://github.com/lampepfl/dotty/releases) +[https://github.com/scala/scala3/releases](https://github.com/scala/scala3/releases) For macOS users, we also provide a [homebrew](https://brew.sh/) package that can be installed by running: @@ -165,7 +165,7 @@ brew upgrade dotty If you have questions or any sort of feedback, feel free to send us a message on our [Gitter channel](https://gitter.im/lampepfl/dotty). If you encounter a bug, please -[open an issue on GitHub](https://github.com/lampepfl/dotty/issues/new). +[open an issue on GitHub](https://github.com/scala/scala3/issues/new). ## Contributing @@ -201,7 +201,7 @@ According to `git shortlog -sn --no-merges 0.10.0..0.11.0-RC1` these are: If you want to get your hands dirty and contribute to Dotty, now is a good time to get involved! Head to our [Getting Started page for new contributors](https://dotty.epfl.ch/docs/contributing/getting-started.html), -and have a look at some of the [good first issues](https://github.com/lampepfl/dotty/issues?q=is%3Aissue+is%3Aopen+label%3Aexp%3Anovice). +and have a look at some of the [good first issues](https://github.com/scala/scala3/issues?q=is%3Aissue+is%3Aopen+label%3Aexp%3Anovice). They make perfect entry points into hacking on the compiler. We are looking forward to having you join the team of contributors. diff --git a/docs/_blog/_posts/2019-01-21-12th-dotty-milestone-release.md b/docs/_blog/_posts/2019-01-21-12th-dotty-milestone-release.md index f9fc25375d86..d56a03b4e345 100644 --- a/docs/_blog/_posts/2019-01-21-12th-dotty-milestone-release.md +++ b/docs/_blog/_posts/2019-01-21-12th-dotty-milestone-release.md @@ -106,7 +106,7 @@ This kind of reasoning is necessary for many advanced GADT usages! ### And much more! -Please read our [release notes](https://github.com/lampepfl/dotty/releases/tag/0.12.0-RC1) +Please read our [release notes](https://github.com/scala/scala3/releases/tag/0.12.0-RC1) for more details! ## Trying out Dotty @@ -137,7 +137,7 @@ the IDE sections of the [getting-started page](https://docs.scala-lang.org/scala Releases are available for download on the _Releases_ section of the Dotty repository: -[https://github.com/lampepfl/dotty/releases](https://github.com/lampepfl/dotty/releases) +[https://github.com/scala/scala3/releases](https://github.com/scala/scala3/releases) For macOS users, we also provide a [homebrew](https://brew.sh/) package that can be installed by running: @@ -156,7 +156,7 @@ brew upgrade dotty If you have questions or any sort of feedback, feel free to send us a message on our [Gitter channel](https://gitter.im/lampepfl/dotty). If you encounter a bug, please -[open an issue on GitHub](https://github.com/lampepfl/dotty/issues/new). +[open an issue on GitHub](https://github.com/scala/scala3/issues/new). ## Contributing @@ -195,7 +195,7 @@ According to `git shortlog -sn --no-merges 0.11.0-RC1..0.12.0-RC1` these are: If you want to get your hands dirty and contribute to Dotty, now is a good time to get involved! Head to our [Getting Started page for new contributors](https://dotty.epfl.ch/docs/contributing/getting-started.html), -and have a look at some of the [good first issues](https://github.com/lampepfl/dotty/issues?q=is%3Aissue+is%3Aopen+label%3Aexp%3Anovice). +and have a look at some of the [good first issues](https://github.com/scala/scala3/issues?q=is%3Aissue+is%3Aopen+label%3Aexp%3Anovice). They make perfect entry points into hacking on the compiler. We are looking forward to having you join the team of contributors. diff --git a/docs/_blog/_posts/2019-03-05-13th-dotty-milestone-release.md b/docs/_blog/_posts/2019-03-05-13th-dotty-milestone-release.md index f1838847d81e..8ebb70ea59e5 100644 --- a/docs/_blog/_posts/2019-03-05-13th-dotty-milestone-release.md +++ b/docs/_blog/_posts/2019-03-05-13th-dotty-milestone-release.md @@ -50,11 +50,11 @@ smoothly, but the *Parallelism and Concurrency* course given in the Spring semester teaches Spark, which means we needed to support it in Dotty! Luckily, this turned out to be mostly straightforward: we adopted the [object -serialization scheme](https://github.com/lampepfl/dotty/pull/5775) and [lambda -serialization scheme](https://github.com/lampepfl/dotty/pull/5837) pioneered by +serialization scheme](https://github.com/scala/scala3/pull/5775) and [lambda +serialization scheme](https://github.com/scala/scala3/pull/5837) pioneered by Scala 2, and that was enough to make our Spark assignments run correctly! This doesn't mean that our support is perfect however, so don't hesitate to [open an -issue](http://github.com/lampepfl/dotty/issues) if something is amiss. +issue](http://github.com/scala/scala3/issues) if something is amiss. ## Introducing top level definitions @@ -74,7 +74,7 @@ def b = a._2 You can read about [dropping package objects](https://dotty.epfl.ch/docs/reference/dropped-features/package-objects.html) at the documentation linked or at the relevant PR -[#5754](https://github.com/lampepfl/dotty/pull/5754). +[#5754](https://github.com/scala/scala3/pull/5754). ## All things impl... implied @@ -211,7 +211,7 @@ object B { **You can read more about** [implied imports](https://dotty.epfl.ch/docs/reference/contextual/import-delegate.html) from the docs or the relevant PR -[#5868](https://github.com/lampepfl/dotty/pull/5868). +[#5868](https://github.com/scala/scala3/pull/5868). As we mentioned above, *context queries* are functions with (only) inferable parameters. Here is an example of such a function: @@ -227,9 +227,9 @@ merely an alignment of IFTs into the new scheme. **You can read more about** the alternative to implicits through the *Contextual Abstractions* section of our documentation or for a deep dive from the relevant PR chain that originated from -[#5458](https://github.com/lampepfl/dotty/pull/5458). The syntax changes for new +[#5458](https://github.com/scala/scala3/pull/5458). The syntax changes for new implicits are summarized in -[#5825](https://github.com/lampepfl/dotty/pull/5825). +[#5825](https://github.com/scala/scala3/pull/5825). This release offers the support for _type class derivation_ as a language feature. Type class derivation is a way to generate instances of certain type @@ -279,8 +279,8 @@ def derived[T] given Generic[T] = ... **You can read more about** [Type class Derivation](https://dotty.epfl.ch/docs/reference/contextual/derivation.html) or have a deep dive at the relevant PRs: -[#5540](https://github.com/lampepfl/dotty/pull/5540) and -[#5839](https://github.com/lampepfl/dotty/pull/5839). +[#5540](https://github.com/scala/scala3/pull/5540) and +[#5839](https://github.com/scala/scala3/pull/5839). _Multiversal equality_ is now supported through the `Eql` marker trait (renamed from `Eq` to differentiate it from Cats' `Eq`). For example, in order to be able @@ -292,7 +292,7 @@ implied for Eql[Int, String] = Eql.derived ``` **You can read more about** how we based multiversal equality on type class derivation through -the relevant PR [#5843](https://github.com/lampepfl/dotty/pull/5843). +the relevant PR [#5843](https://github.com/scala/scala3/pull/5843). _Implicit conversions_ are now defined by implied instances of the `scala.Conversion` class. For example: @@ -311,7 +311,7 @@ important with the documentation of each feature, please consult the ## Implicit resolution rule changes -PR [#5887](https://github.com/lampepfl/dotty/pull/5887) applies the following +PR [#5887](https://github.com/scala/scala3/pull/5887) applies the following changes to implicit resolution: 1. nested implicits always take precedence over outer ones @@ -324,12 +324,12 @@ changes to implicit resolution: data model for semantic information such as symbols and types about programs in Scala and other languages. SemanticDB decouples production and consumption of semantic information, establishing documented means for communication between -tools. With PR [#5761](https://github.com/lampepfl/dotty/pull/5761) we add the +tools. With PR [#5761](https://github.com/scala/scala3/pull/5761) we add the first prototype for the generation of SemanticDB information from TASTy. ## And much more! -Please read our [release notes](https://github.com/lampepfl/dotty/releases/tag/0.13.0-RC1) +Please read our [release notes](https://github.com/scala/scala3/releases/tag/0.13.0-RC1) for more details! # Trying out Dotty @@ -360,7 +360,7 @@ the IDE sections of the [getting-started page](https://docs.scala-lang.org/scala Releases are available for download on the _Releases_ section of the Dotty repository: -[https://github.com/lampepfl/dotty/releases](https://github.com/lampepfl/dotty/releases) +[https://github.com/scala/scala3/releases](https://github.com/scala/scala3/releases) For macOS users, we also provide a [homebrew](https://brew.sh/) package that can be installed by running: @@ -379,7 +379,7 @@ brew upgrade dotty If you have questions or any sort of feedback, feel free to send us a message on our [Gitter channel](https://gitter.im/lampepfl/dotty). If you encounter a bug, please -[open an issue on GitHub](https://github.com/lampepfl/dotty/issues/new). +[open an issue on GitHub](https://github.com/scala/scala3/issues/new). ## Contributing @@ -417,7 +417,7 @@ According to `git shortlog -sn --no-merges 0.12.0-RC1..0.13.0-RC1` these are: If you want to get your hands dirty and contribute to Dotty, now is a good time to get involved! Head to our [Getting Started page for new contributors](https://dotty.epfl.ch/docs/contributing/getting-started.html), -and have a look at some of the [good first issues](https://github.com/lampepfl/dotty/issues?q=is%3Aissue+is%3Aopen+label%3Aexp%3Anovice). +and have a look at some of the [good first issues](https://github.com/scala/scala3/issues?q=is%3Aissue+is%3Aopen+label%3Aexp%3Anovice). They make perfect entry points into hacking on the compiler. We are looking forward to having you join the team of contributors. diff --git a/docs/_blog/_posts/2019-04-15-14th-dotty-milestone-release.md b/docs/_blog/_posts/2019-04-15-14th-dotty-milestone-release.md index 6724ae11cef1..1ac1c3ae9fbb 100644 --- a/docs/_blog/_posts/2019-04-15-14th-dotty-milestone-release.md +++ b/docs/_blog/_posts/2019-04-15-14th-dotty-milestone-release.md @@ -72,7 +72,7 @@ For more information, please read more in the [documentation](https://dotty.epfl ## An immutable array type -A new type, `scala.IArray[T]`, is added, which is an immutable version of the `Array` type. Its implementation deserves a special attention, as it uses the new Dotty features in an elegant way (the below is an abstract from the corresponding [commit](https://github.com/lampepfl/dotty/commit/af2a0e66eb4b1204eac5dcb1d979486b92ef93d7#diff-156dc405d9f228bbc0fe406dfba63f65): +A new type, `scala.IArray[T]`, is added, which is an immutable version of the `Array` type. Its implementation deserves a special attention, as it uses the new Dotty features in an elegant way (the below is an abstract from the corresponding [commit](https://github.com/scala/scala3/commit/af2a0e66eb4b1204eac5dcb1d979486b92ef93d7#diff-156dc405d9f228bbc0fe406dfba63f65): ```scala opaque type IArray[T] = Array[T] @@ -131,7 +131,7 @@ Some of the other changes include: If you have questions or any sort of feedback, feel free to send us a message on our [Gitter channel](https://gitter.im/lampepfl/dotty). If you encounter a bug, please -[open an issue on GitHub](https://github.com/lampepfl/dotty/issues/new). +[open an issue on GitHub](https://github.com/scala/scala3/issues/new). ## Contributing @@ -171,7 +171,7 @@ According to `git shortlog -sn --no-merges 0.13.0-RC1..0.14.0-RC1` these are: If you want to get your hands dirty and contribute to Dotty, now is a good time to get involved! Head to our [Getting Started page for new contributors](https://dotty.epfl.ch/docs/contributing/getting-started.html), -and have a look at some of the [good first issues](https://github.com/lampepfl/dotty/issues?q=is%3Aissue+is%3Aopen+label%3Aexp%3Anovice). +and have a look at some of the [good first issues](https://github.com/scala/scala3/issues?q=is%3Aissue+is%3Aopen+label%3Aexp%3Anovice). They make perfect entry points into hacking on the compiler. We are looking forward to having you join the team of contributors. diff --git a/docs/_blog/_posts/2019-05-23-15th-dotty-milestone-release.md b/docs/_blog/_posts/2019-05-23-15th-dotty-milestone-release.md index 68337d78ca8c..1ffb07377da4 100644 --- a/docs/_blog/_posts/2019-05-23-15th-dotty-milestone-release.md +++ b/docs/_blog/_posts/2019-05-23-15th-dotty-milestone-release.md @@ -6,7 +6,7 @@ authorImg: images/anatolii.png date: 2019-05-23 --- -Hi! We are very excited to announce the 15th release of Dotty. The most exciting thing in this release is the full bootstrap for Dotty introduced by PR [#5923](https://github.com/lampepfl/dotty/pull/5923)🎉😍. This means that we now always compile Dotty with Dotty itself, hence we can use use all the new features in the compiler code base. +Hi! We are very excited to announce the 15th release of Dotty. The most exciting thing in this release is the full bootstrap for Dotty introduced by PR [#5923](https://github.com/scala/scala3/pull/5923)🎉😍. This means that we now always compile Dotty with Dotty itself, hence we can use use all the new features in the compiler code base. With this release comes a bunch of new features and improvements, such as the ability to enforce whether an operator is intended to be used in an infix position, the type safe pattern bindings and more. @@ -36,7 +36,7 @@ This is our 15th scheduled release according to our ## Full Bootstrap Bootstrapping Dotty is a big milestone for us and in compiler construction in general. Firstly, we feel more confident that our compiler works as is (even without reusing the new features). Secondly, in the immediate future, we will be able to reuse many of the features that dotty proposes within dotty itself. For example, we have no fewer than 2641 occurrences of the text string (implicit ctx: Context) in the compiler that we can scrap with [Contextual Function types](https://www.scala-lang.org/blog/2016/12/07/implicit-function-types.html). Big milestones have high risk/high gain and we must be attentive. That is the reason that we will wait a bit until we start using new features. Consequently, at the moment we cross-compile the build with 2.12 on the CI so that we don't accidentally start using Dotty features in case we need to revise the bootstrap process (we'll start using Dotty features eventually, but let's wait until we're confident that this setup works well enough). -Check the following for more information [#5923 (comment)](https://github.com/lampepfl/dotty/pull/5923#issuecomment-485421148) and please let us know if you have any incremental compilation issues or anything else! +Check the following for more information [#5923 (comment)](https://github.com/scala/scala3/pull/5923#issuecomment-485421148) and please let us know if you have any incremental compilation issues or anything else! ## Operator Rules This change addresses the problem of the regulation of whether an operator is supposed to be used in an infix position. The motivation is for the library authors to be able to enforce whether a method or a type is supposed to be used in an infix position by the users. This ability will help to make code bases more consistent in the way the calls to methods are performed. @@ -150,7 +150,7 @@ For the migration purposes, the above change will only take effect in Scala 3.1. For more information, see the [documentation](https://dotty.epfl.ch/docs/reference/changed-features/pattern-bindings.html). ## Further improvements to Generalised Algebraic Data Types (GADTs) support -In this release, we've further improved our support for GADTs. Most notably, we now support variant GADTs, thus fixing [#2985](https://github.com/lampepfl/dotty/issues/2985): +In this release, we've further improved our support for GADTs. Most notably, we now support variant GADTs, thus fixing [#2985](https://github.com/scala/scala3/issues/2985): ```scala enum Expr[+T] { @@ -164,20 +164,20 @@ def eval[T](e: Expr[T]): T = e match { } ``` -We've also plugged a few soundness problems (e.g. [#5667](https://github.com/lampepfl/dotty/issues/5667)) caused by inferring too much when matching on abstract, union and intersection types. For more information, see PR [#5736](https://github.com/lampepfl/dotty/pull/5736). +We've also plugged a few soundness problems (e.g. [#5667](https://github.com/scala/scala3/issues/5667)) caused by inferring too much when matching on abstract, union and intersection types. For more information, see PR [#5736](https://github.com/scala/scala3/pull/5736). ## Other changes Some of the other notable changes include the following: - Singletons are now allowed in union types. E.g. the following is allowed: `object foo; type X = Int | foo.type`. -- A bunch of improvements was made for the type inference system – see, e.g., PRs [#6454](https://github.com/lampepfl/dotty/pull/6454) and [#6467](https://github.com/lampepfl/dotty/pull/6467). -- Improvements to the Scala 2 code support which, in particular, improves Cats support – see PRs [#6494](https://github.com/lampepfl/dotty/pull/6494) and [#6498](https://github.com/lampepfl/dotty/pull/6498). +- A bunch of improvements was made for the type inference system – see, e.g., PRs [#6454](https://github.com/scala/scala3/pull/6454) and [#6467](https://github.com/scala/scala3/pull/6467). +- Improvements to the Scala 2 code support which, in particular, improves Cats support – see PRs [#6494](https://github.com/scala/scala3/pull/6494) and [#6498](https://github.com/scala/scala3/pull/6498). # Let us know what you think! If you have questions or any sort of feedback, feel free to send us a message on our [Gitter channel](https://gitter.im/lampepfl/dotty). If you encounter a bug, please -[open an issue on GitHub](https://github.com/lampepfl/dotty/issues/new). +[open an issue on GitHub](https://github.com/scala/scala3/issues/new). ## Contributing @@ -205,7 +205,7 @@ According to `git shortlog -sn --no-merges 0.14.0-RC1..0.15.0-RC1` these are: If you want to get your hands dirty and contribute to Dotty, now is a good time to get involved! Head to our [Getting Started page for new contributors](https://dotty.epfl.ch/docs/contributing/getting-started.html), -and have a look at some of the [good first issues](https://github.com/lampepfl/dotty/issues?q=is%3Aissue+is%3Aopen+label%3Aexp%3Anovice). +and have a look at some of the [good first issues](https://github.com/scala/scala3/issues?q=is%3Aissue+is%3Aopen+label%3Aexp%3Anovice). They make perfect entry points into hacking on the compiler. We are looking forward to having you join the team of contributors. diff --git a/docs/_blog/_posts/2019-06-11-16th-dotty-milestone-release.md b/docs/_blog/_posts/2019-06-11-16th-dotty-milestone-release.md index 41194df26625..91f63a5610b7 100644 --- a/docs/_blog/_posts/2019-06-11-16th-dotty-milestone-release.md +++ b/docs/_blog/_posts/2019-06-11-16th-dotty-milestone-release.md @@ -49,7 +49,7 @@ represented by `=>>`. As a result a function from types to types is written as `[X] =>> F[X]`. For those who are interested in the discussions, -[#6558](https://github.com/lampepfl/dotty/pull/6558) introduced the new syntax. +[#6558](https://github.com/scala/scala3/pull/6558) introduced the new syntax. ## Syntax Change: Wildcard Arguments in Types @@ -92,7 +92,7 @@ delegate ListOrd[T] for Ord[List[T]] given (ord: Ord[T]) { ``` For more information, the documentation has been updated as part of the relevant -PR [#6649](https://github.com/lampepfl/dotty/pull/6649) +PR [#6649](https://github.com/scala/scala3/pull/6649) ## Polymorphic function types @@ -127,7 +127,7 @@ With PFTs we can now achieve what we want: ``` For those who are interested in the discussions and more test cases, -[#4672](https://github.com/lampepfl/dotty/pull/4672/) introduced PFTs. +[#4672](https://github.com/scala/scala3/pull/4672/) introduced PFTs. ## `lazy val`s are now thread-safe by default @@ -156,9 +156,9 @@ enum B(val gravity: Double) extends java.lang.Enum[B] { } ``` -For more information please check the [test case](https://github.com/lampepfl/dotty/tree/main/tests/run/enum-java) and -also the relevant PRs [#6602](https://github.com/lampepfl/dotty/pull/6602) and -[#6629](https://github.com/lampepfl/dotty/pull/6629). +For more information please check the [test case](https://github.com/scala/scala3/tree/main/tests/run/enum-java) and +also the relevant PRs [#6602](https://github.com/scala/scala3/pull/6602) and +[#6629](https://github.com/scala/scala3/pull/6629). In the test, the enums are defined in the `MainScala.scala` file and used from a Java source, `Test.java`. @@ -212,13 +212,13 @@ Advantages of new scheme: - Complete decoupling between derives clauses and mirror generation. For the technical details of these changes please consule the corresponding PR -[#6531](https://github.com/lampepfl/dotty/pull/6531). +[#6531](https://github.com/scala/scala3/pull/6531). # Let us know what you think! If you have questions or any sort of feedback, feel free to send us a message on our [Gitter channel](https://gitter.im/lampepfl/dotty). If you encounter a bug, please -[open an issue on GitHub](https://github.com/lampepfl/dotty/issues/new). +[open an issue on GitHub](https://github.com/scala/scala3/issues/new). ## Contributing @@ -245,7 +245,7 @@ According to `git shortlog -sn --no-merges 0.15.0-RC1..0.16.0-RC3` these are: If you want to get your hands dirty and contribute to Dotty, now is a good time to get involved! Head to our [Getting Started page for new contributors](https://dotty.epfl.ch/docs/contributing/getting-started.html), -and have a look at some of the [good first issues](https://github.com/lampepfl/dotty/issues?q=is%3Aissue+is%3Aopen+label%3Aexp%3Anovice). +and have a look at some of the [good first issues](https://github.com/scala/scala3/issues?q=is%3Aissue+is%3Aopen+label%3Aexp%3Anovice). They make perfect entry points into hacking on the compiler. We are looking forward to having you join the team of contributors. diff --git a/docs/_blog/_posts/2019-07-25-17th-dotty-milestone-release.md b/docs/_blog/_posts/2019-07-25-17th-dotty-milestone-release.md index eea99263def9..62e0a550598d 100644 --- a/docs/_blog/_posts/2019-07-25-17th-dotty-milestone-release.md +++ b/docs/_blog/_posts/2019-07-25-17th-dotty-milestone-release.md @@ -31,7 +31,7 @@ This is our 17th scheduled release according to our # What’s new in the 0.17.0-RC1 technology preview? ## New implicit scoping rules -We aim to make the implicit scoping rules clean and intuitive. In this release, the scoping rules were refactored to facilitate this goal. As specified in the [code documentation](https://github.com/lampepfl/dotty/pull/6832/files#diff-584b631c45ba6f2d4bc5d803074b8f12R474): +We aim to make the implicit scoping rules clean and intuitive. In this release, the scoping rules were refactored to facilitate this goal. As specified in the [code documentation](https://github.com/scala/scala3/pull/6832/files#diff-584b631c45ba6f2d4bc5d803074b8f12R474): The implicit scope of a type `tp` is the smallest set S of object references (i.e. TermRefs with Module symbol) such that: @@ -48,14 +48,14 @@ with Module symbol) such that: - If `tp` is some other type, its implicit scope is the union of the implicit scopes of its parts (parts defined as in the spec). -You can learn more from PR [#6832](https://github.com/lampepfl/dotty/pull/6832). +You can learn more from PR [#6832](https://github.com/scala/scala3/pull/6832). ## Metaprogramming We are making steady progress developing metaprogramming features. The highlights for this release are: -- Tasty Reflection's `Reflection` object moved inside `QuoteContext` object. This means that if previously to do Tasty Reflection you had to implicitly depend on `Reflection`, now you need to depend on `QuoteContext`. To know more, see [#6723](https://github.com/lampepfl/dotty/pull/6723). -- Progress made on quoted patterns – see [#6504](https://github.com/lampepfl/dotty/pull/6504). -- `code` string interpolator allows to obtain the code a user passes to a macro as a String. See [#6661](https://github.com/lampepfl/dotty/pull/6661). To enable this feature, do the following import: `import scala.compiletime._`. +- Tasty Reflection's `Reflection` object moved inside `QuoteContext` object. This means that if previously to do Tasty Reflection you had to implicitly depend on `Reflection`, now you need to depend on `QuoteContext`. To know more, see [#6723](https://github.com/scala/scala3/pull/6723). +- Progress made on quoted patterns – see [#6504](https://github.com/scala/scala3/pull/6504). +- `code` string interpolator allows to obtain the code a user passes to a macro as a String. See [#6661](https://github.com/scala/scala3/pull/6661). To enable this feature, do the following import: `import scala.compiletime._`. ## 2.12 build removed from the CI tests 2.12 build is removed from the test suite. The 2.12 build compiled and tested the Dotty compiler with the Scala 2.12 compiler. This means that, even though Dotty is bootstrapped (i.e. capable of compiling itself), we were not able to use any of the new Dotty features in the Dotty codebase since these features would not compile with Scala 2.12. The decision to abstain from using the new features was made to give us the time to see if something goes wrong with the bootstrap and the ability to revert to Scala 2.12 if it becomes necessary. @@ -74,7 +74,7 @@ There were some organizational and infrastructural changes worth mentioning. If you have questions or any sort of feedback, feel free to send us a message on our [Gitter channel](https://gitter.im/lampepfl/dotty). If you encounter a bug, please -[open an issue on GitHub](https://github.com/lampepfl/dotty/issues/new). +[open an issue on GitHub](https://github.com/scala/scala3/issues/new). ## Contributing @@ -113,7 +113,7 @@ According to `git shortlog -sn --no-merges 0.16.0-RC3..0.17.0-RC1` these are: If you want to get your hands dirty and contribute to Dotty, now is a good time to get involved! Head to our [Getting Started page for new contributors](https://dotty.epfl.ch/docs/contributing/getting-started.html), -and have a look at some of the [good first issues](https://github.com/lampepfl/dotty/issues?q=is%3Aissue+is%3Aopen+label%3Aexp%3Anovice). +and have a look at some of the [good first issues](https://github.com/scala/scala3/issues?q=is%3Aissue+is%3Aopen+label%3Aexp%3Anovice). They make perfect entry points into hacking on the compiler. We are looking forward to having you join the team of contributors. diff --git a/docs/_blog/_posts/2019-08-30-18th-dotty-milestone-release.md b/docs/_blog/_posts/2019-08-30-18th-dotty-milestone-release.md index 420b9103d3b7..8c06b2058230 100644 --- a/docs/_blog/_posts/2019-08-30-18th-dotty-milestone-release.md +++ b/docs/_blog/_posts/2019-08-30-18th-dotty-milestone-release.md @@ -168,7 +168,7 @@ val iterator = Iterator.from(10, -1) } ``` -For more information, see PR [#6994](https://github.com/lampepfl/dotty/pull/6994). +For more information, see PR [#6994](https://github.com/scala/scala3/pull/6994). ## Brace-less syntax for control expressions This is an effort to clean-up the control expressions. Scala 2 has two ways of writing `if` statements – with and without parentheses. Parentheses can be dropped in Scala 2 `if`s inside `match` or `for` statements. We'd like to have a single style of writing all of the control expressions, and the cleaner the better. @@ -202,7 +202,7 @@ This release, hence, brings the ability to write all of the control expressions Moreover, the compiler can automatically rewrite your sources from the old syntax to the new syntax and vice versa. To rewrite the sources to the new syntax, run the compiler with the `-rewrite -new-syntax` flags, and to rewrite to the old syntax, use `-rewrite -old-syntax`. So far, both syntaxes are supported. -For more information and the precise rules, see PR [#7024](https://github.com/lampepfl/dotty/pull/7024). +For more information and the precise rules, see PR [#7024](https://github.com/scala/scala3/pull/7024). ## Significant indentation syntax Significant indentations syntax is here! A logical continuation of the brace-less syntax for control expressions described above, meant as an exploration into a better way to write Scala, it allows writing Scala programs without braces. For example: @@ -232,7 +232,7 @@ given as scala.util.FromString[Day]: So far, it is a purely experimental effort. This means there is no final decision yet on whether or not it will be included in Scala 3. However, we treat this feature seriously enough to give it an extended period of trial and see if it is viable as the new look and feel for Scala. -For more details and the discussion, see PRs [#7083](https://github.com/lampepfl/dotty/pull/7083) and [#7114](https://github.com/lampepfl/dotty/pull/7114). +For more details and the discussion, see PRs [#7083](https://github.com/scala/scala3/pull/7083) and [#7114](https://github.com/scala/scala3/pull/7114). ## Generic Number Literals It is now possible to seamlessly integrate with different number formats: that is, to write a number and get it automatically converted to your class of choice. E.g.: @@ -256,13 +256,13 @@ For precise rules, semantics and a larger example of `BigFloat`, see [the docume ## Metaprogramming Progress We are making steady progress with the language metaprogramming features. The metaprogramming spotlights of this release are as follows: -- `toExprOfTuple` method which allows converting a `Seq[Expr[Any]]` to `Expr[Tuple]`. The types of the expressions will be preserved in the tuple. See [#7037](https://github.com/lampepfl/dotty/pull/7037) and [#7076](https://github.com/lampepfl/dotty/pull/7076) for the details. -- `toExprOfTuple` method that converts a tuple of expressions to an expression of tuple – see [#7047](https://github.com/lampepfl/dotty/pull/7047). -- `toExprOfSeq` which converts an `Seq[Expr[A]]` to `Expr[Seq[A]]` – see [#6935](https://github.com/lampepfl/dotty/pull/6935). -- More `Liftable` instances – for Tuples of arity greater than 22, `BigInt` and `BigDecimal` – see [#6947](https://github.com/lampepfl/dotty/pull/6947) and [#6944](https://github.com/lampepfl/dotty/pull/6944). -- Leverage implicit lambdas to simplify `Liftable.toExpr` method – see [#6924](https://github.com/lampepfl/dotty/pull/6924) to learn how it is done. -- Runtime staging `run` moved to `scala.quoted.staging` in [#7077](https://github.com/lampepfl/dotty/pull/7077). -- Runtime staging factored out to a separate library in [#7080](https://github.com/lampepfl/dotty/pull/7080). +- `toExprOfTuple` method which allows converting a `Seq[Expr[Any]]` to `Expr[Tuple]`. The types of the expressions will be preserved in the tuple. See [#7037](https://github.com/scala/scala3/pull/7037) and [#7076](https://github.com/scala/scala3/pull/7076) for the details. +- `toExprOfTuple` method that converts a tuple of expressions to an expression of tuple – see [#7047](https://github.com/scala/scala3/pull/7047). +- `toExprOfSeq` which converts an `Seq[Expr[A]]` to `Expr[Seq[A]]` – see [#6935](https://github.com/scala/scala3/pull/6935). +- More `Liftable` instances – for Tuples of arity greater than 22, `BigInt` and `BigDecimal` – see [#6947](https://github.com/scala/scala3/pull/6947) and [#6944](https://github.com/scala/scala3/pull/6944). +- Leverage implicit lambdas to simplify `Liftable.toExpr` method – see [#6924](https://github.com/scala/scala3/pull/6924) to learn how it is done. +- Runtime staging `run` moved to `scala.quoted.staging` in [#7077](https://github.com/scala/scala3/pull/7077). +- Runtime staging factored out to a separate library in [#7080](https://github.com/scala/scala3/pull/7080). ## Type Class Derivation Type class derivation has received a major rework and an [updated documentation](https://dotty.epfl.ch/docs/reference/contextual/derivation.html). We have dropped the usage of the `Shape` type to describe the shape of a type. Instead, all the relevant information is now encoded in the `Mirror` type and its subtypes as tuples. @@ -270,15 +270,15 @@ Type class derivation has received a major rework and an [updated documentation] For more information, see the [documentation](https://dotty.epfl.ch/docs/reference/contextual/derivation.html). ## Other -- This release also features the new version of the SBT Dotty Plugin – 0.3.4. It contains some bug fixes – see [#7120](https://github.com/lampepfl/dotty/pull/7120) for details. -- Scala Days 2019 talks related to Dotty are now [mentioned](https://dotty.epfl.ch/docs/resources/talks.html) at our website – this allows to systematize the knowledge about the next generation of Scala in one place – see [#6984](https://github.com/lampepfl/dotty/pull/6984). -- ScalaJS needs your help! We would like to have robust support for ScalaJS in Dotty, which unfortunately is not the case so far. If you are interested in contributing, please see [the getting started tutorial](https://gist.github.com/sjrd/e0823a5bddbcef43999cdaa032b1220c) and [the discussion](https://github.com/lampepfl/dotty/issues/7113). +- This release also features the new version of the SBT Dotty Plugin – 0.3.4. It contains some bug fixes – see [#7120](https://github.com/scala/scala3/pull/7120) for details. +- Scala Days 2019 talks related to Dotty are now [mentioned](https://dotty.epfl.ch/docs/resources/talks.html) at our website – this allows to systematize the knowledge about the next generation of Scala in one place – see [#6984](https://github.com/scala/scala3/pull/6984). +- ScalaJS needs your help! We would like to have robust support for ScalaJS in Dotty, which unfortunately is not the case so far. If you are interested in contributing, please see [the getting started tutorial](https://gist.github.com/sjrd/e0823a5bddbcef43999cdaa032b1220c) and [the discussion](https://github.com/scala/scala3/issues/7113). # Let us know what you think! If you have questions or any sort of feedback, feel free to send us a message on our [Gitter channel](https://gitter.im/lampepfl/dotty). If you encounter a bug, please -[open an issue on GitHub](https://github.com/lampepfl/dotty/issues/new). +[open an issue on GitHub](https://github.com/scala/scala3/issues/new). ## Contributing @@ -310,7 +310,7 @@ According to `git shortlog -sn --no-merges 0.17.0-RC1..0.18.1-RC1` these are: If you want to get your hands dirty and contribute to Dotty, now is a good time to get involved! Head to our [Getting Started page for new contributors](https://dotty.epfl.ch/docs/contributing/getting-started.html), -and have a look at some of the [good first issues](https://github.com/lampepfl/dotty/issues?q=is%3Aissue+is%3Aopen+label%3Aexp%3Anovice). +and have a look at some of the [good first issues](https://github.com/scala/scala3/issues?q=is%3Aissue+is%3Aopen+label%3Aexp%3Anovice). They make perfect entry points into hacking on the compiler. We are looking forward to having you join the team of contributors. diff --git a/docs/_blog/_posts/2019-09-23-19th-dotty-milestone-release.md b/docs/_blog/_posts/2019-09-23-19th-dotty-milestone-release.md index 0de8d87b92bb..55458b5a20c6 100644 --- a/docs/_blog/_posts/2019-09-23-19th-dotty-milestone-release.md +++ b/docs/_blog/_posts/2019-09-23-19th-dotty-milestone-release.md @@ -160,7 +160,7 @@ For instance, `-project-logo dotty-logo.svg` will make `/images/dotty-logo.svg` [The front page](https://dotty.epfl.ch) has been redesigned too, with a new responsive menu and improved contrast. -Overall, every page has been updated with consistent settings of fonts and colors. A more detailed comparison between the new and the old design can be found [here](https://github.com/lampepfl/dotty/pull/7153). +Overall, every page has been updated with consistent settings of fonts and colors. A more detailed comparison between the new and the old design can be found [here](https://github.com/scala/scala3/pull/7153). ## Metaprogramming Progress We're making steady progress on the Dotty metaprogramming capability. In our previous work, we've implemented a bunch of functions for working with expressions. For example, we have a capability to convert a list of expressions into an expression of list, or a tuple of expressions into an expression of tuple. @@ -179,7 +179,7 @@ Also, `x.toExpr` syntax which lifts `x` into an expression is now deprecated. It If you have questions or any sort of feedback, feel free to send us a message on our [Gitter channel](https://gitter.im/lampepfl/dotty). If you encounter a bug, please -[open an issue on GitHub](https://github.com/lampepfl/dotty/issues/new). +[open an issue on GitHub](https://github.com/scala/scala3/issues/new). ## Contributing @@ -206,7 +206,7 @@ According to `git shortlog -sn --no-merges 0.18.1-RC1..0.19.0-RC1` these are: If you want to get your hands dirty and contribute to Dotty, now is a good time to get involved! Head to our [Getting Started page for new contributors](https://dotty.epfl.ch/docs/contributing/getting-started.html), -and have a look at some of the [good first issues](https://github.com/lampepfl/dotty/issues?q=is%3Aissue+is%3Aopen+label%3Aexp%3Anovice). +and have a look at some of the [good first issues](https://github.com/scala/scala3/issues?q=is%3Aissue+is%3Aopen+label%3Aexp%3Anovice). They make perfect entry points into hacking on the compiler. We are looking forward to having you join the team of contributors. diff --git a/docs/_blog/_posts/2019-11-04-20th-dotty-milestone-release.md b/docs/_blog/_posts/2019-11-04-20th-dotty-milestone-release.md index 78cbe171ca11..e28b2304831c 100644 --- a/docs/_blog/_posts/2019-11-04-20th-dotty-milestone-release.md +++ b/docs/_blog/_posts/2019-11-04-20th-dotty-milestone-release.md @@ -127,16 +127,16 @@ It is easy to forget to put `then` at the end of the line if nothing else follow ## Metaprogramming Progress We are making a steady progress developing and improving the metaprogramming features of Dotty. Here are metaprogramming highlights of this release: -- Fix #7189: Do not try to load contents if file does not exist [#7476](https://github.com/lampepfl/dotty/pull/7476) -- Add customizable names for definitions in quotes [#7346](https://github.com/lampepfl/dotty/pull/7346) -- Rename scala.quoted.matching.{Bind => Sym} [#7332](https://github.com/lampepfl/dotty/pull/7332) -- Replace AsFunction implicit class with Expr.reduce [#7299](https://github.com/lampepfl/dotty/pull/7299) +- Fix #7189: Do not try to load contents if file does not exist [#7476](https://github.com/scala/scala3/pull/7476) +- Add customizable names for definitions in quotes [#7346](https://github.com/scala/scala3/pull/7346) +- Rename scala.quoted.matching.{Bind => Sym} [#7332](https://github.com/scala/scala3/pull/7332) +- Replace AsFunction implicit class with Expr.reduce [#7299](https://github.com/scala/scala3/pull/7299) # Let us know what you think! If you have questions or any sort of feedback, feel free to send us a message on our [Gitter channel](https://gitter.im/lampepfl/dotty). If you encounter a bug, please -[open an issue on GitHub](https://github.com/lampepfl/dotty/issues/new). +[open an issue on GitHub](https://github.com/scala/scala3/issues/new). ## Contributing @@ -173,7 +173,7 @@ According to `git shortlog -sn --no-merges 0.19.0-RC1..0.20.0-RC1` these are: If you want to get your hands dirty and contribute to Dotty, now is a good time to get involved! Head to our [Getting Started page for new contributors](https://dotty.epfl.ch/docs/contributing/getting-started.html), -and have a look at some of the [good first issues](https://github.com/lampepfl/dotty/issues?q=is%3Aissue+is%3Aopen+label%3Aexp%3Anovice). +and have a look at some of the [good first issues](https://github.com/scala/scala3/issues?q=is%3Aissue+is%3Aopen+label%3Aexp%3Anovice). They make perfect entry points into hacking on the compiler. We are looking forward to having you join the team of contributors. diff --git a/docs/_blog/_posts/2019-12-20-21th-dotty-milestone-release.md b/docs/_blog/_posts/2019-12-20-21th-dotty-milestone-release.md index 94d8ee61bec9..794eb875e3fc 100644 --- a/docs/_blog/_posts/2019-12-20-21th-dotty-milestone-release.md +++ b/docs/_blog/_posts/2019-12-20-21th-dotty-milestone-release.md @@ -17,7 +17,7 @@ happy to announce that we are now feature complete! # Feature Complete! This release is a HUGE milestone for us, for Dotty, for Scala 3, for our community. Since that -[initial commit](https://github.com/lampepfl/dotty/commit/90962407e72d88f8f3249ade0f6bd60ff15af5ce) +[initial commit](https://github.com/scala/scala3/commit/90962407e72d88f8f3249ade0f6bd60ff15af5ce) on the 6th December of 2012 when the only feature was the basic structure of a compiler based on the DOT calculus, we have come a long way. @@ -48,7 +48,7 @@ It means that we can now put the Scala 3 compiler under heavy load, getting it ready for industrial strength applications. At the moment we have 23 projects on our community projects and we expect this number to go up! -> https://github.com/lampepfl/dotty/tree/main/community-build/community-projects +> https://github.com/scala/scala3/tree/main/community-build/community-projects This project contains tests to build and test a corpus of open sources Scala 2.x projects against Scala 3. @@ -333,7 +333,7 @@ root for `.semanticdb` files) and `-sourceroot` to calculate a relative path for If you have questions or any sort of feedback, feel free to send us a message on our [Gitter channel](https://gitter.im/lampepfl/dotty). If you encounter a bug, please -[open an issue on GitHub](https://github.com/lampepfl/dotty/issues/new). +[open an issue on GitHub](https://github.com/scala/scala3/issues/new). ## Contributing @@ -378,7 +378,7 @@ According to `git shortlog -sn --no-merges 0.20.0-RC1..0.21.0-RC1` these are: If you want to get your hands dirty and contribute to Dotty, now is a good time to get involved! Head to our [Getting Started page for new contributors](https://dotty.epfl.ch/docs/contributing/getting-started.html), -and have a look at some of the [good first issues](https://github.com/lampepfl/dotty/issues?q=is%3Aissue+is%3Aopen+label%3Aexp%3Anovice). +and have a look at some of the [good first issues](https://github.com/scala/scala3/issues?q=is%3Aissue+is%3Aopen+label%3Aexp%3Anovice). They make perfect entry points into hacking on the compiler. We are looking forward to having you join the team of contributors. diff --git a/docs/_blog/_posts/2020-02-05-22nd-dotty-milestone-release.md b/docs/_blog/_posts/2020-02-05-22nd-dotty-milestone-release.md index a901e83130d8..1dc100a9a741 100644 --- a/docs/_blog/_posts/2020-02-05-22nd-dotty-milestone-release.md +++ b/docs/_blog/_posts/2020-02-05-22nd-dotty-milestone-release.md @@ -55,7 +55,7 @@ println(s"Third: ${list.third}") // 3 This syntax is a completely separate one from the `given` syntax and hence is aimed to bring more clarity and disentangle the two different concepts. -For the discussion, see [PR #7917](https://github.com/lampepfl/dotty/pull/7917). For more information on how to use extension methods in general and collective extension methods in particular, see the [documentation](https://dotty.epfl.ch/docs/reference/contextual/extension-methods.html). +For the discussion, see [PR #7917](https://github.com/scala/scala3/pull/7917). For more information on how to use extension methods in general and collective extension methods in particular, see the [documentation](https://dotty.epfl.ch/docs/reference/contextual/extension-methods.html). # Kind projector syntax support [Kind projector](https://github.com/typelevel/kind-projector) is a popular compiler plugin for Scala 2. It is especially useful in the context of purely functional programming and type class derivation – everywhere where you need to work extensively with types. @@ -90,7 +90,7 @@ object tupleFunctor extends Functor[λ[x => (x, x)]] println(squared) // (1,4) ``` -For the discussion, see [PR #7775](https://github.com/lampepfl/dotty/pull/7775). Also see the GitHub [repository](https://github.com/typelevel/kind-projector) of the kind projector Scala 2 plugin for more context. +For the discussion, see [PR #7775](https://github.com/scala/scala3/pull/7775). Also see the GitHub [repository](https://github.com/typelevel/kind-projector) of the kind projector Scala 2 plugin for more context. # Further improvements to the context parameters syntax Scala 3 context parameters are successors of Scala 2 implicits. In Scala 2, they proved useful for a wide range of applications including purely functional programming, dependency injection, type class derivation, type-level programming. Because their apparent value, one of the priorities in Scala 3 for us is to improve the conceptual framework behind them. @@ -127,7 +127,7 @@ As opposed to the previous: f(2)(given 20) ``` -For the time being, the change is experimental and the old syntax is also supported. For the discussion, see [PR #8162](https://github.com/lampepfl/dotty/pull/8162). You can browse the documentation concerning the new syntax [here](https://dotty.epfl.ch/docs/reference/contextual/motivation-new.html). +For the time being, the change is experimental and the old syntax is also supported. For the discussion, see [PR #8162](https://github.com/scala/scala3/pull/8162). You can browse the documentation concerning the new syntax [here](https://dotty.epfl.ch/docs/reference/contextual/motivation-new.html). # Semantics of inline parameters changed Inline parameters is a metaprogramming feature of Dotty which allows to splice the body of the parameter on its call site. Previously, inline parameters to methods were required to be known on compile time. With this release, this constraint has been relaxed. The following: @@ -149,7 +149,7 @@ Notice how the value of the by-name parameter `b` is not inlined but is bound to So, if previously you had a macro `inline def operationOnCode(code: => Unit) = ${ mcrImpl('code) }` which did something on the AST of the passed `code`, with this release you need to change it to `inline def operationOnCode(inline code: Unit) = ${ mcrImpl('code) }`. -This change was introduced by [PR #8060](https://github.com/lampepfl/dotty/pull/8060/). +This change was introduced by [PR #8060](https://github.com/scala/scala3/pull/8060/). Another change in the semantics of the inline parameters involves the fact that the can no longer be passed as constants to macro implementations. Previously, the following was possible: @@ -169,7 +169,7 @@ inline def power(x: Double, inline n: Int) = ${ powerCode('x, 'n) } private def powerCode(x: Expr[Double], n: Expr[Int])(given QuoteContext): Expr[Double] = ??? ``` -You can obtain the constant value of `n` from within the macro implementation by calling `n.getValue` on it which returns an `Option`. This change was introduced by [PR #8061](https://github.com/lampepfl/dotty/pull/8061). +You can obtain the constant value of `n` from within the macro implementation by calling `n.getValue` on it which returns an `Option`. This change was introduced by [PR #8061](https://github.com/scala/scala3/pull/8061). For more information about the inline capability of Dotty, see [documentation](https://dotty.epfl.ch/docs/reference/metaprogramming/inline.html). @@ -194,7 +194,7 @@ The compile-time error above will say: This feature is particularly useful for data science applications. In data science, it is very easy to make a linear algebra mistake, multiply matrices of wrong dimensions and get a runtime error – sometimes after a few hours of running the model. Hence compile-time verification of the models has a great potential for saving time. With such a type-level arithmetic, Scala becomes well-positioned to implement such type-safe data science frameworks. -For the discussion, see [PR #7628](https://github.com/lampepfl/dotty/pull/7628). The documentation is available [here](https://dotty.epfl.ch/docs/reference/metaprogramming/inline.html#the-scalacompiletimeops-package). +For the discussion, see [PR #7628](https://github.com/scala/scala3/pull/7628). The documentation is available [here](https://dotty.epfl.ch/docs/reference/metaprogramming/inline.html#the-scalacompiletimeops-package). # Suggestions on missing context parameters If there's a compile-time error due to a missing context parameter and this error can be fixed with an import, the compiler will attempt to suggest such an import in the error message. Here is an example of how this error looks like: @@ -212,7 +212,7 @@ If there's a compile-time error due to a missing context parameter and this erro One area where these suggestions will make life easier is purely functional programming with type-classes, with libraries like [cats](https://typelevel.org/cats/). Having the fix for a missing type class in the error message itself is a big time-saver. -For the discussion, see [PR #7862](https://github.com/lampepfl/dotty/pull/7862). +For the discussion, see [PR #7862](https://github.com/scala/scala3/pull/7862). # TASTy Inspector library TASTy Consumer was renamed to TASTy Inspector as of this release. It was also published in a library of its own. For more information, see the [documentation](https://dotty.epfl.ch/docs/reference/metaprogramming/tasty-inspect.html) on this library. @@ -221,7 +221,7 @@ TASTy Consumer was renamed to TASTy Inspector as of this release. It was also pu If you have questions or any sort of feedback, feel free to send us a message on our [Gitter channel](https://gitter.im/lampepfl/dotty). If you encounter a bug, please -[open an issue on GitHub](https://github.com/lampepfl/dotty/issues/new). +[open an issue on GitHub](https://github.com/scala/scala3/issues/new). ## Contributing @@ -267,7 +267,7 @@ According to `git shortlog -sn --no-merges 0.21.0-RC1..0.22.0-RC1` these are: If you want to get your hands dirty and contribute to Dotty, now is a good time to get involved! Head to our [Getting Started page for new contributors](https://dotty.epfl.ch/docs/contributing/getting-started.html), -and have a look at some of the [good first issues](https://github.com/lampepfl/dotty/issues?q=is%3Aissue+is%3Aopen+label%3Aexp%3Anovice). +and have a look at some of the [good first issues](https://github.com/scala/scala3/issues?q=is%3Aissue+is%3Aopen+label%3Aexp%3Anovice). They make perfect entry points into hacking on the compiler. We are looking forward to having you join the team of contributors. diff --git a/docs/_blog/_posts/2020-03-18-23rd-dotty-milestone-release.md b/docs/_blog/_posts/2020-03-18-23rd-dotty-milestone-release.md index e52db993dd19..7725d7e7d254 100644 --- a/docs/_blog/_posts/2020-03-18-23rd-dotty-milestone-release.md +++ b/docs/_blog/_posts/2020-03-18-23rd-dotty-milestone-release.md @@ -52,7 +52,7 @@ In this release, we have added an aid for the programmer to detect such mistakes 1 error found ``` -You can learn more about the feature from the [documentation](https://dotty.epfl.ch/0.23.0-RC1/docs/reference/other-new-features/safe-initialization.html). For the discussion, see PR [#7789](https://github.com/lampepfl/dotty/pull/7789). +You can learn more about the feature from the [documentation](https://dotty.epfl.ch/0.23.0-RC1/docs/reference/other-new-features/safe-initialization.html). For the discussion, see PR [#7789](https://github.com/scala/scala3/pull/7789). ## Bitwise Int compiletime operations In the previous release, Dotty has [received](https://dotty.epfl.ch/blog/2020/02/05/22nd-dotty-milestone-release.html#primitive-compiletime-operations-on-singleton-types) a support for type-level arithmetic operations on integers. In this release, we are extending this support by adding bitwise operations. For example: @@ -246,7 +246,7 @@ Notice how above, we are calling `app.fun` and `app.args`. `fun` and `args` are If you have questions or any sort of feedback, feel free to send us a message on our [Gitter channel](https://gitter.im/lampepfl/dotty). If you encounter a bug, please -[open an issue on GitHub](https://github.com/lampepfl/dotty/issues/new). +[open an issue on GitHub](https://github.com/scala/scala3/issues/new). ## Contributing @@ -294,7 +294,7 @@ According to `git shortlog -sn --no-merges 0.22.0-RC1..0.23.0-RC1` these are: If you want to get your hands dirty and contribute to Dotty, now is a good time to get involved! Head to our [Getting Started page for new contributors](https://dotty.epfl.ch/docs/contributing/getting-started.html), -and have a look at some of the [good first issues](https://github.com/lampepfl/dotty/issues?q=is%3Aissue+is%3Aopen+label%3Aexp%3Anovice). +and have a look at some of the [good first issues](https://github.com/scala/scala3/issues?q=is%3Aissue+is%3Aopen+label%3Aexp%3Anovice). They make perfect entry points into hacking on the compiler. We are looking forward to having you join the team of contributors. diff --git a/docs/_blog/_posts/2020-04-29-24th-dotty-milestone-release.md b/docs/_blog/_posts/2020-04-29-24th-dotty-milestone-release.md index e32df8cacc55..79148254e409 100644 --- a/docs/_blog/_posts/2020-04-29-24th-dotty-milestone-release.md +++ b/docs/_blog/_posts/2020-04-29-24th-dotty-milestone-release.md @@ -76,13 +76,13 @@ Bar 22 Bar 22 ``` -This new change, however, comes with rather intricated rules – if you are interested to learn about them in details, see [documentation](https://dotty.epfl.ch/docs/reference/metaprogramming/inline.html#rules-for-overriding) on inlines and the PR #[8543](https://github.com/lampepfl/dotty/pull/8543/files) which introduced the change. +This new change, however, comes with rather intricated rules – if you are interested to learn about them in details, see [documentation](https://dotty.epfl.ch/docs/reference/metaprogramming/inline.html#rules-for-overriding) on inlines and the PR #[8543](https://github.com/scala/scala3/pull/8543/files) which introduced the change. # Let us know what you think! If you have questions or any sort of feedback, feel free to send us a message on our [Gitter channel](https://gitter.im/lampepfl/dotty). If you encounter a bug, please -[open an issue on GitHub](https://github.com/lampepfl/dotty/issues/new). +[open an issue on GitHub](https://github.com/scala/scala3/issues/new). ## Contributing @@ -125,7 +125,7 @@ According to `git shortlog -sn --no-merges 0.23.0-RC1..0.24.0-RC1` these are: If you want to get your hands dirty and contribute to Dotty, now is a good time to get involved! Head to our [Getting Started page for new contributors](https://dotty.epfl.ch/docs/contributing/getting-started.html), -and have a look at some of the [good first issues](https://github.com/lampepfl/dotty/issues?q=is%3Aissue+is%3Aopen+label%3Aexp%3Anovice). +and have a look at some of the [good first issues](https://github.com/scala/scala3/issues?q=is%3Aissue+is%3Aopen+label%3Aexp%3Anovice). They make perfect entry points into hacking on the compiler. We are looking forward to having you join the team of contributors. @@ -134,7 +134,7 @@ We are looking forward to having you join the team of contributors. Dotty now has a set of widely-used community libraries that are built against every nightly Dotty snapshot. Currently, this includes shapeless, ScalaPB, algebra, scalatest, scopt and squants. -Join our [community build](https://github.com/lampepfl/dotty/tree/main/community-build) +Join our [community build](https://github.com/scala/scala3/tree/main/community-build) to make sure that our regression suite includes your library. [Scastie]: https://scastie.scala-lang.org/?target=dotty diff --git a/docs/_blog/_posts/2020-06-22-25th-dotty-milestone-release.md b/docs/_blog/_posts/2020-06-22-25th-dotty-milestone-release.md index dd5def04bfe9..db73513c2413 100644 --- a/docs/_blog/_posts/2020-06-22-25th-dotty-milestone-release.md +++ b/docs/_blog/_posts/2020-06-22-25th-dotty-milestone-release.md @@ -35,13 +35,13 @@ This default budget is configurable via a compiler flag `-Ximport-suggestion-tim This change should speed up the compiler when it comes to programming with givens. -For more information, see PR [#9167](https://github.com/lampepfl/dotty/pull/9167). +For more information, see PR [#9167](https://github.com/scala/scala3/pull/9167). # Let us know what you think! If you have questions or any sort of feedback, feel free to send us a message on our [Gitter channel](https://gitter.im/lampepfl/dotty). If you encounter a bug, please -[open an issue on GitHub](https://github.com/lampepfl/dotty/issues/new). +[open an issue on GitHub](https://github.com/scala/scala3/issues/new). ## Contributing Thank you to all the contributors who made this release possible 🎉 @@ -89,7 +89,7 @@ According to `git shortlog -sn --no-merges 0.24.0-RC1..0.25.0-RC2` these are: If you want to get your hands dirty and contribute to Dotty, now is a good time to get involved! Head to our [Getting Started page for new contributors](https://dotty.epfl.ch/docs/contributing/getting-started.html), -and have a look at some of the [good first issues](https://github.com/lampepfl/dotty/issues?q=is%3Aissue+is%3Aopen+label%3Aexp%3Anovice). +and have a look at some of the [good first issues](https://github.com/scala/scala3/issues?q=is%3Aissue+is%3Aopen+label%3Aexp%3Anovice). They make perfect entry points into hacking on the compiler. We are looking forward to having you join the team of contributors. @@ -98,7 +98,7 @@ We are looking forward to having you join the team of contributors. Dotty now has a set of widely-used community libraries that are built against every nightly Dotty snapshot. Currently, this includes shapeless, ScalaPB, algebra, scalatest, scopt and squants. -Join our [community build](https://github.com/lampepfl/dotty/tree/main/community-build) +Join our [community build](https://github.com/scala/scala3/tree/main/community-build) to make sure that our regression suite includes your library. [Scastie]: https://scastie.scala-lang.org/?target=dotty diff --git a/docs/_blog/_posts/2020-07-27-26th-dotty-milestone-release.md b/docs/_blog/_posts/2020-07-27-26th-dotty-milestone-release.md index b0d153dded7e..06bf6fc5dabb 100644 --- a/docs/_blog/_posts/2020-07-27-26th-dotty-milestone-release.md +++ b/docs/_blog/_posts/2020-07-27-26th-dotty-milestone-release.md @@ -35,7 +35,7 @@ extension (ss: Seq[String]): def longestString: String = longestStrings.head ``` -You can read more about the new syntax in the [documentation](https://dotty.epfl.ch/docs/reference/contextual/extension-methods.html). For the discussion, see [PR](https://github.com/lampepfl/dotty/pull/9255). +You can read more about the new syntax in the [documentation](https://dotty.epfl.ch/docs/reference/contextual/extension-methods.html). For the discussion, see [PR](https://github.com/scala/scala3/pull/9255). # Local Selectable Instances Local and anonymous classes that extend `Selectable` get more refined types than other classes. For example: @@ -78,7 +78,7 @@ val result = constValueTuple["foo" *: "bar" *: 10 *: 2.5 *: EmptyTuple] println(result) // (foo,bar,10,2.5) ``` -This feature was introduced by PR [#9209](https://github.com/lampepfl/dotty/pull/9209). +This feature was introduced by PR [#9209](https://github.com/scala/scala3/pull/9209). # Per-run time budget for import suggestions Import suggestions is a feature useful for debugging but potentially taxing for performance. Therefore, we have added the `-Ximport-suggestion-timeout ` to allow specifying the timeout (in milliseconds) after which the suggestions mechanism should stop the lookup. The timeout budget is per-run (and not per suggestion) which ensures that the performance does not degrade in case of too many suggestions. @@ -87,7 +87,7 @@ Import suggestions is a feature useful for debugging but potentially taxing for If you have questions or any sort of feedback, feel free to send us a message on our [Gitter channel](https://gitter.im/lampepfl/dotty). If you encounter a bug, please -[open an issue on GitHub](https://github.com/lampepfl/dotty/issues/new). +[open an issue on GitHub](https://github.com/scala/scala3/issues/new). ## Contributing @@ -119,7 +119,7 @@ According to `git shortlog -sn --no-merges 0.25.0-RC2..0.26.0-RC1` these are: If you want to get your hands dirty and contribute to Dotty, now is a good time to get involved! Head to our [Getting Started page for new contributors](https://dotty.epfl.ch/docs/contributing/getting-started.html), -and have a look at some of the [good first issues](https://github.com/lampepfl/dotty/issues?q=is%3Aissue+is%3Aopen+label%3Aexp%3Anovice). +and have a look at some of the [good first issues](https://github.com/scala/scala3/issues?q=is%3Aissue+is%3Aopen+label%3Aexp%3Anovice). They make perfect entry points into hacking on the compiler. We are looking forward to having you join the team of contributors. @@ -128,7 +128,7 @@ We are looking forward to having you join the team of contributors. Dotty now has a set of widely-used community libraries that are built against every nightly Dotty snapshot. Currently, this includes shapeless, ScalaPB, algebra, scalatest, scopt and squants. -Join our [community build](https://github.com/lampepfl/dotty/tree/main/community-build) +Join our [community build](https://github.com/scala/scala3/tree/main/community-build) to make sure that our regression suite includes your library. [Scastie]: https://scastie.scala-lang.org/?target=dotty diff --git a/docs/_blog/_posts/2020-08-31-27th-dotty-milestone-release.md b/docs/_blog/_posts/2020-08-31-27th-dotty-milestone-release.md index e42e98a1385b..7d0eebe369e2 100644 --- a/docs/_blog/_posts/2020-08-31-27th-dotty-milestone-release.md +++ b/docs/_blog/_posts/2020-08-31-27th-dotty-milestone-release.md @@ -41,30 +41,30 @@ To the best of our knowledge, cross-compiling libraries should be able to use Sc If you experience a bug with anything except the unsupported features mentioned above, please file a bug report. # Stability -As we're getting closer to the release of Scala 3, we are continuing to focus on the stability and performance of the language. In this release, we have fixed support of objects under JDK9+ (PR [#9181](https://github.com/lampepfl/dotty/pull/9181)). The issue was, due to the changes in JDK9+ compared to JDK8, our initialization scheme for objects did not work under JDK9+. The aforementioned fixed that issue, thereby unblocking JDK9+ support for Dotty. +As we're getting closer to the release of Scala 3, we are continuing to focus on the stability and performance of the language. In this release, we have fixed support of objects under JDK9+ (PR [#9181](https://github.com/scala/scala3/pull/9181)). The issue was, due to the changes in JDK9+ compared to JDK8, our initialization scheme for objects did not work under JDK9+. The aforementioned fixed that issue, thereby unblocking JDK9+ support for Dotty. -We are also continuing to work on stabilising enums. PR [#9532](https://github.com/lampepfl/dotty/pull/9532) corrects the deserialization and serialization of singleton enum values with `ObjectInputStream` and `ObjectOutputStream`. PR [#9549](https://github.com/lampepfl/dotty/pull/9549) enables overriding the `toString` method on enums – previously this was not possible because of the way enums were desugared. +We are also continuing to work on stabilising enums. PR [#9532](https://github.com/scala/scala3/pull/9532) corrects the deserialization and serialization of singleton enum values with `ObjectInputStream` and `ObjectOutputStream`. PR [#9549](https://github.com/scala/scala3/pull/9549) enables overriding the `toString` method on enums – previously this was not possible because of the way enums were desugared. # Performance We are also focusing these days on making the compiler faster and memory-efficient. For the past month, we were looking in the compiler's memory footprint. We were trying to determine what was allocated in unreasonable amounts during compilation and trying to resolve these allocation issues. The following PRs attempt to increase the performance of the compiler: -- Optimize megaphase [#9597](https://github.com/lampepfl/dotty/pull/9597) -- Cache all memberNamed results [#9633](https://github.com/lampepfl/dotty/pull/9633) -- Parallelize position pickling [#9619](https://github.com/lampepfl/dotty/pull/9619) -- Simplify TypeComparer [#9405](https://github.com/lampepfl/dotty/pull/9405) -- Optimize and simplify SourcePosition handling [#9561](https://github.com/lampepfl/dotty/pull/9561) +- Optimize megaphase [#9597](https://github.com/scala/scala3/pull/9597) +- Cache all memberNamed results [#9633](https://github.com/scala/scala3/pull/9633) +- Parallelize position pickling [#9619](https://github.com/scala/scala3/pull/9619) +- Simplify TypeComparer [#9405](https://github.com/scala/scala3/pull/9405) +- Optimize and simplify SourcePosition handling [#9561](https://github.com/scala/scala3/pull/9561) # Metaprogramming We are keeping the work on the metaprogramming API improvements. For this release, the following PRs bring better API to metaprogrammers: -- Avoid leak of internal implementation in tasty.Reflection [#9613](https://github.com/lampepfl/dotty/pull/9613) -- Redefine quoted.Expr.betaReduce [#9469](https://github.com/lampepfl/dotty/pull/9469) +- Avoid leak of internal implementation in tasty.Reflection [#9613](https://github.com/scala/scala3/pull/9613) +- Redefine quoted.Expr.betaReduce [#9469](https://github.com/scala/scala3/pull/9469) # Let us know what you think! If you have questions or any sort of feedback, feel free to send us a message on our [Gitter channel](https://gitter.im/lampepfl/dotty). If you encounter a bug, please -[open an issue on GitHub](https://github.com/lampepfl/dotty/issues/new). +[open an issue on GitHub](https://github.com/scala/scala3/issues/new). ## Contributing Thank you to all the contributors who made this release possible 🎉 @@ -100,7 +100,7 @@ According to `git shortlog -sn --no-merges 0.26.0-RC1..0.27.0-RC1` these are: If you want to get your hands dirty and contribute to Dotty, now is a good time to get involved! Head to our [Getting Started page for new contributors](https://dotty.epfl.ch/docs/contributing/getting-started.html), -and have a look at some of the [good first issues](https://github.com/lampepfl/dotty/issues?q=is%3Aissue+is%3Aopen+label%3Aexp%3Anovice). +and have a look at some of the [good first issues](https://github.com/scala/scala3/issues?q=is%3Aissue+is%3Aopen+label%3Aexp%3Anovice). They make perfect entry points into hacking on the compiler. We are looking forward to having you join the team of contributors. @@ -109,7 +109,7 @@ We are looking forward to having you join the team of contributors. Dotty now has a set of widely-used community libraries that are built against every nightly Dotty snapshot. Currently, this includes shapeless, ScalaPB, algebra, scalatest, scopt and squants. -Join our [community build](https://github.com/lampepfl/dotty/tree/main/community-build) +Join our [community build](https://github.com/scala/scala3/tree/main/community-build) to make sure that our regression suite includes your library. [Scastie]: https://scastie.scala-lang.org/?target=dotty diff --git a/docs/_blog/_posts/2020-11-09-scala3-m1.md b/docs/_blog/_posts/2020-11-09-scala3-m1.md index ffef5618f9ff..228a9f8eb99e 100644 --- a/docs/_blog/_posts/2020-11-09-scala3-m1.md +++ b/docs/_blog/_posts/2020-11-09-scala3-m1.md @@ -19,8 +19,8 @@ Below, you can find a short summary of the changes that took place during betwee Dotty 0.27.0-RC1 had introduced preliminary Scala.js support with the portable subset of Scala and native JavaScript types. Scala 3.0.0-M1 significantly expands on that support: -* support for non-native JS types ([#9774](https://github.com/lampepfl/dotty/pull/9774)), and -* better support for other JS interop features, notably their interactions with Scala 3 features such as top-level declarations and `enum`s (e.g, [#9725](https://github.com/lampepfl/dotty/pull/9725) and [#9955](https://github.com/lampepfl/dotty/pull/9955)). +* support for non-native JS types ([#9774](https://github.com/scala/scala3/pull/9774)), and +* better support for other JS interop features, notably their interactions with Scala 3 features such as top-level declarations and `enum`s (e.g, [#9725](https://github.com/scala/scala3/pull/9725) and [#9955](https://github.com/scala/scala3/pull/9955)). The only remaining feature of Scala.js that is not supported yet is JS exports: `@JSExport` and its siblings `@JSExportAll`, `@JSExportTopLevel` and `@JSExportStatic` are all ignored by Scala 3.0.0-M1. Support for JS exports will come in the next release. @@ -46,7 +46,7 @@ x match { As of Scala 3.1.0, the `@` syntax will be deprecated and the codebases should switch to `as` instead. -This change was implemented by PR [#9837](https://github.com/lampepfl/dotty/pull/9837). +This change was implemented by PR [#9837](https://github.com/scala/scala3/pull/9837). # Pattern-Bound Given Instances The syntax for `given` instances in patterns has also changed. In the `for`-comprehensions, the correct way of using `given`s is as follows: @@ -62,7 +62,7 @@ pair match case (ctx as given Context, y) => ... ``` -For more information, see [documentation](https://dotty.epfl.ch/docs/reference/contextual/givens.html#pattern-bound-given-instances), and for discussion, see PR [#10091](https://github.com/lampepfl/dotty/pull/10091). +For more information, see [documentation](https://dotty.epfl.ch/docs/reference/contextual/givens.html#pattern-bound-given-instances), and for discussion, see PR [#10091](https://github.com/scala/scala3/pull/10091). # Change wildcard given selectors This is another syntactic change which aims to simplify the code. Instead of: @@ -77,7 +77,7 @@ The correct version of the wildcard `given` import now becomes: import p.given ``` -This change was implemented by PR [#9949](https://github.com/lampepfl/dotty/pull/9949). +This change was implemented by PR [#9949](https://github.com/scala/scala3/pull/9949). # Final API for enumerations `enum` definitions are now released in their final design. since `0.27.0-RC1` we have made the following changes: @@ -125,22 +125,22 @@ val res1: Opt[?] = Nn # Keep `@alpha` optional for operators Following the discussion on [contributors](https://contributors.scala-lang.org/t/the-alpha-notation/4583), we now keep `@alpha` optional for operators. The checking behavior is still available when compiling with the `-Yrequire-alpha`. -`@alpha` annotations provide a way to define an alternate name for symbolic operators. You can learn more about `@alpha` annotations from the [documentation](https://dotty.epfl.ch/docs/reference/changed-features/operators.html#the-alpha-annotation). The change was implemented by PR [#10093](https://github.com/lampepfl/dotty/pull/10093). +`@alpha` annotations provide a way to define an alternate name for symbolic operators. You can learn more about `@alpha` annotations from the [documentation](https://dotty.epfl.ch/docs/reference/changed-features/operators.html#the-alpha-annotation). The change was implemented by PR [#10093](https://github.com/scala/scala3/pull/10093). # Optimizing the compiler During the last months, a considerable amount of effort went into investigating performance bottlenecks in the compiler and optimizing its workflow. We also work on stabilizing the compiler and porting relevant changes from the Scala 2 compiler to Scala 3. The following PRs are relevant to highlighting this work: -- Port classfile parsing improvements [#10037](https://github.com/lampepfl/dotty/pull/10037) -- Semanticdb usability enhancements [#9768](https://github.com/lampepfl/dotty/pull/9768) -- Optimize core and frontend [#9867](https://github.com/lampepfl/dotty/pull/9867) +- Port classfile parsing improvements [#10037](https://github.com/scala/scala3/pull/10037) +- Semanticdb usability enhancements [#9768](https://github.com/scala/scala3/pull/9768) +- Optimize core and frontend [#9867](https://github.com/scala/scala3/pull/9867) # Known issues -This release of Scala 3 doesn't work on JDK 14 because of a regression fixed in [#10135](https://github.com/lampepfl/dotty/pull/10135). JDK 15 doesn't work either because of [scala/bug#12172](https://github.com/scala/bug/issues/12172) which will be fixed in the new scala-library release. +This release of Scala 3 doesn't work on JDK 14 because of a regression fixed in [#10135](https://github.com/scala/scala3/pull/10135). JDK 15 doesn't work either because of [scala/bug#12172](https://github.com/scala/bug/issues/12172) which will be fixed in the new scala-library release. # Let us know what you think! If you have questions or any sort of feedback, feel free to send us a message on our [Gitter channel](https://gitter.im/lampepfl/dotty). If you encounter a bug, please -[open an issue on GitHub](https://github.com/lampepfl/dotty/issues/new). +[open an issue on GitHub](https://github.com/scala/scala3/issues/new). ## Contributors @@ -190,7 +190,7 @@ According to `git shortlog -sn --no-merges 0.27.0-RC1..3.0.0-M1` these are: If you want to get your hands dirty and contribute to Dotty, now is a good time to get involved! Head to our [Getting Started page for new contributors](https://dotty.epfl.ch/docs/contributing/getting-started.html), -and have a look at some of the [good first issues](https://github.com/lampepfl/dotty/issues?q=is%3Aissue+is%3Aopen+label%3Aexp%3Anovice). +and have a look at some of the [good first issues](https://github.com/scala/scala3/issues?q=is%3Aissue+is%3Aopen+label%3Aexp%3Anovice). They make perfect entry points into hacking on the compiler. We are looking forward to having you join the team of contributors. @@ -199,7 +199,7 @@ We are looking forward to having you join the team of contributors. Dotty now has a set of widely-used community libraries that are built against every nightly Dotty snapshot. Currently, this includes shapeless, ScalaPB, algebra, scalatest, scopt and squants. -Join our [community build](https://github.com/lampepfl/dotty/tree/main/community-build) +Join our [community build](https://github.com/scala/scala3/tree/main/community-build) to make sure that our regression suite includes your library. [Scastie]: https://scastie.scala-lang.org/?target=dotty diff --git a/docs/_blog/_posts/2020-12-18-scala3-m3.md b/docs/_blog/_posts/2020-12-18-scala3-m3.md index 41cfa76c0265..84fa9e823bda 100644 --- a/docs/_blog/_posts/2020-12-18-scala3-m3.md +++ b/docs/_blog/_posts/2020-12-18-scala3-m3.md @@ -19,7 +19,7 @@ You can try out the M3 version online via [Scastie](https://scastie.scala-lang.o # sbt plugin update -We published a new version of the sbt plugin `sbt-dotty`, v0.5.1. Because of the changes in PR [#10607](https://github.com/lampepfl/dotty/pull/10607), this release of Scala 3 will not work with earlier versions of sbt-dotty. You will need to upgrade sbt-dotty to 0.5.1 to be able to use Scala 3.0.0-M3. +We published a new version of the sbt plugin `sbt-dotty`, v0.5.1. Because of the changes in PR [#10607](https://github.com/scala/scala3/pull/10607), this release of Scala 3 will not work with earlier versions of sbt-dotty. You will need to upgrade sbt-dotty to 0.5.1 to be able to use Scala 3.0.0-M3. # Final syntactic tweaks ## `as` dropped from the `given` syntax @@ -57,7 +57,7 @@ given global: ExecutionContext = ForkJoinContext() given Context = ctx ``` -You can find a discussion of the above change in the [PR #10538](https://github.com/lampepfl/dotty/pull/10538). +You can find a discussion of the above change in the [PR #10538](https://github.com/scala/scala3/pull/10538). ## Drop `as` in patterns Since we dropped `as` from `given`s, we lost a strong reason for having `as` at all. Therefore, we dropped `as` from patterns as well. The following syntax, valid in Scala 3.0.0-M2, is not accepted anymore: @@ -128,16 +128,16 @@ In the meantime, the compiler will emit warnings when trying to call those metho Note that the warnings are only active with language mode `3.1-migration` or higher - see the documentation on the [Language Versions](https://dotty.epfl.ch/docs/usage/language-versions.html) to learn how to enable it. -You can read the discussion of this change in the [PR #10670](https://github.com/lampepfl/dotty/pull/10670). You can also read more about it in the [documentation](https://dotty.epfl.ch/docs/reference/other-new-features/matchable.html). +You can read the discussion of this change in the [PR #10670](https://github.com/scala/scala3/pull/10670). You can also read more about it in the [documentation](https://dotty.epfl.ch/docs/reference/other-new-features/matchable.html). # Tooling improvements As we are getting closer to a stable release of Scala 3, the focus increasingly shifts on the tooling available to get started with Scala 3. For a while now, we are not using the old dottydoc documentation tool for building the documentation. We are developing an entirely new tool, scala3doc, from scratch. This new documentation tool is more robust and faster than the old one. -As part of the tooling effort, this new Scala 3 documentation tool is rapidly improved. [PR #10522](https://github.com/lampepfl/dotty/pull/10522) proves that the doctool can generate documentation for the community build projects. You can access this documentation via the following [link](https://scala3doc.virtuslab.com/pr-master-docs/index.html). +As part of the tooling effort, this new Scala 3 documentation tool is rapidly improved. [PR #10522](https://github.com/scala/scala3/pull/10522) proves that the doctool can generate documentation for the community build projects. You can access this documentation via the following [link](https://scala3doc.virtuslab.com/pr-master-docs/index.html). -[PR #10491](https://github.com/lampepfl/dotty/pull/10491) introduced scripting support in Scala 3. Consider the following source named `Main.scala`: +[PR #10491](https://github.com/scala/scala3/pull/10491) introduced scripting support in Scala 3. Consider the following source named `Main.scala`: ```scala @main def Test(name: String): Unit = @@ -157,25 +157,25 @@ The documentation for this feature is available [here](https://dotty.epfl.ch/doc # Metaprogramming changes We have been polishing the metaprogramming API and making it more uniform. The following notable changes occurred between M2 and M3: -- Add `Expr.asTerm` [#10694](https://github.com/lampepfl/dotty/pull/10694) -- Add reflect `MatchCase` `TypeRepr` [#10735](https://github.com/lampepfl/dotty/pull/10735) -- Rework reflect Symbol fields API [#10705](https://github.com/lampepfl/dotty/pull/10705) -- Remove `Expr.StringContext.unapply` [#10675](https://github.com/lampepfl/dotty/pull/10675) -- Rename `Liftable` to `ToExpr` and `Unliftable` to `FromExpr` [#10618](https://github.com/lampepfl/dotty/pull/10618) -- Remove Unliftable[Unit] [#10570](https://github.com/lampepfl/dotty/pull/10570) -- Remove reflect.LambdaType [#10548](https://github.com/lampepfl/dotty/pull/10548) -- Add `scala.quoted.Expr.unapply` as dual of `Expr.apply` [#10580](https://github.com/lampepfl/dotty/pull/10580) -- Move `Quotes` as last parameter in `ExprMap.transform` [#10519](https://github.com/lampepfl/dotty/pull/10519) -- Rework reflect Constant API [#10753](https://github.com/lampepfl/dotty/pull/10753) -- Unify quoted.report and reflect.Reporting [#10474](https://github.com/lampepfl/dotty/pull/10474) -- Fix #10359: Add GivenSelector to reflection API [#10469](https://github.com/lampepfl/dotty/pull/10469) -- Rework reflect show API [#10661](https://github.com/lampepfl/dotty/pull/10661) -- Fix #10709: Add missing level check before inlining [#10781](https://github.com/lampepfl/dotty/pull/10781) +- Add `Expr.asTerm` [#10694](https://github.com/scala/scala3/pull/10694) +- Add reflect `MatchCase` `TypeRepr` [#10735](https://github.com/scala/scala3/pull/10735) +- Rework reflect Symbol fields API [#10705](https://github.com/scala/scala3/pull/10705) +- Remove `Expr.StringContext.unapply` [#10675](https://github.com/scala/scala3/pull/10675) +- Rename `Liftable` to `ToExpr` and `Unliftable` to `FromExpr` [#10618](https://github.com/scala/scala3/pull/10618) +- Remove Unliftable[Unit] [#10570](https://github.com/scala/scala3/pull/10570) +- Remove reflect.LambdaType [#10548](https://github.com/scala/scala3/pull/10548) +- Add `scala.quoted.Expr.unapply` as dual of `Expr.apply` [#10580](https://github.com/scala/scala3/pull/10580) +- Move `Quotes` as last parameter in `ExprMap.transform` [#10519](https://github.com/scala/scala3/pull/10519) +- Rework reflect Constant API [#10753](https://github.com/scala/scala3/pull/10753) +- Unify quoted.report and reflect.Reporting [#10474](https://github.com/scala/scala3/pull/10474) +- Fix #10359: Add GivenSelector to reflection API [#10469](https://github.com/scala/scala3/pull/10469) +- Rework reflect show API [#10661](https://github.com/scala/scala3/pull/10661) +- Fix #10709: Add missing level check before inlining [#10781](https://github.com/scala/scala3/pull/10781) # Let us know what you think! If you have questions or any sort of feedback, feel free to send us a message on our [Gitter channel](https://gitter.im/lampepfl/dotty). If you encounter a bug, please -[open an issue on GitHub](https://github.com/lampepfl/dotty/issues/new). +[open an issue on GitHub](https://github.com/scala/scala3/issues/new). ## Contributors @@ -231,7 +231,7 @@ We are looking forward to having you join the team of contributors. ## Library authors: Join our community build -Scala 3 is regularly tested against a sample of libraries known as the "community build". You can add your library to the [community build](https://github.com/lampepfl/dotty/tree/main/community-build) by submitting a PR. +Scala 3 is regularly tested against a sample of libraries known as the "community build". You can add your library to the [community build](https://github.com/scala/scala3/tree/main/community-build) by submitting a PR. [Scastie]: https://scastie.scala-lang.org/?target=dotty diff --git a/docs/_blog/_posts/2021-02-17-scala3-rc1.md b/docs/_blog/_posts/2021-02-17-scala3-rc1.md index 9751f7b7461e..011dc0819107 100644 --- a/docs/_blog/_posts/2021-02-17-scala3-rc1.md +++ b/docs/_blog/_posts/2021-02-17-scala3-rc1.md @@ -11,7 +11,7 @@ Greetings from the Scala 3 team! We are delighted to announce the first release This release brings some last-minute polishings, clean-ups and changes before the big release. There were a few language changes to improve the user experience, as well as the polishings of the metaprogramming framework. We have also worked on the issues that had to be fixed before the stable release. -Overall, more than [400 PRs](https://github.com/lampepfl/dotty/pulls?q=is%3Apr+is%3Aclosed+closed%3A%3E2020-12-02+sort%3Acomments-desc) were merged after the M3 release and until today! Read more below! +Overall, more than [400 PRs](https://github.com/scala/scala3/pulls?q=is%3Apr+is%3Aclosed+closed%3A%3E2020-12-02+sort%3Acomments-desc) were merged after the M3 release and until today! Read more below! ## Allow secondary type parameter list in extension methods @@ -43,7 +43,7 @@ Or, when passing both type arguments: sumBy[String](List("a", "bb", "ccc"))[Int](_.length) ``` -For discussion, see [PR #10940](https://github.com/lampepfl/dotty/pull/10940). For more information about the extension methods, see [documentation](https://dotty.epfl.ch/docs/reference/contextual/extension-methods.html). +For discussion, see [PR #10940](https://github.com/scala/scala3/pull/10940). For more information about the extension methods, see [documentation](https://dotty.epfl.ch/docs/reference/contextual/extension-methods.html). ## New `import` syntax @@ -62,11 +62,11 @@ import scala.collection.mutable as mut import NumPy as np ``` -For the details and discussion, see [PR #11244](https://github.com/lampepfl/dotty/pull/11244). Read more about this change in the [documentation](https://dotty.epfl.ch/docs/reference/changed-features/imports.html). +For the details and discussion, see [PR #11244](https://github.com/scala/scala3/pull/11244). Read more about this change in the [documentation](https://dotty.epfl.ch/docs/reference/changed-features/imports.html). ## Use `*` for vararg splices -[PR #11240](https://github.com/lampepfl/dotty/pull/11240) changed the syntax of vararg splices in patterns and function arguments. The new syntax uses a postfix `*`, instead of `: _*`, analogously to how a vararg parameter is declared. +[PR #11240](https://github.com/scala/scala3/pull/11240) changed the syntax of vararg splices in patterns and function arguments. The new syntax uses a postfix `*`, instead of `: _*`, analogously to how a vararg parameter is declared. ## Use `uninitialized` for wildcard initializers @@ -88,7 +88,7 @@ var x: A = uninitialized This way expresses the intent of the idiom in a more verbose and easy to read way than simply writing an underscore. -For discussion, see [PR #11231](https://github.com/lampepfl/dotty/pull/11231), and the [documentation](https://dotty.epfl.ch/docs/reference/dropped-features/wildcard-init.html) is available on our website. +For discussion, see [PR #11231](https://github.com/scala/scala3/pull/11231), and the [documentation](https://dotty.epfl.ch/docs/reference/dropped-features/wildcard-init.html) is available on our website. ## Eta-expand companion object if functions are expected @@ -109,49 +109,49 @@ Results in: |The method `apply` is inserted. The auto insertion will be deprecated, please write `Foo.apply` explicitly. ``` -As the warning suggests, now you should write `Foo.apply` instead of `Foo`. See [Issue #6190](https://github.com/lampepfl/dotty/issues/6190) and [PR #7207](https://github.com/lampepfl/dotty/pull/7207) for discussion. +As the warning suggests, now you should write `Foo.apply` instead of `Foo`. See [Issue #6190](https://github.com/scala/scala3/issues/6190) and [PR #7207](https://github.com/scala/scala3/pull/7207) for discussion. ## Settling on `scaladoc` as the documentation tool We have settled on using the well-known `scaladoc` as a name for the documentation tool for Scala 3 (known previously as `scala3doc`). The obsolete `dotty-doc` (or `scala3-doc`) is removed in RC1. We have also removed all the Kotlin dependencies (Dokka, etc.) from scaladoc. -For details, see [PR #11349](https://github.com/lampepfl/dotty/pull/11349). To read more about `scaladoc`, see [documentation](https://dotty.epfl.ch/docs/usage/scaladoc/index.html) +For details, see [PR #11349](https://github.com/scala/scala3/pull/11349). To read more about `scaladoc`, see [documentation](https://dotty.epfl.ch/docs/usage/scaladoc/index.html) ## Use `future` and `future-migration` to specify language versions after 3.0 in `-source` -[PR #11355](https://github.com/lampepfl/dotty/pull/11355) changes the `-source` specifier for the Scala version(s) after 3.0 from `3.1` to `future`. I.e. it is now +[PR #11355](https://github.com/scala/scala3/pull/11355) changes the `-source` specifier for the Scala version(s) after 3.0 from `3.1` to `future`. I.e. it is now `-source future` and `-source future-migration` instead of `-source 3.1` and `-source 3.1-migration`. Language imports are changed analogously. The reason for the change is that we want to keep the possibility open to ship a `3.1` version that does not yet contain all the changes enabled under `-source future`. ## Other language changes -- Warn when matching against an opaque type [#10664](https://github.com/lampepfl/dotty/pull/10664) -- Fix [#8634](https://github.com/lampepfl/dotty/issues/8634): Support -release option [#10746](https://github.com/lampepfl/dotty/pull/10746) – the same way Scala 2 does. +- Warn when matching against an opaque type [#10664](https://github.com/scala/scala3/pull/10664) +- Fix [#8634](https://github.com/scala/scala3/issues/8634): Support -release option [#10746](https://github.com/scala/scala3/pull/10746) – the same way Scala 2 does. This setting allows you to specify a version of the Java platform (8, 9 etc) and compile the code with classes specific to the that Java platform, and emit the bytecode for that version. ## Metaprogramming changes A lot of work has been done on the metaprogramming side of things. Mostly we are cleaning up and polishing the API to prepare it for the stable release. The following are the important metaprogramming changes that took place: -- Add `scala.quoted.Expr.unapply` as dual of `Expr.apply` [#10580](https://github.com/lampepfl/dotty/pull/10580) -- Remove `Expr.StringContext.unapply` [#10675](https://github.com/lampepfl/dotty/pull/10675) -- Add reflect `MatchCase` `TypeRepr` [#10735](https://github.com/lampepfl/dotty/pull/10735) -- Rename `scala.quoted.staging.{Toolbox => Compiler}` [#11129](https://github.com/lampepfl/dotty/pull/11129) -- Fix [#10863](https://github.com/lampepfl/dotty/issues/10863): Make show `AnyKind`ed [#10988](https://github.com/lampepfl/dotty/pull/10988) -- Add ParamClause to allow multiple type param clauses [#11074](https://github.com/lampepfl/dotty/pull/11074) -- Rework reflect Symbol fields API [#10705](https://github.com/lampepfl/dotty/pull/10705) -- Rename `Liftable` to `ToExpr` and `Unliftable` to `FromExpr` [#10618](https://github.com/lampepfl/dotty/pull/10618) -- Expand non-transparent macros after Typer [#9984](https://github.com/lampepfl/dotty/pull/9984) -- Rework TastyInspector API to allow inspection of all files [#10792](https://github.com/lampepfl/dotty/pull/10792) -- Allow leading context parameters in extension methods [#10940](https://github.com/lampepfl/dotty/pull/10940) -- Rename `Not` to `NotGiven` to make its purpose clearer [#10720](https://github.com/lampepfl/dotty/pull/10720) -- Fix [#10709](https://github.com/lampepfl/dotty/issues/10709): Add missing level check before inlining [#10781](https://github.com/lampepfl/dotty/pull/10781) +- Add `scala.quoted.Expr.unapply` as dual of `Expr.apply` [#10580](https://github.com/scala/scala3/pull/10580) +- Remove `Expr.StringContext.unapply` [#10675](https://github.com/scala/scala3/pull/10675) +- Add reflect `MatchCase` `TypeRepr` [#10735](https://github.com/scala/scala3/pull/10735) +- Rename `scala.quoted.staging.{Toolbox => Compiler}` [#11129](https://github.com/scala/scala3/pull/11129) +- Fix [#10863](https://github.com/scala/scala3/issues/10863): Make show `AnyKind`ed [#10988](https://github.com/scala/scala3/pull/10988) +- Add ParamClause to allow multiple type param clauses [#11074](https://github.com/scala/scala3/pull/11074) +- Rework reflect Symbol fields API [#10705](https://github.com/scala/scala3/pull/10705) +- Rename `Liftable` to `ToExpr` and `Unliftable` to `FromExpr` [#10618](https://github.com/scala/scala3/pull/10618) +- Expand non-transparent macros after Typer [#9984](https://github.com/scala/scala3/pull/9984) +- Rework TastyInspector API to allow inspection of all files [#10792](https://github.com/scala/scala3/pull/10792) +- Allow leading context parameters in extension methods [#10940](https://github.com/scala/scala3/pull/10940) +- Rename `Not` to `NotGiven` to make its purpose clearer [#10720](https://github.com/scala/scala3/pull/10720) +- Fix [#10709](https://github.com/scala/scala3/issues/10709): Add missing level check before inlining [#10781](https://github.com/scala/scala3/pull/10781) ## Let us know what you think! If you have questions or any sort of feedback, feel free to send us a message on our [Gitter channel](https://gitter.im/lampepfl/dotty). If you encounter a bug, please -[open an issue on GitHub](https://github.com/lampepfl/dotty/issues/new). +[open an issue on GitHub](https://github.com/scala/scala3/issues/new). ## Contributors @@ -225,7 +225,7 @@ According to `git shortlog -sn --no-merges 3.0.0-M3..3.0.0-RC1` these are: If you want to get your hands dirty and contribute to Scala 3, now is a good time to get involved! Head to our [Getting Started page for new contributors](https://dotty.epfl.ch/docs/contributing/getting-started.html), -and have a look at some of the [good first issues](https://github.com/lampepfl/dotty/issues?q=is%3Aissue+is%3Aopen+label%3Aexp%3Anovice). +and have a look at some of the [good first issues](https://github.com/scala/scala3/issues?q=is%3Aissue+is%3Aopen+label%3Aexp%3Anovice). They make perfect entry points into hacking on the compiler. We are looking forward to having you join the team of contributors. diff --git a/docs/_blog/_posts/2021-03-31-scala3-rc2.md b/docs/_blog/_posts/2021-03-31-scala3-rc2.md index 178dfabfbffc..13170adbcfad 100644 --- a/docs/_blog/_posts/2021-03-31-scala3-rc2.md +++ b/docs/_blog/_posts/2021-03-31-scala3-rc2.md @@ -6,7 +6,7 @@ authorImg: /images/anatolii.png date: 2021-03-31 --- -Hello! We are happy to announce Scala 3.0.0-RC2. With this release, we are getting ready for 3.0.0. The significance of it is to give the community the chance and time to test all the changes meant for 3.0.0 final. A lot of bug fixes found their way into this release to ensure stability for 3.0.0 – more than [250 PRs](https://github.com/lampepfl/dotty/pulls?q=is%3Apr+is%3Aclosed+closed%3A%3E2021-02-16) were merged after the 3.0.0-RC1 release and until today! +Hello! We are happy to announce Scala 3.0.0-RC2. With this release, we are getting ready for 3.0.0. The significance of it is to give the community the chance and time to test all the changes meant for 3.0.0 final. A lot of bug fixes found their way into this release to ensure stability for 3.0.0 – more than [250 PRs](https://github.com/scala/scala3/pulls?q=is%3Apr+is%3Aclosed+closed%3A%3E2021-02-16) were merged after the 3.0.0-RC1 release and until today! Read more about this release below. @@ -28,31 +28,31 @@ As mentioned above, we are currently in an issue-fixing mode. So a lot of those There are some notable changes worth mentioning. ## Restrict experimental features to unstable releases only -PR [#11920](https://github.com/lampepfl/dotty/pull/11920) restricts usage of experimental features only to nightlies and snapshots. This change ensures that changes deemed experimental will not propagate into the wider ecosystem provided that the wider ecosystem depends on stable releases. This is needed so that if an experimental feature is modified or removed from the language, the ecosystem will not be impacted. +PR [#11920](https://github.com/scala/scala3/pull/11920) restricts usage of experimental features only to nightlies and snapshots. This change ensures that changes deemed experimental will not propagate into the wider ecosystem provided that the wider ecosystem depends on stable releases. This is needed so that if an experimental feature is modified or removed from the language, the ecosystem will not be impacted. ## New `unsafeNulls` language feature -PR [#9884](https://github.com/lampepfl/dotty/pull/9884) adds a new language feature which enables unsafe null operations under explicit nulls. This is a tool to help projects migrating to full explicit nulls gradually. From now on, you can use an import `import scala.language.unsafeNulls` to create an unsafe scope. For discussion, see the PR linked above, and for more information on the feature, see the [documentation](https://dotty.epfl.ch/docs/reference/other-new-features/explicit-nulls.html). +PR [#9884](https://github.com/scala/scala3/pull/9884) adds a new language feature which enables unsafe null operations under explicit nulls. This is a tool to help projects migrating to full explicit nulls gradually. From now on, you can use an import `import scala.language.unsafeNulls` to create an unsafe scope. For discussion, see the PR linked above, and for more information on the feature, see the [documentation](https://dotty.epfl.ch/docs/reference/other-new-features/explicit-nulls.html). ## Treat Scala.js pseudo-unions as real unions -In PR [#11671](https://github.com/lampepfl/dotty/pull/11671), we now treat the `scala.scalajs.js.|[A, B]` as if it was a real Scala 3 union `A | B`, which further boosts the support for Scala.js in Scala 3. +In PR [#11671](https://github.com/scala/scala3/pull/11671), we now treat the `scala.scalajs.js.|[A, B]` as if it was a real Scala 3 union `A | B`, which further boosts the support for Scala.js in Scala 3. ## Other API changes -`-Ycheck-init` was renamed to `-Ysafe-init`. This flag is used to check safe initialization, more about which you can read in the [documentation](https://dotty.epfl.ch/docs/reference/other-new-features/safe-initialization.html). See also PR [#11920](https://github.com/lampepfl/dotty/pull/11920). +`-Ycheck-init` was renamed to `-Ysafe-init`. This flag is used to check safe initialization, more about which you can read in the [documentation](https://dotty.epfl.ch/docs/reference/other-new-features/safe-initialization.html). See also PR [#11920](https://github.com/scala/scala3/pull/11920). -PR [#11745](https://github.com/lampepfl/dotty/pull/11745) changes the `compiletime` package API a bit. `compiletime.S` was moved to `compiletime.ops.int.S` and the package object `compiletime` was removed in favor of top-level definitions. +PR [#11745](https://github.com/scala/scala3/pull/11745) changes the `compiletime` package API a bit. `compiletime.S` was moved to `compiletime.ops.int.S` and the package object `compiletime` was removed in favor of top-level definitions. ## Metaprogramming The following are some notable metaprogramming changes included into this release: -- Add quotes.Type.valueOfConstant [#11715](https://github.com/lampepfl/dotty/pull/11715) -- Remove compiletime.Widen [#11569](https://github.com/lampepfl/dotty/pull/11569) -- Add -Xcheck-macros scalac option [#11655](https://github.com/lampepfl/dotty/pull/11655) +- Add quotes.Type.valueOfConstant [#11715](https://github.com/scala/scala3/pull/11715) +- Remove compiletime.Widen [#11569](https://github.com/scala/scala3/pull/11569) +- Add -Xcheck-macros scalac option [#11655](https://github.com/scala/scala3/pull/11655) # Let us know what you think! If you have questions or any sort of feedback, feel free to send us a message on our [Gitter channel](https://gitter.im/lampepfl/dotty). If you encounter a bug, please -[open an issue on GitHub](https://github.com/lampepfl/dotty/issues/new). +[open an issue on GitHub](https://github.com/scala/scala3/issues/new). ## Contributors @@ -107,7 +107,7 @@ According to `git shortlog -sn --no-merges 3.0.0-RC1..3.0.0-RC2` these are: If you want to get your hands dirty and contribute to Dotty, now is a good time to get involved! Head to our [Getting Started page for new contributors](https://dotty.epfl.ch/docs/contributing/getting-started.html), -and have a look at some of the [good first issues](https://github.com/lampepfl/dotty/issues?q=is%3Aissue+is%3Aopen+label%3Aexp%3Anovice). +and have a look at some of the [good first issues](https://github.com/scala/scala3/issues?q=is%3Aissue+is%3Aopen+label%3Aexp%3Anovice). They make perfect entry points into hacking on the compiler. We are looking forward to having you join the team of contributors. @@ -116,7 +116,7 @@ We are looking forward to having you join the team of contributors. Dotty now has a set of widely-used community libraries that are built against every nightly Dotty snapshot. Currently, this includes shapeless, ScalaPB, algebra, scalatest, scopt and squants. -Join our [community build](https://github.com/lampepfl/dotty/tree/main/community-build) +Join our [community build](https://github.com/scala/scala3/tree/main/community-build) to make sure that our regression suite includes your library. [Scastie]: https://scastie.scala-lang.org/?target=dotty diff --git a/docs/_blog/_posts/2021-04-21-scala3-rc3.md b/docs/_blog/_posts/2021-04-21-scala3-rc3.md index 8651730da93a..6eddc82e7273 100644 --- a/docs/_blog/_posts/2021-04-21-scala3-rc3.md +++ b/docs/_blog/_posts/2021-04-21-scala3-rc3.md @@ -13,16 +13,16 @@ This release also impacts the release date for 3.0.0 stable. 3.0.0 stable will g # Bug fixes included -- Fix type test for trait parameter arguments [#12066](https://github.com/lampepfl/dotty/pull/12066) -- Set file filter correctly [#12119](https://github.com/lampepfl/dotty/pull/12119) -- Provide mirror support after inlining [#12079](https://github.com/lampepfl/dotty/pull/12079) -- Revert "Recursively check nonvariant arguments of base types for realizability" [#12067](https://github.com/lampepfl/dotty/pull/12067) -- When simplifying match types, ensure fully defined before reducing [#12068](https://github.com/lampepfl/dotty/pull/12068) -- sbt-dotty: the binary version is 3 for Scala >= 3.0.0 [#12084](https://github.com/lampepfl/dotty/pull/12084) -- Fix isInstanceOf[Array[?]] returning true on non-Array [#12108](https://github.com/lampepfl/dotty/pull/12108) -- Scala2Unpickler: don't unpickle the same type parameter twice [#12129](https://github.com/lampepfl/dotty/pull/12129) -- Overloading resolution: Handle SAM types more like Java and Scala 2 [#12131](https://github.com/lampepfl/dotty/pull/12131) -- Add TermParamClause.isGiven [#12042](https://github.com/lampepfl/dotty/pull/12042) +- Fix type test for trait parameter arguments [#12066](https://github.com/scala/scala3/pull/12066) +- Set file filter correctly [#12119](https://github.com/scala/scala3/pull/12119) +- Provide mirror support after inlining [#12079](https://github.com/scala/scala3/pull/12079) +- Revert "Recursively check nonvariant arguments of base types for realizability" [#12067](https://github.com/scala/scala3/pull/12067) +- When simplifying match types, ensure fully defined before reducing [#12068](https://github.com/scala/scala3/pull/12068) +- sbt-dotty: the binary version is 3 for Scala >= 3.0.0 [#12084](https://github.com/scala/scala3/pull/12084) +- Fix isInstanceOf[Array[?]] returning true on non-Array [#12108](https://github.com/scala/scala3/pull/12108) +- Scala2Unpickler: don't unpickle the same type parameter twice [#12129](https://github.com/scala/scala3/pull/12129) +- Overloading resolution: Handle SAM types more like Java and Scala 2 [#12131](https://github.com/scala/scala3/pull/12131) +- Add TermParamClause.isGiven [#12042](https://github.com/scala/scala3/pull/12042) ## Contributors Thank you to all the contributors who made this release possible 🎉 @@ -40,7 +40,7 @@ According to `git shortlog -sn --no-merges 3.0.0-RC2..3.0.0-RC3` these are: If you want to get your hands dirty and contribute to Dotty, now is a good time to get involved! Head to our [Getting Started page for new contributors](https://dotty.epfl.ch/docs/contributing/getting-started.html), -and have a look at some of the [good first issues](https://github.com/lampepfl/dotty/issues?q=is%3Aissue+is%3Aopen+label%3Aexp%3Anovice). +and have a look at some of the [good first issues](https://github.com/scala/scala3/issues?q=is%3Aissue+is%3Aopen+label%3Aexp%3Anovice). They make perfect entry points into hacking on the compiler. We are looking forward to having you join the team of contributors. @@ -49,7 +49,7 @@ We are looking forward to having you join the team of contributors. Dotty now has a set of widely-used community libraries that are built against every nightly Dotty snapshot. Currently, this includes shapeless, ScalaPB, algebra, scalatest, scopt and squants. -Join our [community build](https://github.com/lampepfl/dotty/tree/main/community-build) +Join our [community build](https://github.com/scala/scala3/tree/main/community-build) to make sure that our regression suite includes your library. [Scastie]: https://scastie.scala-lang.org/?target=dotty diff --git a/docs/_blog/_posts/2021-06-07-scala3.0.1-rc1-release.md b/docs/_blog/_posts/2021-06-07-scala3.0.1-rc1-release.md index e9fac2d0447c..617f175e74b7 100644 --- a/docs/_blog/_posts/2021-06-07-scala3.0.1-rc1-release.md +++ b/docs/_blog/_posts/2021-06-07-scala3.0.1-rc1-release.md @@ -21,13 +21,13 @@ The spirit of this policy is to make sure that effectively, no library published Having said that, we still encourage people to play with the experimental features from the `NIGHTLY` compiler versions and discuss their findings. Without the curious and adventurous part of the community playing with the new features, there is no way of knowing what they are good for, and no way to decide whether they should be dropped or promoted to a stable feature. -More about this change you can read in the PR [#12102](https://github.com/lampepfl/dotty/pull/12102). +More about this change you can read in the PR [#12102](https://github.com/scala/scala3/pull/12102). # Kind-projector work -This release also brings extra features for the [Kind Projector](https://docs.scala-lang.org/scala3/guides/migration/plugin-kind-projector.html) migration support. First, PR [#12378](https://github.com/lampepfl/dotty/pull/12378) allows `_` as type lambda placeholder. Second, PR [#12341](https://github.com/lampepfl/dotty/pull/12341) brings support for the variance annotations on the placeholder. This work enhances the ability to cross-compile Scala 2 code that uses the Kind Projector plugin to Scala 3. +This release also brings extra features for the [Kind Projector](https://docs.scala-lang.org/scala3/guides/migration/plugin-kind-projector.html) migration support. First, PR [#12378](https://github.com/scala/scala3/pull/12378) allows `_` as type lambda placeholder. Second, PR [#12341](https://github.com/scala/scala3/pull/12341) brings support for the variance annotations on the placeholder. This work enhances the ability to cross-compile Scala 2 code that uses the Kind Projector plugin to Scala 3. # Improved error reporting -Down the error reporting lane, match type reduction errors were improved. When using a match type, it may or may not reduce to one of its cases. If it doesn't match type is used as specified, e.g. if `M[T]` is a match type and it didn't reduce for `M[Int]`, `M[Int]` will be used. This behavior, however, is frequently not what you want: there is a lot of cases where you would expect a match type to reduce but it doesn't. In such cases, it would be nice to have some diagnostic regarding why it didn't reduce. PR [#12053](https://github.com/lampepfl/dotty/pull/12053/) adds just such a diagnostic. E.g. the following code: +Down the error reporting lane, match type reduction errors were improved. When using a match type, it may or may not reduce to one of its cases. If it doesn't match type is used as specified, e.g. if `M[T]` is a match type and it didn't reduce for `M[Int]`, `M[Int]` will be used. This behavior, however, is frequently not what you want: there is a lot of cases where you would expect a match type to reduce but it doesn't. In such cases, it would be nice to have some diagnostic regarding why it didn't reduce. PR [#12053](https://github.com/scala/scala3/pull/12053/) adds just such a diagnostic. E.g. the following code: ```scala trait A @@ -58,32 +58,32 @@ will report the following error: ``` # Scaladoc -We have updated the [documentation](http://dotty.epfl.ch/docs/usage/scaladoc/index.html) for Scaladoc making it easier for you to get started. Also, PR [#11582](https://github.com/lampepfl/dotty/pull/11582) has added the snippet compiler to ensure the snippets in your scaladoc documentation comments aren't broken. You can read more about this feature on the [mailing list](https://contributors.scala-lang.org/t/snippet-validation-in-scaladoc-for-scala-3/4976). +We have updated the [documentation](http://dotty.epfl.ch/docs/usage/scaladoc/index.html) for Scaladoc making it easier for you to get started. Also, PR [#11582](https://github.com/scala/scala3/pull/11582) has added the snippet compiler to ensure the snippets in your scaladoc documentation comments aren't broken. You can read more about this feature on the [mailing list](https://contributors.scala-lang.org/t/snippet-validation-in-scaladoc-for-scala-3/4976). # Metaprogramming A lot of metaprogramming work was focused on improving the performance. Some of the notable PRs include: -- Cache quote unpickling [#12242](https://github.com/lampepfl/dotty/pull/12242) -- Avoid pickled tasty for some captured quote reference [#12248](https://github.com/lampepfl/dotty/pull/12248) -- Improve quote matcher performance [#12418](https://github.com/lampepfl/dotty/pull/12418) -- Port scala.quoted.runtime.impl.QuoteMatcher [#12402](https://github.com/lampepfl/dotty/pull/12402) +- Cache quote unpickling [#12242](https://github.com/scala/scala3/pull/12242) +- Avoid pickled tasty for some captured quote reference [#12248](https://github.com/scala/scala3/pull/12248) +- Improve quote matcher performance [#12418](https://github.com/scala/scala3/pull/12418) +- Port scala.quoted.runtime.impl.QuoteMatcher [#12402](https://github.com/scala/scala3/pull/12402) # Issue fixing Otherwise, we are making an effort to reduce our issue tracker. Among others, the following are some of the PRs dedicated to issue fixing: -- IArray.toArray: Deprecate broken method [#12598](https://github.com/lampepfl/dotty/pull/12598) -- Fix comparison of dependent function types [#12214](https://github.com/lampepfl/dotty/pull/12214) -- Make translucentSuperType handle match types [#12153](https://github.com/lampepfl/dotty/pull/12153) -- Harden Type Inference [#12560](https://github.com/lampepfl/dotty/pull/12560) -- Reject references to self in super constructor calls [#12567](https://github.com/lampepfl/dotty/pull/12567) -- Provide mirror support after inlining [#12062](https://github.com/lampepfl/dotty/pull/12062) -- Allow export paths to see imports [#12134](https://github.com/lampepfl/dotty/pull/12134) -- Streamline given syntax [#12107](https://github.com/lampepfl/dotty/pull/12107) -- Export constructor proxies [#12311](https://github.com/lampepfl/dotty/pull/12311) -- Identify package and nested package object in isSubPrefix [#12297](https://github.com/lampepfl/dotty/pull/12297) -- Treat Refinements more like AndTypes [#12317](https://github.com/lampepfl/dotty/pull/12317) -- Fix [#9871](https://github.com/lampepfl/dotty/pull/9871): use toNestedPairs in provablyDisjoint [#10560](https://github.com/lampepfl/dotty/pull/10560) +- IArray.toArray: Deprecate broken method [#12598](https://github.com/scala/scala3/pull/12598) +- Fix comparison of dependent function types [#12214](https://github.com/scala/scala3/pull/12214) +- Make translucentSuperType handle match types [#12153](https://github.com/scala/scala3/pull/12153) +- Harden Type Inference [#12560](https://github.com/scala/scala3/pull/12560) +- Reject references to self in super constructor calls [#12567](https://github.com/scala/scala3/pull/12567) +- Provide mirror support after inlining [#12062](https://github.com/scala/scala3/pull/12062) +- Allow export paths to see imports [#12134](https://github.com/scala/scala3/pull/12134) +- Streamline given syntax [#12107](https://github.com/scala/scala3/pull/12107) +- Export constructor proxies [#12311](https://github.com/scala/scala3/pull/12311) +- Identify package and nested package object in isSubPrefix [#12297](https://github.com/scala/scala3/pull/12297) +- Treat Refinements more like AndTypes [#12317](https://github.com/scala/scala3/pull/12317) +- Fix [#9871](https://github.com/scala/scala3/pull/9871): use toNestedPairs in provablyDisjoint [#10560](https://github.com/scala/scala3/pull/10560) # Contributors @@ -147,7 +147,7 @@ According to `git shortlog -sn --no-merges 3.0.0-RC2..3.0.1-RC1`† these are: ## Library authors: Join our community build Scala 3 now has a set of widely-used community libraries that are built against every nightly Scala 3 snapshot. -Join our [community build](https://github.com/lampepfl/dotty/tree/main/community-build) +Join our [community build](https://github.com/scala/scala3/tree/main/community-build) to make sure that our regression suite includes your library. [Scastie]: https://scastie.scala-lang.org/?target=dotty diff --git a/docs/_blog/_posts/2021-06-25-scala301-rc2.md b/docs/_blog/_posts/2021-06-25-scala301-rc2.md index 76257d1a8664..054a8dfc9d56 100644 --- a/docs/_blog/_posts/2021-06-25-scala301-rc2.md +++ b/docs/_blog/_posts/2021-06-25-scala301-rc2.md @@ -6,9 +6,9 @@ authorImg: /images/anatolii.png date: 2021-06-25 --- -This post is a quick announcement of Scala 3.0.1-RC2. This is the second release candidate for 3.0.1. The reason for this release is that a regression with respect to 3.0.0 was introduced by PR [#12519](https://github.com/lampepfl/dotty/pull/12519) which caused the compiler to fail where it shouldn't. We have fixed this regression in PR [#12827](https://github.com/lampepfl/dotty/pull/12827) and backported it to 3.0.1. This is the main reason for having 3.0.1-RC2 before 3.0.1 which is due in one week. +This post is a quick announcement of Scala 3.0.1-RC2. This is the second release candidate for 3.0.1. The reason for this release is that a regression with respect to 3.0.0 was introduced by PR [#12519](https://github.com/scala/scala3/pull/12519) which caused the compiler to fail where it shouldn't. We have fixed this regression in PR [#12827](https://github.com/scala/scala3/pull/12827) and backported it to 3.0.1. This is the main reason for having 3.0.1-RC2 before 3.0.1 which is due in one week. -Besides this main change, taking advantage of the fact that RC2 is happening, we have also included various SBT reporting improvements (PR [#12845](https://github.com/lampepfl/dotty/pull/12845)) which should improve interaction with [Metals](https://scalameta.org/metals/). Also we've backported a few infrastructural fixes even though they aren't a regression from 3.0.0. +Besides this main change, taking advantage of the fact that RC2 is happening, we have also included various SBT reporting improvements (PR [#12845](https://github.com/scala/scala3/pull/12845)) which should improve interaction with [Metals](https://scalameta.org/metals/). Also we've backported a few infrastructural fixes even though they aren't a regression from 3.0.0. @@ -29,7 +29,7 @@ According to `git shortlog -sn --no-merges 3.0.1-RC1..3.0.1-RC2` these are: ## Library authors: Join our community build Scala 3 now has a set of widely-used community libraries that are built against every nightly Scala 3 snapshot. -Join our [community build](https://github.com/lampepfl/dotty/tree/main/community-build) +Join our [community build](https://github.com/scala/scala3/tree/main/community-build) to make sure that our regression suite includes your library. [Scastie]: https://scastie.scala-lang.org/?target=dotty diff --git a/docs/_docs/contributing/architecture/context.md b/docs/_docs/contributing/architecture/context.md index cd38ee437867..61cb88ad5494 100644 --- a/docs/_docs/contributing/architecture/context.md +++ b/docs/_docs/contributing/architecture/context.md @@ -50,4 +50,4 @@ convention is that the `Context` be an explicit parameter, to track its usage. | ... | and so on | -[Contexts]: https://github.com/lampepfl/dotty/blob/master/compiler/src/dotty/tools/dotc/core/Contexts.scala +[Contexts]: https://github.com/scala/scala3/blob/master/compiler/src/dotty/tools/dotc/core/Contexts.scala diff --git a/docs/_docs/contributing/architecture/lifecycle.md b/docs/_docs/contributing/architecture/lifecycle.md index 2cf58f477da3..30ca934ede71 100644 --- a/docs/_docs/contributing/architecture/lifecycle.md +++ b/docs/_docs/contributing/architecture/lifecycle.md @@ -78,13 +78,13 @@ tools // contains helpers and the `scala` generic runner ``` -[Phases]: https://github.com/lampepfl/dotty/blob/master/compiler/src/dotty/tools/dotc/core/Phases.scala -[CompilationUnit]: https://github.com/lampepfl/dotty/blob/master/compiler/src/dotty/tools/dotc/CompilationUnit.scala +[Phases]: https://github.com/scala/scala3/blob/master/compiler/src/dotty/tools/dotc/core/Phases.scala +[CompilationUnit]: https://github.com/scala/scala3/blob/master/compiler/src/dotty/tools/dotc/CompilationUnit.scala -[dotty.tools]: https://github.com/lampepfl/dotty/tree/master/compiler/src/dotty/tools -[ScalaSettings]: https://github.com/lampepfl/dotty/blob/master/compiler/src/dotty/tools/dotc/config/ScalaSettings.scala +[dotty.tools]: https://github.com/scala/scala3/tree/master/compiler/src/dotty/tools +[ScalaSettings]: https://github.com/scala/scala3/blob/master/compiler/src/dotty/tools/dotc/config/ScalaSettings.scala [syntax]: https://docs.scala-lang.org/scala3/reference/syntax.html -[Main]: https://github.com/lampepfl/dotty/blob/master/compiler/src/dotty/tools/dotc/Main.scala -[Driver]: https://github.com/lampepfl/dotty/blob/master/compiler/src/dotty/tools/dotc/Driver.scala -[Compiler]: https://github.com/lampepfl/dotty/blob/master/compiler/src/dotty/tools/dotc/Compiler.scala -[Run]: https://github.com/lampepfl/dotty/blob/master/compiler/src/dotty/tools/dotc/Run.scala \ No newline at end of file +[Main]: https://github.com/scala/scala3/blob/master/compiler/src/dotty/tools/dotc/Main.scala +[Driver]: https://github.com/scala/scala3/blob/master/compiler/src/dotty/tools/dotc/Driver.scala +[Compiler]: https://github.com/scala/scala3/blob/master/compiler/src/dotty/tools/dotc/Compiler.scala +[Run]: https://github.com/scala/scala3/blob/master/compiler/src/dotty/tools/dotc/Run.scala \ No newline at end of file diff --git a/docs/_docs/contributing/architecture/phases.md b/docs/_docs/contributing/architecture/phases.md index 844ae144dddb..8e63de04dadb 100644 --- a/docs/_docs/contributing/architecture/phases.md +++ b/docs/_docs/contributing/architecture/phases.md @@ -85,24 +85,24 @@ suitable for the runtime system, with two sub-groupings: ### `backendPhases` These map the transformed trees to Java classfiles or SJSIR files. -[CompilationUnit]: https://github.com/lampepfl/dotty/blob/master/compiler/src/dotty/tools/dotc/CompilationUnit.scala -[Compiler]: https://github.com/lampepfl/dotty/blob/master/compiler/src/dotty/tools/dotc/Compiler.scala -[Phase]: https://github.com/lampepfl/dotty/blob/master/compiler/src/dotty/tools/dotc/core/Phases.scala -[MiniPhase]: https://github.com/lampepfl/dotty/blob/master/compiler/src/dotty/tools/dotc/transform/MegaPhase.scala -[Run]: https://github.com/lampepfl/dotty/blob/master/compiler/src/dotty/tools/dotc/Run.scala -[parser]: https://github.com/lampepfl/dotty/blob/master/compiler/src/dotty/tools/dotc/parsing/ParserPhase.scala -[typer]: https://github.com/lampepfl/dotty/blob/master/compiler/src/dotty/tools/dotc/typer/TyperPhase.scala -[posttyper]: https://github.com/lampepfl/dotty/blob/master/compiler/src/dotty/tools/dotc/transform/PostTyper.scala -[prepjsinterop]: https://github.com/lampepfl/dotty/blob/master/compiler/src/dotty/tools/dotc/transform/sjs/PrepJSInterop.scala -[pickler]: https://github.com/lampepfl/dotty/blob/master/compiler/src/dotty/tools/dotc/transform/Pickler.scala -[inlining]: https://github.com/lampepfl/dotty/blob/master/compiler/src/dotty/tools/dotc/transform/Inlining.scala -[postInlining]: https://github.com/lampepfl/dotty/blob/master/compiler/src/dotty/tools/dotc/transform/PostInlining.scala -[staging]: https://github.com/lampepfl/dotty/blob/master/compiler/src/dotty/tools/dotc/transform/Staging.scala -[pickleQuotes]: https://github.com/lampepfl/dotty/blob/master/compiler/src/dotty/tools/dotc/transform/PickleQuotes.scala -[refchecks]: https://github.com/lampepfl/dotty/blob/master/compiler/src/dotty/tools/dotc/typer/RefChecks.scala -[initChecker]: https://github.com/lampepfl/dotty/blob/master/compiler/src/dotty/tools/dotc/transform/init/Checker.scala -[firstTransform]: https://github.com/lampepfl/dotty/blob/master/compiler/src/dotty/tools/dotc/transform/FirstTransform.scala -[patternMatcher]: https://github.com/lampepfl/dotty/blob/master/compiler/src/dotty/tools/dotc/transform/PatternMatcher.scala -[erasure]: https://github.com/lampepfl/dotty/blob/master/compiler/src/dotty/tools/dotc/transform/Erasure.scala -[Mirror]: https://github.com/lampepfl/dotty/blob/master/library/src/scala/deriving/Mirror.scala +[CompilationUnit]: https://github.com/scala/scala3/blob/master/compiler/src/dotty/tools/dotc/CompilationUnit.scala +[Compiler]: https://github.com/scala/scala3/blob/master/compiler/src/dotty/tools/dotc/Compiler.scala +[Phase]: https://github.com/scala/scala3/blob/master/compiler/src/dotty/tools/dotc/core/Phases.scala +[MiniPhase]: https://github.com/scala/scala3/blob/master/compiler/src/dotty/tools/dotc/transform/MegaPhase.scala +[Run]: https://github.com/scala/scala3/blob/master/compiler/src/dotty/tools/dotc/Run.scala +[parser]: https://github.com/scala/scala3/blob/master/compiler/src/dotty/tools/dotc/parsing/ParserPhase.scala +[typer]: https://github.com/scala/scala3/blob/master/compiler/src/dotty/tools/dotc/typer/TyperPhase.scala +[posttyper]: https://github.com/scala/scala3/blob/master/compiler/src/dotty/tools/dotc/transform/PostTyper.scala +[prepjsinterop]: https://github.com/scala/scala3/blob/master/compiler/src/dotty/tools/dotc/transform/sjs/PrepJSInterop.scala +[pickler]: https://github.com/scala/scala3/blob/master/compiler/src/dotty/tools/dotc/transform/Pickler.scala +[inlining]: https://github.com/scala/scala3/blob/master/compiler/src/dotty/tools/dotc/transform/Inlining.scala +[postInlining]: https://github.com/scala/scala3/blob/master/compiler/src/dotty/tools/dotc/transform/PostInlining.scala +[staging]: https://github.com/scala/scala3/blob/master/compiler/src/dotty/tools/dotc/transform/Staging.scala +[pickleQuotes]: https://github.com/scala/scala3/blob/master/compiler/src/dotty/tools/dotc/transform/PickleQuotes.scala +[refchecks]: https://github.com/scala/scala3/blob/master/compiler/src/dotty/tools/dotc/typer/RefChecks.scala +[initChecker]: https://github.com/scala/scala3/blob/master/compiler/src/dotty/tools/dotc/transform/init/Checker.scala +[firstTransform]: https://github.com/scala/scala3/blob/master/compiler/src/dotty/tools/dotc/transform/FirstTransform.scala +[patternMatcher]: https://github.com/scala/scala3/blob/master/compiler/src/dotty/tools/dotc/transform/PatternMatcher.scala +[erasure]: https://github.com/scala/scala3/blob/master/compiler/src/dotty/tools/dotc/transform/Erasure.scala +[Mirror]: https://github.com/scala/scala3/blob/master/library/src/scala/deriving/Mirror.scala [PCP]: ../../reference/metaprogramming/macros.md#the-phase-consistency-principle diff --git a/docs/_docs/contributing/architecture/symbols.md b/docs/_docs/contributing/architecture/symbols.md index c19588a4ff12..c11c054b4967 100644 --- a/docs/_docs/contributing/architecture/symbols.md +++ b/docs/_docs/contributing/architecture/symbols.md @@ -60,11 +60,11 @@ All definition symbols will contain a `SymDenotation`. The denotation, in turn, A class symbol will instead be associated with a `ClassDenotation`, which extends `SymDenotation` with some additional fields specific for classes. -[Signature1]: https://github.com/lampepfl/dotty/blob/a527f3b1e49c0d48148ccfb2eb52e3302fc4a349/compiler/src/dotty/tools/dotc/core/Signature.scala#L9-L33 -[Symbols]: https://github.com/lampepfl/dotty/blob/master/compiler/src/dotty/tools/dotc/core/Symbols.scala -[flatten]: https://github.com/lampepfl/dotty/blob/master/compiler/src/dotty/tools/dotc/transform/Flatten.scala -[lambdaLift]: https://github.com/lampepfl/dotty/blob/master/compiler/src/dotty/tools/dotc/transform/LambdaLift.scala -[CompilationUnit]: https://github.com/lampepfl/dotty/blob/master/compiler/src/dotty/tools/dotc/CompilationUnit.scala -[Denotations]: https://github.com/lampepfl/dotty/blob/master/compiler/src/dotty/tools/dotc/core/Denotations.scala -[SymDenotations]: https://github.com/lampepfl/dotty/blob/master/compiler/src/dotty/tools/dotc/core/SymDenotations.scala -[flags]: https://github.com/lampepfl/dotty/blob/master/compiler/src/dotty/tools/dotc/core/Flags.scala +[Signature1]: https://github.com/scala/scala3/blob/a527f3b1e49c0d48148ccfb2eb52e3302fc4a349/compiler/src/dotty/tools/dotc/core/Signature.scala#L9-L33 +[Symbols]: https://github.com/scala/scala3/blob/master/compiler/src/dotty/tools/dotc/core/Symbols.scala +[flatten]: https://github.com/scala/scala3/blob/master/compiler/src/dotty/tools/dotc/transform/Flatten.scala +[lambdaLift]: https://github.com/scala/scala3/blob/master/compiler/src/dotty/tools/dotc/transform/LambdaLift.scala +[CompilationUnit]: https://github.com/scala/scala3/blob/master/compiler/src/dotty/tools/dotc/CompilationUnit.scala +[Denotations]: https://github.com/scala/scala3/blob/master/compiler/src/dotty/tools/dotc/core/Denotations.scala +[SymDenotations]: https://github.com/scala/scala3/blob/master/compiler/src/dotty/tools/dotc/core/SymDenotations.scala +[flags]: https://github.com/scala/scala3/blob/master/compiler/src/dotty/tools/dotc/core/Flags.scala diff --git a/docs/_docs/contributing/architecture/time.md b/docs/_docs/contributing/architecture/time.md index 588b1ce40bb2..56a6cf7447a1 100644 --- a/docs/_docs/contributing/architecture/time.md +++ b/docs/_docs/contributing/architecture/time.md @@ -61,8 +61,8 @@ method foo after typer => (b: Box)(x: b.X): scala.collection.immutable.List[b. method foo after erasure => (b: Box, x: Object): scala.collection.immutable.List ``` -[runs]: https://github.com/lampepfl/dotty/blob/a527f3b1e49c0d48148ccfb2eb52e3302fc4a349/compiler/src/dotty/tools/dotc/Run.scala -[periods]: https://github.com/lampepfl/dotty/blob/a527f3b1e49c0d48148ccfb2eb52e3302fc4a349/compiler/src/dotty/tools/dotc/core/Periods.scala -[Contexts]: https://github.com/lampepfl/dotty/blob/master/compiler/src/dotty/tools/dotc/core/Contexts.scala -[typer]: https://github.com/lampepfl/dotty/blob/master/compiler/src/dotty/tools/dotc/typer/TyperPhase.scala -[erasure]: https://github.com/lampepfl/dotty/blob/master/compiler/src/dotty/tools/dotc/transform/Erasure.scala +[runs]: https://github.com/scala/scala3/blob/a527f3b1e49c0d48148ccfb2eb52e3302fc4a349/compiler/src/dotty/tools/dotc/Run.scala +[periods]: https://github.com/scala/scala3/blob/a527f3b1e49c0d48148ccfb2eb52e3302fc4a349/compiler/src/dotty/tools/dotc/core/Periods.scala +[Contexts]: https://github.com/scala/scala3/blob/master/compiler/src/dotty/tools/dotc/core/Contexts.scala +[typer]: https://github.com/scala/scala3/blob/master/compiler/src/dotty/tools/dotc/typer/TyperPhase.scala +[erasure]: https://github.com/scala/scala3/blob/master/compiler/src/dotty/tools/dotc/transform/Erasure.scala diff --git a/docs/_docs/contributing/architecture/types.md b/docs/_docs/contributing/architecture/types.md index 2dfdc33101a0..ed8995c08643 100644 --- a/docs/_docs/contributing/architecture/types.md +++ b/docs/_docs/contributing/architecture/types.md @@ -143,5 +143,5 @@ Type -+- proxy_type --+- NamedType --------+- TypeRef ``` -[Types.scala]: https://github.com/lampepfl/dotty/blob/master/compiler/src/dotty/tools/dotc/core/Types.scala -[DottyTypeStealer]: https://github.com/lampepfl/dotty/blob/master/compiler/test/dotty/tools/DottyTypeStealer.scala +[Types.scala]: https://github.com/scala/scala3/blob/master/compiler/src/dotty/tools/dotc/core/Types.scala +[DottyTypeStealer]: https://github.com/scala/scala3/blob/master/compiler/test/dotty/tools/DottyTypeStealer.scala diff --git a/docs/_docs/contributing/community-build.md b/docs/_docs/contributing/community-build.md index b382786c614e..e333e4985e36 100644 --- a/docs/_docs/contributing/community-build.md +++ b/docs/_docs/contributing/community-build.md @@ -32,14 +32,14 @@ project to the community build you can follow these steps: check out the [Scala 3 Migration Guide](https://docs.scala-lang.org/scala3/guides/migration/compatibility-intro.html). You can see the submodules in - [community-projects](https://github.com/lampepfl/dotty/tree/main/community-build/community-projects/) + [community-projects](https://github.com/scala/scala3/tree/main/community-build/community-projects/) for examples of projects that compile with Scala 3. 2. Open a PR against this repo that: - Adds your project as a new git submodule - `git submodule add https://github.com/dotty-staging/XYZ.git community-build/community-projects/XYZ` - - Add the project to [projects.scala](https://github.com/lampepfl/dotty/blob/main/community-build/src/scala/dotty/communitybuild/projects.scala) - - Adds a test in [CommunityBuildTest.scala](https://github.com/lampepfl/dotty/blob/main/community-build/test/scala/dotty/communitybuild/CommunityBuildTest.scala) + - Add the project to [projects.scala](https://github.com/scala/scala3/blob/main/community-build/src/scala/dotty/communitybuild/projects.scala) + - Adds a test in [CommunityBuildTest.scala](https://github.com/scala/scala3/blob/main/community-build/test/scala/dotty/communitybuild/CommunityBuildTest.scala) 3. Once the CI is green, someone from the Dotty team will fork your repo and add it to [dotty-staging](https://github.com/dotty-staging). This enables us to diff --git a/docs/_docs/contributing/debugging/ide-debugging.md b/docs/_docs/contributing/debugging/ide-debugging.md index af817826565a..8548235672af 100644 --- a/docs/_docs/contributing/debugging/ide-debugging.md +++ b/docs/_docs/contributing/debugging/ide-debugging.md @@ -74,7 +74,7 @@ To locate them on your filesystem you can run the `export scala3-library-bootstr ``` $ sbt > export scala3-library-bootstrapped/fullClasspath -/home/user/lampepfl/dotty/out/bootstrap/scala3-library-bootstrapped/scala-3.3.1-RC1-bin-SNAPSHOT-nonbootstrapped/classes:/home/user/.cache/coursier/v1/https/repo1.maven.org/maven2/org/scala-lang/scala-library/2.13.10/scala-library-2.13.10.jar +/home/user/scala/scala3/out/bootstrap/scala3-library-bootstrapped/scala-3.3.1-RC1-bin-SNAPSHOT-nonbootstrapped/classes:/home/user/.cache/coursier/v1/https/repo1.maven.org/maven2/org/scala-lang/scala-library/2.13.10/scala-library-2.13.10.jar [success] Total time: 1 s, completed Mar 10, 2023, 4:37:43 PM ``` @@ -93,7 +93,7 @@ Here is the final configuration: "../tests/pos/HelloWorld.scala", "-classpath", // To replace with your own paths - "/home/user/lampepfl/dotty/out/bootstrap/scala3-library-bootstrapped/scala-3.3.1-RC1-bin-SNAPSHOT-nonbootstrapped/classes:/home/user/.cache/coursier/v1/https/repo1.maven.org/maven2/org/scala-lang/scala-library/2.13.10/scala-library-2.13.10.jar", + "/home/user/scala/scala3/out/bootstrap/scala3-library-bootstrapped/scala-3.3.1-RC1-bin-SNAPSHOT-nonbootstrapped/classes:/home/user/.cache/coursier/v1/https/repo1.maven.org/maven2/org/scala-lang/scala-library/2.13.10/scala-library-2.13.10.jar", "-color", "never" ], @@ -112,7 +112,7 @@ You can compile more than one Scala file, by adding them in the `args`: "file1.scala", "file2.scala", "-classpath", - "/home/user/lampepfl/dotty/out/bootstrap/scala3-library-bootstrapped/scala-3.3.1-RC1-bin-SNAPSHOT-nonbootstrapped/classes:/home/user/.cache/coursier/v1/https/repo1.maven.org/maven2/org/scala-lang/scala-library/2.13.10/scala-library-2.13.10.jar" + "/home/user/scala/scala3/out/bootstrap/scala3-library-bootstrapped/scala-3.3.1-RC1-bin-SNAPSHOT-nonbootstrapped/classes:/home/user/.cache/coursier/v1/https/repo1.maven.org/maven2/org/scala-lang/scala-library/2.13.10/scala-library-2.13.10.jar" ] ``` @@ -132,7 +132,7 @@ And concatenate the output into the classpath argument, which should already con "args": [ "using-cats.scala", "-classpath", - "/home/user/lampepfl/dotty/out/bootstrap/scala3-library-bootstrapped/scala-3.3.1-RC1-bin-SNAPSHOT-nonbootstrapped/classes:/home/user/.cache/coursier/v1/https/repo1.maven.org/maven2/org/scala-lang/scala-library/2.13.10/scala-library-2.13.10.jar:/home/user/.cache/coursier/v1/https/repo1.maven.org/maven2/org/typelevel/cats-core_3/2.9.0/cats-core_3-2.9.0.jar:/home/user/.cache/coursier/v1/https/repo1.maven.org/maven2/org/typelevel/cats-kernel_3/2.9.0/cats-kernel_3-2.9.0.jar" + "/home/user/scala/scala3/out/bootstrap/scala3-library-bootstrapped/scala-3.3.1-RC1-bin-SNAPSHOT-nonbootstrapped/classes:/home/user/.cache/coursier/v1/https/repo1.maven.org/maven2/org/scala-lang/scala-library/2.13.10/scala-library-2.13.10.jar:/home/user/.cache/coursier/v1/https/repo1.maven.org/maven2/org/typelevel/cats-core_3/2.9.0/cats-core_3-2.9.0.jar:/home/user/.cache/coursier/v1/https/repo1.maven.org/maven2/org/typelevel/cats-kernel_3/2.9.0/cats-kernel_3-2.9.0.jar" ] ``` diff --git a/docs/_docs/contributing/debugging/inspection.md b/docs/_docs/contributing/debugging/inspection.md index a80c3d3462ae..7cb1fa68abff 100644 --- a/docs/_docs/contributing/debugging/inspection.md +++ b/docs/_docs/contributing/debugging/inspection.md @@ -181,6 +181,6 @@ class StealBox: assert(empty.name.toString == "") ``` -[DottyTypeStealer]: https://github.com/lampepfl/dotty/blob/master/compiler/test/dotty/tools/DottyTypeStealer.scala -[ScalaSettings]: https://github.com/lampepfl/dotty/blob/master/compiler/src/dotty/tools/dotc/config/ScalaSettings.scala -[symbols]: https://github.com/lampepfl/dotty/blob/master/compiler/src/dotty/tools/dotc/core/SymDenotations.scala +[DottyTypeStealer]: https://github.com/scala/scala3/blob/master/compiler/test/dotty/tools/DottyTypeStealer.scala +[ScalaSettings]: https://github.com/scala/scala3/blob/master/compiler/src/dotty/tools/dotc/config/ScalaSettings.scala +[symbols]: https://github.com/scala/scala3/blob/master/compiler/src/dotty/tools/dotc/core/SymDenotations.scala diff --git a/docs/_docs/contributing/debugging/other-debugging.md b/docs/_docs/contributing/debugging/other-debugging.md index 50be43db51ab..e8b72bcca656 100644 --- a/docs/_docs/contributing/debugging/other-debugging.md +++ b/docs/_docs/contributing/debugging/other-debugging.md @@ -152,7 +152,7 @@ To print out the trees after all phases: scalac -Xprint:all ../issues/Playground.scala ``` -To find out the list of all the phases and their names, check out [this](https://github.com/lampepfl/dotty/blob/10526a7d0aa8910729b6036ee51942e05b71abf6/compiler/src/dotty/tools/dotc/Compiler.scala#L34) line in `Compiler.scala`. Each `Phase` object has `phaseName` defined on it, this is the phase name. +To find out the list of all the phases and their names, check out [this](https://github.com/scala/scala3/blob/10526a7d0aa8910729b6036ee51942e05b71abf6/compiler/src/dotty/tools/dotc/Compiler.scala#L34) line in `Compiler.scala`. Each `Phase` object has `phaseName` defined on it, this is the phase name. ## Printing out stack traces of compile time errors You can use the flag `-Ydebug-error` to get the stack trace of all the compile-time errors. Consider the following file: @@ -207,7 +207,7 @@ val YshowVarBounds = BooleanSetting("-Yshow-var-bounds" , "Print type varia val YtestPickler = BooleanSetting("-Ytest-pickler" , "self-test for pickling functionality; should be used with -Ystop-after:pickler") ``` -They are defined in [ScalaSettings.scala](https://github.com/lampepfl/dotty/blob/main/compiler/src/dotty/tools/dotc/config/ScalaSettings.scala). E.g. `YprintPos` is defined as: +They are defined in [ScalaSettings.scala](https://github.com/scala/scala3/blob/main/compiler/src/dotty/tools/dotc/config/ScalaSettings.scala). E.g. `YprintPos` is defined as: ```scala val YprintPos: Setting[Boolean] = BooleanSetting("-Yprint-pos", "show tree positions.") @@ -244,7 +244,7 @@ package @ { ### Figuring out an object creation site #### Via ID -Every [Positioned](https://github.com/lampepfl/dotty/blob/10526a7d0aa8910729b6036ee51942e05b71abf6/compiler/src/dotty/tools/dotc/ast/Positioned.scala) (a parent class of `Tree`) object has a `uniqueId` field. It is an integer that is unique for that tree and doesn't change from compile run to compile run. You can output these IDs from any printer (such as the ones used by `.show` and `-Xprint`) via `-Yshow-tree-ids` flag, e.g.: +Every [Positioned](https://github.com/scala/scala3/blob/10526a7d0aa8910729b6036ee51942e05b71abf6/compiler/src/dotty/tools/dotc/ast/Positioned.scala) (a parent class of `Tree`) object has a `uniqueId` field. It is an integer that is unique for that tree and doesn't change from compile run to compile run. You can output these IDs from any printer (such as the ones used by `.show` and `-Xprint`) via `-Yshow-tree-ids` flag, e.g.: ```shell scalac -Xprint:typer -Yshow-tree-ids ../issues/Playground.scala @@ -355,7 +355,7 @@ if (tree.show == """println("Hello World")""") { } ``` -In other words, you have a reference to the object and want to know were it was created. To do so, go to the class definition of that object. In our case, `tree` is a [`Tree`](https://github.com/lampepfl/dotty/blob/10526a7d0aa8910729b6036ee51942e05b71abf6/compiler/src/dotty/tools/dotc/ast/Trees.scala#L52). Now, create a new `val` member of that type: +In other words, you have a reference to the object and want to know were it was created. To do so, go to the class definition of that object. In our case, `tree` is a [`Tree`](https://github.com/scala/scala3/blob/10526a7d0aa8910729b6036ee51942e05b71abf6/compiler/src/dotty/tools/dotc/ast/Trees.scala#L52). Now, create a new `val` member of that type: ```scala val tracer = Thread.currentThread.getStackTrace.mkString("\n") @@ -380,7 +380,7 @@ Dotty has a lot of debug calls scattered throughout the code, most of which are These do not follow any particular system and so probably it will be easier to go with `println` most of the times instead. #### Printers -Defined in [Printers.scala](https://github.com/lampepfl/dotty/blob/10526a7d0aa8910729b6036ee51942e05b71abf6/compiler/src/dotty/tools/dotc/config/Printers.scala) as a set of variables, each responsible for its own domain. To enable them, replace `noPrinter` with `default`. [Example](https://github.com/lampepfl/dotty/blob/10526a7d0aa8910729b6036ee51942e05b71abf6/compiler/src/dotty/tools/dotc/typer/Typer.scala#L2226) from the code: +Defined in [Printers.scala](https://github.com/scala/scala3/blob/10526a7d0aa8910729b6036ee51942e05b71abf6/compiler/src/dotty/tools/dotc/config/Printers.scala) as a set of variables, each responsible for its own domain. To enable them, replace `noPrinter` with `default`. [Example](https://github.com/scala/scala3/blob/10526a7d0aa8910729b6036ee51942e05b71abf6/compiler/src/dotty/tools/dotc/typer/Typer.scala#L2226) from the code: ```scala typr.println(i"make contextual function $tree / $pt ---> $ifun") @@ -389,13 +389,13 @@ typr.println(i"make contextual function $tree / $pt ---> $ifun") `typr` is a printer. #### Tracing -Defined in [trace.scala](https://github.com/lampepfl/dotty/blob/10526a7d0aa8910729b6036ee51942e05b71abf6/compiler/src/dotty/tools/dotc/reporting/trace.scala). [Example](https://github.com/lampepfl/dotty/blob/10526a7d0aa8910729b6036ee51942e05b71abf6/compiler/src/dotty/tools/dotc/typer/Typer.scala#L2232) from the code: +Defined in [trace.scala](https://github.com/scala/scala3/blob/10526a7d0aa8910729b6036ee51942e05b71abf6/compiler/src/dotty/tools/dotc/reporting/trace.scala). [Example](https://github.com/scala/scala3/blob/10526a7d0aa8910729b6036ee51942e05b71abf6/compiler/src/dotty/tools/dotc/typer/Typer.scala#L2232) from the code: ```scala trace(i"typing $tree", typr, show = true) { // ... ``` -To enable globally, change [tracingEnabled](https://github.com/lampepfl/dotty/blob/10526a7d0aa8910729b6036ee51942e05b71abf6/compiler/src/dotty/tools/dotc/config/Config.scala#L164) to `true` (will recompile a lot of code). +To enable globally, change [tracingEnabled](https://github.com/scala/scala3/blob/10526a7d0aa8910729b6036ee51942e05b71abf6/compiler/src/dotty/tools/dotc/config/Config.scala#L164) to `true` (will recompile a lot of code). You also need to set the printer referenced in the call (in the example, `typr`) to `default` as explained in the section on printers. @@ -406,4 +406,4 @@ trace.force(i"typing $tree", typr, show = true) { // ... ``` #### Reporter -Defined in [Reporter.scala](https://github.com/lampepfl/dotty/blob/10526a7d0aa8910729b6036ee51942e05b71abf6/compiler/src/dotty/tools/dotc/reporting/Reporter.scala). Enables calls such as `report.log`. To enable, run scalac with `-Ylog:typer` option. +Defined in [Reporter.scala](https://github.com/scala/scala3/blob/10526a7d0aa8910729b6036ee51942e05b71abf6/compiler/src/dotty/tools/dotc/reporting/Reporter.scala). Enables calls such as `report.log`. To enable, run scalac with `-Ylog:typer` option. diff --git a/docs/_docs/contributing/getting-started.md b/docs/_docs/contributing/getting-started.md index 938e7ff36d42..63e968902600 100644 --- a/docs/_docs/contributing/getting-started.md +++ b/docs/_docs/contributing/getting-started.md @@ -127,7 +127,7 @@ For more information, see the [scaladoc section](./scaladoc.md). ## Community The main development discussion channels are: -- [github.com/lampepfl/dotty/discussions](https://github.com/lampepfl/dotty/discussions) +- [github.com/scala/scala3/discussions](https://github.com/scala/scala3/discussions) - [contributors.scala-lang.org](https://contributors.scala-lang.org) - [gitter.im/scala/contributors](https://gitter.im/scala/contributors) @@ -141,5 +141,5 @@ The main development discussion channels are: [adopt]: https://adoptopenjdk.net/ [compat]: https://docs.scala-lang.org/overviews/jdk-compatibility/overview.html [scala-cla]: https://www.lightbend.com/contribute/cla/scala -[dotty-issue]: https://github.com/lampepfl/dotty/issues -[dotty-discussion]: https://github.com/lampepfl/dotty/discussions +[dotty-issue]: https://github.com/scala/scala3/issues +[dotty-discussion]: https://github.com/scala/scala3/discussions diff --git a/docs/_docs/contributing/index.md b/docs/_docs/contributing/index.md index 507149340941..965847e39a94 100644 --- a/docs/_docs/contributing/index.md +++ b/docs/_docs/contributing/index.md @@ -12,9 +12,9 @@ also documents the inner workings of the Scala 3 compiler, `dotc`. Keep in mind that the code for `dotc` is continually changing, so the ideas discussed in this guide may fall out of date. This is a living document, so please consider contributing to it on -[GitHub](https://github.com/lampepfl/dotty/tree/main/docs/_docs/contributing) if +[GitHub](https://github.com/scala/scala3/tree/main/docs/_docs/contributing) if you notice anything out of date, or report any issues -[here](https://github.com/lampepfl/dotty/issues). +[here](https://github.com/scala/scala3/issues). ### Get the most from This Guide diff --git a/docs/_docs/contributing/issues/areas.md b/docs/_docs/contributing/issues/areas.md index ce27e9c0a5aa..9206d608ffbb 100644 --- a/docs/_docs/contributing/issues/areas.md +++ b/docs/_docs/contributing/issues/areas.md @@ -55,17 +55,17 @@ See [Inliner]. #### Compiletime Ops Types See `tryCompiletimeConstantFold` in [Types]. -[Showable]: https://github.com/lampepfl/dotty/blob/master/compiler/src/dotty/tools/dotc/printing/Showable.scala -[PlainPrinter]: https://github.com/lampepfl/dotty/blob/master/compiler/src/dotty/tools/dotc/printing/PlainPrinter.scala -[RefinedPrinter]: https://github.com/lampepfl/dotty/blob/master/compiler/src/dotty/tools/dotc/printing/RefinedPrinter.scala -[ErrorMessageID]: https://github.com/lampepfl/dotty/blob/master/compiler/src/dotty/tools/dotc/reporting/ErrorMessageID.scala -[messages]: https://github.com/lampepfl/dotty/blob/master/compiler/src/dotty/tools/dotc/reporting/messages.scala -[Synthesizer]: https://github.com/lampepfl/dotty/blob/master/compiler/src/dotty/tools/dotc/typer/Synthesizer.scala -[SyntheticMembers]: https://github.com/lampepfl/dotty/blob/master/compiler/src/dotty/tools/dotc/transform/SyntheticMembers.scala -[quotes-impl]: https://github.com/lampepfl/dotty/tree/master/compiler/src/scala/quoted/runtime/impl -[Inliner]: https://github.com/lampepfl/dotty/blob/main/compiler/src/dotty/tools/dotc/inlines/Inliner.scala -[Types]: https://github.com/lampepfl/dotty/tree/master/compiler/src/dotty/tools/dotc/core/Types.scala -[Completion]: https://github.com/lampepfl/dotty/tree/master/compiler/src/dotty/tools/dotc/interactive/Completion.scala -[DesugarEnums]: https://github.com/lampepfl/dotty/blob/master/compiler/src/dotty/tools/dotc/ast/DesugarEnums.scala -[Desugar]: https://github.com/lampepfl/dotty/blob/master/compiler/src/dotty/tools/dotc/ast/Desugar.scala -[Space]: https://github.com/lampepfl/dotty/blob/master/compiler/src/dotty/tools/dotc/transform/patmat/Space.scala +[Showable]: https://github.com/scala/scala3/blob/master/compiler/src/dotty/tools/dotc/printing/Showable.scala +[PlainPrinter]: https://github.com/scala/scala3/blob/master/compiler/src/dotty/tools/dotc/printing/PlainPrinter.scala +[RefinedPrinter]: https://github.com/scala/scala3/blob/master/compiler/src/dotty/tools/dotc/printing/RefinedPrinter.scala +[ErrorMessageID]: https://github.com/scala/scala3/blob/master/compiler/src/dotty/tools/dotc/reporting/ErrorMessageID.scala +[messages]: https://github.com/scala/scala3/blob/master/compiler/src/dotty/tools/dotc/reporting/messages.scala +[Synthesizer]: https://github.com/scala/scala3/blob/master/compiler/src/dotty/tools/dotc/typer/Synthesizer.scala +[SyntheticMembers]: https://github.com/scala/scala3/blob/master/compiler/src/dotty/tools/dotc/transform/SyntheticMembers.scala +[quotes-impl]: https://github.com/scala/scala3/tree/master/compiler/src/scala/quoted/runtime/impl +[Inliner]: https://github.com/scala/scala3/blob/main/compiler/src/dotty/tools/dotc/inlines/Inliner.scala +[Types]: https://github.com/scala/scala3/tree/master/compiler/src/dotty/tools/dotc/core/Types.scala +[Completion]: https://github.com/scala/scala3/tree/master/compiler/src/dotty/tools/dotc/interactive/Completion.scala +[DesugarEnums]: https://github.com/scala/scala3/blob/master/compiler/src/dotty/tools/dotc/ast/DesugarEnums.scala +[Desugar]: https://github.com/scala/scala3/blob/master/compiler/src/dotty/tools/dotc/ast/Desugar.scala +[Space]: https://github.com/scala/scala3/blob/master/compiler/src/dotty/tools/dotc/transform/patmat/Space.scala diff --git a/docs/_docs/contributing/issues/cause.md b/docs/_docs/contributing/issues/cause.md index e23f6d1f747f..f96d3b6d2f8a 100644 --- a/docs/_docs/contributing/issues/cause.md +++ b/docs/_docs/contributing/issues/cause.md @@ -116,7 +116,7 @@ def myInfo: Type = myInfo_debug, def myInfo_=(x: Type) = { tracer = Thread.currentThread.getStackTrace.mkString("\n"); myInfo_debug = x } ``` -[Printers]: https://github.com/lampepfl/dotty/blob/master/compiler/src/dotty/tools/dotc/config/Printers.scala -[Denotation]: https://github.com/lampepfl/dotty/blob/master/compiler/src/dotty/tools/dotc/core/Denotations.scala -[PostTyper]: https://github.com/lampepfl/dotty/blob/master/compiler/src/dotty/tools/dotc/transform/PostTyper.scala -[ScalaSettings]: https://github.com/lampepfl/dotty/blob/master/compiler/src/dotty/tools/dotc/config/ScalaSettings.scala +[Printers]: https://github.com/scala/scala3/blob/master/compiler/src/dotty/tools/dotc/config/Printers.scala +[Denotation]: https://github.com/scala/scala3/blob/master/compiler/src/dotty/tools/dotc/core/Denotations.scala +[PostTyper]: https://github.com/scala/scala3/blob/master/compiler/src/dotty/tools/dotc/transform/PostTyper.scala +[ScalaSettings]: https://github.com/scala/scala3/blob/master/compiler/src/dotty/tools/dotc/config/ScalaSettings.scala diff --git a/docs/_docs/contributing/issues/reproduce.md b/docs/_docs/contributing/issues/reproduce.md index ca5da324a867..ae031a44d76f 100644 --- a/docs/_docs/contributing/issues/reproduce.md +++ b/docs/_docs/contributing/issues/reproduce.md @@ -159,7 +159,7 @@ scala -classpath $here/out Test # Run main method of `Test` generated by the co In this section, you have seen how to reproduce an issue locally, and next you will see how to try and detect its root cause. -[lampepfl/dotty]: https://github.com/lampepfl/dotty/issues -[#7710]: https://github.com/lampepfl/dotty/issues/7710 +[lampepfl/dotty]: https://github.com/scala/scala3/issues +[#7710]: https://github.com/scala/scala3/issues/7710 [dotty-issue-workspace]: https://github.com/anatoliykmetyuk/dotty-issue-workspace [workspace-readme]: https://github.com/anatoliykmetyuk/dotty-issue-workspace#getting-started diff --git a/docs/_docs/contributing/procedures/release.md b/docs/_docs/contributing/procedures/release.md index c54bb637aff5..88e1beb1b93b 100644 --- a/docs/_docs/contributing/procedures/release.md +++ b/docs/_docs/contributing/procedures/release.md @@ -28,7 +28,7 @@ Say we want to release the 0.14.0 version. In this section we describe the proce CI is set to automatically detect the tags of the format discussed above and perform the required release operations. Precisely, it will do two things for the release tags: - Publish the release jars to Maven -- Create the drafts at the GitHub [release](https://github.com/lampepfl/dotty/releases) page of the repository with the artefacts of the release. +- Create the drafts at the GitHub [release](https://github.com/scala/scala3/releases) page of the repository with the artefacts of the release. The CI operation is entirely automatic provided you have tagged the release correctly. No need to do anything here. @@ -50,11 +50,11 @@ However, you may end up with as many as 6 tasks being run. The auxiliary tasks m ### Release Procedure Checklist Before we start the release procedure, we create an issue with a release checklist. As we go through the release, we update the checklist. To generate the checklist, run the following command: -`bash <(curl -sL https://raw.githubusercontent.com/lampepfl/dotty/main/docs/docs/contributing/checklist.sh) ` +`bash <(curl -sL https://raw.githubusercontent.com/scala/scala3/main/docs/docs/contributing/checklist.sh) ` Above, `` is the stable version being released. For example, if you are releasing `0.14.0` and `0.15.0-RC1`, this variable is `14` and the command is as follows: -`bash <(curl -sL https://raw.githubusercontent.com/lampepfl/dotty/main/docs/docs/contributing/checklist.sh) 14` +`bash <(curl -sL https://raw.githubusercontent.com/scala/scala3/main/docs/docs/contributing/checklist.sh) 14` Copy and paste the output into the release issue. @@ -73,10 +73,10 @@ After the release is done, we document it as follows: During the release process we ensure that various parts of the community are also prepared for the new version of Scala so that users can hit the ground running when the new release is announced. You can see an example of this -[here](https://github.com/lampepfl/dotty/issues/17559). +[here](https://github.com/scala/scala3/issues/17559). # Procedure in Bash Scripts -The below procedure is compiled from [this](https://github.com/lampepfl/dotty/issues/5907#issue-409313505) and [this](https://github.com/lampepfl/dotty/issues/6235#issue-429265748) checklists. It assumes we want to publish the `0.14.0` given the `0.14.0-RC1` release candidate. +The below procedure is compiled from [this](https://github.com/scala/scala3/issues/5907#issue-409313505) and [this](https://github.com/scala/scala3/issues/6235#issue-429265748) checklists. It assumes we want to publish the `0.14.0` given the `0.14.0-RC1` release candidate. Note that at the same time we will also publish the `0.15.0-RC1` release. We publish two releases at the same time as per the logic outlined at the [Example/At the Dotty Repo](#at-the-dotty-repo) and the [Model](#model) sections above: the step (5) in the algorithm outlined in the [Example](#at-the-dotty-repo) for the release cycle of `0.14.0` is the step (1) in the release cycle of `0.15.0`. @@ -101,7 +101,7 @@ git merge 0.14.x git push origin main ######## Publish the 0.15.0-RC1 unstable version – begin the release cycle for 0.15.0 ######## -# Move all the unfinished tasks from Milestone 15 to Milestone 16 on GitHub – see https://github.com/lampepfl/dotty/milestones +# Move all the unfinished tasks from Milestone 15 to Milestone 16 on GitHub – see https://github.com/scala/scala3/milestones git checkout -b 0.15.x diff --git a/docs/_docs/contributing/sending-in-a-pr.md b/docs/_docs/contributing/sending-in-a-pr.md index c99e6a28172b..0c276e2c9287 100644 --- a/docs/_docs/contributing/sending-in-a-pr.md +++ b/docs/_docs/contributing/sending-in-a-pr.md @@ -104,7 +104,7 @@ every part of CI. For example, maybe you're just updating some documentation and there is no need to run the community build for this. We skip parts of the CI by utilizing keywords inside of brackets. The most up-to-date way to see this are by looking in the `if` statements of jobs. For example you can see some -[here](https://github.com/lampepfl/dotty/blob/5d2812a5937389f8a46f9e97ab9cbfbb3f298d87/.github/workflows/ci.yaml#L54-L64). +[here](https://github.com/scala/scala3/blob/5d2812a5937389f8a46f9e97ab9cbfbb3f298d87/.github/workflows/ci.yaml#L54-L64). Below are commonly used ones: @@ -160,8 +160,8 @@ you're PR will be merged in! [pull-request]: https://docs.github.com/en?query=pull+requests [lampepfl/dotty]: https://github.com/lampepfl/dotty [cla]: http://typesafe.com/contribute/cla/scala -[issues]: https://github.com/lampepfl/dotty/issues -[full-list]: https://github.com/lampepfl/dotty/blob/master/CONTRIBUTING.md +[issues]: https://github.com/scala/scala3/issues +[full-list]: https://github.com/scala/scala3/blob/master/CONTRIBUTING.md [discord]: https://discord.gg/TSmY9zkHar [dry]: https://www.oreilly.com/library/view/97-things-every/9780596809515/ch30.html [scouts]: https://www.oreilly.com/library/view/97-things-every/9780596809515/ch08.html diff --git a/docs/_docs/contributing/setting-up-your-ide.md b/docs/_docs/contributing/setting-up-your-ide.md index 3bb7d329d50c..a02c1dee63cb 100644 --- a/docs/_docs/contributing/setting-up-your-ide.md +++ b/docs/_docs/contributing/setting-up-your-ide.md @@ -34,7 +34,7 @@ want to make sure you do two things: 1. You'll want to find and change the following under `commonBootstrappedSettings` which is found in the - [`Build.scala`](https://github.com/lampepfl/dotty/blob/main/project/Build.scala) + [`Build.scala`](https://github.com/scala/scala3/blob/main/project/Build.scala) file. ```diff diff --git a/docs/_docs/contributing/testing.md b/docs/_docs/contributing/testing.md index 3f3e5421b9bd..039b37ead8bf 100644 --- a/docs/_docs/contributing/testing.md +++ b/docs/_docs/contributing/testing.md @@ -246,7 +246,7 @@ can enter an inconsistent state and cause spurious test failures. If you suspect you can run `rm -rf out/*` from the root of the repository and run your tests again. If that fails, you can try `git clean -xfd`. -[CompilationTests]: https://github.com/lampepfl/dotty/blob/master/compiler/test/dotty/tools/dotc/CompilationTests.scala -[compiler/test]: https://github.com/lampepfl/dotty/blob/master/compiler/test/ -[compiler/test/dotc]: https://github.com/lampepfl/dotty/tree/master/compiler/test/dotc -[SemanticdbTests]: https://github.com/lampepfl/dotty/blob/master/compiler/test/dotty/tools/dotc/semanticdb/SemanticdbTests.scala +[CompilationTests]: https://github.com/scala/scala3/blob/master/compiler/test/dotty/tools/dotc/CompilationTests.scala +[compiler/test]: https://github.com/scala/scala3/blob/master/compiler/test/ +[compiler/test/dotc]: https://github.com/scala/scala3/tree/master/compiler/test/dotc +[SemanticdbTests]: https://github.com/scala/scala3/blob/master/compiler/test/dotty/tools/dotc/semanticdb/SemanticdbTests.scala diff --git a/docs/_docs/internals/coverage.md b/docs/_docs/internals/coverage.md index 162aa182a1e0..923908683721 100644 --- a/docs/_docs/internals/coverage.md +++ b/docs/_docs/internals/coverage.md @@ -5,7 +5,7 @@ title: "Code Coverage for Scala 3" ## Instrument code for coverage analysis -[PR#13880](https://github.com/lampepfl/dotty/pull/13880) has implemented code coverage support for Dotty. +[PR#13880](https://github.com/scala/scala3/pull/13880) has implemented code coverage support for Dotty. In general, code coverage "instruments" the program at compile time: code is inserted to record which statement are called. This does not change the behavior of the program. Also, a list of all the coverable statements is produced. To use this feature, add the compile option `-coverage-out:DIR`, where `DIR` is the destination of the measurement files. diff --git a/docs/_docs/internals/debug-macros.md b/docs/_docs/internals/debug-macros.md index 4d5cab52c568..a085ebbde355 100644 --- a/docs/_docs/internals/debug-macros.md +++ b/docs/_docs/internals/debug-macros.md @@ -34,7 +34,7 @@ the stack trace, we will be able to figure out where the tree is created. If the position is in the compiler, then either report a compiler bug or fix the problem with `.withSpan(tree.span)`. The following fix is an example: -- https://github.com/lampepfl/dotty/pull/6581 +- https://github.com/scala/scala3/pull/6581 ## unresolved symbols in pickling diff --git a/docs/_docs/internals/dotc-scalac.md b/docs/_docs/internals/dotc-scalac.md index 03baad375eb1..e5335c734891 100644 --- a/docs/_docs/internals/dotc-scalac.md +++ b/docs/_docs/internals/dotc-scalac.md @@ -133,6 +133,6 @@ if (sym is Flags.PackageClass) // Scala 3 (*) * `MethodType(paramSyms, resultType)` from scalac => `mt @ MethodType(paramNames, paramTypes)`. Result type is `mt.resultType` -[Denotations1]: https://github.com/lampepfl/dotty/blob/a527f3b1e49c0d48148ccfb2eb52e3302fc4a349/compiler/src/dotty/tools/dotc/core/Denotations.scala#L27-L72 -[Denotations2]: https://github.com/lampepfl/dotty/blob/a527f3b1e49c0d48148ccfb2eb52e3302fc4a349/compiler/src/dotty/tools/dotc/core/Denotations.scala#L77-L103 -[Signature1]: https://github.com/lampepfl/dotty/blob/a527f3b1e49c0d48148ccfb2eb52e3302fc4a349/compiler/src/dotty/tools/dotc/core/Signature.scala#L9-L33 +[Denotations1]: https://github.com/scala/scala3/blob/a527f3b1e49c0d48148ccfb2eb52e3302fc4a349/compiler/src/dotty/tools/dotc/core/Denotations.scala#L27-L72 +[Denotations2]: https://github.com/scala/scala3/blob/a527f3b1e49c0d48148ccfb2eb52e3302fc4a349/compiler/src/dotty/tools/dotc/core/Denotations.scala#L77-L103 +[Signature1]: https://github.com/scala/scala3/blob/a527f3b1e49c0d48148ccfb2eb52e3302fc4a349/compiler/src/dotty/tools/dotc/core/Signature.scala#L9-L33 diff --git a/docs/_docs/internals/overall-structure.md b/docs/_docs/internals/overall-structure.md index ab936ddd8512..a25c287e16c9 100644 --- a/docs/_docs/internals/overall-structure.md +++ b/docs/_docs/internals/overall-structure.md @@ -235,10 +235,10 @@ Phases fall into four categories: * Code generators: These map the transformed trees to Java classfiles or .sjsir files. -[dotty.tools]: https://github.com/lampepfl/dotty/tree/main/compiler/src/dotty/tools -[dotc]: https://github.com/lampepfl/dotty/tree/main/compiler/src/dotty/tools/dotc -[Main]: https://github.com/lampepfl/dotty/blob/main/compiler/src/dotty/tools/dotc/Main.scala -[Driver]: https://github.com/lampepfl/dotty/blob/main/compiler/src/dotty/tools/dotc/Driver.scala -[Compiler]: https://github.com/lampepfl/dotty/blob/main/compiler/src/dotty/tools/dotc/Compiler.scala -[Run]: https://github.com/lampepfl/dotty/blob/main/compiler/src/dotty/tools/dotc/Run.scala -[Context]: https://github.com/lampepfl/dotty/blob/main/compiler/src/dotty/tools/dotc/core/Contexts.scala +[dotty.tools]: https://github.com/scala/scala3/tree/main/compiler/src/dotty/tools +[dotc]: https://github.com/scala/scala3/tree/main/compiler/src/dotty/tools/dotc +[Main]: https://github.com/scala/scala3/blob/main/compiler/src/dotty/tools/dotc/Main.scala +[Driver]: https://github.com/scala/scala3/blob/main/compiler/src/dotty/tools/dotc/Driver.scala +[Compiler]: https://github.com/scala/scala3/blob/main/compiler/src/dotty/tools/dotc/Compiler.scala +[Run]: https://github.com/scala/scala3/blob/main/compiler/src/dotty/tools/dotc/Run.scala +[Context]: https://github.com/scala/scala3/blob/main/compiler/src/dotty/tools/dotc/core/Contexts.scala diff --git a/docs/_docs/internals/periods.md b/docs/_docs/internals/periods.md index 46241da0bb17..bf9c4a5fe786 100644 --- a/docs/_docs/internals/periods.md +++ b/docs/_docs/internals/periods.md @@ -88,6 +88,6 @@ object Period { As a sentinel value there's `Nowhere`, a period that is empty. -[runs]: https://github.com/lampepfl/dotty/blob/a527f3b1e49c0d48148ccfb2eb52e3302fc4a349/compiler/src/dotty/tools/dotc/Run.scala -[phases]: https://github.com/lampepfl/dotty/blob/a527f3b1e49c0d48148ccfb2eb52e3302fc4a349/compiler/src/dotty/tools/dotc/core/Phases.scala -[period]: https://github.com/lampepfl/dotty/blob/a527f3b1e49c0d48148ccfb2eb52e3302fc4a349/compiler/src/dotty/tools/dotc/core/Periods.scala +[runs]: https://github.com/scala/scala3/blob/a527f3b1e49c0d48148ccfb2eb52e3302fc4a349/compiler/src/dotty/tools/dotc/Run.scala +[phases]: https://github.com/scala/scala3/blob/a527f3b1e49c0d48148ccfb2eb52e3302fc4a349/compiler/src/dotty/tools/dotc/core/Phases.scala +[period]: https://github.com/scala/scala3/blob/a527f3b1e49c0d48148ccfb2eb52e3302fc4a349/compiler/src/dotty/tools/dotc/core/Periods.scala diff --git a/docs/_docs/internals/type-system.md b/docs/_docs/internals/type-system.md index 8fa8912a7118..d2c0cd869e61 100644 --- a/docs/_docs/internals/type-system.md +++ b/docs/_docs/internals/type-system.md @@ -95,7 +95,7 @@ checks if `tp1` is a subtype of `tp2`. ### Type rebasing ### **FIXME**: This section is no longer accurate because -https://github.com/lampepfl/dotty/pull/331 changed the handling of refined +https://github.com/scala/scala3/pull/331 changed the handling of refined types. Consider [tests/pos/refinedSubtyping.scala][5] @@ -132,8 +132,8 @@ TODO ## Type inference via constraint solving ## TODO -[1]: https://github.com/lampepfl/dotty/blob/main/compiler/src/dotty/tools/dotc/core/Types.scala +[1]: https://github.com/scala/scala3/blob/main/compiler/src/dotty/tools/dotc/core/Types.scala [2]: https://github.com/samuelgruetter/dotty/blob/classdiagrampdf/dotty-types.pdf [3]: https://github.com/samuelgruetter/scaladiagrams/tree/print-descendants -[4]: https://github.com/lampepfl/dotty/blob/main/compiler/src/dotty/tools/dotc/core/TypeComparer.scala -[5]: https://github.com/lampepfl/dotty/blob/main/tests/pos/refinedSubtyping.scala +[4]: https://github.com/scala/scala3/blob/main/compiler/src/dotty/tools/dotc/core/TypeComparer.scala +[5]: https://github.com/scala/scala3/blob/main/tests/pos/refinedSubtyping.scala diff --git a/docs/_docs/reference/changed-features/eta-expansion-spec.md b/docs/_docs/reference/changed-features/eta-expansion-spec.md index 714ab37ae11a..516764ef5370 100644 --- a/docs/_docs/reference/changed-features/eta-expansion-spec.md +++ b/docs/_docs/reference/changed-features/eta-expansion-spec.md @@ -74,4 +74,4 @@ The method value syntax `m _` is deprecated. ## Reference -For more information, see [PR #2701](https://github.com/lampepfl/dotty/pull/2701). +For more information, see [PR #2701](https://github.com/scala/scala3/pull/2701). diff --git a/docs/_docs/reference/changed-features/implicit-conversions-spec.md b/docs/_docs/reference/changed-features/implicit-conversions-spec.md index dc19e10c8b8f..a70321b70c15 100644 --- a/docs/_docs/reference/changed-features/implicit-conversions-spec.md +++ b/docs/_docs/reference/changed-features/implicit-conversions-spec.md @@ -114,4 +114,4 @@ changes to implicit resolution, refer to the [Changes in Implicit Resolution](im ## Reference For more information about implicit resolution, see [Changes in Implicit Resolution](implicit-resolution.md). -Other details are available in [PR #2065](https://github.com/lampepfl/dotty/pull/2065). +Other details are available in [PR #2065](https://github.com/scala/scala3/pull/2065). diff --git a/docs/_docs/reference/changed-features/structural-types-spec.md b/docs/_docs/reference/changed-features/structural-types-spec.md index 18d0f31ee6fe..216b738ae61c 100644 --- a/docs/_docs/reference/changed-features/structural-types-spec.md +++ b/docs/_docs/reference/changed-features/structural-types-spec.md @@ -16,7 +16,7 @@ RefineStat ::= ‘val’ VarDcl | ‘def’ DefDcl | ‘type’ {nl} TypeDcl ## Implementation of Structural Types The standard library defines a universal marker trait -[`scala.Selectable`](https://github.com/lampepfl/dotty/blob/main/library/src/scala/Selectable.scala): +[`scala.Selectable`](https://github.com/scala/scala3/blob/main/library/src/scala/Selectable.scala): ```scala trait Selectable extends Any @@ -150,4 +150,4 @@ conversion that can turn `v` into a `Selectable`, and the selection methods coul ## Context -For more information, see [Rethink Structural Types](https://github.com/lampepfl/dotty/issues/1886). +For more information, see [Rethink Structural Types](https://github.com/scala/scala3/issues/1886). diff --git a/docs/_docs/reference/contextual/by-name-context-parameters.md b/docs/_docs/reference/contextual/by-name-context-parameters.md index 3515efd78fa5..7c517abe9406 100644 --- a/docs/_docs/reference/contextual/by-name-context-parameters.md +++ b/docs/_docs/reference/contextual/by-name-context-parameters.md @@ -61,5 +61,5 @@ No local given instance was generated because the synthesized argument is not re ## Reference -For more information, see [Issue #1998](https://github.com/lampepfl/dotty/issues/1998) +For more information, see [Issue #1998](https://github.com/scala/scala3/issues/1998) and the associated [Scala SIP](https://docs.scala-lang.org/sips/byname-implicits.html). diff --git a/docs/_docs/reference/contextual/multiversal-equality.md b/docs/_docs/reference/contextual/multiversal-equality.md index b51d03b10963..6258973c0cda 100644 --- a/docs/_docs/reference/contextual/multiversal-equality.md +++ b/docs/_docs/reference/contextual/multiversal-equality.md @@ -25,7 +25,7 @@ the program will still typecheck, since values of all types can be compared with But it will probably give unexpected results and fail at runtime. Multiversal equality is an opt-in way to make universal equality safer. -It uses a binary type class [`scala.CanEqual`](https://github.com/lampepfl/dotty/blob/main/library/src/scala/CanEqual.scala) +It uses a binary type class [`scala.CanEqual`](https://github.com/scala/scala3/blob/main/library/src/scala/CanEqual.scala) to indicate that values of two given types can be compared with each other. The example above would not typecheck if `S` or `T` was a class that derives `CanEqual`, e.g. @@ -71,7 +71,7 @@ given CanEqual[A, B] = CanEqual.derived given CanEqual[B, A] = CanEqual.derived ``` -The [`scala.CanEqual`](https://github.com/lampepfl/dotty/blob/main/library/src/scala/CanEqual.scala) +The [`scala.CanEqual`](https://github.com/scala/scala3/blob/main/library/src/scala/CanEqual.scala) object defines a number of `CanEqual` given instances that together define a rule book for what standard types can be compared (more details below). @@ -225,4 +225,4 @@ work under `-language:strictEquality`, since otherwise the universal `Eq[Any]` i More on multiversal equality is found in a [blog post](http://www.scala-lang.org/blog/2016/05/06/multiversal-equality.html) -and a [GitHub issue](https://github.com/lampepfl/dotty/issues/1247). +and a [GitHub issue](https://github.com/scala/scala3/issues/1247). diff --git a/docs/_docs/reference/dropped-features/auto-apply.md b/docs/_docs/reference/dropped-features/auto-apply.md index eadfe2f429ea..1a809275d4d0 100644 --- a/docs/_docs/reference/dropped-features/auto-apply.md +++ b/docs/_docs/reference/dropped-features/auto-apply.md @@ -93,4 +93,4 @@ stricter checking. ## Reference -For more information, see [Issue #2570](https://github.com/lampepfl/dotty/issues/2570) and [PR #2716](https://github.com/lampepfl/dotty/pull/2716). +For more information, see [Issue #2570](https://github.com/scala/scala3/issues/2570) and [PR #2716](https://github.com/scala/scala3/pull/2716). diff --git a/docs/_docs/reference/dropped-features/type-projection.md b/docs/_docs/reference/dropped-features/type-projection.md index 08b5ffb34eca..2c3e82ce99b8 100644 --- a/docs/_docs/reference/dropped-features/type-projection.md +++ b/docs/_docs/reference/dropped-features/type-projection.md @@ -9,7 +9,7 @@ and `A` names a type member of `T`. Scala 3 disallows this if `T` is an abstract type (class types and type aliases are fine). This change was made because unrestricted type projection -is [unsound](https://github.com/lampepfl/dotty/issues/1050). +is [unsound](https://github.com/scala/scala3/issues/1050). This restriction rules out the [type-level encoding of a combinator calculus](https://michid.wordpress.com/2010/01/29/scala-type-level-encoding-of-the-ski-calculus/). diff --git a/docs/_docs/reference/enums/adts.md b/docs/_docs/reference/enums/adts.md index 5219e062a633..36dd1daf0b59 100644 --- a/docs/_docs/reference/enums/adts.md +++ b/docs/_docs/reference/enums/adts.md @@ -170,4 +170,4 @@ The changes are specified below as deltas with respect to the Scala syntax given ## Reference -For more information, see [Issue #1970](https://github.com/lampepfl/dotty/issues/1970). +For more information, see [Issue #1970](https://github.com/scala/scala3/issues/1970). diff --git a/docs/_docs/reference/enums/enums.md b/docs/_docs/reference/enums/enums.md index 65051bdfb39f..8d4fca3268b0 100644 --- a/docs/_docs/reference/enums/enums.md +++ b/docs/_docs/reference/enums/enums.md @@ -180,7 +180,7 @@ scala> Color.Red.compareTo(Color.Green) val res15: Int = -1 ``` -For a more in-depth example of using Scala 3 enums from Java, see [this test](https://github.com/lampepfl/dotty/tree/main/tests/run/enum-java). In the test, the enums are defined in the `MainScala.scala` file and used from a Java source, `Test.java`. +For a more in-depth example of using Scala 3 enums from Java, see [this test](https://github.com/scala/scala3/tree/main/tests/run/enum-java). In the test, the enums are defined in the `MainScala.scala` file and used from a Java source, `Test.java`. ## Implementation @@ -218,5 +218,5 @@ val Red: Color = $new(0, "Red") ## Reference -For more information, see [Issue #1970](https://github.com/lampepfl/dotty/issues/1970) and -[PR #4003](https://github.com/lampepfl/dotty/pull/4003). +For more information, see [Issue #1970](https://github.com/scala/scala3/issues/1970) and +[PR #4003](https://github.com/scala/scala3/pull/4003). diff --git a/docs/_docs/reference/experimental/explicit-nulls.md b/docs/_docs/reference/experimental/explicit-nulls.md index 47b21d3a5e23..1925b0b3c925 100644 --- a/docs/_docs/reference/experimental/explicit-nulls.md +++ b/docs/_docs/reference/experimental/explicit-nulls.md @@ -431,7 +431,7 @@ When dealing with local mutable variables, there are two questions: x = null ``` -See [more examples](https://github.com/lampepfl/dotty/blob/main/tests/explicit-nulls/neg/flow-varref-in-closure.scala). +See [more examples](https://github.com/scala/scala3/blob/main/tests/explicit-nulls/neg/flow-varref-in-closure.scala). Currently, we are unable to track paths with a mutable variable prefix. For example, `x.a` if `x` is mutable. diff --git a/docs/_docs/reference/experimental/tupled-function.md b/docs/_docs/reference/experimental/tupled-function.md index 0cc016953a80..de683128fb0c 100644 --- a/docs/_docs/reference/experimental/tupled-function.md +++ b/docs/_docs/reference/experimental/tupled-function.md @@ -35,7 +35,7 @@ The compiler will synthesize an instance of `TupledFunction[F, G]` if: Examples -------- `TupledFunction` can be used to generalize the `Function1.tupled`, ... `Function22.tupled` methods to functions of any arities. -The following defines `tupled` as [extension method](../contextual/extension-methods.html) ([full example](https://github.com/lampepfl/dotty/blob/main/tests/run/tupled-function-tupled.scala)). +The following defines `tupled` as [extension method](../contextual/extension-methods.html) ([full example](https://github.com/scala/scala3/blob/main/tests/run/tupled-function-tupled.scala)). ```scala /** Creates a tupled version of this function: instead of N arguments, @@ -49,7 +49,7 @@ extension [F, Args <: Tuple, R](f: F) def tupled(using tf: TupledFunction[F, Args => R]): Args => R = tf.tupled(f) ``` -`TupledFunction` can be used to generalize the `Function.untupled` to a function of any arities ([full example](https://github.com/lampepfl/dotty/blob/main/tests/run/tupled-function-untupled.scala)) +`TupledFunction` can be used to generalize the `Function.untupled` to a function of any arities ([full example](https://github.com/scala/scala3/blob/main/tests/run/tupled-function-untupled.scala)) ```scala /** Creates an untupled version of this function: instead of a single argument of type [[scala.Tuple]] with N elements, @@ -65,7 +65,7 @@ extension [F, Args <: Tuple, R](f: Args => R) def untupled(using tf: TupledFunction[F, Args => R]): F = tf.untupled(f) ``` -`TupledFunction` can also be used to generalize the [`Tuple1.compose`](https://github.com/lampepfl/dotty/blob/main/tests/run/tupled-function-compose.scala) and [`Tuple1.andThen`](https://github.com/lampepfl/dotty/blob/main/tests/run/tupled-function-andThen.scala) methods to compose functions of larger arities and with functions that return tuples. +`TupledFunction` can also be used to generalize the [`Tuple1.compose`](https://github.com/scala/scala3/blob/main/tests/run/tupled-function-compose.scala) and [`Tuple1.andThen`](https://github.com/scala/scala3/blob/main/tests/run/tupled-function-andThen.scala) methods to compose functions of larger arities and with functions that return tuples. ```scala /** Composes two instances of TupledFunction into a new TupledFunction, with this function applied last. diff --git a/docs/_docs/reference/features-classification.md b/docs/_docs/reference/features-classification.md index 36cea3b9e72d..550130780b44 100644 --- a/docs/_docs/reference/features-classification.md +++ b/docs/_docs/reference/features-classification.md @@ -72,7 +72,7 @@ These constructs are restricted to make the language safer. - [Given Imports](contextual/given-imports.md): implicits now require a special form of import, to make the import clearly visible. - [Type Projection](dropped-features/type-projection.md): only classes can be used as prefix `C` of a type projection `C#A`. Type projection on abstract types is no longer supported since it is unsound. - [Multiversal equality](contextual/multiversal-equality.md) implements an "opt-in" scheme to rule out nonsensical comparisons with `==` and `!=`. - - [infix](https://github.com/lampepfl/dotty/pull/5975) + - [infix](https://github.com/scala/scala3/pull/5975) makes method application syntax uniform across code bases. Unrestricted implicit conversions continue to be available in Scala 3.0, but will be deprecated and removed later. Unrestricted versions of the other constructs in the list above are available only under `-source 3.0-migration`. @@ -100,7 +100,7 @@ These constructs are proposed to be dropped without a new construct replacing th - [Auto application](dropped-features/auto-apply.md), - [Weak conformance](dropped-features/weak-conformance.md), - [Compound types](new-types/intersection-types.md), - - [Auto tupling](https://github.com/lampepfl/dotty/pull/4311) (implemented, but not merged). + - [Auto tupling](https://github.com/scala/scala3/pull/4311) (implemented, but not merged). The date when these constructs are dropped varies. The current status is: @@ -148,7 +148,7 @@ These are additions to the language that make it more powerful or pleasant to us - [Enums](enums/enums.md) provide concise syntax for enumerations and [algebraic data types](enums/adts.md). - [Parameter untupling](other-new-features/parameter-untupling.md) avoids having to use `case` for tupled parameter destructuring. - [Dependent function types](new-types/dependent-function-types.md) generalize dependent methods to dependent function values and types. - - [Polymorphic function types](https://github.com/lampepfl/dotty/pull/4672) generalize polymorphic methods to dependent function values and types. _Current status_: There is a proposal, and a prototype implementation, but the implementation has not been finalized or merged yet. + - [Polymorphic function types](https://github.com/scala/scala3/pull/4672) generalize polymorphic methods to dependent function values and types. _Current status_: There is a proposal, and a prototype implementation, but the implementation has not been finalized or merged yet. - [Kind polymorphism](other-new-features/kind-polymorphism.md) allows the definition of operators working equally on types and type constructors. **Status: mixed** diff --git a/docs/_docs/reference/metaprogramming/compiletime-ops.md b/docs/_docs/reference/metaprogramming/compiletime-ops.md index 048c6b6165bb..dcefb3a16ed3 100644 --- a/docs/_docs/reference/metaprogramming/compiletime-ops.md +++ b/docs/_docs/reference/metaprogramming/compiletime-ops.md @@ -259,5 +259,5 @@ val notMissing : NotMissing = summonInlineCheck(3) ## Reference -For more information about compile-time operations, see [PR #4768](https://github.com/lampepfl/dotty/pull/4768), -which explains how `summonFrom`'s predecessor (implicit matches) can be used for typelevel programming and code specialization and [PR #7201](https://github.com/lampepfl/dotty/pull/7201) which explains the new `summonFrom` syntax. +For more information about compile-time operations, see [PR #4768](https://github.com/scala/scala3/pull/4768), +which explains how `summonFrom`'s predecessor (implicit matches) can be used for typelevel programming and code specialization and [PR #7201](https://github.com/scala/scala3/pull/7201) which explains the new `summonFrom` syntax. diff --git a/docs/_docs/reference/new-types/dependent-function-types-spec.md b/docs/_docs/reference/new-types/dependent-function-types-spec.md index f603200b1ae0..3084ff4de71c 100644 --- a/docs/_docs/reference/new-types/dependent-function-types-spec.md +++ b/docs/_docs/reference/new-types/dependent-function-types-spec.md @@ -4,7 +4,7 @@ title: "Dependent Function Types - More Details" nightlyOf: https://docs.scala-lang.org/scala3/reference/new-types/dependent-function-types-spec.html --- -Initial implementation in [PR #3464](https://github.com/lampepfl/dotty/pull/3464). +Initial implementation in [PR #3464](https://github.com/scala/scala3/pull/3464). ## Syntax @@ -46,7 +46,7 @@ same way that other functions do, see The example below defines a trait `C` and the two dependent function types `DF` and `IDF` and prints the results of the respective function applications: -[depfuntype.scala]: https://github.com/lampepfl/dotty/blob/main/tests/pos/depfuntype.scala +[depfuntype.scala]: https://github.com/scala/scala3/blob/main/tests/pos/depfuntype.scala ```scala trait C { type M; val m: M } @@ -70,7 +70,7 @@ type IDF = (x: C) ?=> x.M In the following example the depend type `f.Eff` refers to the effect type `CanThrow`: -[eff-dependent.scala]: https://github.com/lampepfl/dotty/blob/main/tests/run/eff-dependent.scala +[eff-dependent.scala]: https://github.com/scala/scala3/blob/main/tests/run/eff-dependent.scala ```scala trait Effect diff --git a/docs/_docs/reference/new-types/intersection-types-spec.md b/docs/_docs/reference/new-types/intersection-types-spec.md index 4e26626c0b36..723720fdef89 100644 --- a/docs/_docs/reference/new-types/intersection-types-spec.md +++ b/docs/_docs/reference/new-types/intersection-types-spec.md @@ -97,7 +97,7 @@ glb(A, _) = A // use first In the above, `|T|` means the erased type of `T`, `JArray` refers to the type of Java Array. -See also: [`TypeErasure#erasedGlb`](https://github.com/lampepfl/dotty/blob/main/compiler/src/dotty/tools/dotc/core/TypeErasure.scala#L289). +See also: [`TypeErasure#erasedGlb`](https://github.com/scala/scala3/blob/main/compiler/src/dotty/tools/dotc/core/TypeErasure.scala#L289). ## Relationship with Compound Type (`with`) diff --git a/docs/_docs/reference/new-types/polymorphic-function-types.md b/docs/_docs/reference/new-types/polymorphic-function-types.md index 1754bf844831..04f4a3483896 100644 --- a/docs/_docs/reference/new-types/polymorphic-function-types.md +++ b/docs/_docs/reference/new-types/polymorphic-function-types.md @@ -33,7 +33,7 @@ In Scala 3 this is now possible. The type of the `bar` value above is This type describes function values which take a type `A` as a parameter, then take a list of type `List[A]`, and return a list of the same type `List[A]`. -[More details](https://github.com/lampepfl/dotty/pull/4672) +[More details](https://github.com/scala/scala3/pull/4672) ## Example Usage diff --git a/docs/_docs/reference/other-new-features/open-classes.md b/docs/_docs/reference/other-new-features/open-classes.md index 10af6ead669e..ff2dbac27b3b 100644 --- a/docs/_docs/reference/other-new-features/open-classes.md +++ b/docs/_docs/reference/other-new-features/open-classes.md @@ -77,4 +77,4 @@ A class that is neither `abstract` nor `open` is similar to a `sealed` class: it ## Migration -`open` is a new modifier in Scala 3. To allow cross compilation between Scala 2.13 and Scala 3.0 without warnings, the feature warning for ad-hoc extensions is produced only under `-source future`. It will be produced by default [from Scala 3.4 on](https://github.com/lampepfl/dotty/issues/16334). +`open` is a new modifier in Scala 3. To allow cross compilation between Scala 2.13 and Scala 3.0 without warnings, the feature warning for ad-hoc extensions is produced only under `-source future`. It will be produced by default [from Scala 3.4 on](https://github.com/scala/scala3/issues/16334). diff --git a/docs/_docs/reference/other-new-features/parameter-untupling-spec.md b/docs/_docs/reference/other-new-features/parameter-untupling-spec.md index fd462dd610c8..c9e1033f5ea6 100644 --- a/docs/_docs/reference/other-new-features/parameter-untupling-spec.md +++ b/docs/_docs/reference/other-new-features/parameter-untupling-spec.md @@ -56,4 +56,4 @@ Obsolete conversions could be detected and fixed by [`Scalafix`](https://scalace ## Reference -For more information, see [Issue #897](https://github.com/lampepfl/dotty/issues/897). +For more information, see [Issue #897](https://github.com/scala/scala3/issues/897). diff --git a/docs/_docs/reference/other-new-features/parameter-untupling.md b/docs/_docs/reference/other-new-features/parameter-untupling.md index e1e7afcad8fe..27e3850b309c 100644 --- a/docs/_docs/reference/other-new-features/parameter-untupling.md +++ b/docs/_docs/reference/other-new-features/parameter-untupling.md @@ -75,4 +75,4 @@ cannot subvert untupling. For more information see: * [More details](./parameter-untupling-spec.md) -* [Issue #897](https://github.com/lampepfl/dotty/issues/897). +* [Issue #897](https://github.com/scala/scala3/issues/897). diff --git a/docs/_docs/reference/overview.md b/docs/_docs/reference/overview.md index b1e8281dfc16..bdb8aa74c1aa 100644 --- a/docs/_docs/reference/overview.md +++ b/docs/_docs/reference/overview.md @@ -91,7 +91,7 @@ These constructs are proposed to be dropped without a new construct replacing th - [Auto application](dropped-features/auto-apply.md), - [Weak conformance](dropped-features/weak-conformance.md), - Compound types (replaced by [Intersection types](new-types/intersection-types.md)), -- [Auto tupling](https://github.com/lampepfl/dotty/pull/4311) (implemented, but not merged). +- [Auto tupling](https://github.com/scala/scala3/pull/4311) (implemented, but not merged). The date when these constructs are dropped varies. The current status is: diff --git a/docs/_docs/release-notes-0.1.2.md b/docs/_docs/release-notes-0.1.2.md index 98f359c83a15..c6bc9846bf07 100644 --- a/docs/_docs/release-notes-0.1.2.md +++ b/docs/_docs/release-notes-0.1.2.md @@ -18,7 +18,7 @@ Dotty 0.1.2 targets Java 8. We don't have plans to add support for earlier versi # Reporting Bugs / Known Issues -Please [file](https://github.com/lampepfl/dotty/issues) any bugs you encounter. If you’re unsure whether something is a bug, +Please [file](https://github.com/scala/scala3/issues) any bugs you encounter. If you’re unsure whether something is a bug, please ask on the Dotty [gitter channel](https://github.com/lampepfl/dotty). # Dotty Doc @@ -86,27 +86,27 @@ This release ships with the following features: [9]: http://docs.scala-lang.org/sips/pending/static-members.html [10]: http://docs.scala-lang.org/sips/pending/improved-lazy-val-initialization.html [11]: http://magarciaepfl.github.io/scala/ -[12]: https://github.com/lampepfl/dotty/commit/b2215ed23311b2c99ea638f9d7fcad9737dba588 -[13]: https://github.com/lampepfl/dotty/pull/187 -[14]: https://github.com/lampepfl/dotty/pull/217 +[12]: https://github.com/scala/scala3/commit/b2215ed23311b2c99ea638f9d7fcad9737dba588 +[13]: https://github.com/scala/scala3/pull/187 +[14]: https://github.com/scala/scala3/pull/217 [15]: reference/other-new-features/trait-parameters.html -[16]: https://github.com/lampepfl/dotty/commit/89540268e6c49fb92b9ca61249e46bb59981bf5a -[17]: https://github.com/lampepfl/dotty/pull/174 -[18]: https://github.com/lampepfl/dotty/pull/488 -[19]: https://github.com/lampepfl/dotty/pull/174 -[20]: https://github.com/lampepfl/dotty/pull/411 -[21]: https://github.com/lampepfl/dotty/pull/1364 -[22]: https://github.com/lampepfl/dotty/pull/1227 -[23]: https://github.com/lampepfl/dotty/pull/117 -[24]: https://github.com/lampepfl/dotty/pull/2532 -[25]: https://github.com/lampepfl/dotty/pull/2194 -[26]: https://github.com/lampepfl/dotty/pull/213 -[27]: https://github.com/lampepfl/dotty/pull/2513 -[28]: https://github.com/lampepfl/dotty/pull/2361 -[29]: https://github.com/lampepfl/dotty/pull/1453 +[16]: https://github.com/scala/scala3/commit/89540268e6c49fb92b9ca61249e46bb59981bf5a +[17]: https://github.com/scala/scala3/pull/174 +[18]: https://github.com/scala/scala3/pull/488 +[19]: https://github.com/scala/scala3/pull/174 +[20]: https://github.com/scala/scala3/pull/411 +[21]: https://github.com/scala/scala3/pull/1364 +[22]: https://github.com/scala/scala3/pull/1227 +[23]: https://github.com/scala/scala3/pull/117 +[24]: https://github.com/scala/scala3/pull/2532 +[25]: https://github.com/scala/scala3/pull/2194 +[26]: https://github.com/scala/scala3/pull/213 +[27]: https://github.com/scala/scala3/pull/2513 +[28]: https://github.com/scala/scala3/pull/2361 +[29]: https://github.com/scala/scala3/pull/1453 [30]: reference/contextual/context-functions.html -[31]: https://github.com/lampepfl/dotty/pull/2136 -[32]: https://github.com/lampepfl/dotty/pull/1758 +[31]: https://github.com/scala/scala3/pull/2136 +[32]: https://github.com/scala/scala3/pull/1758 [33]: reference/metaprogramming/inline.html # Contributors diff --git a/docs/_spec/APPLIEDreference/dropped-features/auto-apply.md b/docs/_spec/APPLIEDreference/dropped-features/auto-apply.md index b9aedb9f046b..95366b5e8f78 100644 --- a/docs/_spec/APPLIEDreference/dropped-features/auto-apply.md +++ b/docs/_spec/APPLIEDreference/dropped-features/auto-apply.md @@ -93,4 +93,4 @@ stricter checking. ## Reference -For more information, see [Issue #2570](https://github.com/lampepfl/dotty/issues/2570) and [PR #2716](https://github.com/lampepfl/dotty/pull/2716). +For more information, see [Issue #2570](https://github.com/scala/scala3/issues/2570) and [PR #2716](https://github.com/scala/scala3/pull/2716). diff --git a/docs/_spec/APPLIEDreference/enums/enums.md b/docs/_spec/APPLIEDreference/enums/enums.md index bcab50d3a36d..d8fdbe9f1db2 100644 --- a/docs/_spec/APPLIEDreference/enums/enums.md +++ b/docs/_spec/APPLIEDreference/enums/enums.md @@ -178,5 +178,5 @@ scala> Color.Red.compareTo(Color.Green) val res15: Int = -1 ``` -For a more in-depth example of using Scala 3 enums from Java, see [this test](https://github.com/lampepfl/dotty/tree/main/tests/run/enum-java). +For a more in-depth example of using Scala 3 enums from Java, see [this test](https://github.com/scala/scala3/tree/main/tests/run/enum-java). In the test, the enums are defined in the `MainScala.scala` file and used from a Java source, `Test.java`. diff --git a/docs/_spec/APPLIEDreference/new-types/union-types.md b/docs/_spec/APPLIEDreference/new-types/union-types.md index 152505d7fc8d..f59bb6f6b851 100644 --- a/docs/_spec/APPLIEDreference/new-types/union-types.md +++ b/docs/_spec/APPLIEDreference/new-types/union-types.md @@ -59,8 +59,8 @@ be changed in the future. For example by not widening unions that have been explicitly written down by the user and not inferred, or by not widening a type argument when the corresponding type parameter is covariant. -See [PR #2330](https://github.com/lampepfl/dotty/pull/2330) and -[Issue #4867](https://github.com/lampepfl/dotty/issues/4867) for further discussions. +See [PR #2330](https://github.com/scala/scala3/pull/2330) and +[Issue #4867](https://github.com/scala/scala3/issues/4867) for further discussions. ### Example diff --git a/docs/_spec/TODOreference/changed-features/eta-expansion-spec.md b/docs/_spec/TODOreference/changed-features/eta-expansion-spec.md index a62d45df9e11..fa5c1e57a066 100644 --- a/docs/_spec/TODOreference/changed-features/eta-expansion-spec.md +++ b/docs/_spec/TODOreference/changed-features/eta-expansion-spec.md @@ -74,4 +74,4 @@ The method value syntax `m _` is deprecated. ## Reference -For more information, see [PR #2701](https://github.com/lampepfl/dotty/pull/2701). +For more information, see [PR #2701](https://github.com/scala/scala3/pull/2701). diff --git a/docs/_spec/TODOreference/changed-features/implicit-conversions-spec.md b/docs/_spec/TODOreference/changed-features/implicit-conversions-spec.md index dc19e10c8b8f..a70321b70c15 100644 --- a/docs/_spec/TODOreference/changed-features/implicit-conversions-spec.md +++ b/docs/_spec/TODOreference/changed-features/implicit-conversions-spec.md @@ -114,4 +114,4 @@ changes to implicit resolution, refer to the [Changes in Implicit Resolution](im ## Reference For more information about implicit resolution, see [Changes in Implicit Resolution](implicit-resolution.md). -Other details are available in [PR #2065](https://github.com/lampepfl/dotty/pull/2065). +Other details are available in [PR #2065](https://github.com/scala/scala3/pull/2065). diff --git a/docs/_spec/TODOreference/changed-features/structural-types-spec.md b/docs/_spec/TODOreference/changed-features/structural-types-spec.md index d456932649fb..cb48a593e831 100644 --- a/docs/_spec/TODOreference/changed-features/structural-types-spec.md +++ b/docs/_spec/TODOreference/changed-features/structural-types-spec.md @@ -16,7 +16,7 @@ RefineStat ::= ‘val’ VarDcl | ‘def’ DefDcl | ‘type’ {nl} TypeDcl ## Implementation of Structural Types The standard library defines a universal marker trait -[`scala.Selectable`](https://github.com/lampepfl/dotty/blob/main/library/src/scala/Selectable.scala): +[`scala.Selectable`](https://github.com/scala/scala3/blob/main/library/src/scala/Selectable.scala): ```scala trait Selectable extends Any @@ -150,4 +150,4 @@ conversion that can turn `v` into a `Selectable`, and the selection methods coul ## Context -For more information, see [Rethink Structural Types](https://github.com/lampepfl/dotty/issues/1886). +For more information, see [Rethink Structural Types](https://github.com/scala/scala3/issues/1886). diff --git a/docs/_spec/TODOreference/contextual/by-name-context-parameters.md b/docs/_spec/TODOreference/contextual/by-name-context-parameters.md index 3004bfb2c4c2..7a80b20592ea 100644 --- a/docs/_spec/TODOreference/contextual/by-name-context-parameters.md +++ b/docs/_spec/TODOreference/contextual/by-name-context-parameters.md @@ -61,5 +61,5 @@ No local given instance was generated because the synthesized argument is not re ## Reference -For more information, see [Issue #1998](https://github.com/lampepfl/dotty/issues/1998) +For more information, see [Issue #1998](https://github.com/scala/scala3/issues/1998) and the associated [Scala SIP](https://docs.scala-lang.org/sips/byname-implicits.html). diff --git a/docs/_spec/TODOreference/contextual/multiversal-equality.md b/docs/_spec/TODOreference/contextual/multiversal-equality.md index e9a81b95f472..fa729fda8e61 100644 --- a/docs/_spec/TODOreference/contextual/multiversal-equality.md +++ b/docs/_spec/TODOreference/contextual/multiversal-equality.md @@ -25,7 +25,7 @@ the program will still typecheck, since values of all types can be compared with But it will probably give unexpected results and fail at runtime. Multiversal equality is an opt-in way to make universal equality safer. -It uses a binary type class [`scala.CanEqual`](https://github.com/lampepfl/dotty/blob/main/library/src/scala/CanEqual.scala) +It uses a binary type class [`scala.CanEqual`](https://github.com/scala/scala3/blob/main/library/src/scala/CanEqual.scala) to indicate that values of two given types can be compared with each other. The example above would not typecheck if `S` or `T` was a class that derives `CanEqual`, e.g. @@ -70,7 +70,7 @@ given CanEqual[A, B] = CanEqual.derived given CanEqual[B, A] = CanEqual.derived ``` -The [`scala.CanEqual`](https://github.com/lampepfl/dotty/blob/main/library/src/scala/CanEqual.scala) +The [`scala.CanEqual`](https://github.com/scala/scala3/blob/main/library/src/scala/CanEqual.scala) object defines a number of `CanEqual` given instances that together define a rule book for what standard types can be compared (more details below). @@ -224,4 +224,4 @@ work under `-language:strictEquality`, since otherwise the universal `Eq[Any]` i More on multiversal equality is found in a [blog post](http://www.scala-lang.org/blog/2016/05/06/multiversal-equality.html) -and a [GitHub issue](https://github.com/lampepfl/dotty/issues/1247). +and a [GitHub issue](https://github.com/scala/scala3/issues/1247). diff --git a/docs/_spec/TODOreference/dropped-features/type-projection.md b/docs/_spec/TODOreference/dropped-features/type-projection.md index 08b5ffb34eca..2c3e82ce99b8 100644 --- a/docs/_spec/TODOreference/dropped-features/type-projection.md +++ b/docs/_spec/TODOreference/dropped-features/type-projection.md @@ -9,7 +9,7 @@ and `A` names a type member of `T`. Scala 3 disallows this if `T` is an abstract type (class types and type aliases are fine). This change was made because unrestricted type projection -is [unsound](https://github.com/lampepfl/dotty/issues/1050). +is [unsound](https://github.com/scala/scala3/issues/1050). This restriction rules out the [type-level encoding of a combinator calculus](https://michid.wordpress.com/2010/01/29/scala-type-level-encoding-of-the-ski-calculus/). diff --git a/docs/_spec/TODOreference/experimental/explicit-nulls.md b/docs/_spec/TODOreference/experimental/explicit-nulls.md index b3fa53429cfe..3be83afe967e 100644 --- a/docs/_spec/TODOreference/experimental/explicit-nulls.md +++ b/docs/_spec/TODOreference/experimental/explicit-nulls.md @@ -431,7 +431,7 @@ When dealing with local mutable variables, there are two questions: x = null ``` -See [more examples](https://github.com/lampepfl/dotty/blob/main/tests/explicit-nulls/neg/flow-varref-in-closure.scala). +See [more examples](https://github.com/scala/scala3/blob/main/tests/explicit-nulls/neg/flow-varref-in-closure.scala). Currently, we are unable to track paths with a mutable variable prefix. For example, `x.a` if `x` is mutable. diff --git a/docs/_spec/TODOreference/experimental/tupled-function.md b/docs/_spec/TODOreference/experimental/tupled-function.md index da108fc832ad..e01570e9e8a6 100644 --- a/docs/_spec/TODOreference/experimental/tupled-function.md +++ b/docs/_spec/TODOreference/experimental/tupled-function.md @@ -34,7 +34,7 @@ The compiler will synthesize an instance of `TupledFunction[F, G]` if: Examples -------- `TupledFunction` can be used to generalize the `Function1.tupled`, ... `Function22.tupled` methods to functions of any arities. -The following defines `tupled` as [extension method](../contextual/extension-methods.html) ([full example](https://github.com/lampepfl/dotty/blob/main/tests/run/tupled-function-tupled.scala)). +The following defines `tupled` as [extension method](../contextual/extension-methods.html) ([full example](https://github.com/scala/scala3/blob/main/tests/run/tupled-function-tupled.scala)). ```scala /** Creates a tupled version of this function: instead of N arguments, @@ -48,7 +48,7 @@ extension [F, Args <: Tuple, R](f: F) def tupled(using tf: TupledFunction[F, Args => R]): Args => R = tf.tupled(f) ``` -`TupledFunction` can be used to generalize the `Function.untupled` to a function of any arities ([full example](https://github.com/lampepfl/dotty/blob/main/tests/run/tupled-function-untupled.scala)) +`TupledFunction` can be used to generalize the `Function.untupled` to a function of any arities ([full example](https://github.com/scala/scala3/blob/main/tests/run/tupled-function-untupled.scala)) ```scala /** Creates an untupled version of this function: instead of a single argument of type [[scala.Tuple]] with N elements, @@ -64,7 +64,7 @@ extension [F, Args <: Tuple, R](f: Args => R) def untupled(using tf: TupledFunction[F, Args => R]): F = tf.untupled(f) ``` -`TupledFunction` can also be used to generalize the [`Tuple1.compose`](https://github.com/lampepfl/dotty/blob/main/tests/run/tupled-function-compose.scala) and [`Tuple1.andThen`](https://github.com/lampepfl/dotty/blob/main/tests/run/tupled-function-andThen.scala) methods to compose functions of larger arities and with functions that return tuples. +`TupledFunction` can also be used to generalize the [`Tuple1.compose`](https://github.com/scala/scala3/blob/main/tests/run/tupled-function-compose.scala) and [`Tuple1.andThen`](https://github.com/scala/scala3/blob/main/tests/run/tupled-function-andThen.scala) methods to compose functions of larger arities and with functions that return tuples. ```scala /** Composes two instances of TupledFunction into a new TupledFunction, with this function applied last. diff --git a/docs/_spec/TODOreference/features-classification.md b/docs/_spec/TODOreference/features-classification.md index 36cea3b9e72d..550130780b44 100644 --- a/docs/_spec/TODOreference/features-classification.md +++ b/docs/_spec/TODOreference/features-classification.md @@ -72,7 +72,7 @@ These constructs are restricted to make the language safer. - [Given Imports](contextual/given-imports.md): implicits now require a special form of import, to make the import clearly visible. - [Type Projection](dropped-features/type-projection.md): only classes can be used as prefix `C` of a type projection `C#A`. Type projection on abstract types is no longer supported since it is unsound. - [Multiversal equality](contextual/multiversal-equality.md) implements an "opt-in" scheme to rule out nonsensical comparisons with `==` and `!=`. - - [infix](https://github.com/lampepfl/dotty/pull/5975) + - [infix](https://github.com/scala/scala3/pull/5975) makes method application syntax uniform across code bases. Unrestricted implicit conversions continue to be available in Scala 3.0, but will be deprecated and removed later. Unrestricted versions of the other constructs in the list above are available only under `-source 3.0-migration`. @@ -100,7 +100,7 @@ These constructs are proposed to be dropped without a new construct replacing th - [Auto application](dropped-features/auto-apply.md), - [Weak conformance](dropped-features/weak-conformance.md), - [Compound types](new-types/intersection-types.md), - - [Auto tupling](https://github.com/lampepfl/dotty/pull/4311) (implemented, but not merged). + - [Auto tupling](https://github.com/scala/scala3/pull/4311) (implemented, but not merged). The date when these constructs are dropped varies. The current status is: @@ -148,7 +148,7 @@ These are additions to the language that make it more powerful or pleasant to us - [Enums](enums/enums.md) provide concise syntax for enumerations and [algebraic data types](enums/adts.md). - [Parameter untupling](other-new-features/parameter-untupling.md) avoids having to use `case` for tupled parameter destructuring. - [Dependent function types](new-types/dependent-function-types.md) generalize dependent methods to dependent function values and types. - - [Polymorphic function types](https://github.com/lampepfl/dotty/pull/4672) generalize polymorphic methods to dependent function values and types. _Current status_: There is a proposal, and a prototype implementation, but the implementation has not been finalized or merged yet. + - [Polymorphic function types](https://github.com/scala/scala3/pull/4672) generalize polymorphic methods to dependent function values and types. _Current status_: There is a proposal, and a prototype implementation, but the implementation has not been finalized or merged yet. - [Kind polymorphism](other-new-features/kind-polymorphism.md) allows the definition of operators working equally on types and type constructors. **Status: mixed** diff --git a/docs/_spec/TODOreference/metaprogramming/compiletime-ops.md b/docs/_spec/TODOreference/metaprogramming/compiletime-ops.md index a43c941ae943..782cd72886d5 100644 --- a/docs/_spec/TODOreference/metaprogramming/compiletime-ops.md +++ b/docs/_spec/TODOreference/metaprogramming/compiletime-ops.md @@ -290,5 +290,5 @@ val notMissing : NotMissing = summonInlineCheck(3) ## Reference -For more information about compile-time operations, see [PR #4768](https://github.com/lampepfl/dotty/pull/4768), -which explains how `summonFrom`'s predecessor (implicit matches) can be used for typelevel programming and code specialization and [PR #7201](https://github.com/lampepfl/dotty/pull/7201) which explains the new `summonFrom` syntax. +For more information about compile-time operations, see [PR #4768](https://github.com/scala/scala3/pull/4768), +which explains how `summonFrom`'s predecessor (implicit matches) can be used for typelevel programming and code specialization and [PR #7201](https://github.com/scala/scala3/pull/7201) which explains the new `summonFrom` syntax. diff --git a/docs/_spec/TODOreference/new-types/dependent-function-types-spec.md b/docs/_spec/TODOreference/new-types/dependent-function-types-spec.md index f3237ddf7b9a..b27346a687d6 100644 --- a/docs/_spec/TODOreference/new-types/dependent-function-types-spec.md +++ b/docs/_spec/TODOreference/new-types/dependent-function-types-spec.md @@ -4,7 +4,7 @@ title: "Dependent Function Types - More Details" nightlyOf: https://docs.scala-lang.org/scala3/reference/new-types/dependent-function-types-spec.html --- -Initial implementation in [PR #3464](https://github.com/lampepfl/dotty/pull/3464). +Initial implementation in [PR #3464](https://github.com/scala/scala3/pull/3464). ## Syntax @@ -46,7 +46,7 @@ same way that other functions do, see The example below defines a trait `C` and the two dependent function types `DF` and `IDF` and prints the results of the respective function applications: -[depfuntype.scala]: https://github.com/lampepfl/dotty/blob/main/tests/pos/depfuntype.scala +[depfuntype.scala]: https://github.com/scala/scala3/blob/main/tests/pos/depfuntype.scala ```scala trait C { type M; val m: M } @@ -70,7 +70,7 @@ type IDF = (x: C) ?=> x.M In the following example the depend type `f.Eff` refers to the effect type `CanThrow`: -[eff-dependent.scala]: https://github.com/lampepfl/dotty/blob/main/tests/run/eff-dependent.scala +[eff-dependent.scala]: https://github.com/scala/scala3/blob/main/tests/run/eff-dependent.scala ```scala trait Effect diff --git a/docs/_spec/TODOreference/new-types/polymorphic-function-types.md b/docs/_spec/TODOreference/new-types/polymorphic-function-types.md index 1754bf844831..04f4a3483896 100644 --- a/docs/_spec/TODOreference/new-types/polymorphic-function-types.md +++ b/docs/_spec/TODOreference/new-types/polymorphic-function-types.md @@ -33,7 +33,7 @@ In Scala 3 this is now possible. The type of the `bar` value above is This type describes function values which take a type `A` as a parameter, then take a list of type `List[A]`, and return a list of the same type `List[A]`. -[More details](https://github.com/lampepfl/dotty/pull/4672) +[More details](https://github.com/scala/scala3/pull/4672) ## Example Usage diff --git a/docs/_spec/TODOreference/other-new-features/parameter-untupling-spec.md b/docs/_spec/TODOreference/other-new-features/parameter-untupling-spec.md index e5165550fc0d..6133012463a1 100644 --- a/docs/_spec/TODOreference/other-new-features/parameter-untupling-spec.md +++ b/docs/_spec/TODOreference/other-new-features/parameter-untupling-spec.md @@ -86,4 +86,4 @@ Obsolete conversions could be detected and fixed by [`Scalafix`](https://scalace ## Reference -For more information, see [Issue #897](https://github.com/lampepfl/dotty/issues/897). +For more information, see [Issue #897](https://github.com/scala/scala3/issues/897). diff --git a/docs/_spec/TODOreference/other-new-features/parameter-untupling.md b/docs/_spec/TODOreference/other-new-features/parameter-untupling.md index fcc1fa11d519..09806fc169eb 100644 --- a/docs/_spec/TODOreference/other-new-features/parameter-untupling.md +++ b/docs/_spec/TODOreference/other-new-features/parameter-untupling.md @@ -74,4 +74,4 @@ cannot subvert untupling. For more information see: * [More details](./parameter-untupling-spec.md) -* [Issue #897](https://github.com/lampepfl/dotty/issues/897). +* [Issue #897](https://github.com/scala/scala3/issues/897). diff --git a/docs/_spec/TODOreference/overview.md b/docs/_spec/TODOreference/overview.md index b1e8281dfc16..bdb8aa74c1aa 100644 --- a/docs/_spec/TODOreference/overview.md +++ b/docs/_spec/TODOreference/overview.md @@ -91,7 +91,7 @@ These constructs are proposed to be dropped without a new construct replacing th - [Auto application](dropped-features/auto-apply.md), - [Weak conformance](dropped-features/weak-conformance.md), - Compound types (replaced by [Intersection types](new-types/intersection-types.md)), -- [Auto tupling](https://github.com/lampepfl/dotty/pull/4311) (implemented, but not merged). +- [Auto tupling](https://github.com/scala/scala3/pull/4311) (implemented, but not merged). The date when these constructs are dropped varies. The current status is: diff --git a/docs/_spec/_layouts/default.yml b/docs/_spec/_layouts/default.yml index 5d597cb5ea96..0f7fb24d7ce2 100644 --- a/docs/_spec/_layouts/default.yml +++ b/docs/_spec/_layouts/default.yml @@ -27,7 +27,7 @@
diff --git a/library/src/scala/runtime/stdLibPatches/language.scala b/library/src/scala/runtime/stdLibPatches/language.scala index d74bb1376912..70d5f2d41907 100644 --- a/library/src/scala/runtime/stdLibPatches/language.scala +++ b/library/src/scala/runtime/stdLibPatches/language.scala @@ -13,7 +13,7 @@ object language: * code should not rely on them. * * Programmers are encouraged to try out experimental features and - * [[https://github.com/lampepfl/dotty/issues report any bugs or API inconsistencies]] + * [[https://github.com/scala/scala3/issues report any bugs or API inconsistencies]] * they encounter so they can be improved in future releases. * * @group experimental diff --git a/presentation-compiler/src/main/dotty/tools/pc/HoverProvider.scala b/presentation-compiler/src/main/dotty/tools/pc/HoverProvider.scala index a36ce83e6aa0..6f39b4871a06 100644 --- a/presentation-compiler/src/main/dotty/tools/pc/HoverProvider.scala +++ b/presentation-compiler/src/main/dotty/tools/pc/HoverProvider.scala @@ -109,7 +109,7 @@ object HoverProvider: val exprTpw = tpe.widenTermRefExpr.metalsDealias val hoverString = tpw match - // https://github.com/lampepfl/dotty/issues/8891 + // https://github.com/scala/scala3/issues/8891 case tpw: ImportType => printer.hoverSymbol(symbol, symbol.paramRef) case _ => diff --git a/presentation-compiler/src/main/dotty/tools/pc/MetalsInteractive.scala b/presentation-compiler/src/main/dotty/tools/pc/MetalsInteractive.scala index 2c2897e401a1..381e0eaec6a5 100644 --- a/presentation-compiler/src/main/dotty/tools/pc/MetalsInteractive.scala +++ b/presentation-compiler/src/main/dotty/tools/pc/MetalsInteractive.scala @@ -220,7 +220,7 @@ object MetalsInteractive: then List((head.symbol, head.typeOpt)) /* Type tree for List(1) has an Int type variable, which has span * but doesn't exist in code. - * https://github.com/lampepfl/dotty/issues/15937 + * https://github.com/scala/scala3/issues/15937 */ else if head.isInstanceOf[TypeTree] then enclosingSymbolsWithExpressionType(tail, pos, indexed) diff --git a/presentation-compiler/src/main/dotty/tools/pc/PcCollector.scala b/presentation-compiler/src/main/dotty/tools/pc/PcCollector.scala index 310edd60d87e..60def237badb 100644 --- a/presentation-compiler/src/main/dotty/tools/pc/PcCollector.scala +++ b/presentation-compiler/src/main/dotty/tools/pc/PcCollector.scala @@ -57,13 +57,13 @@ abstract class PcCollector[T]( .pathTo(driver.openedTrees(uri), pos)(using driver.currentCtx) .dropWhile(t => // NamedArg anyway doesn't have symbol t.symbol == NoSymbol && !t.isInstanceOf[NamedArg] || - // same issue https://github.com/lampepfl/dotty/issues/15937 as below + // same issue https://github.com/scala/scala3/issues/15937 as below t.isInstanceOf[TypeTree] ) val path = rawPath match // For type it will sometimes go into the wrong tree since TypeTree also contains the same span - // https://github.com/lampepfl/dotty/issues/15937 + // https://github.com/scala/scala3/issues/15937 case TypeApply(sel: Select, _) :: tail if sel.span.contains(pos.span) => Interactive.pathTo(sel, pos.span) ::: rawPath case _ => rawPath @@ -583,7 +583,7 @@ abstract class PcCollector[T]( t } - // NOTE: Connected to https://github.com/lampepfl/dotty/issues/16771 + // NOTE: Connected to https://github.com/scala/scala3/issues/16771 // `sel.nameSpan` is calculated incorrectly in (1 + 2).toString // See test DocumentHighlightSuite.select-parentheses private def selectNameSpan(sel: Select): Span = diff --git a/presentation-compiler/src/main/dotty/tools/pc/SemanticdbSymbols.scala b/presentation-compiler/src/main/dotty/tools/pc/SemanticdbSymbols.scala index da8add9df327..d298a88fc655 100644 --- a/presentation-compiler/src/main/dotty/tools/pc/SemanticdbSymbols.scala +++ b/presentation-compiler/src/main/dotty/tools/pc/SemanticdbSymbols.scala @@ -118,8 +118,8 @@ object SemanticdbSymbols: b.toString /** - * Taken from https://github.com/lampepfl/dotty/blob/2db43dae1480825227eb30d291b0dd0f0494e0f6/compiler/src/dotty/tools/dotc/semanticdb/ExtractSemanticDB.scala#L293 - * In future might be replaced by usage of compiler implementation after merging https://github.com/lampepfl/dotty/pull/12885 + * Taken from https://github.com/scala/scala3/blob/2db43dae1480825227eb30d291b0dd0f0494e0f6/compiler/src/dotty/tools/dotc/semanticdb/ExtractSemanticDB.scala#L293 + * In future might be replaced by usage of compiler implementation after merging https://github.com/scala/scala3/pull/12885 */ private def addSymName(b: StringBuilder, sym: Symbol)(using Context): Unit = diff --git a/presentation-compiler/src/main/dotty/tools/pc/completions/CompletionProvider.scala b/presentation-compiler/src/main/dotty/tools/pc/completions/CompletionProvider.scala index 9d46a460850a..710e91750362 100644 --- a/presentation-compiler/src/main/dotty/tools/pc/completions/CompletionProvider.scala +++ b/presentation-compiler/src/main/dotty/tools/pc/completions/CompletionProvider.scala @@ -157,7 +157,7 @@ class CompletionProvider( // For overloaded signatures we get multiple symbols, so we need // to recalculate the description - // related issue https://github.com/lampepfl/dotty/issues/11941 + // related issue https://github.com/scala/scala3/issues/11941 lazy val kind: CompletionItemKind = completion.completionItemKind val description = completion.description(printer) val label = completion.labelWithDescription(printer) diff --git a/presentation-compiler/test/dotty/tools/pc/tests/completion/CompletionSuite.scala b/presentation-compiler/test/dotty/tools/pc/tests/completion/CompletionSuite.scala index 8f47582e4806..68e9f7728a87 100644 --- a/presentation-compiler/test/dotty/tools/pc/tests/completion/CompletionSuite.scala +++ b/presentation-compiler/test/dotty/tools/pc/tests/completion/CompletionSuite.scala @@ -888,7 +888,7 @@ class CompletionSuite extends BaseCompletionSuite: topLines = Some(2) ) - // issues with scala 3 https://github.com/lampepfl/dotty/pull/13515 + // issues with scala 3 https://github.com/scala/scala3/pull/13515 @Test def ordering4 = check( s"""|class Main { diff --git a/presentation-compiler/test/dotty/tools/pc/tests/hover/HoverTermSuite.scala b/presentation-compiler/test/dotty/tools/pc/tests/hover/HoverTermSuite.scala index bd044b55528a..82fd0c657e67 100644 --- a/presentation-compiler/test/dotty/tools/pc/tests/hover/HoverTermSuite.scala +++ b/presentation-compiler/test/dotty/tools/pc/tests/hover/HoverTermSuite.scala @@ -112,7 +112,7 @@ class HoverTermSuite extends BaseHoverSuite: | } |} |""".stripMargin, - // https://github.com/lampepfl/dotty/issues/8835 + // https://github.com/scala/scala3/issues/8835 """|object num: a.Xtension |""".stripMargin.hover ) diff --git a/presentation-compiler/test/dotty/tools/pc/tests/hover/HoverTypeSuite.scala b/presentation-compiler/test/dotty/tools/pc/tests/hover/HoverTypeSuite.scala index 269dc25069a5..c6e2590f8f29 100644 --- a/presentation-compiler/test/dotty/tools/pc/tests/hover/HoverTypeSuite.scala +++ b/presentation-compiler/test/dotty/tools/pc/tests/hover/HoverTypeSuite.scala @@ -41,7 +41,7 @@ class HoverTypeSuite extends BaseHoverSuite: ) // We should produce a shorter type but: - // https://github.com/lampepfl/dotty/issues/11683 + // https://github.com/scala/scala3/issues/11683 @Test def `enums` = check( """| @@ -125,7 +125,7 @@ class HoverTypeSuite extends BaseHoverSuite: * As user can actually supply params to them by hand when * invoking the extension method, we always show them next to the * method itself. - * https://github.com/lampepfl/dotty/issues/13123 + * https://github.com/scala/scala3/issues/13123 */ @Test def `extension-methods-complex` = check( diff --git a/presentation-compiler/test/dotty/tools/pc/tests/signaturehelp/SignatureHelpSuite.scala b/presentation-compiler/test/dotty/tools/pc/tests/signaturehelp/SignatureHelpSuite.scala index 01d5e03b6c1e..c1b51b127fd6 100644 --- a/presentation-compiler/test/dotty/tools/pc/tests/signaturehelp/SignatureHelpSuite.scala +++ b/presentation-compiler/test/dotty/tools/pc/tests/signaturehelp/SignatureHelpSuite.scala @@ -191,7 +191,7 @@ class SignatureHelpSuite extends BaseSignatureHelpSuite: |""".stripMargin ) - // https://github.com/lampepfl/dotty/issues/15244 + // https://github.com/scala/scala3/issues/15244 @Test def `vararg` = check( """ diff --git a/project/scripts/bootstrappedOnlyCmdTests b/project/scripts/bootstrappedOnlyCmdTests index 8a0f5cf78f2f..4e18e3a1d4a4 100755 --- a/project/scripts/bootstrappedOnlyCmdTests +++ b/project/scripts/bootstrappedOnlyCmdTests @@ -113,7 +113,7 @@ scala_version=${versionProps[2]} echo "testing -sourcepath with incremental compile: inlining changed inline def into a def" # Here we will test that a changed inline method symbol loaded from the sourcepath (-sourcepath compiler option) # will have its `defTree` correctly set when its method body is required for inlining. -# So far I have not found a way to replicate issue https://github.com/lampepfl/dotty/issues/13994 +# So far I have not found a way to replicate issue https://github.com/scala/scala3/issues/13994 # with sbt scripted tests, if a way is found, move this test there. cwd=$(pwd) sbt_test_command="++${scala_version}!;clean;prepareSources;compile;copyChanges;compile" @@ -126,7 +126,7 @@ echo "testing -sourcepath with incremental compile: hashing reference to changed # Here we will test that a changed inline method symbol loaded from the sourcepath (-sourcepath compiler option) # will have its `defTree` correctly set when its method body is hashed by extractAPI, when referenced from another # inline method. -# So far I have not found a way to replicate https://github.com/lampepfl/dotty/pull/12931#discussion_r753212124 +# So far I have not found a way to replicate https://github.com/scala/scala3/pull/12931#discussion_r753212124 # with sbt scripted tests, if a way is found, move this test there. cwd=$(pwd) sbt_test_dir="$cwd/tests/cmdTest-sbt-tests/sourcepath-with-inline-api-hash" diff --git a/sbt-bridge/src/dotty/tools/xsbt/CompilerBridgeDriver.java b/sbt-bridge/src/dotty/tools/xsbt/CompilerBridgeDriver.java index 2d54d4e83404..20256d9e17cc 100644 --- a/sbt-bridge/src/dotty/tools/xsbt/CompilerBridgeDriver.java +++ b/sbt-bridge/src/dotty/tools/xsbt/CompilerBridgeDriver.java @@ -79,7 +79,7 @@ private static void reportMissingFile(DelegatingReporter reporter, SourceFile so underline + "\n" + " Falling back to placeholder for the given source file (of class " + sourceFile.getClass().getName() + ")\n" + " This is likely a bug in incremental compilation for the Scala 3 compiler.\n" + - " Please report it to the Scala 3 maintainers at https://github.com/lampepfl/dotty/issues."; + " Please report it to the Scala 3 maintainers at https://github.com/scala/scala3/issues."; reporter.reportBasicWarning(message); } diff --git a/sbt-bridge/src/xsbt/CachedCompilerImpl.java b/sbt-bridge/src/xsbt/CachedCompilerImpl.java index 8b7779f9c9cb..c9d4c50485ed 100644 --- a/sbt-bridge/src/xsbt/CachedCompilerImpl.java +++ b/sbt-bridge/src/xsbt/CachedCompilerImpl.java @@ -18,7 +18,7 @@ import dotty.tools.dotc.sbt.interfaces.IncrementalCallback; // deprecation warnings are suppressed because scala3-sbt-bridge must stay compatible with Zinc 1.3 -// see https://github.com/lampepfl/dotty/issues/10816 +// see https://github.com/scala/scala3/issues/10816 @SuppressWarnings("deprecation") public class CachedCompilerImpl implements CachedCompiler { private final String[] args; diff --git a/sbt-test/sbt-bridge/zinc-13-compat/test b/sbt-test/sbt-bridge/zinc-13-compat/test index f7b3295e155c..aecadb2f539d 100644 --- a/sbt-test/sbt-bridge/zinc-13-compat/test +++ b/sbt-test/sbt-bridge/zinc-13-compat/test @@ -1,3 +1,3 @@ # this little app test that scala3-sbt-bridge is compatible with Zinc 1.3 -# this is necessary to maintain the compatibility with Bloop (see https://github.com/lampepfl/dotty/issues/10816) +# this is necessary to maintain the compatibility with Bloop (see https://github.com/scala/scala3/issues/10816) > run diff --git a/sbt-test/sbt-dotty/scaladoc/src/main/scala/MultiversalEquality.scala b/sbt-test/sbt-dotty/scaladoc/src/main/scala/MultiversalEquality.scala index a4089e75de19..b8ebaf5565df 100644 --- a/sbt-test/sbt-dotty/scaladoc/src/main/scala/MultiversalEquality.scala +++ b/sbt-test/sbt-dotty/scaladoc/src/main/scala/MultiversalEquality.scala @@ -4,7 +4,7 @@ import scala.language.strictEquality /** * Multiversal Equality: https://dotty.epfl.ch/docs/reference/contextual/multiversal-equality.html - * scala.Eq definition: https://github.com/lampepfl/dotty/blob/master/library/src/scala/CanEqual.scala + * scala.Eq definition: https://github.com/scala/scala3/blob/master/library/src/scala/CanEqual.scala */ object MultiversalEquality { diff --git a/sbt-test/sbt-dotty/tasty-inspector-cache/inspector/src/main/scala/main.scala b/sbt-test/sbt-dotty/tasty-inspector-cache/inspector/src/main/scala/main.scala index 8335a016578f..b17747aa3ccf 100644 --- a/sbt-test/sbt-dotty/tasty-inspector-cache/inspector/src/main/scala/main.scala +++ b/sbt-test/sbt-dotty/tasty-inspector-cache/inspector/src/main/scala/main.scala @@ -2,7 +2,7 @@ import scala.quoted.Quotes import scala.quoted.quotes import scala.tasty.inspector as ins -// Test for https://github.com/lampepfl/dotty/issues/13919 +// Test for https://github.com/scala/scala3/issues/13919 class MyInspector extends ins.Inspector: def inspect(using Quotes)(tastys: List[ins.Tasty[quotes.type]]): Unit = import quotes.reflect._ diff --git a/sbt-test/source-dependencies/malformed-class-name-with-dollar/test b/sbt-test/source-dependencies/malformed-class-name-with-dollar/test index cf2dc1898f3e..bc71a440925e 100644 --- a/sbt-test/source-dependencies/malformed-class-name-with-dollar/test +++ b/sbt-test/source-dependencies/malformed-class-name-with-dollar/test @@ -1,5 +1,5 @@ > compile $ copy-file changes/A.scala A.scala -# It seems that https://github.com/lampepfl/dotty/pull/10784 break incremental compilation here +# It seems that https://github.com/scala/scala3/pull/10784 break incremental compilation here > clean > compile diff --git a/scaladoc-testcases/src/tests/nonScala3Parent.scala b/scaladoc-testcases/src/tests/nonScala3Parent.scala index 91183d25b583..c5a29b2d6132 100644 --- a/scaladoc-testcases/src/tests/nonScala3Parent.scala +++ b/scaladoc-testcases/src/tests/nonScala3Parent.scala @@ -4,7 +4,7 @@ package nonScala3Parent import javax.swing.JPanel import javax.swing.JFrame -// https://github.com/lampepfl/dotty/issues/15927 +// https://github.com/scala/scala3/issues/15927 trait Foo1 extends Numeric[Any] trait Foo2 extends JPanel diff --git a/scaladoc/src/dotty/tools/scaladoc/tasty/TypesSupport.scala b/scaladoc/src/dotty/tools/scaladoc/tasty/TypesSupport.scala index 35cf1cb6eec3..373a26dd0297 100644 --- a/scaladoc/src/dotty/tools/scaladoc/tasty/TypesSupport.scala +++ b/scaladoc/src/dotty/tools/scaladoc/tasty/TypesSupport.scala @@ -298,7 +298,7 @@ trait TypesSupport: } case tpe => - val msg = s"Encountered unsupported type. Report this problem to https://github.com/lampepfl/dotty/.\n" + + val msg = s"Encountered unsupported type. Report this problem to https://github.com/scala/scala3/.\n" + s"${tpe.show(using Printer.TypeReprStructure)}" throw MatchError(msg) diff --git a/scaladoc/test/dotty/tools/scaladoc/ScaladocTest.scala b/scaladoc/test/dotty/tools/scaladoc/ScaladocTest.scala index 540364ec10bf..0c8211865928 100644 --- a/scaladoc/test/dotty/tools/scaladoc/ScaladocTest.scala +++ b/scaladoc/test/dotty/tools/scaladoc/ScaladocTest.scala @@ -27,7 +27,7 @@ abstract class ScaladocTest(val name: String): tastyFiles = tastyFiles(name), output = getTempDir().getRoot, projectVersion = Some("1.0"), - sourceLinks = List("github://lampepfl/dotty/master") + sourceLinks = List("github://scala/scala3/master") ) @Test diff --git a/tests/explicit-nulls/pos/nn2.scala b/tests/explicit-nulls/pos/nn2.scala index a39618b97f22..6c2c67396899 100644 --- a/tests/explicit-nulls/pos/nn2.scala +++ b/tests/explicit-nulls/pos/nn2.scala @@ -1,5 +1,5 @@ // Test that is fixed when explicit nulls are enabled. -// https://github.com/lampepfl/dotty/issues/6247 +// https://github.com/scala/scala3/issues/6247 class Foo { val x1: String|Null = null diff --git a/tests/init/pos/i9795.scala b/tests/init/pos/i9795.scala index 33c13b2eb592..0968dfeb2589 100644 --- a/tests/init/pos/i9795.scala +++ b/tests/init/pos/i9795.scala @@ -2,6 +2,6 @@ class A: // Safe initialization check only allows capturing `this` either through primary constructor or synthetic `apply` // `Some` case class comes from Scala 2 stdlib, which is not visible, hence the warning // For reference: - // https://github.com/lampepfl/dotty/pull/12711 - // https://github.com/lampepfl/dotty/pull/14283 + // https://github.com/scala/scala3/pull/12711 + // https://github.com/scala/scala3/pull/14283 val some = Some(this) diff --git a/tests/neg/i11118.scala b/tests/neg/i11118.scala index 23d9b2b604b6..a94bfce47640 100644 --- a/tests/neg/i11118.scala +++ b/tests/neg/i11118.scala @@ -1,2 +1,2 @@ -// https://github.com/lampepfl/dotty/issues/11118 +// https://github.com/scala/scala3/issues/11118 val (a,b) = (1,2,3) // error // warning diff --git a/tests/neg/i4060.scala b/tests/neg/i4060.scala index ba641d633d3c..bd16ed867966 100644 --- a/tests/neg/i4060.scala +++ b/tests/neg/i4060.scala @@ -1,6 +1,6 @@ //> using options -language:experimental.erasedDefinitions -// See https://github.com/lampepfl/dotty/issues/4060#issuecomment-445808377 +// See https://github.com/scala/scala3/issues/4060#issuecomment-445808377 object App { trait A { type L >: Any} diff --git a/tests/pos-macros/i9361.scala b/tests/pos-macros/i9361.scala index 18efd203d885..abd711bbfcaa 100644 --- a/tests/pos-macros/i9361.scala +++ b/tests/pos-macros/i9361.scala @@ -1,4 +1,4 @@ -// https://github.com/lampepfl/dotty/issues/9361 +// https://github.com/scala/scala3/issues/9361 import scala.quoted._ diff --git a/tests/pos-with-compiler-cc/backend/jvm/BCodeHelpers.scala b/tests/pos-with-compiler-cc/backend/jvm/BCodeHelpers.scala index 2454bca9d653..5ad6a99f6055 100644 --- a/tests/pos-with-compiler-cc/backend/jvm/BCodeHelpers.scala +++ b/tests/pos-with-compiler-cc/backend/jvm/BCodeHelpers.scala @@ -833,7 +833,7 @@ trait BCodeHelpers extends BCodeIdiomatic with BytecodeWriters { case tp => report.warning( s"an unexpected type representation reached the compiler backend while compiling ${ctx.compilationUnit}: $tp. " + - "If possible, please file a bug on https://github.com/lampepfl/dotty/issues") + "If possible, please file a bug on https://github.com/scala/scala3/issues") tp match { case tp: ThisType if tp.cls == defn.ArrayClass => ObjectRef.asInstanceOf[ct.bTypes.ClassBType] // was introduced in 9b17332f11 to fix SI-999, but this code is not reached in its test, or any other test @@ -874,7 +874,7 @@ trait BCodeHelpers extends BCodeIdiomatic with BytecodeWriters { report.error( em"""|compiler bug: created invalid generic signature for $sym in ${sym.denot.owner.showFullName} |signature: $sig - |if this is reproducible, please report bug at https://github.com/lampepfl/dotty/issues + |if this is reproducible, please report bug at https://github.com/scala/scala3/issues """, sym.sourcePos) throw ex } diff --git a/tests/pos-with-compiler-cc/backend/jvm/BCodeSkelBuilder.scala b/tests/pos-with-compiler-cc/backend/jvm/BCodeSkelBuilder.scala index 1d8a9c579cb9..125ee26b0528 100644 --- a/tests/pos-with-compiler-cc/backend/jvm/BCodeSkelBuilder.scala +++ b/tests/pos-with-compiler-cc/backend/jvm/BCodeSkelBuilder.scala @@ -131,7 +131,7 @@ trait BCodeSkelBuilder extends BCodeHelpers { // Should we do this transformation earlier, say in Constructors? Or would that just cause // pain for scala-{js, native}? // - // @sjrd (https://github.com/lampepfl/dotty/pull/9181#discussion_r457458205): + // @sjrd (https://github.com/scala/scala3/pull/9181#discussion_r457458205): // moving that before the back-end would make things significantly more complicated for // Scala.js and Native. Both have a first-class concept of ModuleClass, and encode the // singleton pattern of MODULE$ in a completely different way. In the Scala.js IR, there @@ -142,7 +142,7 @@ trait BCodeSkelBuilder extends BCodeHelpers { // TODO: remove `!f.name.is(LazyBitMapName)` once we change lazy val encoding - // https://github.com/lampepfl/dotty/issues/7140 + // https://github.com/scala/scala3/issues/7140 // // Lazy val encoding assumes bitmap fields are non-static // diff --git a/tests/pos-with-compiler-cc/backend/jvm/DottyBackendInterface.scala b/tests/pos-with-compiler-cc/backend/jvm/DottyBackendInterface.scala index a70d671f9c63..6ce434015b8c 100644 --- a/tests/pos-with-compiler-cc/backend/jvm/DottyBackendInterface.scala +++ b/tests/pos-with-compiler-cc/backend/jvm/DottyBackendInterface.scala @@ -126,7 +126,7 @@ object DottyBackendInterface { * See also `genPlainClass` in `BCodeSkelBuilder.scala`. * * TODO: remove the special handing of `LazyBitMapName` once we swtich to - * the new lazy val encoding: https://github.com/lampepfl/dotty/issues/7140 + * the new lazy val encoding: https://github.com/scala/scala3/issues/7140 */ def isStaticModuleField(using Context): Boolean = sym.owner.isStaticModuleClass && sym.isField && !sym.name.is(LazyBitMapName) diff --git a/tests/pos/erasure-array.scala b/tests/pos/erasure-array.scala index 63240e9801f0..83dc2a423306 100644 --- a/tests/pos/erasure-array.scala +++ b/tests/pos/erasure-array.scala @@ -1,4 +1,4 @@ -// https://github.com/lampepfl/dotty/issues/1065 +// https://github.com/scala/scala3/issues/1065 package hello object world { diff --git a/tests/pos/i10242.scala b/tests/pos/i10242.scala index 10883633971e..b4a9700e1634 100644 --- a/tests/pos/i10242.scala +++ b/tests/pos/i10242.scala @@ -1,6 +1,6 @@ //> using options -source:3.3 -// https://github.com/lampepfl/dotty/issues/10242 +// https://github.com/scala/scala3/issues/10242 type Foo[A, B <: A] = A type Bar[A] = A match { diff --git a/tests/pos/i11681.scala b/tests/pos/i11681.scala index 587285911610..3374cbf9a4af 100644 --- a/tests/pos/i11681.scala +++ b/tests/pos/i11681.scala @@ -1,4 +1,4 @@ -// https://github.com/lampepfl/dotty/issues/11681 +// https://github.com/scala/scala3/issues/11681 import scala.collection.Factory diff --git a/tests/pos/i12663.scala b/tests/pos/i12663.scala index befbc65316cb..dc446acb6bdf 100644 --- a/tests/pos/i12663.scala +++ b/tests/pos/i12663.scala @@ -1,4 +1,4 @@ -// https://github.com/lampepfl/dotty/issues/12663 +// https://github.com/scala/scala3/issues/12663 final class HookComponentBuilder[Ctx, CtxFn[_]] { def asd[A](f: Ctx => A): A = ??? diff --git a/tests/pos/i12679.scala b/tests/pos/i12679.scala index fed62c72dd42..3fb14a8a91ed 100644 --- a/tests/pos/i12679.scala +++ b/tests/pos/i12679.scala @@ -1,4 +1,4 @@ -// https://github.com/lampepfl/dotty/issues/12679 +// https://github.com/scala/scala3/issues/12679 object Example: def foo[F[_]](qux: String, quux: String = ""): F[Unit] = ??? diff --git a/tests/pos/i14096.scala b/tests/pos/i14096.scala index 59365231b121..49f80332483a 100644 --- a/tests/pos/i14096.scala +++ b/tests/pos/i14096.scala @@ -1,4 +1,4 @@ -// https://github.com/lampepfl/dotty/issues/14096 +// https://github.com/scala/scala3/issues/14096 object Test: object Forte: def test[T](i: Int, config: String = ""): Int = 1 diff --git a/tests/pos/i14271.scala b/tests/pos/i14271.scala index 8f46940afd09..d29cf306617a 100644 --- a/tests/pos/i14271.scala +++ b/tests/pos/i14271.scala @@ -1,4 +1,4 @@ -// https://github.com/lampepfl/dotty/issues/14271 +// https://github.com/scala/scala3/issues/14271 class Bound[T] class MyClass[T <: Bound[T]] diff --git a/tests/pos/i14278.scala b/tests/pos/i14278.scala index ebc9376fbad5..09feb75a0c6f 100644 --- a/tests/pos/i14278.scala +++ b/tests/pos/i14278.scala @@ -1,4 +1,4 @@ -// https://github.com/lampepfl/dotty/issues/14278 +// https://github.com/scala/scala3/issues/14278 class Foo extension (foo: Foo) diff --git a/tests/pos/i14642.scala b/tests/pos/i14642.scala index b69da7d8d6d7..f380c404bd03 100644 --- a/tests/pos/i14642.scala +++ b/tests/pos/i14642.scala @@ -1,4 +1,4 @@ -// https://github.com/lampepfl/dotty/issues/14642 +// https://github.com/scala/scala3/issues/14642 case object A case class B() case class C() diff --git a/tests/pos/i14830.scala b/tests/pos/i14830.scala index 592a47c1a53c..6664bd44ea4a 100644 --- a/tests/pos/i14830.scala +++ b/tests/pos/i14830.scala @@ -1,5 +1,5 @@ -// https://github.com/lampepfl/dotty/issues/14830 +// https://github.com/scala/scala3/issues/14830 val a: Comparable[String] = "Fred" val b: { def length: Int } = "Fred" val c: Comparable[String] & { def length: Int } = "Fred" diff --git a/tests/pos/i15546.scala b/tests/pos/i15546.scala index 19c7f15b24f1..86303e283baa 100644 --- a/tests/pos/i15546.scala +++ b/tests/pos/i15546.scala @@ -1,4 +1,4 @@ -// https://github.com/lampepfl/dotty/issues/15546 +// https://github.com/scala/scala3/issues/15546 trait Foo[F[_]] diff --git a/tests/pos/i5700.scala b/tests/pos/i5700.scala index 69892dea16f4..89e8ca025c64 100644 --- a/tests/pos/i5700.scala +++ b/tests/pos/i5700.scala @@ -1,4 +1,4 @@ -// https://github.com/lampepfl/dotty/issues/5700 +// https://github.com/scala/scala3/issues/5700 object noRecursionLimit: type M = { type T[+A]; type Ev >: T[Any] <: T[Nothing] } val M: M = ().asInstanceOf[M] diff --git a/tests/pos/i7414.scala b/tests/pos/i7414.scala index fd85ed2a2265..2c65b6cce466 100644 --- a/tests/pos/i7414.scala +++ b/tests/pos/i7414.scala @@ -1,4 +1,4 @@ -// https://github.com/lampepfl/dotty/issues/7414 +// https://github.com/scala/scala3/issues/7414 object DepTest { trait Trait { diff --git a/tests/pos/i7445a.scala b/tests/pos/i7445a.scala index 2b54166de3f0..c23a5ebd12b9 100644 --- a/tests/pos/i7445a.scala +++ b/tests/pos/i7445a.scala @@ -1,4 +1,4 @@ -// https://github.com/lampepfl/dotty/issues/7445 +// https://github.com/scala/scala3/issues/7445 object Main { type O1[A] = { diff --git a/tests/pos/i7445b.scala b/tests/pos/i7445b.scala index 1d49479ef0a5..0f6859fdb5c3 100644 --- a/tests/pos/i7445b.scala +++ b/tests/pos/i7445b.scala @@ -1,4 +1,4 @@ -// https://github.com/lampepfl/dotty/issues/7445 +// https://github.com/scala/scala3/issues/7445 type O1[A] = { type OutInner[Ts] <: Tuple = Ts match { diff --git a/tests/pos/i7653.scala b/tests/pos/i7653.scala index 8511b6eef69b..61c75634d10e 100644 --- a/tests/pos/i7653.scala +++ b/tests/pos/i7653.scala @@ -1,4 +1,4 @@ -// https://github.com/lampepfl/dotty/issues/7653 +// https://github.com/scala/scala3/issues/7653 object options2 { type Option[T] = { diff --git a/tests/pos/i7790.scala b/tests/pos/i7790.scala index 0a0e2d2ce347..d2dfc5c892a4 100644 --- a/tests/pos/i7790.scala +++ b/tests/pos/i7790.scala @@ -1,4 +1,4 @@ -// https://github.com/lampepfl/dotty/issues/7790 +// https://github.com/scala/scala3/issues/7790 trait Foo: given Int = 10 def map(f: Int ?=> Int) = f diff --git a/tests/pos/i7807.scala b/tests/pos/i7807.scala index df8a41ddcf8d..118ad77c58c4 100644 --- a/tests/pos/i7807.scala +++ b/tests/pos/i7807.scala @@ -11,6 +11,6 @@ object Test: val m: n.type match { case 0 => 1 case 1 => 0 } = flip(n) - // The following do not work, see discussion in https://github.com/lampepfl/dotty/pull/7835/files/6e60814e69be5c8d60265d4ce4bc1758863c23d8#r361741296: + // The following do not work, see discussion in https://github.com/scala/scala3/pull/7835/files/6e60814e69be5c8d60265d4ce4bc1758863c23d8#r361741296: // flip(m) // flip(flip(n)) diff --git a/tests/pos/i8300.scala b/tests/pos/i8300.scala index f106b24dbd1c..f4634f7cd520 100644 --- a/tests/pos/i8300.scala +++ b/tests/pos/i8300.scala @@ -1,4 +1,4 @@ -// https://github.com/lampepfl/dotty/issues/8300 +// https://github.com/scala/scala3/issues/8300 type Bar[X] = X match { case List[a] => List[Tuple1[a]] diff --git a/tests/pos/kind-projector.scala b/tests/pos/kind-projector.scala index ff787d0111e2..4d6ec8c932a9 100644 --- a/tests/pos/kind-projector.scala +++ b/tests/pos/kind-projector.scala @@ -53,7 +53,7 @@ class BackticksAreFine6 extends FooPlus[BazPlus[Int => `-*`, `-*`, Int]] class BackticksAreFine7 extends Foo[λ[`-x` => BazPlus[x => `-*`, Int, x]]] class BackticksAreFine8 extends Foo[λ[`x` => BazPlus[x => `*`, Int, x]]] -// https://github.com/lampepfl/dotty/issues/13141 +// https://github.com/scala/scala3/issues/13141 // i13141 object A { class X { type Blah = Int } diff --git a/tests/run-macros/f-interpolator-tests.scala b/tests/run-macros/f-interpolator-tests.scala index 8c59ae19a187..e10301ad3115 100755 --- a/tests/run-macros/f-interpolator-tests.scala +++ b/tests/run-macros/f-interpolator-tests.scala @@ -2,7 +2,7 @@ * * The tests are sorted by argument category as the arguments are on https://docs.oracle.com/javase/6/docs/api/java/util/Formatter.html#detail * - * Some tests come from https://github.com/lampepfl/dotty/pull/3894/files + * Some tests come from https://github.com/scala/scala3/pull/3894/files */ object Test { def main(args: Array[String]) = { diff --git a/tests/run/i11583.scala b/tests/run/i11583.scala index 87701ac95310..fd4d63faa084 100644 --- a/tests/run/i11583.scala +++ b/tests/run/i11583.scala @@ -5,7 +5,7 @@ class Context: class Env: type Extra -// TODO: enable after https://github.com/lampepfl/dotty/issues/11700 is fixed +// TODO: enable after https://github.com/scala/scala3/issues/11700 is fixed // extension [Ctx <: Context](using ctx: Ctx)(tpe: ctx.Type)(using env: Env) // /** essentially: `extension (s: String) def &&:(b: Boolean)(i: Int)` // * but exercises the RefinedPrinter and safety of reordering parameters diff --git a/tests/run/i11706.scala b/tests/run/i11706.scala index 276ee408d266..f87f5697c35d 100644 --- a/tests/run/i11706.scala +++ b/tests/run/i11706.scala @@ -1,4 +1,4 @@ -// https://github.com/lampepfl/dotty/issues/11706 +// https://github.com/scala/scala3/issues/11706 import scala.compiletime.erasedValue object Obj: diff --git a/tests/run/i12032.scala b/tests/run/i12032.scala index 52358332e2c8..867a485e8222 100644 --- a/tests/run/i12032.scala +++ b/tests/run/i12032.scala @@ -1,4 +1,4 @@ -// https://github.com/lampepfl/dotty/issues/12032 +// https://github.com/scala/scala3/issues/12032 class Foo(val strings: Seq[String]) extends FooLowPriority trait FooLowPriority { self: Foo => diff --git a/tests/run/i13216.scala b/tests/run/i13216.scala index 174d0f200f31..59943672eac1 100644 --- a/tests/run/i13216.scala +++ b/tests/run/i13216.scala @@ -1,4 +1,4 @@ -// https://github.com/lampepfl/dotty/issues/13216 +// https://github.com/scala/scala3/issues/13216 import scala.annotation.targetName class C(s: String) extends AnyVal { diff --git a/tests/run/i13334.scala b/tests/run/i13334.scala index 2ee0987c13cc..4443114443e0 100644 --- a/tests/run/i13334.scala +++ b/tests/run/i13334.scala @@ -1,4 +1,4 @@ -// https://github.com/lampepfl/dotty/issues/13334 +// https://github.com/scala/scala3/issues/13334 trait DFC given DFC = new DFC {} diff --git a/tests/run/i13691b.scala b/tests/run/i13691b.scala index 1da726827467..eef09c38d431 100644 --- a/tests/run/i13691b.scala +++ b/tests/run/i13691b.scala @@ -1,4 +1,4 @@ -// https://github.com/lampepfl/dotty/issues/13691 +// https://github.com/scala/scala3/issues/13691 import language.experimental.saferExceptions trait Decoder[+T]: diff --git a/tests/run/i14582.scala b/tests/run/i14582.scala index bce33aa170b2..1f4d26dccf8d 100644 --- a/tests/run/i14582.scala +++ b/tests/run/i14582.scala @@ -1,4 +1,4 @@ -// https://github.com/lampepfl/dotty/issues/14582 +// https://github.com/scala/scala3/issues/14582 @main def Test() = val map = Map( "a" -> 1, diff --git a/tests/run/i15913.scala b/tests/run/i15913.scala index f3e98a3bfd6a..f3853e2f8cfe 100644 --- a/tests/run/i15913.scala +++ b/tests/run/i15913.scala @@ -1,4 +1,4 @@ -// https://github.com/lampepfl/dotty/issues/15913 +// https://github.com/lampepfl/scala/scala3/15913 class injector[F] diff --git a/tests/run/i4192/TestCases.scala b/tests/run/i4192/TestCases.scala index 33b4e458ebe7..7935b57d7fc2 100644 --- a/tests/run/i4192/TestCases.scala +++ b/tests/run/i4192/TestCases.scala @@ -62,7 +62,7 @@ class A { topLevel => new AA object AB { - val nestedOnce = this // self alias cannot be used uniformly here: https://github.com/lampepfl/dotty/issues/11648 + val nestedOnce = this // self alias cannot be used uniformly here: https://github.com/scala/scala3/issues/11648 checkMember(this, topLevel) diff --git a/tests/run/i4496b.scala b/tests/run/i4496b.scala index a6ed5b105e59..54459d15a8af 100644 --- a/tests/run/i4496b.scala +++ b/tests/run/i4496b.scala @@ -102,7 +102,7 @@ object Test { assert(consume(v) == 10) assert(consumeInl(v) == 10) assert(v.a == 10) - // Pending, per https://github.com/lampepfl/dotty/issues/4528. + // Pending, per https://github.com/scala/scala3/issues/4528. // v.a = 11 // assert(consume(v) == 11) // assert(consumeInl(v) == 11) diff --git a/tests/semanticdb/expect/Advanced.expect.scala b/tests/semanticdb/expect/Advanced.expect.scala index d36fcd611eef..0078bde9a078 100644 --- a/tests/semanticdb/expect/Advanced.expect.scala +++ b/tests/semanticdb/expect/Advanced.expect.scala @@ -43,7 +43,7 @@ object Test/*<-advanced::Test.*/ { } } - // see: https://github.com/lampepfl/dotty/pull/14608#discussion_r835642563 + // see: https://github.com/scala/scala3/pull/14608#discussion_r835642563 lazy val foo/*<-advanced::Test.foo.*/: (reflect.Selectable/*->scala::reflect::Selectable#*/ { type A/*<-local16*/ = Int/*->scala::Int#*/ }) &/*->scala::`&`#*/ (reflect.Selectable/*->scala::reflect::Selectable#*/ { type A/*<-local17*/ = Int/*->scala::Int#*/; val a/*<-local18*/: A/*->local17*/ }) = ???/*->scala::Predef.`???`().*/ def bar/*<-advanced::Test.bar().*/: foo/*->advanced::Test.foo.*/.A/*->local17*/ = foo/*->advanced::Test.foo.*/.a/*->scala::reflect::Selectable#selectDynamic().*/ } diff --git a/tests/semanticdb/expect/Advanced.scala b/tests/semanticdb/expect/Advanced.scala index 8e0d2f3a1692..dffd122aa798 100644 --- a/tests/semanticdb/expect/Advanced.scala +++ b/tests/semanticdb/expect/Advanced.scala @@ -43,7 +43,7 @@ object Test { } } - // see: https://github.com/lampepfl/dotty/pull/14608#discussion_r835642563 + // see: https://github.com/scala/scala3/pull/14608#discussion_r835642563 lazy val foo: (reflect.Selectable { type A = Int }) & (reflect.Selectable { type A = Int; val a: A }) = ??? def bar: foo.A = foo.a } From 7aa17c5f227c92f9c95cedee1801b68b143374e5 Mon Sep 17 00:00:00 2001 From: =?UTF-8?q?J=C4=99drzej=20Rochala?= <48657087+rochala@users.noreply.github.com> Date: Tue, 5 Mar 2024 14:36:11 +0100 Subject: [PATCH 191/371] Adjust owner in `Interactive.contextOfPath` causing crash in `ImplicitSearch` (#19875) `Interactive` provided us with the method `contextOfPath` which should return enclosing ctx for given position. It was working fine until given loop detection was improved some time ago. It started crashing as the context owner was set to original context owner, instead of the real owner. This PR changes this and sets context to its outer context owner. Fixes https://github.com/scalameta/metals/issues/6193 --- .../dotty/tools/dotc/interactive/Interactive.scala | 6 +++--- .../tools/languageserver/CompletionTest.scala | 14 ++++++++++++++ 2 files changed, 17 insertions(+), 3 deletions(-) diff --git a/compiler/src/dotty/tools/dotc/interactive/Interactive.scala b/compiler/src/dotty/tools/dotc/interactive/Interactive.scala index 928a9be6103b..a03ae502f2f1 100644 --- a/compiler/src/dotty/tools/dotc/interactive/Interactive.scala +++ b/compiler/src/dotty/tools/dotc/interactive/Interactive.scala @@ -297,14 +297,14 @@ object Interactive { else outer case tree @ Block(stats, expr) => - val localCtx = outer.fresh.setNewScope + val localCtx = outer.localContext(tree, outer.owner).setNewScope stats.foreach { case stat: MemberDef => localCtx.enter(stat.symbol) case _ => } - contextOfStat(stats, nested, ctx.owner, localCtx) + contextOfStat(stats, nested, localCtx.owner, localCtx) case tree @ CaseDef(pat, _, _) => - val localCtx = outer.fresh.setNewScope + val localCtx = outer.localContext(tree, outer.owner).setNewScope pat.foreachSubTree { case bind: Bind => localCtx.enter(bind.symbol) case _ => diff --git a/language-server/test/dotty/tools/languageserver/CompletionTest.scala b/language-server/test/dotty/tools/languageserver/CompletionTest.scala index 4c54637b367f..f3794c6f3468 100644 --- a/language-server/test/dotty/tools/languageserver/CompletionTest.scala +++ b/language-server/test/dotty/tools/languageserver/CompletionTest.scala @@ -35,6 +35,20 @@ class CompletionTest { .completion(("Conversion", Class, "Conversion")) } + @Test def implicitSearchCrash: Unit = + code""" + |object Test: + | trait Foo: + | def test(): String + | given Int = ??? + | given (using ev: Int): Conversion[String, Foo] = ??? + | + | val test = { + | "".tes$m1 + | 1 + | }""" + .completion(("test", Method, "(): String")) + @Test def completionFromScalaPackageObject: Unit = { code"class Foo { val foo: BigD${m1} }" .completion( From 3cb5d709df3bc6dddcaa22f2a17af10fae935485 Mon Sep 17 00:00:00 2001 From: =?UTF-8?q?Pawe=C5=82=20Marks?= Date: Tue, 12 Mar 2024 17:27:15 +0100 Subject: [PATCH 192/371] Add changelog for 3.4.1-RC2 --- changelogs/3.4.1-RC2.md | 18 ++++++++++++++++++ 1 file changed, 18 insertions(+) create mode 100644 changelogs/3.4.1-RC2.md diff --git a/changelogs/3.4.1-RC2.md b/changelogs/3.4.1-RC2.md new file mode 100644 index 000000000000..7267d2339c35 --- /dev/null +++ b/changelogs/3.4.1-RC2.md @@ -0,0 +1,18 @@ +# Backported fixes + +- Adjust owner in Interactive.contextOfPath causing crash in ImplicitSearch [#19875](https://github.com/lampepfl/dotty/pull/19875) +- Add GADT symbols when typing typing-ahead lambda bodies[#19644](https://github.com/lampepfl/dotty/pull/19644) + +# Contributors + +Thank you to all the contributors who made this release possible 🎉 + +According to `git shortlog -sn --no-merges 3.4.1-RC1..3.4.1-RC2` these are: + +``` + 4 Hamza REMMAL + 2 Dale Wijnand + 2 Paweł Marks + 1 Jędrzej Rochala + +``` From 4465edadd19f7f8f4ac693736c173fd0293869ee Mon Sep 17 00:00:00 2001 From: =?UTF-8?q?Pawe=C5=82=20Marks?= Date: Tue, 12 Mar 2024 17:28:38 +0100 Subject: [PATCH 193/371] Release 3.4.1-RC2 --- project/Build.scala | 4 ++-- 1 file changed, 2 insertions(+), 2 deletions(-) diff --git a/project/Build.scala b/project/Build.scala index 49e9e5163cd8..9166d3a5ce23 100644 --- a/project/Build.scala +++ b/project/Build.scala @@ -83,9 +83,9 @@ object DottyJSPlugin extends AutoPlugin { object Build { import ScaladocConfigs._ - val referenceVersion = "3.4.0" + val referenceVersion = "3.4.1-RC1" - val baseVersion = "3.4.1-RC1" + val baseVersion = "3.4.1-RC2" // Versions used by the vscode extension to create a new project // This should be the latest published releases. From d4d71f5a09bd31f2eb41d7364df276a095cd3b91 Mon Sep 17 00:00:00 2001 From: =?UTF-8?q?Kristof=20L=C3=BCnenschlo=C3=9F?= Date: Fri, 15 Mar 2024 22:23:16 +0100 Subject: [PATCH 194/371] Fix inline code formatting in documentation --- docs/_docs/reference/experimental/cc.md | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/docs/_docs/reference/experimental/cc.md b/docs/_docs/reference/experimental/cc.md index 5bdf91f628ec..fedc8fe66b65 100644 --- a/docs/_docs/reference/experimental/cc.md +++ b/docs/_docs/reference/experimental/cc.md @@ -138,7 +138,7 @@ This type is a shorthand for `(A -> B)^{c, d}`, i.e. the function type `A -> B` The impure function type `A => B` is treated as an alias for `A ->{cap} B`. That is, impure functions are functions that can capture anything. A capture annotation `^` binds more strongly than a function arrow. So -`A -> B^{c}` is read as `A` -> (B^{c})`. +`A -> B^{c}` is read as `A -> (B^{c})`. Analogous conventions apply to context function types. `A ?=> B` is an impure context function, with `A ?-> B` as its pure complement. From 5c4967f76a82a9b0aed1b5d29ed0ea6324a09f9c Mon Sep 17 00:00:00 2001 From: =?UTF-8?q?Pawe=C5=82=20Marks?= Date: Wed, 27 Mar 2024 13:51:03 +0100 Subject: [PATCH 195/371] Add changelog for 3.4.1 --- changelogs/3.4.1.md | 192 ++++++++++++++++++++++++++++++++++++++++++++ 1 file changed, 192 insertions(+) create mode 100644 changelogs/3.4.1.md diff --git a/changelogs/3.4.1.md b/changelogs/3.4.1.md new file mode 100644 index 000000000000..920c78f61e8f --- /dev/null +++ b/changelogs/3.4.1.md @@ -0,0 +1,192 @@ +# Highlights of the release + +- Add support for `@deprecatedInheritance` [#19082](https://github.com/lampepfl/dotty/pull/19082) +- Avoid generating given definitions that loop [#19282](https://github.com/lampepfl/dotty/pull/19282) + +# Other changes and fixes + +## Coverage + +- Correctly prettify names in coverage info [#18542](https://github.com/lampepfl/dotty/pull/18542) + +## Desugaring + +- Make apply proxies work with overloaded ctors [#19464](https://github.com/lampepfl/dotty/pull/19464) +- Fix possible crash in Desugar [#19567](https://github.com/lampepfl/dotty/pull/19567) + +## Documentation + +- Update `private[this]` deprecation warning and documentation [#19393](https://github.com/lampepfl/dotty/pull/19393) + +## Erasure + +- Make eraseInfo work for classes with EmptyScopes [#19550](https://github.com/lampepfl/dotty/pull/19550) + +## Exports + +- Do not propagate `@tailrec` to exported methods [#19509](https://github.com/lampepfl/dotty/pull/19509) +- Fix retained flags in exports [#19636](https://github.com/lampepfl/dotty/pull/19636) + +## GADTs + +- Only cache base types when gadt state is empty [#19562](https://github.com/lampepfl/dotty/pull/19562) +- Add GADT symbols when typing typing-ahead lambda bodies[#19644](https://github.com/lampepfl/dotty/pull/19644) + +## Implicits + +- Run CheckStatic after UncacheGivenAliases [#19318](https://github.com/lampepfl/dotty/pull/19318) +- Add tests to verify that crash is fixed elsewhere. Fixes #19328 [#19329](https://github.com/lampepfl/dotty/pull/19329) +- Don't search for implicit conversions to NoType [#19563](https://github.com/lampepfl/dotty/pull/19563) +- Instantiate argument type vars before implicit search [#19096](https://github.com/lampepfl/dotty/pull/19096) +- Adjust owner in Interactive.contextOfPath causing crash in ImplicitSearch [#19875](https://github.com/lampepfl/dotty/pull/19875) + +## Java Interop + +- Classfile reader: handle JDK 9+ constant types in constant pool [#19533](https://github.com/lampepfl/dotty/pull/19533) + +## Linting + +- Make fatal warnings not fail compilation early & aggregate warns [#19245](https://github.com/lampepfl/dotty/pull/19245) + +## Macro Annotations + +- Check and enter missing symbols in MacroAnnotations only for definitions [#19579](https://github.com/lampepfl/dotty/pull/19579) + +## Match Types + +- Normalize MatchAlias in unrollTupleTypes [#19565](https://github.com/lampepfl/dotty/pull/19565) +- Fix #19445: Remove too-strict test in match type matching. [#19511](https://github.com/lampepfl/dotty/pull/19511) + +## Opaque Types + +- Fix problems with cycle checks [#19453](https://github.com/lampepfl/dotty/pull/19453) + +## Parser + +- Fix(#18265): crash on extension method without type nor RHS [#18743](https://github.com/lampepfl/dotty/pull/18743) +- Warn when @volatile is used on vals [#19462](https://github.com/lampepfl/dotty/pull/19462) +- Fix(#16459) xml parse regression [#19531](https://github.com/lampepfl/dotty/pull/19531) + +## Pattern Matching + +- Fix false unreachable due to opaqueness [#19368](https://github.com/lampepfl/dotty/pull/19368) +- Improve recursive decompose prefix fix [#19375](https://github.com/lampepfl/dotty/pull/19375) +- Allow constraining a parameter to Nothing [#19397](https://github.com/lampepfl/dotty/pull/19397) +- Add a test case, proving i15661 is fixed [#19432](https://github.com/lampepfl/dotty/pull/19432) + +## Presentation Compiler + +- Improvement: Support completions for implicit classes [#19314](https://github.com/lampepfl/dotty/pull/19314) +- Chore: Backport changes from Metals [#19410](https://github.com/lampepfl/dotty/pull/19410) +- Fix goto-def on exported forwarders [#19494](https://github.com/lampepfl/dotty/pull/19494) +- Backport pc changes from metals [#19617](https://github.com/lampepfl/dotty/pull/19617) +- Chore: Backport changes from Metals [#19592](https://github.com/lampepfl/dotty/pull/19592) +- Use comma counting for all signature help types [#19520](https://github.com/lampepfl/dotty/pull/19520) +- Make PC more resilient to crashes [#19488](https://github.com/lampepfl/dotty/pull/19488) +- Make order of renames and missing imports deterministic [#19468](https://github.com/lampepfl/dotty/pull/19468) +- Chore: backport changes from metals [#19452](https://github.com/lampepfl/dotty/pull/19452) +- Improve signature help by more stable position calculation + better named arg support [#19214](https://github.com/lampepfl/dotty/pull/19214) +- Instantiate Type Vars in completion labels of extension methods [#18914](https://github.com/lampepfl/dotty/pull/18914) + +## Quotes + +- Only evaluate transparent inline unapply once [#19380](https://github.com/lampepfl/dotty/pull/19380) +- Update `staging.Compiler.make` documentation [#19428](https://github.com/lampepfl/dotty/pull/19428) +- Error instead of StaleSymbol crash for certain cyclic macro dependencies [#19549](https://github.com/lampepfl/dotty/pull/19549) +- Refine handling of StaleSymbol type errors [#19605](https://github.com/lampepfl/dotty/pull/19605) +- Fix module symbol recovery from `NoClassDefFoundError` [#19645](https://github.com/lampepfl/dotty/pull/19645) +- Fix HOAS pattern example and error message [#19655](https://github.com/lampepfl/dotty/pull/19655) +- Set the correct type when copying reflect Inlined trees [#19409](https://github.com/lampepfl/dotty/pull/19409) + +## Reporting + +- Don't explain erroneous bounds [#19338](https://github.com/lampepfl/dotty/pull/19338) +- Better error diagnostics for cyclic references [#19408](https://github.com/lampepfl/dotty/pull/19408) +- Properly identify empty bounds in error message [#19310](https://github.com/lampepfl/dotty/pull/19310) + +## Scala-JS + +- Fix #19528: Actually remove Dynamic from interfaces of native JS classes. [#19536](https://github.com/lampepfl/dotty/pull/19536) +- Consider static and non-static methods as non-double def [#19400](https://github.com/lampepfl/dotty/pull/19400) + +## Scaladoc + +- Scaladoc - add option for dynamic side menu [#19337](https://github.com/lampepfl/dotty/pull/19337) +- Scaladoc: Fix "case case Foo" in enum's cases [#19519](https://github.com/lampepfl/dotty/pull/19519) +- Fix(#19377): show inherited abstract members in dedicated section [#19552](https://github.com/lampepfl/dotty/pull/19552) +- Jsoup: 1.14.3 → 1.17.2 [#19564](https://github.com/lampepfl/dotty/pull/19564) +- Extend copyright into 2024 [#19367](https://github.com/lampepfl/dotty/pull/19367) + +## Tooling + +- Prioritize TASTy files over classfiles on classpath aggregation [#19431](https://github.com/lampepfl/dotty/pull/19431) + +## Transform + +- Fix purity check for val inside of object [#19598](https://github.com/lampepfl/dotty/pull/19598) +- Drop special treatment of function types in overloading resolution [#19654](https://github.com/lampepfl/dotty/pull/19654) +- Add checks for the consistency of the parents in TreeChecker [#18935](https://github.com/lampepfl/dotty/pull/18935) + +## Type Inference + +- More careful type variable instance improvements [#19659](https://github.com/lampepfl/dotty/pull/19659) + +## Typer + +- Reject wildcard types in using clauses [#19459](https://github.com/lampepfl/dotty/pull/19459) +- Don't leave underspecified SAM types in the code [#19461](https://github.com/lampepfl/dotty/pull/19461) +- Also compute base classes of wildcardTypes [#19465](https://github.com/lampepfl/dotty/pull/19465) +- Fix(#15784): ident rule for pat match was too strict [#19501](https://github.com/lampepfl/dotty/pull/19501) +- Heal occurrences of => T between ElimByName and Erasure [#19558](https://github.com/lampepfl/dotty/pull/19558) +- Fix(#i18645): overload ext method body in braces didn't compile [#19651](https://github.com/lampepfl/dotty/pull/19651) +- Fix #19202: Passing NotNullInfos to a mutable field of a Completer [#19463](https://github.com/lampepfl/dotty/pull/19463) +- Fix Java record problems (#19578) and (#19386) [#19583](https://github.com/lampepfl/dotty/pull/19583) +- Improve when deprecation warnings are emitted [#19621](https://github.com/lampepfl/dotty/pull/19621) +- Space: Replace showType & make Space Showable [#19370](https://github.com/lampepfl/dotty/pull/19370) + + +# Contributors + +Thank you to all the contributors who made this release possible 🎉 + +According to `git shortlog -sn --no-merges 3.4.0..3.4.1` these are: + +``` + 53 Martin Odersky + 53 Nicolas Stucki + 22 Dale Wijnand + 11 Szymon Rodziewicz + 11 i10416 + 8 Hamza REMMAL + 7 Paweł Marks + 7 noti0na1 + 6 Yilin Wei + 5 Jędrzej Rochala + 3 Eugene Flesselle + 3 Seth Tisue + 2 Florian3k + 2 Hamza Remmal + 2 Jan Chyb + 2 Katarzyna Marek + 2 Sébastien Doeraene + 2 Tomasz Godzik + 2 dependabot[bot] + 1 Bersier + 1 Fabián Heredia Montiel + 1 Jakub Ciesluk + 1 Jakub Cieśluk + 1 Kacper Korban + 1 Kenji Yoshida + 1 Mehdi Alaoui + 1 Nikita Gazarov + 1 Oron Port + 1 Pascal Weisenburger + 1 Philippus Baalman + 1 Quentin Bernet + 1 Som Snytt + 1 Wojciech Mazur + 1 Yichen Xu + 1 aherlihy + 1 rochala + +``` From 3ffe3223f3dc82aa8e0cc65d7666ba39ed074091 Mon Sep 17 00:00:00 2001 From: =?UTF-8?q?Pawe=C5=82=20Marks?= Date: Wed, 27 Mar 2024 14:16:10 +0100 Subject: [PATCH 196/371] Release 3.4.1 --- project/Build.scala | 4 ++-- tasty/src/dotty/tools/tasty/TastyFormat.scala | 4 ++-- 2 files changed, 4 insertions(+), 4 deletions(-) diff --git a/project/Build.scala b/project/Build.scala index 9166d3a5ce23..b8bcbde97e7b 100644 --- a/project/Build.scala +++ b/project/Build.scala @@ -83,9 +83,9 @@ object DottyJSPlugin extends AutoPlugin { object Build { import ScaladocConfigs._ - val referenceVersion = "3.4.1-RC1" + val referenceVersion = "3.4.0" - val baseVersion = "3.4.1-RC2" + val baseVersion = "3.4.1" // Versions used by the vscode extension to create a new project // This should be the latest published releases. diff --git a/tasty/src/dotty/tools/tasty/TastyFormat.scala b/tasty/src/dotty/tools/tasty/TastyFormat.scala index e17c98234691..b5ca6f45f594 100644 --- a/tasty/src/dotty/tools/tasty/TastyFormat.scala +++ b/tasty/src/dotty/tools/tasty/TastyFormat.scala @@ -318,7 +318,7 @@ object TastyFormat { * compatibility, but remains backwards compatible, with all * preceding `MinorVersion`. */ - final val MinorVersion: Int = 5 + final val MinorVersion: Int = 4 /** Natural Number. The `ExperimentalVersion` allows for * experimentation with changes to TASTy without committing @@ -334,7 +334,7 @@ object TastyFormat { * is able to read final TASTy documents if the file's * `MinorVersion` is strictly less than the current value. */ - final val ExperimentalVersion: Int = 1 + final val ExperimentalVersion: Int = 0 /**This method implements a binary relation (`<:<`) between two TASTy versions. * From ab06ff6a87b99c5500eea4d7895d12bd06e2eff7 Mon Sep 17 00:00:00 2001 From: =?UTF-8?q?Pawe=C5=82=20Marks?= Date: Thu, 28 Mar 2024 11:31:14 +0100 Subject: [PATCH 197/371] Bring back old completions as a deprecated method --- .../src/dotty/tools/repl/ReplDriver.scala | 20 ++++++++++++++++--- compiler/test/dotty/tools/repl/ReplTest.scala | 2 +- 2 files changed, 18 insertions(+), 4 deletions(-) diff --git a/compiler/src/dotty/tools/repl/ReplDriver.scala b/compiler/src/dotty/tools/repl/ReplDriver.scala index f8bba2f59fe1..0d64c88d9228 100644 --- a/compiler/src/dotty/tools/repl/ReplDriver.scala +++ b/compiler/src/dotty/tools/repl/ReplDriver.scala @@ -163,7 +163,7 @@ class ReplDriver(settings: Array[String], /* complete = */ false // if true adds space when completing ) } - val comps = completions(line.cursor, line.line, state) + val comps = completionsWithSignatures(line.cursor, line.line, state) candidates.addAll(comps.map(_.label).distinct.map(makeCandidate).asJava) val lineWord = line.word() comps.filter(c => c.label == lineWord && c.symbols.nonEmpty) match @@ -255,8 +255,22 @@ class ReplDriver(settings: Array[String], else label + @deprecated("Use completionsWithSignatures instead", "3.4.2") + protected final def completions(cursor: Int, expr: String, state0: State): List[Candidate] = + completionsWithSignatures(cursor, expr, state0).map: c => + new Candidate( + /* value = */ c.label, + /* displ = */ stripBackTicks(c.label), // displayed value + /* group = */ null, // can be used to group completions together + /* descr = */ null, // TODO use for documentation? + /* suffix = */ null, + /* key = */ null, + /* complete = */ false // if true adds space when completing + ) + end completions + /** Extract possible completions at the index of `cursor` in `expr` */ - protected final def completions(cursor: Int, expr: String, state0: State): List[Completion] = + protected final def completionsWithSignatures(cursor: Int, expr: String, state0: State): List[Completion] = if expr.startsWith(":") then ParseResult.commands.collect { case command if command._1.startsWith(expr) => Completion(command._1, "", List()) @@ -275,7 +289,7 @@ class ReplDriver(settings: Array[String], try Completion.completions(srcPos)._2 catch case NonFatal(_) => Nil } .getOrElse(Nil) - end completions + end completionsWithSignatures protected def interpret(res: ParseResult, quiet: Boolean = false)(using state: State): State = { res match { diff --git a/compiler/test/dotty/tools/repl/ReplTest.scala b/compiler/test/dotty/tools/repl/ReplTest.scala index 3e827a0f1e36..3925b61d7de0 100644 --- a/compiler/test/dotty/tools/repl/ReplTest.scala +++ b/compiler/test/dotty/tools/repl/ReplTest.scala @@ -42,7 +42,7 @@ extends ReplDriver(options, new PrintStream(out, true, StandardCharsets.UTF_8.na /** Returns the `(, )`*/ def tabComplete(src: String)(implicit state: State): List[String] = - completions(src.length, src, state).map(_.label).sorted.distinct + completionsWithSignatures(src.length, src, state).map(_.label).sorted.distinct extension [A](state: State) infix def andThen(op: State ?=> A): A = op(using state) From 76cd2dc26bd0b720b30425b5062b445e3bf4c810 Mon Sep 17 00:00:00 2001 From: =?UTF-8?q?Pawe=C5=82=20Marks?= Date: Thu, 28 Mar 2024 12:09:35 +0100 Subject: [PATCH 198/371] Add changelog for 3.4.2-RC1 --- changelogs/3.4.2-RC1.md | 209 ++++++++++++++++++++++++++++++++++++++++ 1 file changed, 209 insertions(+) create mode 100644 changelogs/3.4.2-RC1.md diff --git a/changelogs/3.4.2-RC1.md b/changelogs/3.4.2-RC1.md new file mode 100644 index 000000000000..464a5f6b086a --- /dev/null +++ b/changelogs/3.4.2-RC1.md @@ -0,0 +1,209 @@ +# Highlights of the release + +- Bump JLine 3.19.0 -> 3.24.1 & sbt 1.9.7 -> 1.9.9 [#19744](https://github.com/lampepfl/dotty/pull/19744) +- Refactor settings & improve dx [#19766](https://github.com/lampepfl/dotty/pull/19766) +- Publish `scala2-library-tasty-experimental` [#19588](https://github.com/lampepfl/dotty/pull/19588) +- Repl - method signatures in autocomplete [#19917](https://github.com/lampepfl/dotty/pull/19917) + +# Other changes and fixes + +## Annotations + +- Attempt implicit search for old style `implicit` parameters in Application matchArgs [#19737](https://github.com/lampepfl/dotty/pull/19737) + +## Backend + +- Fix(#17255): cannot find Scala companion module from Java [#19773](https://github.com/lampepfl/dotty/pull/19773) +- Change isStatic to isStaticOwner in hasLocalInstantiation [#19803](https://github.com/lampepfl/dotty/pull/19803) + +## Coverage + +- Port coverage filter options for packages and files [#19727](https://github.com/lampepfl/dotty/pull/19727) + +## Default parameters + +- Lift all non trivial prefixes for default parameters [#19739](https://github.com/lampepfl/dotty/pull/19739) + +## Doctool + +- Prevent HTML/XSS Injection in Scala Search [#19980](https://github.com/lampepfl/dotty/pull/19980) +- Parse search query param in Scaladoc [#19669](https://github.com/lampepfl/dotty/pull/19669) + +## Experimental: Capture Checking + +- Disallow covariant `cap`s in the lower bound of type members [#19624](https://github.com/lampepfl/dotty/pull/19624) +- Ignore orphan parameters inside a retains annotation during Ycheck [#19684](https://github.com/lampepfl/dotty/pull/19684) +- Fix the pickling of `This` inside capture sets [#19797](https://github.com/lampepfl/dotty/pull/19797) +- Add updated to SeqViewOps [#19798](https://github.com/lampepfl/dotty/pull/19798) +- Fix Function tree copier [#19822](https://github.com/lampepfl/dotty/pull/19822) +- Drop FreeSeqFactory from stdlib-cc [#19849](https://github.com/lampepfl/dotty/pull/19849) +- Fix i19859 [#19860](https://github.com/lampepfl/dotty/pull/19860) +- Various fixes to stdlib-cc [#19873](https://github.com/lampepfl/dotty/pull/19873) +- Add more methods in `SeqViewOps` [#19993](https://github.com/lampepfl/dotty/pull/19993) +- Check `This` references in `refersToParamOf` [#20005](https://github.com/lampepfl/dotty/pull/20005) + +## Exports + +- Fix the tparam bounds of exported inherited classes [#18647](https://github.com/lampepfl/dotty/pull/18647) + +## Implicits + +- Prefer extensions over conversions for member selection [#19717](https://github.com/lampepfl/dotty/pull/19717) +- Don't allow implicit conversions on prefixes of type selections [#19934](https://github.com/lampepfl/dotty/pull/19934) +- Make sure typeParams returns a stable result even in the presence of completions [#19974](https://github.com/lampepfl/dotty/pull/19974) + +## Incremental Compilation + +- Fix undercompilation upon ctor change [#19911](https://github.com/lampepfl/dotty/pull/19911) +- Load but not enter case accessors fields in Scala2Unpickler [#19926](https://github.com/lampepfl/dotty/pull/19926) + +## Initialization + +- Add supports for type cast and filtering type for field and method owner in global initialization checker [#19612](https://github.com/lampepfl/dotty/pull/19612) +- Added a second trace for global init checker showing creation of mutable fields [#19996](https://github.com/lampepfl/dotty/pull/19996) +- Suppressing repetitive warnings in the global initialization checker [#19898](https://github.com/lampepfl/dotty/pull/19898) + +## Inline + +- Specialized retained inline FunctionN apply methods [#19801](https://github.com/lampepfl/dotty/pull/19801) +- Avoid crash after StopMacroExpansion [#19883](https://github.com/lampepfl/dotty/pull/19883) +- Check deprecation of inline methods [#19914](https://github.com/lampepfl/dotty/pull/19914) +- Inline transparent implicit parameters when typing Unapply trees [#19646](https://github.com/lampepfl/dotty/pull/19646) +- Restore pre-3.3.2 behavior of `inline implicit def` [#19877](https://github.com/lampepfl/dotty/pull/19877) + +## Match Types + +- Cover patterns using `reflect.TypeTest` in isMatchTypeShaped [#19923](https://github.com/lampepfl/dotty/pull/19923) +- Rework MatchType recursion in collectParts [#19867](https://github.com/lampepfl/dotty/pull/19867) + +## Nullability + +- Fix #19808: Don't force to compute the owner of a symbol when there is no denotation [#19813](https://github.com/lampepfl/dotty/pull/19813) + +## Parser + +- Add support for JEP-409 (sealed classes) + Add javacOpt directive [#19080](https://github.com/lampepfl/dotty/pull/19080) +- Fix(#16458): regression in xml syntax parsing [#19522](https://github.com/lampepfl/dotty/pull/19522) +- Fix parsing of conditional expressions in parentheses [#19985](https://github.com/lampepfl/dotty/pull/19985) + +## Presentation Compiler + +- Allow range selection on function parameter to select a parameter list [#19777](https://github.com/lampepfl/dotty/pull/19777) + +## Quotes + +- Disallow ill-staged references to local classes [#19869](https://github.com/lampepfl/dotty/pull/19869) +- Add regression test for #19909 [#19915](https://github.com/lampepfl/dotty/pull/19915) +- Detect non `Expr[..]` splice patterns [#19944](https://github.com/lampepfl/dotty/pull/19944) +- Avoid spurious `val` binding in quote pattern [#19948](https://github.com/lampepfl/dotty/pull/19948) +- Add regression test and imporve -Xprint-suspension message [#19688](https://github.com/lampepfl/dotty/pull/19688) + +## REPL + +- Repl truncation copes with null [#17336](https://github.com/lampepfl/dotty/pull/17336) +- Catch stackoverflow errors in the highlighter [#19836](https://github.com/lampepfl/dotty/pull/19836) +- Fix a REPL bad symbolic reference [#19786](https://github.com/lampepfl/dotty/pull/19786) + +## Reflection + +- Fix `TypeTreeTypeTest` to not match `TypeBoundsTree`s [#19485](https://github.com/lampepfl/dotty/pull/19485) +- Improve message when tree cannot be shown as source [#19906](https://github.com/lampepfl/dotty/pull/19906) +- Fix #19732: quotes.reflect.Ref incorrectly casting `This` to `RefTree` [#19930](https://github.com/lampepfl/dotty/pull/19930) +- Add check for parents in Quotes (#19842) [#19870](https://github.com/lampepfl/dotty/pull/19870) + +## Reporting + +- Improve error reporting for missing members [#19800](https://github.com/lampepfl/dotty/pull/19800) +- Avoid repetitions in name hints [#19975](https://github.com/lampepfl/dotty/pull/19975) +- Improve error message when using experimental definitions [#19782](https://github.com/lampepfl/dotty/pull/19782) +- Make -Xprompt work as desired under -Werror [#19765](https://github.com/lampepfl/dotty/pull/19765) +- Fix #19402: emit proper error in absence of using in given definitions [#19714](https://github.com/lampepfl/dotty/pull/19714) +- Bugfix: Choose correct signature is signatureHelp for overloaded methods [#19707](https://github.com/lampepfl/dotty/pull/19707) +- Unify completion pos usage, fix presentation compiler crash in interpolation [#19614](https://github.com/lampepfl/dotty/pull/19614) + +## Scaladoc + +- Fix(#16610): warn ignored Scaladoc on multiple enum cases [#19555](https://github.com/lampepfl/dotty/pull/19555) + +## TASTy format + +- Add patch for undefined behavior with `object $` [#19705](https://github.com/lampepfl/dotty/pull/19705) +- Fix(#19806): wrong tasty of scala module class reference [#19827](https://github.com/lampepfl/dotty/pull/19827) +- Used derived types to type arguments of dependent function type [#19838](https://github.com/lampepfl/dotty/pull/19838) + +## Tooling + +- Java TASTy: use new threadsafe writer implementation [#19690](https://github.com/lampepfl/dotty/pull/19690) +- Remove `-Yforce-inline-while-typing` [#19889](https://github.com/lampepfl/dotty/pull/19889) +- Cleanup unnecessary language flag [#19865](https://github.com/lampepfl/dotty/pull/19865) +- Bugfix: Auto imports in worksheets in Scala 3 [#19793](https://github.com/lampepfl/dotty/pull/19793) +- Refine behavior of `-Yno-experimental` [#19741](https://github.com/lampepfl/dotty/pull/19741) + +## Transform + +- Short-circuit isCheckable with classSymbol [#19634](https://github.com/lampepfl/dotty/pull/19634) +- Avoid eta-reduction of `(..., f: T => R, ...) => f.apply(..)` into `f` [#19966](https://github.com/lampepfl/dotty/pull/19966) +- Tweak parameter accessor scheme [#19719](https://github.com/lampepfl/dotty/pull/19719) + +## Typer + +- Update phrasing for NotClassType explain error message [#19635](https://github.com/lampepfl/dotty/pull/19635) +- Fix java typer problems with inner class references and raw types [#19747](https://github.com/lampepfl/dotty/pull/19747) +- Approximate MatchTypes with lub of case bodies, if non-recursive [#19761](https://github.com/lampepfl/dotty/pull/19761) +- Revert broken changes with transparent inline [#19922](https://github.com/lampepfl/dotty/pull/19922) +- Delay hard argument comparisons [#20007](https://github.com/lampepfl/dotty/pull/20007) +- Fix #19607: Allow to instantiate *wildcard* type captures to TypeBounds. [#19627](https://github.com/lampepfl/dotty/pull/19627) +- Fix #19907: Skip soft unions in widenSingle of widenInferred [#19995](https://github.com/lampepfl/dotty/pull/19995) +- Fix untupling of functions in for comprehensions [#19620](https://github.com/lampepfl/dotty/pull/19620) + +# Contributors + +Thank you to all the contributors who made this release possible 🎉 + +According to `git shortlog -sn --no-merges 3.4.1..3.42-RC1` these are: + +``` + 46 Nicolas Stucki + 33 Martin Odersky + 25 Dale Wijnand + 22 Hamza REMMAL + 18 Yichen Xu + 17 Jamie Thompson + 15 Szymon Rodziewicz + 11 EnzeXing + 11 i10416 + 7 Paweł Marks + 6 Kacper Korban + 4 Dan13llljws + 4 Katarzyna Marek + 4 Matt Bovel + 4 Som Snytt + 4 noti0na1 + 3 110416 + 3 Eugene Flesselle + 3 Sébastien Doeraene + 3 dependabot[bot] + 2 Bersier + 2 Hamza Remmal + 2 Jakub Ciesluk + 2 João Costa + 2 Jędrzej Rochala + 2 Natsu Kagami + 2 Stephane Bersier + 2 Taro L. Saito + 2 aherlihy + 1 Aleksander Boruch-Gruszecki + 1 Aviv Keller + 1 Eugene Yokota + 1 Guillaume Martres + 1 Jan Chyb + 1 Lukas Rytz + 1 Mikołaj Fornal + 1 Olga Mazhara + 1 Ondřej Lhoták + 1 Robert Stoll + 1 Seth Tisue + 1 Valentin Schneeberger + 1 Yilin Wei + 1 willerf +``` From 4029577068587ddc523e84fa39849446f83c0ea6 Mon Sep 17 00:00:00 2001 From: =?UTF-8?q?Pawe=C5=82=20Marks?= Date: Thu, 28 Mar 2024 12:10:26 +0100 Subject: [PATCH 199/371] Release 3.4.2-RC1 --- project/Build.scala | 4 ++-- tasty/src/dotty/tools/tasty/TastyFormat.scala | 4 ++-- 2 files changed, 4 insertions(+), 4 deletions(-) diff --git a/project/Build.scala b/project/Build.scala index e3b4ed789a1f..a5569c0d8888 100644 --- a/project/Build.scala +++ b/project/Build.scala @@ -83,7 +83,7 @@ object DottyJSPlugin extends AutoPlugin { object Build { import ScaladocConfigs._ - val referenceVersion = "3.4.0" + val referenceVersion = "3.4.1" val baseVersion = "3.4.2-RC1" @@ -104,7 +104,7 @@ object Build { * set to 3.1.3. If it is going to be 3.1.0, it must be set to the latest * 3.0.x release. */ - val previousDottyVersion = "3.4.0" + val previousDottyVersion = "3.4.1" /** Version against which we check binary compatibility. */ val ltsDottyVersion = "3.3.0" diff --git a/tasty/src/dotty/tools/tasty/TastyFormat.scala b/tasty/src/dotty/tools/tasty/TastyFormat.scala index e17c98234691..b5ca6f45f594 100644 --- a/tasty/src/dotty/tools/tasty/TastyFormat.scala +++ b/tasty/src/dotty/tools/tasty/TastyFormat.scala @@ -318,7 +318,7 @@ object TastyFormat { * compatibility, but remains backwards compatible, with all * preceding `MinorVersion`. */ - final val MinorVersion: Int = 5 + final val MinorVersion: Int = 4 /** Natural Number. The `ExperimentalVersion` allows for * experimentation with changes to TASTy without committing @@ -334,7 +334,7 @@ object TastyFormat { * is able to read final TASTy documents if the file's * `MinorVersion` is strictly less than the current value. */ - final val ExperimentalVersion: Int = 1 + final val ExperimentalVersion: Int = 0 /**This method implements a binary relation (`<:<`) between two TASTy versions. * From cbd8408aa874682ce4b4a98a40d69cf493b72f28 Mon Sep 17 00:00:00 2001 From: Raphael Jolly Date: Fri, 5 Apr 2024 15:34:58 +0200 Subject: [PATCH 200/371] Improve documentation of implicit conversions --- .../reference/changed-features/implicit-conversions-spec.md | 5 +++-- docs/_spec/07-implicits.md | 4 ++-- 2 files changed, 5 insertions(+), 4 deletions(-) diff --git a/docs/_docs/reference/changed-features/implicit-conversions-spec.md b/docs/_docs/reference/changed-features/implicit-conversions-spec.md index dc19e10c8b8f..8f14e69fd214 100644 --- a/docs/_docs/reference/changed-features/implicit-conversions-spec.md +++ b/docs/_docs/reference/changed-features/implicit-conversions-spec.md @@ -43,8 +43,9 @@ Views are applied in three situations: `v` which is applicable to `e` and whose result contains a method `m` which is applicable to `args` is searched. The search proceeds as in the case of implicit parameters, where the implicit scope is - the one of `T`. If such a view is found, the application - `e.m(args)` is converted to `v(e).m(args)`. + the one of `T => pt`, with `pt` being the structural type + `{ def m(args: T_1 , ... , T_n): U }`. If such a view is found, + the application `e.m(args)` is converted to `v(e).m(args)`. # Differences with Scala 2 implicit conversions diff --git a/docs/_spec/07-implicits.md b/docs/_spec/07-implicits.md index 2cd80f227cd4..29cfbdc24107 100644 --- a/docs/_spec/07-implicits.md +++ b/docs/_spec/07-implicits.md @@ -63,7 +63,7 @@ The _parts_ of a type ´T´ are: - if ´T´ is a type projection `´S´#´U´`, the parts of ´S´ as well as ´T´ itself; - if ´T´ is a type alias, the parts of its expansion; - if ´T´ is an abstract type, the parts of its upper bound; -- if ´T´ denotes an implicit conversion to a type with a method with argument types ´T_1, ..., T_n´ and result type ´U´, the union of the parts of ´T_1, ..., T_n´ and ´U´; +- if ´T´ is a structural type with a method with argument types ´T_1, ..., T_n´ and result type ´U´, the union of the parts of ´T_1, ..., T_n´ and ´U´; - in all other cases, just ´T´ itself. Note that packages are internally represented as classes with companion modules to hold the package members. @@ -288,7 +288,7 @@ The search proceeds as in the case of implicit parameters, where the implicit sc If such a view is found, the selection ´e.m´ is converted to `´v´(´e´).´m´`. 1. In a selection ´e.m(\mathit{args})´ with ´e´ of type ´T´, if the selector ´m´ denotes some member(s) of ´T´, but none of these members is applicable to the arguments ´\mathit{args}´. In this case a view ´v´ is searched which is applicable to ´e´ and whose result contains a method ´m´ which is applicable to ´\mathit{args}´. -The search proceeds as in the case of implicit parameters, where the implicit scope is the one of ´T´. If such a view is found, the selection ´e.m´ is converted to `´v´(´e´).´m(\mathit{args})´`. +The search proceeds as in the case of implicit parameters, where the implicit scope is the one of `´T´ => ´\mathit{pt}´`, with ´\mathit{pt}´ being the structural type ´{ def m(\mathit{args}: T_1 , ... , T_n): U }´. If such a view is found, the selection ´e.m´ is converted to `´v´(´e´).´m(\mathit{args})´`. The implicit view, if it is found, can accept its argument ´e´ as a call-by-value or as a call-by-name parameter. However, call-by-value implicits take precedence over call-by-name implicits. From 56b276f974b97387209feb5cb1ef70fccc1ccb19 Mon Sep 17 00:00:00 2001 From: Guillaume Martres Date: Wed, 10 Apr 2024 14:13:27 +0200 Subject: [PATCH 201/371] Implement match type amendment: extractors follow aliases and singletons This implements the change proposed in https://github.com/scala/improvement-proposals/pull/84. The added pos test case presents motivating examples, the added neg test cases demonstrate that errors are correctly reported when cycles are present. The potential for cycle is no worse than with the existing extraction logic as demonstrated by the existing test in `tests/neg/mt-deskolemize.scala`. [Cherry-picked 1a235c6719f56e1597241dc38eeda49087b323e8] --- .../dotty/tools/dotc/core/TypeComparer.scala | 65 +++++++++++++++++-- tests/neg/mt-deskolemize.scala | 42 ++++++++++++ tests/pos/mt-deskolemize.scala | 55 ++++++++++++++++ 3 files changed, 157 insertions(+), 5 deletions(-) create mode 100644 tests/pos/mt-deskolemize.scala diff --git a/compiler/src/dotty/tools/dotc/core/TypeComparer.scala b/compiler/src/dotty/tools/dotc/core/TypeComparer.scala index cee1ec7fffa8..dad159ace55f 100644 --- a/compiler/src/dotty/tools/dotc/core/TypeComparer.scala +++ b/compiler/src/dotty/tools/dotc/core/TypeComparer.scala @@ -3518,20 +3518,75 @@ class MatchReducer(initctx: Context) extends TypeComparer(initctx) { false case MatchTypeCasePattern.TypeMemberExtractor(typeMemberName, capture) => + /** Try to remove references to `skolem` from a type in accordance with the spec. + * + * If any reference to `skolem` remains in the result type, + * `refersToSkolem` is set to true. + */ + class DropSkolemMap(skolem: SkolemType) extends TypeMap: + var refersToSkolem = false + def apply(tp: Type): Type = + tp match + case `skolem` => + refersToSkolem = true + tp + case tp: NamedType => + var savedRefersToSkolem = refersToSkolem + refersToSkolem = false + try + val pre1 = apply(tp.prefix) + if refersToSkolem then + tp match + case tp: TermRef => tp.info.widenExpr.dealias match + case info: SingletonType => + refersToSkolem = false + apply(info) + case _ => + tp.derivedSelect(pre1) + case tp: TypeRef => tp.info match + case info: AliasingBounds => + refersToSkolem = false + apply(info.alias) + case _ => + tp.derivedSelect(pre1) + else + tp.derivedSelect(pre1) + finally + refersToSkolem |= savedRefersToSkolem + case tp: LazyRef => + // By default, TypeMap maps LazyRefs lazily. We need to + // force it for `refersToSkolem` to be correctly set. + apply(tp.ref) + case _ => + mapOver(tp) + end DropSkolemMap + /** Try to remove references to `skolem` from `u` in accordance with the spec. + * + * If any reference to `skolem` remains in the result type, return + * NoType instead. + */ + def dropSkolem(u: Type, skolem: SkolemType): Type = + val dmap = DropSkolemMap(skolem) + val res = dmap(u) + if dmap.refersToSkolem then NoType else res + val stableScrut: SingletonType = scrut match case scrut: SingletonType => scrut case _ => SkolemType(scrut) + stableScrut.member(typeMemberName) match case denot: SingleDenotation if denot.exists => val info = denot.info match case alias: AliasingBounds => alias.alias // Extract the alias case ClassInfo(prefix, cls, _, _, _) => prefix.select(cls) // Re-select the class from the prefix case info => info // Notably, RealTypeBounds, which will eventually give a MatchResult.NoInstances - val infoRefersToSkolem = stableScrut.isInstanceOf[SkolemType] && stableScrut.occursIn(info) - val info1 = info match - case info: TypeBounds => info // Will already trigger a MatchResult.NoInstances - case _ if infoRefersToSkolem => RealTypeBounds(info, info) // Explicitly trigger a MatchResult.NoInstances - case _ => info // We have a match + val info1 = stableScrut match + case skolem: SkolemType => + dropSkolem(info, skolem).orElse: + info match + case info: TypeBounds => info // Will already trigger a MatchResult.NoInstances + case _ => RealTypeBounds(info, info) // Explicitly trigger a MatchResult.NoInstances + case _ => info rec(capture, info1, variance = 0, scrutIsWidenedAbstract) case _ => false diff --git a/tests/neg/mt-deskolemize.scala b/tests/neg/mt-deskolemize.scala index 0a58d5db7bc4..505e47637ac4 100644 --- a/tests/neg/mt-deskolemize.scala +++ b/tests/neg/mt-deskolemize.scala @@ -14,3 +14,45 @@ class SimpleLoop2 extends Expr: object Test1: val x: ExtractValue[SimpleLoop1] = 1 // error + +trait Description: + type Elem <: Tuple + +class PrimBroken extends Expr: + type Value = Alias + type Alias = Value // error + +class Prim extends Expr: + type Value = BigInt + +class VecExpr[E <: Expr] extends Expr: + type Value = Vector[ExtractValue[E]] + +trait ProdExpr extends Expr: + val description: Description + type Value = Tuple.Map[description.Elem, [X] =>> ExtractValue[X & Expr]] + + +class MyExpr1 extends ProdExpr: + final val description = new Description: + type Elem = (VecExpr[Prim], MyExpr2) + +class MyExpr2 extends ProdExpr: + final val description = new Description: + type Elem = (VecExpr[VecExpr[MyExpr1]], Prim) + +trait Constable[E <: Expr]: + def lit(v: ExtractValue[E]): E +object Constable: + given [E <: Expr]: Constable[E] = ??? + +object Test2: + def fromLiteral[E <: Expr : Constable](v: ExtractValue[E]): E = + summon[Constable[E]].lit(v) + val x0: ExtractValue[Prim] = "" // error + val x1: ExtractValue[PrimBroken] = 1 // error + + val foo: MyExpr2 = new MyExpr2 + val v: foo.Value = (Vector(Vector()), 1) // error: Recursion limit exceeded + val c: MyExpr2 = fromLiteral: + (Vector(Vector()), 1) // error: Recursion limit exceeded diff --git a/tests/pos/mt-deskolemize.scala b/tests/pos/mt-deskolemize.scala new file mode 100644 index 000000000000..34f38289b24d --- /dev/null +++ b/tests/pos/mt-deskolemize.scala @@ -0,0 +1,55 @@ +trait Expr: + type Value + +object Expr: + type Of[V] = Expr { type Value = V } + type ExtractValue[F <: Expr] = F match + case Expr.Of[v] => v +import Expr.ExtractValue + +class Prim extends Expr: + type Value = Alias + type Alias = BigInt + +class VecExpr[E <: Expr] extends Expr: + type Value = Vector[ExtractValue[E]] + +trait Description: + type Elem <: Tuple + +trait ProdExpr extends Expr: + val description: Description + type Value = Tuple.Map[description.Elem, [X] =>> ExtractValue[X & Expr]] + +class MyExpr1 extends ProdExpr: + final val description = new Description: + type Elem = (VecExpr[Prim], Prim) + +class MyExpr2 extends ProdExpr: + final val description = new Description: + type Elem = (VecExpr[VecExpr[MyExpr1]], Prim) + +trait ProdExprAlt[T <: Tuple] extends Expr: + type Value = Tuple.Map[T, [X] =>> ExtractValue[X & Expr]] + +class MyExpr3 extends ProdExprAlt[(Prim, VecExpr[Prim], Prim)] + +trait Constable[E <: Expr]: + def lit(v: ExtractValue[E]): E +object Constable: + given [E <: Expr]: Constable[E] = ??? + +object Test: + def fromLiteral[E <: Expr : Constable](v: ExtractValue[E]): E = + summon[Constable[E]].lit(v) + val a: Prim = fromLiteral(1) + val b: VecExpr[Prim] = fromLiteral(Vector(1)) + val c: MyExpr1 = fromLiteral((Vector(1), 1)) + val d: MyExpr2 = fromLiteral(Vector(Vector((Vector(1), 1))), 2) + val e: MyExpr3 = fromLiteral((1, Vector(1), 1)) + val f: ProdExprAlt[(MyExpr1, VecExpr[MyExpr3])] = fromLiteral: + ( + (Vector(1), 1), + Vector((1, Vector(1), 1), (2, Vector(1), 2)) + ) + val g: Expr { type Alias = Int; type Value = Alias } = fromLiteral(1) From bfa18520c76820e16b2274ed9b8850c663c15ab0 Mon Sep 17 00:00:00 2001 From: =?UTF-8?q?Pawe=C5=82=20Marks?= Date: Thu, 9 May 2024 16:24:14 +0200 Subject: [PATCH 202/371] Move logic under feature.experimental.betterMatchTypesExtractors This way we can merge this PR without waiting for the SIP committee to approve it. [Cherry-picked 61b5a7b6a52f32c68a4f3aa8842f6c4850349b87][modified] --- .../src/dotty/tools/dotc/config/Feature.scala | 3 + .../dotty/tools/dotc/core/TypeComparer.scala | 11 +++- .../runtime/stdLibPatches/language.scala | 7 +++ tests/neg/mt-deskolemize-2.scala | 60 +++++++++++++++++++ tests/neg/mt-deskolemize.scala | 42 ------------- tests/pos/mt-deskolemize.scala | 2 + 6 files changed, 80 insertions(+), 45 deletions(-) create mode 100644 tests/neg/mt-deskolemize-2.scala diff --git a/compiler/src/dotty/tools/dotc/config/Feature.scala b/compiler/src/dotty/tools/dotc/config/Feature.scala index 1fe9cae936c9..5c27f20fcba1 100644 --- a/compiler/src/dotty/tools/dotc/config/Feature.scala +++ b/compiler/src/dotty/tools/dotc/config/Feature.scala @@ -34,6 +34,7 @@ object Feature: val captureChecking = experimental("captureChecking") val into = experimental("into") val namedTuples = experimental("namedTuples") + val betterMatchTypeExtractors = experimental("betterMatchTypeExtractors") def experimentalAutoEnableFeatures(using Context): List[TermName] = defn.languageExperimentalFeatures @@ -88,6 +89,8 @@ object Feature: def scala2ExperimentalMacroEnabled(using Context) = enabled(scala2macros) + def betterMatchTypeExtractorsEnabled(using Context) = enabled(betterMatchTypeExtractors) + /** Is pureFunctions enabled for this compilation unit? */ def pureFunsEnabled(using Context) = enabledBySetting(pureFunctions) diff --git a/compiler/src/dotty/tools/dotc/core/TypeComparer.scala b/compiler/src/dotty/tools/dotc/core/TypeComparer.scala index dad159ace55f..3ce98e5447a2 100644 --- a/compiler/src/dotty/tools/dotc/core/TypeComparer.scala +++ b/compiler/src/dotty/tools/dotc/core/TypeComparer.scala @@ -10,7 +10,7 @@ import TypeOps.refineUsingParent import collection.mutable import util.{Stats, NoSourcePosition, EqHashMap} import config.Config -import config.Feature.{migrateTo3, sourceVersion} +import config.Feature.{betterMatchTypeExtractorsEnabled, migrateTo3, sourceVersion} import config.Printers.{subtyping, gadts, matchTypes, noPrinter} import config.SourceVersion import TypeErasure.{erasedLub, erasedGlb} @@ -3519,6 +3519,11 @@ class MatchReducer(initctx: Context) extends TypeComparer(initctx) { case MatchTypeCasePattern.TypeMemberExtractor(typeMemberName, capture) => /** Try to remove references to `skolem` from a type in accordance with the spec. + * + * If `betterMatchTypeExtractorsEnabled` is enabled then references + * to `skolem` occuring are avoided by following aliases and + * singletons, otherwise no attempt made to avoid references to + * `skolem`. * * If any reference to `skolem` remains in the result type, * `refersToSkolem` is set to true. @@ -3530,7 +3535,7 @@ class MatchReducer(initctx: Context) extends TypeComparer(initctx) { case `skolem` => refersToSkolem = true tp - case tp: NamedType => + case tp: NamedType if betterMatchTypeExtractorsEnabled => var savedRefersToSkolem = refersToSkolem refersToSkolem = false try @@ -3553,7 +3558,7 @@ class MatchReducer(initctx: Context) extends TypeComparer(initctx) { tp.derivedSelect(pre1) finally refersToSkolem |= savedRefersToSkolem - case tp: LazyRef => + case tp: LazyRef if betterMatchTypeExtractorsEnabled => // By default, TypeMap maps LazyRefs lazily. We need to // force it for `refersToSkolem` to be correctly set. apply(tp.ref) diff --git a/library/src/scala/runtime/stdLibPatches/language.scala b/library/src/scala/runtime/stdLibPatches/language.scala index b2bd4b791423..78755b8df757 100644 --- a/library/src/scala/runtime/stdLibPatches/language.scala +++ b/library/src/scala/runtime/stdLibPatches/language.scala @@ -105,6 +105,13 @@ object language: @compileTimeOnly("`relaxedExtensionImports` can only be used at compile time in import statements") @deprecated("The experimental.relaxedExtensionImports language import is no longer needed since the feature is now standard", since = "3.4") object relaxedExtensionImports + + /** Enhance match type extractors to follow aliases and singletons. + * + * @see [[https://github.com/scala/improvement-proposals/pull/84]] + */ + @compileTimeOnly("`betterMatchTypeExtractors` can only be used at compile time in import statements") + object betterMatchTypeExtractors end experimental /** The deprecated object contains features that are no longer officially suypported in Scala. diff --git a/tests/neg/mt-deskolemize-2.scala b/tests/neg/mt-deskolemize-2.scala new file mode 100644 index 000000000000..90d506a42e6f --- /dev/null +++ b/tests/neg/mt-deskolemize-2.scala @@ -0,0 +1,60 @@ +//> using options -language:experimental.betterMatchTypeExtractors + +trait Expr: + type Value +object Expr: + type Of[V] = Expr { type Value = V } + type ExtractValue[F <: Expr] = F match + case Expr.Of[v] => v +import Expr.ExtractValue + +class SimpleLoop1 extends Expr: + type Value = ExtractValue[SimpleLoop2] + +class SimpleLoop2 extends Expr: + type Value = ExtractValue[SimpleLoop1] + +object Test1: + val x: ExtractValue[SimpleLoop1] = 1 // error + +trait Description: + type Elem <: Tuple + +class PrimBroken extends Expr: + type Value = Alias + type Alias = Value // error + +class Prim extends Expr: + type Value = BigInt + +class VecExpr[E <: Expr] extends Expr: + type Value = Vector[ExtractValue[E]] + +trait ProdExpr extends Expr: + val description: Description + type Value = Tuple.Map[description.Elem, [X] =>> ExtractValue[X & Expr]] + + +class MyExpr1 extends ProdExpr: + final val description = new Description: + type Elem = (VecExpr[Prim], MyExpr2) + +class MyExpr2 extends ProdExpr: + final val description = new Description: + type Elem = (VecExpr[VecExpr[MyExpr1]], Prim) + +trait Constable[E <: Expr]: + def lit(v: ExtractValue[E]): E +object Constable: + given [E <: Expr]: Constable[E] = ??? + +object Test2: + def fromLiteral[E <: Expr : Constable](v: ExtractValue[E]): E = + summon[Constable[E]].lit(v) + val x0: ExtractValue[Prim] = "" // error + val x1: ExtractValue[PrimBroken] = 1 // error + + val foo: MyExpr2 = new MyExpr2 + val v: foo.Value = (Vector(Vector()), 1) // error: Recursion limit exceeded + val c: MyExpr2 = fromLiteral: + (Vector(Vector()), 1) // error: Recursion limit exceeded diff --git a/tests/neg/mt-deskolemize.scala b/tests/neg/mt-deskolemize.scala index 505e47637ac4..0a58d5db7bc4 100644 --- a/tests/neg/mt-deskolemize.scala +++ b/tests/neg/mt-deskolemize.scala @@ -14,45 +14,3 @@ class SimpleLoop2 extends Expr: object Test1: val x: ExtractValue[SimpleLoop1] = 1 // error - -trait Description: - type Elem <: Tuple - -class PrimBroken extends Expr: - type Value = Alias - type Alias = Value // error - -class Prim extends Expr: - type Value = BigInt - -class VecExpr[E <: Expr] extends Expr: - type Value = Vector[ExtractValue[E]] - -trait ProdExpr extends Expr: - val description: Description - type Value = Tuple.Map[description.Elem, [X] =>> ExtractValue[X & Expr]] - - -class MyExpr1 extends ProdExpr: - final val description = new Description: - type Elem = (VecExpr[Prim], MyExpr2) - -class MyExpr2 extends ProdExpr: - final val description = new Description: - type Elem = (VecExpr[VecExpr[MyExpr1]], Prim) - -trait Constable[E <: Expr]: - def lit(v: ExtractValue[E]): E -object Constable: - given [E <: Expr]: Constable[E] = ??? - -object Test2: - def fromLiteral[E <: Expr : Constable](v: ExtractValue[E]): E = - summon[Constable[E]].lit(v) - val x0: ExtractValue[Prim] = "" // error - val x1: ExtractValue[PrimBroken] = 1 // error - - val foo: MyExpr2 = new MyExpr2 - val v: foo.Value = (Vector(Vector()), 1) // error: Recursion limit exceeded - val c: MyExpr2 = fromLiteral: - (Vector(Vector()), 1) // error: Recursion limit exceeded diff --git a/tests/pos/mt-deskolemize.scala b/tests/pos/mt-deskolemize.scala index 34f38289b24d..abd61d9d55e6 100644 --- a/tests/pos/mt-deskolemize.scala +++ b/tests/pos/mt-deskolemize.scala @@ -1,3 +1,5 @@ +//> using options -language:experimental.betterMatchTypeExtractors + trait Expr: type Value From 3e1c4defc3cb1db0367f5e0fbece1db01279fc7b Mon Sep 17 00:00:00 2001 From: Guillaume Martres Date: Tue, 7 May 2024 12:34:30 +0200 Subject: [PATCH 203/371] DropSkolemMap: simplify logic No need to save the value of `refersToSkolem`: if it's true before we enter `NamedType` it will be true after and `dropSkolem` will return `NoType`. The previous logic could still be useful if we want to give more easily actionable error messages in the future by only keeping in the type the skolems we couldn't remove. [Cherry-picked a1930c4ca38673885a4ebc2ce95689e9e65d08be] --- .../dotty/tools/dotc/core/TypeComparer.scala | 41 +++++++++---------- 1 file changed, 19 insertions(+), 22 deletions(-) diff --git a/compiler/src/dotty/tools/dotc/core/TypeComparer.scala b/compiler/src/dotty/tools/dotc/core/TypeComparer.scala index 3ce98e5447a2..27dd4b7134a9 100644 --- a/compiler/src/dotty/tools/dotc/core/TypeComparer.scala +++ b/compiler/src/dotty/tools/dotc/core/TypeComparer.scala @@ -3531,33 +3531,30 @@ class MatchReducer(initctx: Context) extends TypeComparer(initctx) { class DropSkolemMap(skolem: SkolemType) extends TypeMap: var refersToSkolem = false def apply(tp: Type): Type = + if refersToSkolem then + return tp tp match case `skolem` => refersToSkolem = true tp case tp: NamedType if betterMatchTypeExtractorsEnabled => - var savedRefersToSkolem = refersToSkolem - refersToSkolem = false - try - val pre1 = apply(tp.prefix) - if refersToSkolem then - tp match - case tp: TermRef => tp.info.widenExpr.dealias match - case info: SingletonType => - refersToSkolem = false - apply(info) - case _ => - tp.derivedSelect(pre1) - case tp: TypeRef => tp.info match - case info: AliasingBounds => - refersToSkolem = false - apply(info.alias) - case _ => - tp.derivedSelect(pre1) - else - tp.derivedSelect(pre1) - finally - refersToSkolem |= savedRefersToSkolem + val pre1 = apply(tp.prefix) + if refersToSkolem then + tp match + case tp: TermRef => tp.info.widenExpr.dealias match + case info: SingletonType => + refersToSkolem = false + apply(info) + case _ => + tp.derivedSelect(pre1) + case tp: TypeRef => tp.info match + case info: AliasingBounds => + refersToSkolem = false + apply(info.alias) + case _ => + tp.derivedSelect(pre1) + else + tp.derivedSelect(pre1) case tp: LazyRef if betterMatchTypeExtractorsEnabled => // By default, TypeMap maps LazyRefs lazily. We need to // force it for `refersToSkolem` to be correctly set. From 11f01d2fc199db95d59139d78f552a8e8fed7341 Mon Sep 17 00:00:00 2001 From: Wojciech Mazur Date: Wed, 8 May 2024 14:57:35 +0200 Subject: [PATCH 204/371] Deprecate `StandardPlugin.init` in favor of `initialize` method taking implicit Context (#20330) We do deprecate `StandardPlugin.init` in favour of `StandardPlugin.initialize` method tak takes additional `Context` parameter - it would e.g. allow to use reporting mechanism when parsing compiler plugin options. Introduces changes to akka/akka fork used in Community Build [Cherry-picked 1276034e48114b9422ae5c5f1b25708e62517d45] --- community-build/community-projects/akka | 2 +- .../src/dotty/tools/dotc/plugins/Plugin.scala | 16 +++++++++++++++- .../src/dotty/tools/dotc/plugins/Plugins.scala | 2 +- .../changed-features/compiler-plugins.md | 4 ++-- .../changed-features/compiler-plugins.md | 4 ++-- .../analyzer-plugin/plugin/Analyzer.scala | 2 +- .../compiler-plugin/plugin/DivideZero.scala | 3 ++- tests/plugins/custom/analyzer/Analyzer_1.scala | 2 +- tests/plugins/neg/divideZero/plugin_1.scala | 2 +- 9 files changed, 26 insertions(+), 11 deletions(-) diff --git a/community-build/community-projects/akka b/community-build/community-projects/akka index 7f5115ebc9cd..79b294048f89 160000 --- a/community-build/community-projects/akka +++ b/community-build/community-projects/akka @@ -1 +1 @@ -Subproject commit 7f5115ebc9cde408433040f11834f5218b4a3357 +Subproject commit 79b294048f893d9d6b9332618f7aebedce9a5340 diff --git a/compiler/src/dotty/tools/dotc/plugins/Plugin.scala b/compiler/src/dotty/tools/dotc/plugins/Plugin.scala index ce77a5b9d97a..fdb41fc56689 100644 --- a/compiler/src/dotty/tools/dotc/plugins/Plugin.scala +++ b/compiler/src/dotty/tools/dotc/plugins/Plugin.scala @@ -13,6 +13,7 @@ import java.io.InputStream import java.util.Properties import scala.util.{ Try, Success, Failure } +import scala.annotation.nowarn trait PluginPhase extends MiniPhase { def runsBefore: Set[String] = Set.empty @@ -50,7 +51,20 @@ trait StandardPlugin extends Plugin { * @param options commandline options to the plugin. * @return a list of phases to be added to the phase plan */ - def init(options: List[String]): List[PluginPhase] + @deprecatedOverriding("Method 'init' does not allow to access 'Context', use 'initialize' instead.", since = "Scala 3.5.0") + @deprecated("Use 'initialize' instead.", since = "Scala 3.5.0") + def init(options: List[String]): List[PluginPhase] = Nil + + /** Non-research plugins should override this method to return the phases + * + * The phases returned must be freshly constructed (not reused + * and returned again on subsequent calls). + * + * @param options commandline options to the plugin. + * @return a list of phases to be added to the phase plan + */ + @nowarn("cat=deprecation") + def initialize(options: List[String])(using Context): List[PluginPhase] = init(options) } /** A research plugin may customize the compilation pipeline freely diff --git a/compiler/src/dotty/tools/dotc/plugins/Plugins.scala b/compiler/src/dotty/tools/dotc/plugins/Plugins.scala index 31176bb2fb2c..a6672d475129 100644 --- a/compiler/src/dotty/tools/dotc/plugins/Plugins.scala +++ b/compiler/src/dotty/tools/dotc/plugins/Plugins.scala @@ -125,7 +125,7 @@ trait Plugins { } // schedule plugins according to ordering constraints - val pluginPhases = plugins.collect { case p: StandardPlugin => p }.flatMap { plug => plug.init(options(plug)) } + val pluginPhases = plugins.collect { case p: StandardPlugin => p }.flatMap { plug => plug.initialize(options(plug)) } val updatedPlan = Plugins.schedule(plan, pluginPhases) // add research plugins diff --git a/docs/_docs/reference/changed-features/compiler-plugins.md b/docs/_docs/reference/changed-features/compiler-plugins.md index 6be8a62c7ac4..c0bfccec8172 100644 --- a/docs/_docs/reference/changed-features/compiler-plugins.md +++ b/docs/_docs/reference/changed-features/compiler-plugins.md @@ -67,7 +67,7 @@ class DivideZero extends StandardPlugin: val name: String = "divideZero" override val description: String = "divide zero check" - def init(options: List[String]): List[PluginPhase] = + override def initialize(options: List[String])(using Context): List[PluginPhase] = (new DivideZeroPhase) :: Nil class DivideZeroPhase extends PluginPhase: @@ -90,7 +90,7 @@ end DivideZeroPhase ``` The plugin main class (`DivideZero`) must extend the trait `StandardPlugin` -and implement the method `init` that takes the plugin's options as argument +and implement the method `initialize` that takes the plugin's options as argument and returns a list of `PluginPhase`s to be inserted into the compilation pipeline. Our plugin adds one compiler phase to the pipeline. A compiler phase must extend diff --git a/docs/_spec/TODOreference/changed-features/compiler-plugins.md b/docs/_spec/TODOreference/changed-features/compiler-plugins.md index 20bdb7f49836..719e204fc803 100644 --- a/docs/_spec/TODOreference/changed-features/compiler-plugins.md +++ b/docs/_spec/TODOreference/changed-features/compiler-plugins.md @@ -67,7 +67,7 @@ class DivideZero extends StandardPlugin: val name: String = "divideZero" override val description: String = "divide zero check" - def init(options: List[String]): List[PluginPhase] = + override def initialize(options: List[String])(using Context): List[PluginPhase] = (new DivideZeroPhase) :: Nil class DivideZeroPhase extends PluginPhase: @@ -90,7 +90,7 @@ end DivideZeroPhase ``` The plugin main class (`DivideZero`) must extend the trait `StandardPlugin` -and implement the method `init` that takes the plugin's options as argument +and implement the method `initialize` that takes the plugin's options as argument and returns a list of `PluginPhase`s to be inserted into the compilation pipeline. Our plugin adds one compiler phase to the pipeline. A compiler phase must extend diff --git a/sbt-test/sbt-dotty/analyzer-plugin/plugin/Analyzer.scala b/sbt-test/sbt-dotty/analyzer-plugin/plugin/Analyzer.scala index c1fab5c13f42..01aa57d7a971 100644 --- a/sbt-test/sbt-dotty/analyzer-plugin/plugin/Analyzer.scala +++ b/sbt-test/sbt-dotty/analyzer-plugin/plugin/Analyzer.scala @@ -21,7 +21,7 @@ class InitPlugin extends StandardPlugin { val name: String = "initPlugin" override val description: String = "checks that under -Yretain-trees we may get tree for all symbols" - def init(options: List[String]): List[PluginPhase] = + override def initialize(options: List[String])(using Context): List[PluginPhase] = (new SetDefTree) :: (new InitChecker) :: Nil } diff --git a/sbt-test/sbt-dotty/compiler-plugin/plugin/DivideZero.scala b/sbt-test/sbt-dotty/compiler-plugin/plugin/DivideZero.scala index c6fac6b796c0..3d1698250e5d 100644 --- a/sbt-test/sbt-dotty/compiler-plugin/plugin/DivideZero.scala +++ b/sbt-test/sbt-dotty/compiler-plugin/plugin/DivideZero.scala @@ -22,7 +22,8 @@ class DivideZero extends PluginPhase with StandardPlugin { override val runsAfter = Set(Pickler.name) override val runsBefore = Set(Staging.name) - def init(options: List[String]): List[PluginPhase] = this :: Nil + // We keep using deprecated variant here just to ensure it still works correctly + override def init(options: List[String]): List[PluginPhase] = this :: Nil private def isNumericDivide(sym: Symbol)(implicit ctx: Context): Boolean = { def test(tpe: String): Boolean = diff --git a/tests/plugins/custom/analyzer/Analyzer_1.scala b/tests/plugins/custom/analyzer/Analyzer_1.scala index 0e1cc53290d0..d611972e0e48 100644 --- a/tests/plugins/custom/analyzer/Analyzer_1.scala +++ b/tests/plugins/custom/analyzer/Analyzer_1.scala @@ -52,7 +52,7 @@ class InitChecker extends PluginPhase with StandardPlugin { override val runsAfter = Set(SetDefTree.name) override val runsBefore = Set(FirstTransform.name) - def init(options: List[String]): List[PluginPhase] = this :: (new SetDefTree) :: Nil + override def initialize(options: List[String])(using Context): List[PluginPhase] = this :: (new SetDefTree) :: Nil private def checkDef(tree: Tree)(implicit ctx: Context): Tree = { if (tree.symbol.defTree.isEmpty) diff --git a/tests/plugins/neg/divideZero/plugin_1.scala b/tests/plugins/neg/divideZero/plugin_1.scala index ef8e077fd14d..68b2a8eae478 100644 --- a/tests/plugins/neg/divideZero/plugin_1.scala +++ b/tests/plugins/neg/divideZero/plugin_1.scala @@ -20,7 +20,7 @@ class DivideZero extends PluginPhase with StandardPlugin { override val runsAfter = Set(Pickler.name) override val runsBefore = Set(PickleQuotes.name) - override def init(options: List[String]): List[PluginPhase] = this :: Nil + override def initialize(options: List[String])(using Context): List[PluginPhase] = this :: Nil private def isNumericDivide(sym: Symbol)(implicit ctx: Context): Boolean = { def test(tpe: String): Boolean = From 5fdfb977114c6593f99117b79316012ec2747c19 Mon Sep 17 00:00:00 2001 From: odersky Date: Sun, 7 Jan 2024 13:22:06 +0100 Subject: [PATCH 205/371] New modularity language import [Cherry-picked 34f17b753ad8dc5fcc038d592a8fc1c748ec62b4] --- .../src/scala/runtime/stdLibPatches/language.scala | 13 ++++++++++++- 1 file changed, 12 insertions(+), 1 deletion(-) diff --git a/library/src/scala/runtime/stdLibPatches/language.scala b/library/src/scala/runtime/stdLibPatches/language.scala index 78755b8df757..e9c480919902 100644 --- a/library/src/scala/runtime/stdLibPatches/language.scala +++ b/library/src/scala/runtime/stdLibPatches/language.scala @@ -96,7 +96,18 @@ object language: * @see [[https://dotty.epfl.ch/docs/reference/experimental/into-modifier]] */ @compileTimeOnly("`namedTuples` can only be used at compile time in import statements") - object namedTuples + object namedTupleas + + /** Experimental support for new features for better modularity, including + * - better tracking of dependencies through classes + * - better usability of context bounds + * - better syntax and conventions for type classes + * - ability to merge exported types in intersections + * + * @see [[https://dotty.epfl.ch/docs/reference/experimental/modularity]] + */ + @compileTimeOnly("`modularity` can only be used at compile time in import statements") + object modularity /** Was needed to add support for relaxed imports of extension methods. * The language import is no longer needed as this is now a standard feature since SIP was accepted. From 813af6907362b8ba06bc9bd4ef9914a4d5804b51 Mon Sep 17 00:00:00 2001 From: odersky Date: Sat, 18 Nov 2023 15:10:34 +0100 Subject: [PATCH 206/371] Allow vals in using clauses of givens [Cherry-picked 31c9e8a850e3f40dd797dc9e3669dcadb020586d] --- .../dotty/tools/dotc/parsing/Parsers.scala | 25 +++++++++++++------ 1 file changed, 17 insertions(+), 8 deletions(-) diff --git a/compiler/src/dotty/tools/dotc/parsing/Parsers.scala b/compiler/src/dotty/tools/dotc/parsing/Parsers.scala index 60b2a2b1d3cf..8d5c50d6d608 100644 --- a/compiler/src/dotty/tools/dotc/parsing/Parsers.scala +++ b/compiler/src/dotty/tools/dotc/parsing/Parsers.scala @@ -62,7 +62,7 @@ object Parsers { case ExtensionFollow // extension clause, following extension parameter def isClass = // owner is a class - this == Class || this == CaseClass + this == Class || this == CaseClass || this == Given def takesOnlyUsingClauses = // only using clauses allowed for this owner this == Given || this == ExtensionFollow def acceptsVariance = @@ -3372,7 +3372,7 @@ object Parsers { val isAbstractOwner = paramOwner == ParamOwner.Type || paramOwner == ParamOwner.TypeParam val start = in.offset var mods = annotsAsMods() | Param - if paramOwner == ParamOwner.Class || paramOwner == ParamOwner.CaseClass then + if paramOwner.isClass then mods |= PrivateLocal if isIdent(nme.raw.PLUS) && checkVarianceOK() then mods |= Covariant @@ -4100,6 +4100,14 @@ object Parsers { val nameStart = in.offset val name = if isIdent && followingIsGivenSig() then ident() else EmptyTermName + // TODO Change syntax description + def adjustDefParams(paramss: List[ParamClause]): List[ParamClause] = + paramss.nestedMap: param => + if !param.mods.isAllOf(PrivateLocal) then + syntaxError(em"method parameter ${param.name} may not be `a val`", param.span) + param.withMods(param.mods &~ (AccessFlags | ParamAccessor | Mutable) | Param) + .asInstanceOf[List[ParamClause]] + val gdef = val tparams = typeParamClauseOpt(ParamOwner.Given) newLineOpt() @@ -4121,16 +4129,17 @@ object Parsers { mods1 |= Lazy ValDef(name, parents.head, subExpr()) else - DefDef(name, joinParams(tparams, vparamss), parents.head, subExpr()) + DefDef(name, adjustDefParams(joinParams(tparams, vparamss)), parents.head, subExpr()) else if (isStatSep || isStatSeqEnd) && parentsIsType then if name.isEmpty then syntaxError(em"anonymous given cannot be abstract") - DefDef(name, joinParams(tparams, vparamss), parents.head, EmptyTree) + DefDef(name, adjustDefParams(joinParams(tparams, vparamss)), parents.head, EmptyTree) else - val tparams1 = tparams.map(tparam => tparam.withMods(tparam.mods | PrivateLocal)) - val vparamss1 = vparamss.map(_.map(vparam => - vparam.withMods(vparam.mods &~ Param | ParamAccessor | Protected))) - val constr = makeConstructor(tparams1, vparamss1) + val vparamss1 = vparamss.nestedMap: vparam => + if vparam.mods.is(Private) + then vparam.withMods(vparam.mods &~ PrivateLocal | Protected) + else vparam + val constr = makeConstructor(tparams, vparamss1) val templ = if isStatSep || isStatSeqEnd then Template(constr, parents, Nil, EmptyValDef, Nil) else withTemplate(constr, parents) From b5d48fda4954567d5a0851723213ffdb8d4cd844 Mon Sep 17 00:00:00 2001 From: =?UTF-8?q?Pawe=C5=82=20Marks?= Date: Thu, 9 May 2024 17:06:45 +0200 Subject: [PATCH 207/371] A relaxation concerning exported type aliases The rules for export forwarders are changed as follows. Previously, all export forwarders were declared `final`. Now, only term members are declared `final`. Type aliases left aside. This makes it possible to export the same type member into several traits and then mix these traits in the same class. `typeclass-aggregates.scala` shows why this is essential to be able to combine multiple givens with type members. The change does not lose safety since different type aliases would in any case lead to uninstantiatable classes. [Cherry-picked 84655ca3409c3ec2c1645b0c8f56ff7d17cc304d][modified] --- .../src/dotty/tools/dotc/config/Feature.scala | 1 + .../src/dotty/tools/dotc/core/Flags.scala | 2 - .../src/dotty/tools/dotc/typer/Namer.scala | 6 ++- .../reference/other-new-features/export.md | 16 +++++-- tests/neg/i0248-inherit-refined.check | 12 +++++ tests/pos/typeclass-aggregates.scala | 47 +++++++++++++++++++ 6 files changed, 77 insertions(+), 7 deletions(-) create mode 100644 tests/neg/i0248-inherit-refined.check create mode 100644 tests/pos/typeclass-aggregates.scala diff --git a/compiler/src/dotty/tools/dotc/config/Feature.scala b/compiler/src/dotty/tools/dotc/config/Feature.scala index 5c27f20fcba1..0d551094da4d 100644 --- a/compiler/src/dotty/tools/dotc/config/Feature.scala +++ b/compiler/src/dotty/tools/dotc/config/Feature.scala @@ -34,6 +34,7 @@ object Feature: val captureChecking = experimental("captureChecking") val into = experimental("into") val namedTuples = experimental("namedTuples") + val modularity = experimental("modularity") val betterMatchTypeExtractors = experimental("betterMatchTypeExtractors") def experimentalAutoEnableFeatures(using Context): List[TermName] = diff --git a/compiler/src/dotty/tools/dotc/core/Flags.scala b/compiler/src/dotty/tools/dotc/core/Flags.scala index 8110bc769d4f..98c57a96a5c0 100644 --- a/compiler/src/dotty/tools/dotc/core/Flags.scala +++ b/compiler/src/dotty/tools/dotc/core/Flags.scala @@ -543,8 +543,6 @@ object Flags { /** Flags retained in type export forwarders */ val RetainedExportTypeFlags = Infix - val MandatoryExportTypeFlags = Exported | Final - /** Flags that apply only to classes */ val ClassOnlyFlags = Sealed | Open | Abstract.toTypeFlags diff --git a/compiler/src/dotty/tools/dotc/typer/Namer.scala b/compiler/src/dotty/tools/dotc/typer/Namer.scala index 72ca6a35bf4b..d2121ede2a67 100644 --- a/compiler/src/dotty/tools/dotc/typer/Namer.scala +++ b/compiler/src/dotty/tools/dotc/typer/Namer.scala @@ -26,7 +26,7 @@ import Nullables.* import transform.ValueClasses.* import TypeErasure.erasure import reporting.* -import config.Feature.sourceVersion +import config.Feature.{sourceVersion, modularity} import config.SourceVersion.* import scala.compiletime.uninitialized @@ -1203,7 +1203,9 @@ class Namer { typer: Typer => target = target.etaExpand newSymbol( cls, forwarderName, - MandatoryExportTypeFlags | (sym.flags & RetainedExportTypeFlags), + Exported + | (sym.flags & RetainedExportTypeFlags) + | (if Feature.enabled(modularity) then EmptyFlags else Final), TypeAlias(target), coord = span) // Note: This will always create unparameterzied aliases. So even if the original type is diff --git a/docs/_docs/reference/other-new-features/export.md b/docs/_docs/reference/other-new-features/export.md index 98e9a7d3d711..e21d369b6b5e 100644 --- a/docs/_docs/reference/other-new-features/export.md +++ b/docs/_docs/reference/other-new-features/export.md @@ -37,7 +37,12 @@ final def print(bits: BitMap): Unit = printUnit.print(bits) final type PrinterType = printUnit.PrinterType ``` -They can be accessed inside `Copier` as well as from outside: +With the experimental `modularity` language import, only exported methods and values are final, whereas the generated `PrinterType` would be a simple type alias +```scala + type PrinterType = printUnit.PrinterType +``` + +These aliases can be accessed inside `Copier` as well as from outside: ```scala val copier = new Copier @@ -90,12 +95,17 @@ export O.* ``` Export aliases copy the type and value parameters of the members they refer to. -Export aliases are always `final`. Aliases of given instances are again defined as givens (and aliases of old-style implicits are `implicit`). Aliases of extensions are again defined as extensions. Aliases of inline methods or values are again defined `inline`. There are no other modifiers that can be given to an alias. This has the following consequences for overriding: +Export aliases of term members are always `final`. Aliases of given instances are again defined as givens (and aliases of old-style implicits are `implicit`). Aliases of extensions are again defined as extensions. Aliases of inline methods or values are again defined `inline`. There are no other modifiers that can be given to an alias. This has the following consequences for overriding: - - Export aliases cannot be overridden, since they are final. + - Export aliases of methods or fields cannot be overridden, since they are final. - Export aliases cannot override concrete members in base classes, since they are not marked `override`. - However, export aliases can implement deferred members of base classes. + - Export type aliases are normally also final, except when the experimental + language import `modularity` is present. The general + rules for type aliases ensure in any case that if there are several type aliases in a class, + they must agree on their right hand sides, or the class could not be instantiated. + So dropping the `final` for export type aliases is safe. Export aliases for public value definitions that are accessed without referring to private values in the qualifier path diff --git a/tests/neg/i0248-inherit-refined.check b/tests/neg/i0248-inherit-refined.check new file mode 100644 index 000000000000..4e14c3c6f14b --- /dev/null +++ b/tests/neg/i0248-inherit-refined.check @@ -0,0 +1,12 @@ +-- [E170] Type Error: tests/neg/i0248-inherit-refined.scala:8:18 ------------------------------------------------------- +8 | class C extends Y // error + | ^ + | test.A & test.B is not a class type + | + | longer explanation available when compiling with `-explain` +-- [E170] Type Error: tests/neg/i0248-inherit-refined.scala:10:18 ------------------------------------------------------ +10 | class D extends Z // error + | ^ + | test.A | test.B is not a class type + | + | longer explanation available when compiling with `-explain` diff --git a/tests/pos/typeclass-aggregates.scala b/tests/pos/typeclass-aggregates.scala new file mode 100644 index 000000000000..77b0f1a9f04a --- /dev/null +++ b/tests/pos/typeclass-aggregates.scala @@ -0,0 +1,47 @@ +//> using options -source future -language:experimental.modularity +trait Ord: + type This + extension (x: This) + def compareTo(y: This): Int + def < (y: This): Boolean = compareTo(y) < 0 + def > (y: This): Boolean = compareTo(y) > 0 + + trait OrdProxy extends Ord: + export Ord.this.* + +trait SemiGroup: + type This + extension (x: This) def combine(y: This): This + + trait SemiGroupProxy extends SemiGroup: + export SemiGroup.this.* + +trait Monoid extends SemiGroup: + def unit: This + + trait MonoidProxy extends Monoid: + export Monoid.this.* + +def ordWithMonoid(ord: Ord, monoid: Monoid{ type This = ord.This }): Ord & Monoid = + new ord.OrdProxy with monoid.MonoidProxy {} + +trait OrdWithMonoid extends Ord, Monoid + +def ordWithMonoid2(ord: Ord, monoid: Monoid{ type This = ord.This }) = //: OrdWithMonoid { type This = ord.This} = + new OrdWithMonoid with ord.OrdProxy with monoid.MonoidProxy {} + +given intOrd: Ord { type This = Int } = ??? +given intMonoid: Monoid { type This = Int } = ??? + +//given (using ord: Ord, monoid: Monoid{ type This = ord.This }): (Ord & Monoid { type This = ord.This}) = +// ordWithMonoid2(ord, monoid) + +val x = summon[Ord & Monoid { type This = Int}] +val y: Int = ??? : x.This + +// given [A, B](using ord: A is Ord, monoid: A is Monoid) => A is Ord & Monoid = +// new ord.OrdProxy with monoid.MonoidProxy {} + +given [A](using ord: Ord { type This = A }, monoid: Monoid { type This = A}): (Ord & Monoid) { type This = A} = + new ord.OrdProxy with monoid.MonoidProxy {} + From 48e2aa7329d85f754f2d4aaec3d7ea638f3fd83d Mon Sep 17 00:00:00 2001 From: odersky Date: Wed, 13 Dec 2023 10:54:15 +0100 Subject: [PATCH 208/371] Allow class parents to be refined types. Refinements of a class parent are added as synthetic members to the inheriting class. [Cherry-picked 48944142182932b0bb1f97d7261d6033aa96888a] --- .../src/dotty/tools/dotc/core/NamerOps.scala | 21 +++++ .../tools/dotc/core/tasty/TreeUnpickler.scala | 2 +- .../tools/dotc/transform/init/Util.scala | 1 + .../src/dotty/tools/dotc/typer/Namer.scala | 37 +++++++-- .../src/dotty/tools/dotc/typer/Typer.scala | 30 +++++-- tests/neg/i0248-inherit-refined.scala | 6 +- tests/neg/parent-refinement-access.check | 7 ++ tests/neg/parent-refinement-access.scala | 6 ++ tests/neg/parent-refinement.check | 29 ++++++- tests/neg/parent-refinement.scala | 20 ++++- tests/pos/parent-refinement.scala | 48 +++++++++++ tests/pos/typeclasses.scala | 79 ++++--------------- 12 files changed, 200 insertions(+), 86 deletions(-) create mode 100644 tests/neg/parent-refinement-access.check create mode 100644 tests/neg/parent-refinement-access.scala create mode 100644 tests/pos/parent-refinement.scala diff --git a/compiler/src/dotty/tools/dotc/core/NamerOps.scala b/compiler/src/dotty/tools/dotc/core/NamerOps.scala index 75a135826785..8d096913e285 100644 --- a/compiler/src/dotty/tools/dotc/core/NamerOps.scala +++ b/compiler/src/dotty/tools/dotc/core/NamerOps.scala @@ -5,6 +5,7 @@ package core import Contexts.*, Symbols.*, Types.*, Flags.*, Scopes.*, Decorators.*, Names.*, NameOps.* import SymDenotations.{LazyType, SymDenotation}, StdNames.nme import TypeApplications.EtaExpansion +import collection.mutable /** Operations that are shared between Namer and TreeUnpickler */ object NamerOps: @@ -18,6 +19,26 @@ object NamerOps: case TypeSymbols(tparams) :: _ => ctor.owner.typeRef.appliedTo(tparams.map(_.typeRef)) case _ => ctor.owner.typeRef + /** Split dependent class refinements off parent type. Add them to `refinements`, + * unless it is null. + */ + extension (tp: Type) + def separateRefinements(cls: ClassSymbol, refinements: mutable.LinkedHashMap[Name, Type] | Null)(using Context): Type = + tp match + case RefinedType(tp1, rname, rinfo) => + try tp1.separateRefinements(cls, refinements) + finally + if refinements != null then + refinements(rname) = refinements.get(rname) match + case Some(tp) => tp & rinfo + case None => rinfo + case tp @ AnnotatedType(tp1, ann) => + tp.derivedAnnotatedType(tp1.separateRefinements(cls, refinements), ann) + case tp: RecType => + tp.parent.substRecThis(tp, cls.thisType).separateRefinements(cls, refinements) + case tp => + tp + /** If isConstructor, make sure it has at least one non-implicit parameter list * This is done by adding a () in front of a leading old style implicit parameter, * or by adding a () as last -- or only -- parameter list if the constructor has diff --git a/compiler/src/dotty/tools/dotc/core/tasty/TreeUnpickler.scala b/compiler/src/dotty/tools/dotc/core/tasty/TreeUnpickler.scala index 04d19f2f8821..f6fa9faf0114 100644 --- a/compiler/src/dotty/tools/dotc/core/tasty/TreeUnpickler.scala +++ b/compiler/src/dotty/tools/dotc/core/tasty/TreeUnpickler.scala @@ -1074,7 +1074,7 @@ class TreeUnpickler(reader: TastyReader, } val parentReader = fork val parents = readParents(withArgs = false)(using parentCtx) - val parentTypes = parents.map(_.tpe.dealias) + val parentTypes = parents.map(_.tpe.dealiasKeepAnnots.separateRefinements(cls, null)) if cls.is(JavaDefined) && parentTypes.exists(_.derivesFrom(defn.JavaAnnotationClass)) then cls.setFlag(JavaAnnotation) val self = diff --git a/compiler/src/dotty/tools/dotc/transform/init/Util.scala b/compiler/src/dotty/tools/dotc/transform/init/Util.scala index 756fd1a0a8e7..e11d0e1e21a5 100644 --- a/compiler/src/dotty/tools/dotc/transform/init/Util.scala +++ b/compiler/src/dotty/tools/dotc/transform/init/Util.scala @@ -20,6 +20,7 @@ object Util: def typeRefOf(tp: Type)(using Context): TypeRef = tp.dealias.typeConstructor match case tref: TypeRef => tref + case RefinedType(parent, _, _) => typeRefOf(parent) case hklambda: HKTypeLambda => typeRefOf(hklambda.resType) diff --git a/compiler/src/dotty/tools/dotc/typer/Namer.scala b/compiler/src/dotty/tools/dotc/typer/Namer.scala index d2121ede2a67..530423fd2613 100644 --- a/compiler/src/dotty/tools/dotc/typer/Namer.scala +++ b/compiler/src/dotty/tools/dotc/typer/Namer.scala @@ -55,11 +55,12 @@ class Namer { typer: Typer => import untpd.* - val TypedAhead : Property.Key[tpd.Tree] = new Property.Key - val ExpandedTree : Property.Key[untpd.Tree] = new Property.Key - val ExportForwarders: Property.Key[List[tpd.MemberDef]] = new Property.Key - val SymOfTree : Property.Key[Symbol] = new Property.Key - val AttachedDeriver : Property.Key[Deriver] = new Property.Key + val TypedAhead : Property.Key[tpd.Tree] = new Property.Key + val ExpandedTree : Property.Key[untpd.Tree] = new Property.Key + val ExportForwarders : Property.Key[List[tpd.MemberDef]] = new Property.Key + val ParentRefinements: Property.Key[List[Symbol]] = new Property.Key + val SymOfTree : Property.Key[Symbol] = new Property.Key + val AttachedDeriver : Property.Key[Deriver] = new Property.Key // was `val Deriver`, but that gave shadowing problems with constructor proxies /** A partial map from unexpanded member and pattern defs and to their expansions. @@ -1515,6 +1516,7 @@ class Namer { typer: Typer => /** The type signature of a ClassDef with given symbol */ override def completeInCreationContext(denot: SymDenotation): Unit = { val parents = impl.parents + val parentRefinements = new mutable.LinkedHashMap[Name, Type] /* The type of a parent constructor. Types constructor arguments * only if parent type contains uninstantiated type parameters. @@ -1569,8 +1571,13 @@ class Namer { typer: Typer => val ptype = parentType(parent)(using completerCtx.superCallContext).dealiasKeepAnnots if (cls.isRefinementClass) ptype else { - val pt = checkClassType(ptype, parent.srcPos, - traitReq = parent ne parents.head, stablePrefixReq = !isJava) + val pt = checkClassType( + if Feature.enabled(modularity) + then ptype.separateRefinements(cls, parentRefinements) + else ptype, + parent.srcPos, + traitReq = parent ne parents.head, + stablePrefixReq = !isJava) if (pt.derivesFrom(cls)) { val addendum = parent match { case Select(qual: Super, _) if Feature.migrateTo3 => @@ -1597,6 +1604,21 @@ class Namer { typer: Typer => } } + /** Enter all parent refinements as public class members, unless a definition + * with the same name already exists in the class. + */ + def enterParentRefinementSyms(refinements: List[(Name, Type)]) = + val refinedSyms = mutable.ListBuffer[Symbol]() + for (name, tp) <- refinements do + if decls.lookupEntry(name) == null then + val flags = tp match + case tp: MethodOrPoly => Method | Synthetic | Deferred + case _ => Synthetic | Deferred + refinedSyms += newSymbol(cls, name, flags, tp, coord = original.rhs.span.startPos).entered + if refinedSyms.nonEmpty then + typr.println(i"parent refinement symbols: ${refinedSyms.toList}") + original.pushAttachment(ParentRefinements, refinedSyms.toList) + /** If `parents` contains references to traits that have supertraits with implicit parameters * add those supertraits in linearization order unless they are already covered by other * parent types. For instance, in @@ -1667,6 +1689,7 @@ class Namer { typer: Typer => cls.invalidateMemberCaches() // we might have checked for a member when parents were not known yet. cls.setNoInitsFlags(parentsKind(parents), untpd.bodyKind(rest)) cls.setStableConstructor() + enterParentRefinementSyms(parentRefinements.toList) processExports(using localCtx) defn.patchStdLibClass(cls) addConstructorProxies(cls) diff --git a/compiler/src/dotty/tools/dotc/typer/Typer.scala b/compiler/src/dotty/tools/dotc/typer/Typer.scala index 46982cf1406d..c5b6faf455f7 100644 --- a/compiler/src/dotty/tools/dotc/typer/Typer.scala +++ b/compiler/src/dotty/tools/dotc/typer/Typer.scala @@ -40,8 +40,7 @@ import annotation.tailrec import Implicits.* import util.Stats.record import config.Printers.{gadts, typr} -import config.Feature -import config.Feature.{sourceVersion, migrateTo3} +import config.Feature, Feature.{sourceVersion, migrateTo3, modularity} import config.SourceVersion.* import rewrites.Rewrites, Rewrites.patch import staging.StagingLevel @@ -1004,10 +1003,11 @@ class Typer(@constructorOnly nestingLevel: Int = 0) extends Namer tp.exists && !tp.typeSymbol.is(Final) && (!tp.isTopType || tp.isAnyRef) // Object is the only toplevel class that can be instantiated - if (templ1.parents.isEmpty && - isFullyDefined(pt, ForceDegree.flipBottom) && - isSkolemFree(pt) && - isEligible(pt.underlyingClassRef(refinementOK = false))) + if templ1.parents.isEmpty + && isFullyDefined(pt, ForceDegree.flipBottom) + && isSkolemFree(pt) + && isEligible(pt.underlyingClassRef(refinementOK = Feature.enabled(modularity))) + then templ1 = cpy.Template(templ)(parents = untpd.TypeTree(pt) :: Nil) for case parent: RefTree <- templ1.parents do typedAhead(parent, tree => inferTypeParams(typedType(tree), pt)) @@ -2871,6 +2871,19 @@ class Typer(@constructorOnly nestingLevel: Int = 0) extends Namer } } + /** Add all parent refinement symbols as declarations to this class */ + def addParentRefinements(body: List[Tree])(using Context): List[Tree] = + cdef.getAttachment(ParentRefinements) match + case Some(refinedSyms) => + val refinements = refinedSyms.map: sym => + ( if sym.isType then TypeDef(sym.asType) + else if sym.is(Method) then DefDef(sym.asTerm) + else ValDef(sym.asTerm) + ).withSpan(impl.span.startPos) + body ++ refinements + case None => + body + ensureCorrectSuperClass() completeAnnotations(cdef, cls) val constr1 = typed(constr).asInstanceOf[DefDef] @@ -2891,7 +2904,10 @@ class Typer(@constructorOnly nestingLevel: Int = 0) extends Namer cdef.withType(UnspecifiedErrorType) else { val dummy = localDummy(cls, impl) - val body1 = addAccessorDefs(cls, typedStats(impl.body, dummy)(using ctx.inClassContext(self1.symbol))._1) + val body1 = + addParentRefinements( + addAccessorDefs(cls, + typedStats(impl.body, dummy)(using ctx.inClassContext(self1.symbol))._1)) checkNoDoubleDeclaration(cls) val impl1 = cpy.Template(impl)(constr1, parents1, Nil, self1, body1) diff --git a/tests/neg/i0248-inherit-refined.scala b/tests/neg/i0248-inherit-refined.scala index 97b6f5cdab73..f7cd6375afc9 100644 --- a/tests/neg/i0248-inherit-refined.scala +++ b/tests/neg/i0248-inherit-refined.scala @@ -1,10 +1,12 @@ +//> using options -source future -language:experimental.modularity + object test { class A { type T } type X = A { type T = Int } - class B extends X // error + class B extends X // was error, now OK type Y = A & B class C extends Y // error type Z = A | B class D extends Z // error - abstract class E extends ({ val x: Int }) // error + abstract class E extends ({ val x: Int }) // was error, now OK } diff --git a/tests/neg/parent-refinement-access.check b/tests/neg/parent-refinement-access.check new file mode 100644 index 000000000000..5cde9d51558f --- /dev/null +++ b/tests/neg/parent-refinement-access.check @@ -0,0 +1,7 @@ +-- [E164] Declaration Error: tests/neg/parent-refinement-access.scala:6:6 ---------------------------------------------- +6 |trait Year2(private[Year2] val value: Int) extends (Gen { val x: Int }) // error + | ^ + | error overriding value x in trait Year2 of type Int; + | value x in trait Gen of type Any has weaker access privileges; it should be public + | (Note that value x in trait Year2 of type Int is abstract, + | and is therefore overridden by concrete value x in trait Gen of type Any) diff --git a/tests/neg/parent-refinement-access.scala b/tests/neg/parent-refinement-access.scala new file mode 100644 index 000000000000..57d45f4fb201 --- /dev/null +++ b/tests/neg/parent-refinement-access.scala @@ -0,0 +1,6 @@ +//> using options -source future -language:experimental.modularity + +trait Gen: + private[Gen] val x: Any = () + +trait Year2(private[Year2] val value: Int) extends (Gen { val x: Int }) // error diff --git a/tests/neg/parent-refinement.check b/tests/neg/parent-refinement.check index 550430bd35a7..cf9a57bc7821 100644 --- a/tests/neg/parent-refinement.check +++ b/tests/neg/parent-refinement.check @@ -1,4 +1,25 @@ --- Error: tests/neg/parent-refinement.scala:5:2 ------------------------------------------------------------------------ -5 | with Ordered[Year] { // error - | ^^^^ - | end of toplevel definition expected but 'with' found +-- Error: tests/neg/parent-refinement.scala:11:6 ----------------------------------------------------------------------- +11 |class Bar extends IdOf[Int], (X { type Value = String }) // error + | ^^^ + |class Bar cannot be instantiated since it has a member Value with possibly conflicting bounds Int | String <: ... <: Int & String +-- [E007] Type Mismatch Error: tests/neg/parent-refinement.scala:15:17 ------------------------------------------------- +15 | val x: Value = 0 // error + | ^ + | Found: (0 : Int) + | Required: Baz.this.Value + | + | longer explanation available when compiling with `-explain` +-- [E007] Type Mismatch Error: tests/neg/parent-refinement.scala:21:6 -------------------------------------------------- +21 | foo(2) // error + | ^ + | Found: (2 : Int) + | Required: Boolean + | + | longer explanation available when compiling with `-explain` +-- [E007] Type Mismatch Error: tests/neg/parent-refinement.scala:17:22 ------------------------------------------------- +17 |val x: IdOf[Int] = Baz() // error + | ^^^^^ + | Found: Baz + | Required: IdOf[Int] + | + | longer explanation available when compiling with `-explain` diff --git a/tests/neg/parent-refinement.scala b/tests/neg/parent-refinement.scala index ca2b88a75fd8..868747faba57 100644 --- a/tests/neg/parent-refinement.scala +++ b/tests/neg/parent-refinement.scala @@ -1,7 +1,21 @@ +//> using options -source future -language:experimental.modularity trait Id { type Value } +trait X { type Value } +type IdOf[T] = Id { type Value = T } + case class Year(value: Int) extends AnyVal - with Id { type Value = Int } - with Ordered[Year] { // error + with (Id { type Value = Int }) + with Ordered[Year] + +class Bar extends IdOf[Int], (X { type Value = String }) // error + +class Baz extends IdOf[Int]: + type Value = String + val x: Value = 0 // error + +val x: IdOf[Int] = Baz() // error -} \ No newline at end of file +object Clash extends ({ def foo(x: Int): Int }): + def foo(x: Boolean): Int = 1 + foo(2) // error diff --git a/tests/pos/parent-refinement.scala b/tests/pos/parent-refinement.scala new file mode 100644 index 000000000000..eaa74228c5d6 --- /dev/null +++ b/tests/pos/parent-refinement.scala @@ -0,0 +1,48 @@ +//> using options -source future -language:experimental.modularity + +class A +class B extends A +class C extends B + +trait Id { type Value } +type IdOf[T] = Id { type Value = T } +trait X { type Value } + +case class Year(value: Int) extends IdOf[Int]: + val x: Value = 2 + +type Between[Lo, Hi] = X { type Value >: Lo <: Hi } + +class Foo() extends IdOf[B], Between[C, A]: + val x: Value = B() + +trait Bar extends IdOf[Int], (X { type Value = String }) + +class Baz extends IdOf[Int]: + type Value = String + val x: Value = "" + +trait Gen: + type T + val x: T + +type IntInst = Gen: + type T = Int + val x: 0 + +trait IntInstTrait extends IntInst + +abstract class IntInstClass extends IntInstTrait, IntInst + +object obj1 extends IntInstTrait: + val x = 0 + +object obj2 extends IntInstClass: + val x = 0 + +def main = + val x: obj1.T = 2 - obj2.x + val y: obj2.T = 2 - obj1.x + + + diff --git a/tests/pos/typeclasses.scala b/tests/pos/typeclasses.scala index 07fe5a31ce5d..2bf7f76f0804 100644 --- a/tests/pos/typeclasses.scala +++ b/tests/pos/typeclasses.scala @@ -1,7 +1,6 @@ -class Common: +//> using options -source future -language:experimental.modularity - // this should go in Predef - infix type at [A <: { type This}, B] = A { type This = B } +class Common: trait Ord: type This @@ -26,41 +25,23 @@ class Common: extension [A](x: This[A]) def flatMap[B](f: A => This[B]): This[B] def map[B](f: A => B) = x.flatMap(f `andThen` pure) + + infix type is[A <: AnyKind, B <: {type This <: AnyKind}] = B { type This = A } + end Common object Instances extends Common: -/* - instance Int: Ord as intOrd with - extension (x: Int) - def compareTo(y: Int) = - if x < y then -1 - else if x > y then +1 - else 0 -*/ - given intOrd: Ord with + given intOrd: (Int is Ord) with type This = Int extension (x: Int) def compareTo(y: Int) = if x < y then -1 else if x > y then +1 else 0 -/* - instance List[T: Ord]: Ord as listOrd with - extension (xs: List[T]) def compareTo(ys: List[T]): Int = (xs, ys) match - case (Nil, Nil) => 0 - case (Nil, _) => -1 - case (_, Nil) => +1 - case (x :: xs1, y :: ys1) => - val fst = x.compareTo(y) - if (fst != 0) fst else xs1.compareTo(ys1) -*/ - // Proposed short syntax: - // given listOrd[T: Ord as ord]: Ord at T with - given listOrd[T](using ord: Ord { type This = T}): Ord with - type This = List[T] + given listOrd[T](using ord: T is Ord): (List[T] is Ord) with extension (xs: List[T]) def compareTo(ys: List[T]): Int = (xs, ys) match case (Nil, Nil) => 0 case (Nil, _) => -1 @@ -70,32 +51,18 @@ object Instances extends Common: if (fst != 0) fst else xs1.compareTo(ys1) end listOrd -/* - instance List: Monad as listMonad with + given listMonad: (List is Monad) with extension [A](xs: List[A]) def flatMap[B](f: A => List[B]): List[B] = xs.flatMap(f) def pure[A](x: A): List[A] = List(x) -*/ - given listMonad: Monad with - type This[A] = List[A] - extension [A](xs: List[A]) def flatMap[B](f: A => List[B]): List[B] = - xs.flatMap(f) - def pure[A](x: A): List[A] = - List(x) -/* - type Reader[Ctx] = X =>> Ctx => X - instance Reader[Ctx: _]: Monad as readerMonad with - extension [A](r: Ctx => A) def flatMap[B](f: A => Ctx => B): Ctx => B = - ctx => f(r(ctx))(ctx) - def pure[A](x: A): Ctx => A = - ctx => x -*/ + type Reader[Ctx] = [X] =>> Ctx => X - given readerMonad[Ctx]: Monad with - type This[X] = Ctx => X + //given [Ctx] => Reader[Ctx] is Monad as readerMonad: + + given readerMonad[Ctx]: (Reader[Ctx] is Monad) with extension [A](r: Ctx => A) def flatMap[B](f: A => Ctx => B): Ctx => B = ctx => f(r(ctx))(ctx) def pure[A](x: A): Ctx => A = @@ -110,29 +77,17 @@ object Instances extends Common: def second = xs.tail.head def third = xs.tail.tail.head - //Proposed short syntax: - //extension [M: Monad as m, A](xss: M[M[A]]) - // def flatten: M[A] = - // xs.flatMap(identity) - extension [M, A](using m: Monad)(xss: m.This[m.This[A]]) def flatten: m.This[A] = xss.flatMap(identity) - // Proposed short syntax: - //def maximum[T: Ord](xs: List[T]: T = - def maximum[T](xs: List[T])(using Ord at T): T = + def maximum[T](xs: List[T])(using T is Ord): T = xs.reduceLeft((x, y) => if (x < y) y else x) - // Proposed short syntax: - // def descending[T: Ord as asc]: Ord at T = new Ord: - def descending[T](using asc: Ord at T): Ord at T = new Ord: - type This = T + def descending[T](using asc: T is Ord): T is Ord = new: extension (x: T) def compareTo(y: T) = asc.compareTo(y)(x) - // Proposed short syntax: - // def minimum[T: Ord](xs: List[T]) = - def minimum[T](xs: List[T])(using Ord at T) = + def minimum[T](xs: List[T])(using T is Ord) = maximum(xs)(using descending) def test(): Unit = @@ -177,10 +132,10 @@ instance Sheep: Animal with override def talk(): Unit = println(s"$name pauses briefly... $noise") */ +import Instances.is // Implement the `Animal` trait for `Sheep`. -given Animal with - type This = Sheep +given (Sheep is Animal) with def apply(name: String) = Sheep(name) extension (self: This) def name: String = self.name From 96c76e91ff4a9a43e020baf84fd40c7ffdafa387 Mon Sep 17 00:00:00 2001 From: odersky Date: Mon, 1 Apr 2024 19:11:11 +0200 Subject: [PATCH 209/371] Introduce tracked class parameters For a tracked class parameter we add a refinement in the constructor type that the class member is the same as the parameter. E.g. ```scala class C { type T } class D(tracked val x: C) { type T = x.T } ``` This will generate the constructor type: ```scala (x1: C): D { val x: x1.type } ``` Without `tracked` the refinement would not be added. This can solve several problems with dependent class types where previously we lost track of type dependencies. [Cherry-picked 5189e6854ad1dacc3454542c2f124f5bcb7e2a9c] --- .../src/dotty/tools/dotc/ast/Desugar.scala | 13 +- compiler/src/dotty/tools/dotc/ast/untpd.scala | 2 + .../src/dotty/tools/dotc/core/Flags.scala | 9 +- .../src/dotty/tools/dotc/core/NamerOps.scala | 17 +- .../dotc/core/PatternTypeConstrainer.scala | 9 +- .../src/dotty/tools/dotc/core/StdNames.scala | 1 + .../tools/dotc/core/SymDenotations.scala | 12 +- .../src/dotty/tools/dotc/core/TypeUtils.scala | 15 +- .../tools/dotc/core/tasty/TreePickler.scala | 1 + .../tools/dotc/core/tasty/TreeUnpickler.scala | 6 +- .../dotty/tools/dotc/parsing/Parsers.scala | 15 +- .../tools/dotc/printing/PlainPrinter.scala | 2 +- .../dotty/tools/dotc/printing/Printer.scala | 5 +- .../tools/dotc/transform/PostTyper.scala | 16 +- .../src/dotty/tools/dotc/typer/Checking.scala | 13 +- .../src/dotty/tools/dotc/typer/Namer.scala | 55 +++-- .../dotty/tools/dotc/typer/RefChecks.scala | 17 +- .../src/dotty/tools/dotc/typer/Typer.scala | 2 +- .../test/dotc/pos-test-pickling.blacklist | 5 + docs/_docs/internals/syntax.md | 2 +- .../reference/experimental/modularity.md | 189 ++++++++++++++++++ docs/sidebar.yml | 1 + project/MiMaFilters.scala | 3 + tasty/src/dotty/tools/tasty/TastyFormat.scala | 5 +- tests/neg/i3964.scala | 12 ++ tests/neg/tracked.check | 50 +++++ tests/neg/tracked.scala | 20 ++ tests/neg/tracked2.scala | 1 + tests/new/tracked-mixin-traits.scala | 16 ++ tests/pos/depclass-1.scala | 19 ++ tests/pos/i3920.scala | 32 +++ tests/pos/i3964.scala | 32 +++ tests/pos/i3964a/Defs_1.scala | 18 ++ tests/pos/i3964a/Uses_2.scala | 16 ++ tests/pos/parsercombinators-expanded.scala | 64 ++++++ tests/pos/parsercombinators-givens-2.scala | 52 +++++ tests/pos/parsercombinators-givens.scala | 54 +++++ tests/run/i3920.scala | 26 +++ 38 files changed, 758 insertions(+), 69 deletions(-) create mode 100644 docs/_docs/reference/experimental/modularity.md create mode 100644 tests/neg/i3964.scala create mode 100644 tests/neg/tracked.check create mode 100644 tests/neg/tracked.scala create mode 100644 tests/neg/tracked2.scala create mode 100644 tests/new/tracked-mixin-traits.scala create mode 100644 tests/pos/depclass-1.scala create mode 100644 tests/pos/i3920.scala create mode 100644 tests/pos/i3964.scala create mode 100644 tests/pos/i3964a/Defs_1.scala create mode 100644 tests/pos/i3964a/Uses_2.scala create mode 100644 tests/pos/parsercombinators-expanded.scala create mode 100644 tests/pos/parsercombinators-givens-2.scala create mode 100644 tests/pos/parsercombinators-givens.scala create mode 100644 tests/run/i3920.scala diff --git a/compiler/src/dotty/tools/dotc/ast/Desugar.scala b/compiler/src/dotty/tools/dotc/ast/Desugar.scala index 1801a7fada7c..c3a0c05088cb 100644 --- a/compiler/src/dotty/tools/dotc/ast/Desugar.scala +++ b/compiler/src/dotty/tools/dotc/ast/Desugar.scala @@ -429,13 +429,13 @@ object desugar { private def toDefParam(tparam: TypeDef, keepAnnotations: Boolean): TypeDef = { var mods = tparam.rawMods if (!keepAnnotations) mods = mods.withAnnotations(Nil) - tparam.withMods(mods & (EmptyFlags | Sealed) | Param) + tparam.withMods(mods & EmptyFlags | Param) } private def toDefParam(vparam: ValDef, keepAnnotations: Boolean, keepDefault: Boolean): ValDef = { var mods = vparam.rawMods if (!keepAnnotations) mods = mods.withAnnotations(Nil) val hasDefault = if keepDefault then HasDefault else EmptyFlags - vparam.withMods(mods & (GivenOrImplicit | Erased | hasDefault) | Param) + vparam.withMods(mods & (GivenOrImplicit | Erased | hasDefault | Tracked) | Param) } def mkApply(fn: Tree, paramss: List[ParamClause])(using Context): Tree = @@ -860,9 +860,8 @@ object desugar { // implicit wrapper is typechecked in same scope as constructor, so // we can reuse the constructor parameters; no derived params are needed. DefDef( - className.toTermName, joinParams(constrTparams, defParamss), - classTypeRef, creatorExpr) - .withMods(companionMods | mods.flags.toTermFlags & (GivenOrImplicit | Inline) | finalFlag) + className.toTermName, joinParams(constrTparams, defParamss), classTypeRef, creatorExpr + ) .withMods(companionMods | mods.flags.toTermFlags & (GivenOrImplicit | Inline) | finalFlag) .withSpan(cdef.span) :: Nil } @@ -890,7 +889,9 @@ object desugar { } if mods.isAllOf(Given | Inline | Transparent) then report.error("inline given instances cannot be trasparent", cdef) - val classMods = if mods.is(Given) then mods &~ (Inline | Transparent) | Synthetic else mods + var classMods = if mods.is(Given) then mods &~ (Inline | Transparent) | Synthetic else mods + if vparamAccessors.exists(_.mods.is(Tracked)) then + classMods |= Dependent cpy.TypeDef(cdef: TypeDef)( name = className, rhs = cpy.Template(impl)(constr, parents1, clsDerived, self1, diff --git a/compiler/src/dotty/tools/dotc/ast/untpd.scala b/compiler/src/dotty/tools/dotc/ast/untpd.scala index 0dfe52c421d9..91ef462bcf05 100644 --- a/compiler/src/dotty/tools/dotc/ast/untpd.scala +++ b/compiler/src/dotty/tools/dotc/ast/untpd.scala @@ -230,6 +230,8 @@ object untpd extends Trees.Instance[Untyped] with UntypedTreeInfo { case class Infix()(implicit @constructorOnly src: SourceFile) extends Mod(Flags.Infix) + case class Tracked()(implicit @constructorOnly src: SourceFile) extends Mod(Flags.Tracked) + /** Used under pureFunctions to mark impure function types `A => B` in `FunctionWithMods` */ case class Impure()(implicit @constructorOnly src: SourceFile) extends Mod(Flags.Impure) } diff --git a/compiler/src/dotty/tools/dotc/core/Flags.scala b/compiler/src/dotty/tools/dotc/core/Flags.scala index 98c57a96a5c0..2bc7610bb0ce 100644 --- a/compiler/src/dotty/tools/dotc/core/Flags.scala +++ b/compiler/src/dotty/tools/dotc/core/Flags.scala @@ -377,6 +377,9 @@ object Flags { /** Symbol cannot be found as a member during typer */ val (Invisible @ _, _, _) = newFlags(45, "") + /** Tracked modifier for class parameter / a class with some tracked parameters */ + val (Tracked @ _, _, Dependent @ _) = newFlags(46, "tracked") + // ------------ Flags following this one are not pickled ---------------------------------- /** Symbol is not a member of its owner */ @@ -452,7 +455,7 @@ object Flags { CommonSourceModifierFlags.toTypeFlags | Abstract | Sealed | Opaque | Open val TermSourceModifierFlags: FlagSet = - CommonSourceModifierFlags.toTermFlags | Inline | AbsOverride | Lazy + CommonSourceModifierFlags.toTermFlags | Inline | AbsOverride | Lazy | Tracked /** Flags representing modifiers that can appear in trees */ val ModifierFlags: FlagSet = @@ -466,7 +469,7 @@ object Flags { val FromStartFlags: FlagSet = commonFlags( Module, Package, Deferred, Method, Case, Enum, Param, ParamAccessor, Scala2SpecialFlags, MutableOrOpen, Opaque, Touched, JavaStatic, - OuterOrCovariant, LabelOrContravariant, CaseAccessor, + OuterOrCovariant, LabelOrContravariant, CaseAccessor, Tracked, Extension, NonMember, Implicit, Given, Permanent, Synthetic, Exported, SuperParamAliasOrScala2x, Inline, Macro, ConstructorProxy, Invisible) @@ -477,7 +480,7 @@ object Flags { */ val AfterLoadFlags: FlagSet = commonFlags( FromStartFlags, AccessFlags, Final, AccessorOrSealed, - Abstract, LazyOrTrait, SelfName, JavaDefined, JavaAnnotation, Transparent) + Abstract, LazyOrTrait, SelfName, JavaDefined, JavaAnnotation, Transparent, Tracked) /** A value that's unstable unless complemented with a Stable flag */ val UnstableValueFlags: FlagSet = Mutable | Method diff --git a/compiler/src/dotty/tools/dotc/core/NamerOps.scala b/compiler/src/dotty/tools/dotc/core/NamerOps.scala index 8d096913e285..af03573da4a8 100644 --- a/compiler/src/dotty/tools/dotc/core/NamerOps.scala +++ b/compiler/src/dotty/tools/dotc/core/NamerOps.scala @@ -16,8 +16,21 @@ object NamerOps: */ def effectiveResultType(ctor: Symbol, paramss: List[List[Symbol]])(using Context): Type = paramss match - case TypeSymbols(tparams) :: _ => ctor.owner.typeRef.appliedTo(tparams.map(_.typeRef)) - case _ => ctor.owner.typeRef + case TypeSymbols(tparams) :: rest => + addParamRefinements(ctor.owner.typeRef.appliedTo(tparams.map(_.typeRef)), rest) + case _ => + addParamRefinements(ctor.owner.typeRef, paramss) + + /** Given a method with tracked term-parameters `p1, ..., pn`, and result type `R`, add the + * refinements R { p1 = p1' } ... { pn = pn' }, where pi' is the term parameter ref + * of the parameter and pi is its name. This matters only under experimental.modularity, + * since wothout it there are no tracked parameters. Parameter refinements are added for + * constructors and given companion methods. + */ + def addParamRefinements(resType: Type, paramss: List[List[Symbol]])(using Context): Type = + paramss.flatten.foldLeft(resType): (rt, param) => + if param.is(Tracked) then RefinedType(rt, param.name, param.termRef) + else rt /** Split dependent class refinements off parent type. Add them to `refinements`, * unless it is null. diff --git a/compiler/src/dotty/tools/dotc/core/PatternTypeConstrainer.scala b/compiler/src/dotty/tools/dotc/core/PatternTypeConstrainer.scala index 6d6a47cf6a1e..9baf0c40a80b 100644 --- a/compiler/src/dotty/tools/dotc/core/PatternTypeConstrainer.scala +++ b/compiler/src/dotty/tools/dotc/core/PatternTypeConstrainer.scala @@ -88,11 +88,6 @@ trait PatternTypeConstrainer { self: TypeComparer => } } - def stripRefinement(tp: Type): Type = tp match { - case tp: RefinedOrRecType => stripRefinement(tp.parent) - case tp => tp - } - def tryConstrainSimplePatternType(pat: Type, scrut: Type) = { val patCls = pat.classSymbol val scrCls = scrut.classSymbol @@ -182,14 +177,14 @@ trait PatternTypeConstrainer { self: TypeComparer => case AndType(scrut1, scrut2) => constrainPatternType(pat, scrut1) && constrainPatternType(pat, scrut2) case scrut: RefinedOrRecType => - constrainPatternType(pat, stripRefinement(scrut)) + constrainPatternType(pat, scrut.stripRefinement) case scrut => dealiasDropNonmoduleRefs(pat) match { case OrType(pat1, pat2) => either(constrainPatternType(pat1, scrut), constrainPatternType(pat2, scrut)) case AndType(pat1, pat2) => constrainPatternType(pat1, scrut) && constrainPatternType(pat2, scrut) case pat: RefinedOrRecType => - constrainPatternType(stripRefinement(pat), scrut) + constrainPatternType(pat.stripRefinement, scrut) case pat => tryConstrainSimplePatternType(pat, scrut) || classesMayBeCompatible && constrainUpcasted(scrut) diff --git a/compiler/src/dotty/tools/dotc/core/StdNames.scala b/compiler/src/dotty/tools/dotc/core/StdNames.scala index 62d7afa22ed2..7545cf5c4ba1 100644 --- a/compiler/src/dotty/tools/dotc/core/StdNames.scala +++ b/compiler/src/dotty/tools/dotc/core/StdNames.scala @@ -629,6 +629,7 @@ object StdNames { val toString_ : N = "toString" val toTypeConstructor: N = "toTypeConstructor" val tpe : N = "tpe" + val tracked: N = "tracked" val transparent : N = "transparent" val tree : N = "tree" val true_ : N = "true" diff --git a/compiler/src/dotty/tools/dotc/core/SymDenotations.scala b/compiler/src/dotty/tools/dotc/core/SymDenotations.scala index 09d45dbdf06b..49c466f0bfd5 100644 --- a/compiler/src/dotty/tools/dotc/core/SymDenotations.scala +++ b/compiler/src/dotty/tools/dotc/core/SymDenotations.scala @@ -1187,21 +1187,25 @@ object SymDenotations { final def isExtensibleClass(using Context): Boolean = isClass && !isOneOf(FinalOrModuleClass) && !isAnonymousClass - /** A symbol is effectively final if it cannot be overridden in a subclass */ + /** A symbol is effectively final if it cannot be overridden */ final def isEffectivelyFinal(using Context): Boolean = isOneOf(EffectivelyFinalFlags) || is(Inline, butNot = Deferred) || is(JavaDefinedVal, butNot = Method) || isConstructor - || !owner.isExtensibleClass + || !owner.isExtensibleClass && !is(Deferred) + // Deferred symbols can arise through parent refinements. + // For them, the overriding relationship reverses anyway, so + // being in a final class does not mean the symbol cannot be + // implemented concretely in a superclass. /** A class is effectively sealed if has the `final` or `sealed` modifier, or it * is defined in Scala 3 and is neither abstract nor open. */ final def isEffectivelySealed(using Context): Boolean = isOneOf(FinalOrSealed) - || isClass && (!isOneOf(EffectivelyOpenFlags) - || isLocalToCompilationUnit) + || isClass + && (!isOneOf(EffectivelyOpenFlags) || isLocalToCompilationUnit) final def isLocalToCompilationUnit(using Context): Boolean = is(Private) diff --git a/compiler/src/dotty/tools/dotc/core/TypeUtils.scala b/compiler/src/dotty/tools/dotc/core/TypeUtils.scala index d4be03e9aae4..dd881bb1adf6 100644 --- a/compiler/src/dotty/tools/dotc/core/TypeUtils.scala +++ b/compiler/src/dotty/tools/dotc/core/TypeUtils.scala @@ -7,12 +7,13 @@ import Types.*, Contexts.*, Symbols.*, Flags.*, Decorators.* import Names.{Name, TermName} import Constants.Constant -class TypeUtils { +import Names.Name +class TypeUtils: /** A decorator that provides methods on types * that are needed in the transformer pipeline. */ - extension (self: Type) { + extension (self: Type) def isErasedValueType(using Context): Boolean = self.isInstanceOf[ErasedValueType] @@ -178,5 +179,11 @@ class TypeUtils { def isThisTypeOf(cls: Symbol)(using Context) = self match case self: Types.ThisType => self.cls == cls case _ => false - } -} + + /** Strip all outer refinements off this type */ + def stripRefinement: Type = self match + case self: RefinedOrRecType => self.parent.stripRefinement + case seld => self + +end TypeUtils + diff --git a/compiler/src/dotty/tools/dotc/core/tasty/TreePickler.scala b/compiler/src/dotty/tools/dotc/core/tasty/TreePickler.scala index 186e039c4d74..8d1eca8fb5f0 100644 --- a/compiler/src/dotty/tools/dotc/core/tasty/TreePickler.scala +++ b/compiler/src/dotty/tools/dotc/core/tasty/TreePickler.scala @@ -867,6 +867,7 @@ class TreePickler(pickler: TastyPickler, attributes: Attributes) { if (flags.is(Exported)) writeModTag(EXPORTED) if (flags.is(Given)) writeModTag(GIVEN) if (flags.is(Implicit)) writeModTag(IMPLICIT) + if (flags.is(Tracked)) writeModTag(TRACKED) if (isTerm) { if (flags.is(Lazy, butNot = Module)) writeModTag(LAZY) if (flags.is(AbsOverride)) { writeModTag(ABSTRACT); writeModTag(OVERRIDE) } diff --git a/compiler/src/dotty/tools/dotc/core/tasty/TreeUnpickler.scala b/compiler/src/dotty/tools/dotc/core/tasty/TreeUnpickler.scala index f6fa9faf0114..15f58956fbe3 100644 --- a/compiler/src/dotty/tools/dotc/core/tasty/TreeUnpickler.scala +++ b/compiler/src/dotty/tools/dotc/core/tasty/TreeUnpickler.scala @@ -31,7 +31,8 @@ import util.{SourceFile, Property} import ast.{Trees, tpd, untpd} import Trees.* import Decorators.* -import dotty.tools.dotc.quoted.QuotePatterns +import config.Feature +import quoted.QuotePatterns import dotty.tools.tasty.{TastyBuffer, TastyReader} import TastyBuffer.* @@ -755,6 +756,7 @@ class TreeUnpickler(reader: TastyReader, case INVISIBLE => addFlag(Invisible) case TRANSPARENT => addFlag(Transparent) case INFIX => addFlag(Infix) + case TRACKED => addFlag(Tracked) case PRIVATEqualified => readByte() privateWithin = readWithin @@ -922,6 +924,8 @@ class TreeUnpickler(reader: TastyReader, val resType = if name == nme.CONSTRUCTOR then effectiveResultType(sym, paramss) + else if sym.isAllOf(Given | Method) && Feature.enabled(Feature.modularity) then + addParamRefinements(tpt.tpe, paramss) else tpt.tpe sym.info = methodType(paramss, resType) diff --git a/compiler/src/dotty/tools/dotc/parsing/Parsers.scala b/compiler/src/dotty/tools/dotc/parsing/Parsers.scala index 8d5c50d6d608..94814457523e 100644 --- a/compiler/src/dotty/tools/dotc/parsing/Parsers.scala +++ b/compiler/src/dotty/tools/dotc/parsing/Parsers.scala @@ -3189,6 +3189,7 @@ object Parsers { case nme.open => Mod.Open() case nme.transparent => Mod.Transparent() case nme.infix => Mod.Infix() + case nme.tracked => Mod.Tracked() } } @@ -3255,7 +3256,8 @@ object Parsers { * | AccessModifier * | override * | opaque - * LocalModifier ::= abstract | final | sealed | open | implicit | lazy | inline | transparent | infix | erased + * LocalModifier ::= abstract | final | sealed | open | implicit | lazy | erased | + * inline | transparent | infix */ def modifiers(allowed: BitSet = modifierTokens, start: Modifiers = Modifiers()): Modifiers = { @tailrec @@ -3408,8 +3410,8 @@ object Parsers { /** ClsTermParamClause ::= ‘(’ ClsParams ‘)’ | UsingClsTermParamClause * UsingClsTermParamClause::= ‘(’ ‘using’ [‘erased’] (ClsParams | ContextTypes) ‘)’ * ClsParams ::= ClsParam {‘,’ ClsParam} - * ClsParam ::= {Annotation} [{Modifier} (‘val’ | ‘var’)] Param - * + * ClsParam ::= {Annotation} + * [{Modifier | ‘tracked’} (‘val’ | ‘var’)] Param * TypelessClause ::= DefTermParamClause * | UsingParamClause * @@ -3445,6 +3447,8 @@ object Parsers { if isErasedKw then mods = addModifier(mods) if paramOwner.isClass then + if isIdent(nme.tracked) && in.featureEnabled(Feature.modularity) && !in.lookahead.isColon then + mods = addModifier(mods) mods = addFlag(modifiers(start = mods), ParamAccessor) mods = if in.token == VAL then @@ -3516,7 +3520,8 @@ object Parsers { val isParams = !impliedMods.is(Given) || startParamTokens.contains(in.token) - || isIdent && (in.name == nme.inline || in.lookahead.isColon) + || isIdent + && (in.name == nme.inline || in.name == nme.tracked || in.lookahead.isColon) (mods, isParams) (if isParams then commaSeparated(() => param()) else contextTypes(paramOwner, numLeadParams, impliedMods)) match { @@ -4104,7 +4109,7 @@ object Parsers { def adjustDefParams(paramss: List[ParamClause]): List[ParamClause] = paramss.nestedMap: param => if !param.mods.isAllOf(PrivateLocal) then - syntaxError(em"method parameter ${param.name} may not be `a val`", param.span) + syntaxError(em"method parameter ${param.name} may not be a `val`", param.span) param.withMods(param.mods &~ (AccessFlags | ParamAccessor | Mutable) | Param) .asInstanceOf[List[ParamClause]] diff --git a/compiler/src/dotty/tools/dotc/printing/PlainPrinter.scala b/compiler/src/dotty/tools/dotc/printing/PlainPrinter.scala index 87f7c88e0407..5808707326a0 100644 --- a/compiler/src/dotty/tools/dotc/printing/PlainPrinter.scala +++ b/compiler/src/dotty/tools/dotc/printing/PlainPrinter.scala @@ -113,7 +113,7 @@ class PlainPrinter(_ctx: Context) extends Printer { protected def refinementNameString(tp: RefinedType): String = nameString(tp.refinedName) /** String representation of a refinement */ - protected def toTextRefinement(rt: RefinedType): Text = + def toTextRefinement(rt: RefinedType): Text = val keyword = rt.refinedInfo match { case _: ExprType | _: MethodOrPoly => "def " case _: TypeBounds => "type " diff --git a/compiler/src/dotty/tools/dotc/printing/Printer.scala b/compiler/src/dotty/tools/dotc/printing/Printer.scala index 8687925ed5fb..297dc31ea94a 100644 --- a/compiler/src/dotty/tools/dotc/printing/Printer.scala +++ b/compiler/src/dotty/tools/dotc/printing/Printer.scala @@ -4,7 +4,7 @@ package printing import core.* import Texts.*, ast.Trees.* -import Types.{Type, SingletonType, LambdaParam, NamedType}, +import Types.{Type, SingletonType, LambdaParam, NamedType, RefinedType}, Symbols.Symbol, Scopes.Scope, Constants.Constant, Names.Name, Denotations._, Annotations.Annotation, Contexts.Context import typer.Implicits.* @@ -104,6 +104,9 @@ abstract class Printer { /** Textual representation of a prefix of some reference, ending in `.` or `#` */ def toTextPrefixOf(tp: NamedType): Text + /** textual representation of a refinement, with no enclosing {...} */ + def toTextRefinement(rt: RefinedType): Text + /** Textual representation of a reference in a capture set */ def toTextCaptureRef(tp: Type): Text diff --git a/compiler/src/dotty/tools/dotc/transform/PostTyper.scala b/compiler/src/dotty/tools/dotc/transform/PostTyper.scala index d107de31829f..954b08c24ac1 100644 --- a/compiler/src/dotty/tools/dotc/transform/PostTyper.scala +++ b/compiler/src/dotty/tools/dotc/transform/PostTyper.scala @@ -369,11 +369,15 @@ class PostTyper extends MacroTransform with InfoTransformer { thisPhase => case Select(nu: New, nme.CONSTRUCTOR) if isCheckable(nu) => // need to check instantiability here, because the type of the New itself // might be a type constructor. - ctx.typer.checkClassType(tree.tpe, tree.srcPos, traitReq = false, stablePrefixReq = true) + def checkClassType(tpe: Type, stablePrefixReq: Boolean) = + ctx.typer.checkClassType(tpe, tree.srcPos, + traitReq = false, stablePrefixReq = stablePrefixReq, + refinementOK = Feature.enabled(Feature.modularity)) + checkClassType(tree.tpe, true) if !nu.tpe.isLambdaSub then // Check the constructor type as well; it could be an illegal singleton type // which would not be reflected as `tree.tpe` - ctx.typer.checkClassType(nu.tpe, tree.srcPos, traitReq = false, stablePrefixReq = false) + checkClassType(nu.tpe, false) Checking.checkInstantiable(tree.tpe, nu.tpe, nu.srcPos) withNoCheckNews(nu :: Nil)(app1) case _ => @@ -448,8 +452,12 @@ class PostTyper extends MacroTransform with InfoTransformer { thisPhase => // Constructor parameters are in scope when typing a parent. // While they can safely appear in a parent tree, to preserve // soundness we need to ensure they don't appear in a parent - // type (#16270). - val illegalRefs = parent.tpe.namedPartsWith(p => p.symbol.is(ParamAccessor) && (p.symbol.owner eq sym)) + // type (#16270). We can strip any refinement of a parent type since + // these refinements are split off from the parent type constructor + // application `parent` in Namer and don't show up as parent types + // of the class. + val illegalRefs = parent.tpe.dealias.stripRefinement.namedPartsWith: + p => p.symbol.is(ParamAccessor) && (p.symbol.owner eq sym) if illegalRefs.nonEmpty then report.error( em"The type of a class parent cannot refer to constructor parameters, but ${parent.tpe} refers to ${illegalRefs.map(_.name.show).mkString(",")}", parent.srcPos) diff --git a/compiler/src/dotty/tools/dotc/typer/Checking.scala b/compiler/src/dotty/tools/dotc/typer/Checking.scala index 7745c620312c..5839ec1766af 100644 --- a/compiler/src/dotty/tools/dotc/typer/Checking.scala +++ b/compiler/src/dotty/tools/dotc/typer/Checking.scala @@ -33,8 +33,7 @@ import Applications.UnapplyArgs import Inferencing.isFullyDefined import transform.patmat.SpaceEngine.{isIrrefutable, isIrrefutableQuotePattern} import transform.ValueClasses.underlyingOfValueClass -import config.Feature -import config.Feature.sourceVersion +import config.Feature, Feature.{sourceVersion, modularity} import config.SourceVersion.* import config.MigrationVersion import printing.Formatting.hlAsKeyword @@ -198,7 +197,7 @@ object Checking { * and that the instance conforms to the self type of the created class. */ def checkInstantiable(tp: Type, srcTp: Type, pos: SrcPos)(using Context): Unit = - tp.underlyingClassRef(refinementOK = false) match + tp.underlyingClassRef(refinementOK = Feature.enabled(modularity)) match case tref: TypeRef => val cls = tref.symbol if (cls.isOneOf(AbstractOrTrait)) { @@ -601,6 +600,7 @@ object Checking { // The issue with `erased inline` is that the erased semantics get lost // as the code is inlined and the reference is removed before the erased usage check. checkCombination(Erased, Inline) + checkNoConflict(Tracked, Mutable, em"mutable variables may not be `tracked`") checkNoConflict(Lazy, ParamAccessor, em"parameter may not be `lazy`") } @@ -1067,8 +1067,8 @@ trait Checking { * check that class prefix is stable. * @return `tp` itself if it is a class or trait ref, ObjectType if not. */ - def checkClassType(tp: Type, pos: SrcPos, traitReq: Boolean, stablePrefixReq: Boolean)(using Context): Type = - tp.underlyingClassRef(refinementOK = false) match { + def checkClassType(tp: Type, pos: SrcPos, traitReq: Boolean, stablePrefixReq: Boolean, refinementOK: Boolean = false)(using Context): Type = + tp.underlyingClassRef(refinementOK) match case tref: TypeRef => if (traitReq && !tref.symbol.is(Trait)) report.error(TraitIsExpected(tref.symbol), pos) if (stablePrefixReq && ctx.phase <= refchecksPhase) checkStable(tref.prefix, pos, "class prefix") @@ -1076,7 +1076,6 @@ trait Checking { case _ => report.error(NotClassType(tp), pos) defn.ObjectType - } /** If `sym` is an old-style implicit conversion, check that implicit conversions are enabled. * @pre sym.is(GivenOrImplicit) @@ -1626,7 +1625,7 @@ trait NoChecking extends ReChecking { override def checkNonCyclic(sym: Symbol, info: TypeBounds, reportErrors: Boolean)(using Context): Type = info override def checkNonCyclicInherited(joint: Type, parents: List[Type], decls: Scope, pos: SrcPos)(using Context): Unit = () override def checkStable(tp: Type, pos: SrcPos, kind: String)(using Context): Unit = () - override def checkClassType(tp: Type, pos: SrcPos, traitReq: Boolean, stablePrefixReq: Boolean)(using Context): Type = tp + override def checkClassType(tp: Type, pos: SrcPos, traitReq: Boolean, stablePrefixReq: Boolean, refinementOK: Boolean)(using Context): Type = tp override def checkImplicitConversionDefOK(sym: Symbol)(using Context): Unit = () override def checkImplicitConversionUseOK(tree: Tree, expected: Type)(using Context): Unit = () override def checkFeasibleParent(tp: Type, pos: SrcPos, where: => String = "")(using Context): Type = tp diff --git a/compiler/src/dotty/tools/dotc/typer/Namer.scala b/compiler/src/dotty/tools/dotc/typer/Namer.scala index 530423fd2613..e48c2fdf5066 100644 --- a/compiler/src/dotty/tools/dotc/typer/Namer.scala +++ b/compiler/src/dotty/tools/dotc/typer/Namer.scala @@ -122,7 +122,8 @@ class Namer { typer: Typer => /** Record `sym` as the symbol defined by `tree` */ def recordSym(sym: Symbol, tree: Tree)(using Context): Symbol = { - for (refs <- tree.removeAttachment(References); ref <- refs) ref.watching(sym) + for refs <- tree.removeAttachment(References); ref <- refs do + ref.watching(sym) tree.pushAttachment(SymOfTree, sym) sym } @@ -295,12 +296,15 @@ class Namer { typer: Typer => createOrRefine[Symbol](tree, name, flags, ctx.owner, _ => info, (fs, _, pwithin) => newSymbol(ctx.owner, name, fs, info, pwithin, tree.nameSpan)) case tree: Import => - recordSym(newImportSymbol(ctx.owner, Completer(tree)(ctx), tree.span), tree) + recordSym(newImportSym(tree), tree) case _ => NoSymbol } } + private def newImportSym(imp: Import)(using Context): Symbol = + newImportSymbol(ctx.owner, Completer(imp)(ctx), imp.span) + /** If `sym` exists, enter it in effective scope. Check that * package members are not entered twice in the same run. */ @@ -525,11 +529,9 @@ class Namer { typer: Typer => } /** Transfer all references to `from` to `to` */ - def transferReferences(from: ValDef, to: ValDef): Unit = { - val fromRefs = from.removeAttachment(References).getOrElse(Nil) - val toRefs = to.removeAttachment(References).getOrElse(Nil) - to.putAttachment(References, fromRefs ++ toRefs) - } + def transferReferences(from: ValDef, to: ValDef): Unit = + for ref <- from.removeAttachment(References).getOrElse(Nil) do + ref.watching(to) /** Merge the module class `modCls` in the expanded tree of `mdef` with the * body and derived clause of the synthetic module class `fromCls`. @@ -707,7 +709,18 @@ class Namer { typer: Typer => enterSymbol(companion) end addAbsentCompanions - stats.foreach(expand) + /** Expand each statement, keeping track of language imports in the context. This is + * necessary since desugaring might depend on language imports. + */ + def expandTopLevel(stats: List[Tree])(using Context): Unit = stats match + case (imp @ Import(qual, _)) :: stats1 if untpd.languageImport(qual).isDefined => + expandTopLevel(stats1)(using ctx.importContext(imp, newImportSym(imp))) + case stat :: stats1 => + expand(stat) + expandTopLevel(stats1) + case Nil => + + expandTopLevel(stats) mergeCompanionDefs() val ctxWithStats = stats.foldLeft(ctx)((ctx, stat) => indexExpanded(stat)(using ctx)) inContext(ctxWithStats) { @@ -1530,8 +1543,9 @@ class Namer { typer: Typer => core match case Select(New(tpt), nme.CONSTRUCTOR) => val targs1 = targs map (typedAheadType(_)) - val ptype = typedAheadType(tpt).tpe appliedTo targs1.tpes - if (ptype.typeParams.isEmpty) ptype + val ptype = typedAheadType(tpt).tpe.appliedTo(targs1.tpes) + if ptype.typeParams.isEmpty && !ptype.dealias.typeSymbol.is(Dependent) then + ptype else if (denot.is(ModuleClass) && denot.sourceModule.isOneOf(GivenOrImplicit)) missingType(denot.symbol, "parent ")(using creationContext) @@ -1612,7 +1626,8 @@ class Namer { typer: Typer => for (name, tp) <- refinements do if decls.lookupEntry(name) == null then val flags = tp match - case tp: MethodOrPoly => Method | Synthetic | Deferred + case tp: MethodOrPoly => Method | Synthetic | Deferred | Tracked + case _ if name.isTermName => Synthetic | Deferred | Tracked case _ => Synthetic | Deferred refinedSyms += newSymbol(cls, name, flags, tp, coord = original.rhs.span.startPos).entered if refinedSyms.nonEmpty then @@ -1660,11 +1675,9 @@ class Namer { typer: Typer => val parentTypes = defn.adjustForTuple(cls, cls.typeParams, defn.adjustForBoxedUnit(cls, - addUsingTraits( - locally: - val isJava = ctx.isJava - ensureFirstIsClass(cls, parents.map(checkedParentType(_, isJava))) - ) + addUsingTraits: + val isJava = ctx.isJava + ensureFirstIsClass(cls, parents.map(checkedParentType(_, isJava))) ) ) typr.println(i"completing $denot, parents = $parents%, %, parentTypes = $parentTypes%, %") @@ -1824,7 +1837,7 @@ class Namer { typer: Typer => } /** The type signature of a DefDef with given symbol */ - def defDefSig(ddef: DefDef, sym: Symbol, completer: Namer#Completer)(using Context): Type = { + def defDefSig(ddef: DefDef, sym: Symbol, completer: Namer#Completer)(using Context): Type = // Beware: ddef.name need not match sym.name if sym was freshened! val isConstructor = sym.name == nme.CONSTRUCTOR @@ -1863,13 +1876,19 @@ class Namer { typer: Typer => def wrapMethType(restpe: Type): Type = instantiateDependent(restpe, paramSymss) methodType(paramSymss, restpe, ddef.mods.is(JavaDefined)) + + def wrapRefinedMethType(restpe: Type): Type = + wrapMethType(addParamRefinements(restpe, paramSymss)) + if isConstructor then // set result type tree to unit, but take the current class as result type of the symbol typedAheadType(ddef.tpt, defn.UnitType) wrapMethType(effectiveResultType(sym, paramSymss)) + else if sym.isAllOf(Given | Method) && Feature.enabled(modularity) then + valOrDefDefSig(ddef, sym, paramSymss, wrapRefinedMethType) else valOrDefDefSig(ddef, sym, paramSymss, wrapMethType) - } + end defDefSig def inferredResultType( mdef: ValOrDefDef, diff --git a/compiler/src/dotty/tools/dotc/typer/RefChecks.scala b/compiler/src/dotty/tools/dotc/typer/RefChecks.scala index 2bf4b959ebca..7cd1d67e9aa5 100644 --- a/compiler/src/dotty/tools/dotc/typer/RefChecks.scala +++ b/compiler/src/dotty/tools/dotc/typer/RefChecks.scala @@ -610,8 +610,13 @@ object RefChecks { overrideError("is not inline, cannot implement an inline method") else if (other.isScala2Macro && !member.isScala2Macro) // (1.11) overrideError("cannot be used here - only Scala-2 macros can override Scala-2 macros") - else if (!compatTypes(memberTp(self), otherTp(self)) && - !compatTypes(memberTp(upwardsSelf), otherTp(upwardsSelf))) + else if !compatTypes(memberTp(self), otherTp(self)) + && !compatTypes(memberTp(upwardsSelf), otherTp(upwardsSelf)) + && !member.is(Tracked) + // Tracked members need to be excluded since they are abstract type members with + // singleton types. Concrete overrides usually have a wider type. + // TODO: Should we exclude all refinements inherited from parents? + then overrideError("has incompatible type", compareTypes = true) else if (member.targetName != other.targetName) if (other.targetName != other.name) @@ -620,7 +625,9 @@ object RefChecks { overrideError("cannot have a @targetName annotation since external names would be different") else if intoOccurrences(memberTp(self)) != intoOccurrences(otherTp(self)) then overrideError("has different occurrences of `into` modifiers", compareTypes = true) - else if other.is(ParamAccessor) && !isInheritedAccessor(member, other) then // (1.12) + else if other.is(ParamAccessor) && !isInheritedAccessor(member, other) + && !member.is(Tracked) + then // (1.12) report.errorOrMigrationWarning( em"cannot override val parameter ${other.showLocated}", member.srcPos, @@ -670,6 +677,10 @@ object RefChecks { mbr.isType || mbr.isSuperAccessor // not yet synthesized || mbr.is(JavaDefined) && hasJavaErasedOverriding(mbr) + || mbr.is(Tracked) + // Tracked members correspond to existing val parameters, so they don't + // count as deferred. The val parameter could not implement the tracked + // refinement since it usually has a wider type. def isImplemented(mbr: Symbol) = val mbrDenot = mbr.asSeenFrom(clazz.thisType) diff --git a/compiler/src/dotty/tools/dotc/typer/Typer.scala b/compiler/src/dotty/tools/dotc/typer/Typer.scala index c5b6faf455f7..8f2b7ce95785 100644 --- a/compiler/src/dotty/tools/dotc/typer/Typer.scala +++ b/compiler/src/dotty/tools/dotc/typer/Typer.scala @@ -4416,7 +4416,7 @@ class Typer(@constructorOnly nestingLevel: Int = 0) extends Namer cpy.Ident(qual)(qual.symbol.name.sourceModuleName.toTypeName) case _ => errorTree(tree, em"cannot convert from $tree to an instance creation expression") - val tycon = ctorResultType.underlyingClassRef(refinementOK = false) + val tycon = ctorResultType.underlyingClassRef(refinementOK = true) typed( untpd.Select( untpd.New(untpd.TypedSplice(tpt.withType(tycon))), diff --git a/compiler/test/dotc/pos-test-pickling.blacklist b/compiler/test/dotc/pos-test-pickling.blacklist index a856a5b84d92..ad9befa72f5f 100644 --- a/compiler/test/dotc/pos-test-pickling.blacklist +++ b/compiler/test/dotc/pos-test-pickling.blacklist @@ -124,3 +124,8 @@ i19955a.scala i19955b.scala i20053b.scala +# alias types at different levels of dereferencing +parsercombinators-givens.scala +parsercombinators-givens-2.scala + + diff --git a/docs/_docs/internals/syntax.md b/docs/_docs/internals/syntax.md index 8cc070d5dbc5..c711d5f63db8 100644 --- a/docs/_docs/internals/syntax.md +++ b/docs/_docs/internals/syntax.md @@ -372,7 +372,7 @@ ClsParamClause ::= [nl] ‘(’ ClsParams ‘)’ | [nl] ‘(’ ‘using’ (ClsParams | FunArgTypes) ‘)’ ClsParams ::= ClsParam {‘,’ ClsParam} ClsParam ::= {Annotation} ValDef(mods, id, tpe, expr) -- point of mods on val/var - [{Modifier} (‘val’ | ‘var’)] Param + [{Modifier | ‘tracked’} (‘val’ | ‘var’)] Param DefParamClauses ::= DefParamClause { DefParamClause } -- and two DefTypeParamClause cannot be adjacent DefParamClause ::= DefTypeParamClause diff --git a/docs/_docs/reference/experimental/modularity.md b/docs/_docs/reference/experimental/modularity.md new file mode 100644 index 000000000000..2062c4d5eda2 --- /dev/null +++ b/docs/_docs/reference/experimental/modularity.md @@ -0,0 +1,189 @@ +--- +layout: doc-page +title: "Modularity Improvements" +nightlyOf: https://docs.scala-lang.org/scala3/reference/experimental/modularity.html +--- + +# Modularity Improvements + +Martin Odersky, 7.1.2024 + +Scala is a language in the SML tradition, in the sense that it has +abstract and alias types as members of modules (which in Scala take the form of objects and classes). This leads to a simple dependently +typed system, where dependencies in types are on paths instead of full terms. + +So far, some key ingredients were lacking which meant that module composition with functors is harder in Scala than in SML. In particular, one often needs to resort the infamous `Aux` pattern that lifts type members into type parameters so that they can be tracked across class instantiations. This makes modular, dependently typed programs +much harder to write and read, and makes such programming only accessible to experts. + +In this note I propose some small changes to Scala's dependent typing that makes +modular programming much more straightforward. + +The suggested improvements have been implemented and are available +in source version `future` if the additional experimental language import `modularity` is present. For instance, using the following command: + +``` + scala compile -source:future -language:experimental.modularity +``` + +## Tracked Parameters + +Scala is dependently typed for functions, but unfortunately not for classes. +For instance, consider the following definitions: + +```scala + class C: + type T + ... + + def f(x: C): x.T = ... + + val y: C { type T = Int } +``` +Then `f(y)` would have type `Int`, since the compiler will substitute the +concrete parameter reference `y` for the formal parameter `x` in the result +type of `f`, and `y.T = Int` + +However, if we use a class `F` instead of a method `f`, things go wrong. + +```scala + class F(val x: C): + val result: x.T = ... +``` +Now `F(y).result` would not have type `Int` but instead the rather less useful type `?1.T` where `?1` is a so-called skolem constant of type `C` (a skolem represents an unknown value). + +This shortcoming means that classes cannot really be used for advanced +modularity constructs that rely on dependent typing. + +**Proposal:** Introduce a `tracked` modifier that can be added to +a `val` parameter of a class or trait. For every tracked class parameter of a class `C`, add a refinement in the constructor type of `C` that the class member is the same as the parameter. + +**Example:** In the setting above, assume `F` is instead declared like this: +```scala + class F(tracked val x: C): + val result: x.T = ... +``` +Then the constructor `F` would get roughly the following type: +```scala + F(x1: C): F { val x: x1.type } +``` +_Aside:_ More precisely, both parameter and refinement would apply to the same name `x` but the refinement still refers to the parameter. We unfortunately can't express that in source, however, so we chose the new name `x1` for the parameter in the explanation. + +With the new constructor type, the expression `F(y).result` would now have the type `Int`, as hoped for. The reasoning to get there is as follows: + + - The result of the constructor `F(y)` has type `F { val x: y.type }` by + the standard typing for dependent functions. + - The type of `result` inside `F` is `x.T`. + - Hence, the type of `result` as a member of `F { val x: y.type }` is `y.T`, which is equal to `Int`. + +The addition of tracked parameters makes classes suitable as a fundamental modularity construct supporting dependent typing. Here is an example, taken from issue #3920: + +```scala +trait Ordering: + type T + def compare(t1:T, t2: T): Int + +class SetFunctor(tracked val ord: Ordering): + type Set = List[ord.T] + + def empty: Set = Nil + + extension (s: Set) + def add(x: ord.T): Set = x :: remove(x) + def remove(x: ord.T): Set = s.filter(e => ord.compare(x, e) != 0) + def contains(x: ord.T): Boolean = s.exists(e => ord.compare(x, e) == 0) + +object intOrdering extends Ordering: + type T = Int + def compare(t1: T, t2: T): Int = t1 - t2 + +val IntSet = new SetFunctor(intOrdering) + +@main def Test = + import IntSet.* + val set = IntSet.empty.add(6).add(8).add(23) + assert(!set.contains(7)) + assert(set.contains(8)) +``` +This works as it should now. Without the addition of `tracked` to the +parameter of `SetFunctor` typechecking would immediately lose track of +the element type `T` after an `add`, and would therefore fail. + +**Syntax Change** + +``` +ClsParam ::= {Annotation} [{Modifier | ‘tracked’} (‘val’ | ‘var’)] Param +``` + +The (soft) `tracked` modifier is only allowed for `val` parameters of classes. + +**Discussion** + +Since `tracked` is so useful, why not assume it by default? First, `tracked` makes sense only for `val` parameters. If a class parameter is not also a field declared using `val` then there's nothing to refine in the constructor result type. One could think of at least making all `val` parameters tracked by default, but that would be a backwards incompatible change. For instance, the following code would break: + +```scala +case class Foo(x: Int) +var foo = Foo(1) +if someCondition then foo = Foo(2) +``` +If we assume `tracked` for parameter `x` (which is implicitly a `val`), +then `foo` would get inferred type `Foo { val x: 1 }`, so it could not +be reassigned to a value of type `Foo { val x: 2 }` on the next line. + +Another approach might be to assume `tracked` for a `val` parameter `x` +only if the class refers to a type member of `x`. But it turns out that this +scheme is unimplementable since it would quickly lead to cyclic references +when typechecking recursive class graphs. So an explicit `tracked` looks like the best available option. + +## Allow Class Parents to be Refined Types + +Since `tracked` parameters create refinements in constructor types, +it is now possible that a class has a parent that is a refined type. +Previously such types were not permitted, since we were not quite sure how to handle them. But with tracked parameters it becomes pressing so +admit such types. + +**Proposal** Allow refined types as parent types of classes. All refinements that are inherited in this way become synthetic members of the class. + +**Example** + +```scala +class C: + type T + def m(): T + +type R = C: + type T = Int + def m(): 22 + +class D extends R: + def next(): D +``` +This code now compiles. The definition of `D` is expanded as follows: + +```scala +class D extends C: + def next(): D + /*synthetic*/ type T = Int + /*synthetic*/ def m(): 22 +``` +Note how class refinements are moved from the parent constructor of `D` into the body of class `D` itself. + +This change does not entail a syntax change. Syntactically, parent types cannot be refined types themselves. So the following would be illegal: +```scala +class D extends C { type T = Int; def m(): 22 }: // error + def next(): D +``` +If a refined type should be used directly as a parent type of a class, it needs to come in parentheses: +```scala +class D extends (C { type T = Int; def m(): 22 }) // ok + def next(): D +``` + +## A Small Relaxation To Export Rules + +The rules for export forwarders are changed as follows. + +Previously, all export forwarders were declared `final`. Now, only term members are declared `final`. Type aliases are left aside. + +This makes it possible to export the same type member into several traits and then mix these traits in the same class. The test file `tests/pos/typeclass-aggregates.scala` shows why this is essential if we want to combine multiple givens with type members in a new given that aggregates all these givens in an intersection type. + +The change does not lose safety since different type aliases would in any case lead to uninstantiatable classes. \ No newline at end of file diff --git a/docs/sidebar.yml b/docs/sidebar.yml index b38e057f06b1..160698f1f44b 100644 --- a/docs/sidebar.yml +++ b/docs/sidebar.yml @@ -155,6 +155,7 @@ subsection: - page: reference/experimental/purefuns.md - page: reference/experimental/tupled-function.md - page: reference/experimental/named-tuples.md + - page: reference/experimental/modularity.md - page: reference/syntax.md - title: Language Versions index: reference/language-versions/language-versions.md diff --git a/project/MiMaFilters.scala b/project/MiMaFilters.scala index 40a3918b5943..3b28733226a0 100644 --- a/project/MiMaFilters.scala +++ b/project/MiMaFilters.scala @@ -18,6 +18,8 @@ object MiMaFilters { ProblemFilters.exclude[DirectMissingMethodProblem]("scala.runtime.Tuples.fromIArray"), ProblemFilters.exclude[MissingFieldProblem]("scala.runtime.stdLibPatches.language#experimental.namedTuples"), ProblemFilters.exclude[MissingClassProblem]("scala.runtime.stdLibPatches.language$experimental$namedTuples$"), + ProblemFilters.exclude[MissingFieldProblem]("scala.runtime.stdLibPatches.language#experimental.modularity"), + ProblemFilters.exclude[MissingClassProblem]("scala.runtime.stdLibPatches.language$experimental$modularity$"), ), // Additions since last LTS @@ -95,6 +97,7 @@ object MiMaFilters { // Additions that require a new minor version of tasty core Build.mimaPreviousDottyVersion -> Seq( ProblemFilters.exclude[DirectMissingMethodProblem]("dotty.tools.tasty.TastyFormat.FLEXIBLEtype") + ProblemFilters.exclude[DirectMissingMethodProblem]("dotty.tools.tasty.TastyFormat.TRACKED"), ), // Additions since last LTS diff --git a/tasty/src/dotty/tools/tasty/TastyFormat.scala b/tasty/src/dotty/tools/tasty/TastyFormat.scala index 164243d3b469..c29ea99bcd8d 100644 --- a/tasty/src/dotty/tools/tasty/TastyFormat.scala +++ b/tasty/src/dotty/tools/tasty/TastyFormat.scala @@ -228,6 +228,7 @@ Standard-Section: "ASTs" TopLevelStat* EXPORTED -- An export forwarder OPEN -- an open class INVISIBLE -- invisible during typechecking + TRACKED -- a tracked class parameter / a dependent class Annotation Variance = STABLE -- invariant @@ -509,6 +510,7 @@ object TastyFormat { final val INVISIBLE = 44 final val EMPTYCLAUSE = 45 final val SPLITCLAUSE = 46 + final val TRACKED = 47 // Tree Cat. 2: tag Nat final val firstNatTreeTag = SHAREDterm @@ -700,7 +702,8 @@ object TastyFormat { | INVISIBLE | ANNOTATION | PRIVATEqualified - | PROTECTEDqualified => true + | PROTECTEDqualified + | TRACKED => true case _ => false } diff --git a/tests/neg/i3964.scala b/tests/neg/i3964.scala new file mode 100644 index 000000000000..eaf3953bc230 --- /dev/null +++ b/tests/neg/i3964.scala @@ -0,0 +1,12 @@ +//> using options -source future -language:experimental.modularity +trait Animal +class Dog extends Animal +class Cat extends Animal + +object Test1: + + abstract class Bar { val x: Animal } + val bar: Bar { val x: Cat } = new Bar { val x = new Cat } // error, but should work + + trait Foo { val x: Animal } + val foo: Foo { val x: Cat } = new Foo { val x = new Cat } // error, but should work diff --git a/tests/neg/tracked.check b/tests/neg/tracked.check new file mode 100644 index 000000000000..ae734e7aa0b4 --- /dev/null +++ b/tests/neg/tracked.check @@ -0,0 +1,50 @@ +-- Error: tests/neg/tracked.scala:2:16 --------------------------------------------------------------------------------- +2 |class C(tracked x: Int) // error + | ^ + | `val` or `var` expected +-- [E040] Syntax Error: tests/neg/tracked.scala:7:18 ------------------------------------------------------------------- +7 | def foo(tracked a: Int) = // error + | ^ + | ':' expected, but identifier found +-- Error: tests/neg/tracked.scala:8:12 --------------------------------------------------------------------------------- +8 | tracked val b: Int = 2 // error + | ^^^ + | end of statement expected but 'val' found +-- Error: tests/neg/tracked.scala:11:10 -------------------------------------------------------------------------------- +11 | tracked object Foo // error // error + | ^^^^^^ + | end of statement expected but 'object' found +-- Error: tests/neg/tracked.scala:14:10 -------------------------------------------------------------------------------- +14 | tracked class D // error // error + | ^^^^^ + | end of statement expected but 'class' found +-- Error: tests/neg/tracked.scala:17:10 -------------------------------------------------------------------------------- +17 | tracked type T = Int // error // error + | ^^^^ + | end of statement expected but 'type' found +-- Error: tests/neg/tracked.scala:20:29 -------------------------------------------------------------------------------- +20 | given g2(using tracked val x: Int): C = C(x) // error + | ^^^^^^^^^^^^^^^^^^ + | method parameter x may not be a `val` +-- Error: tests/neg/tracked.scala:4:21 --------------------------------------------------------------------------------- +4 |class C2(tracked var x: Int) // error + | ^ + | mutable variables may not be `tracked` +-- [E006] Not Found Error: tests/neg/tracked.scala:11:2 ---------------------------------------------------------------- +11 | tracked object Foo // error // error + | ^^^^^^^ + | Not found: tracked + | + | longer explanation available when compiling with `-explain` +-- [E006] Not Found Error: tests/neg/tracked.scala:14:2 ---------------------------------------------------------------- +14 | tracked class D // error // error + | ^^^^^^^ + | Not found: tracked + | + | longer explanation available when compiling with `-explain` +-- [E006] Not Found Error: tests/neg/tracked.scala:17:2 ---------------------------------------------------------------- +17 | tracked type T = Int // error // error + | ^^^^^^^ + | Not found: tracked + | + | longer explanation available when compiling with `-explain` diff --git a/tests/neg/tracked.scala b/tests/neg/tracked.scala new file mode 100644 index 000000000000..8d315a7b89ac --- /dev/null +++ b/tests/neg/tracked.scala @@ -0,0 +1,20 @@ +//> using options -source future -language:experimental.modularity +class C(tracked x: Int) // error + +class C2(tracked var x: Int) // error + +object A: + def foo(tracked a: Int) = // error + tracked val b: Int = 2 // error + +object B: + tracked object Foo // error // error + +object C: + tracked class D // error // error + +object D: + tracked type T = Int // error // error + +object E: + given g2(using tracked val x: Int): C = C(x) // error diff --git a/tests/neg/tracked2.scala b/tests/neg/tracked2.scala new file mode 100644 index 000000000000..2e6fa8cf6045 --- /dev/null +++ b/tests/neg/tracked2.scala @@ -0,0 +1 @@ +class C(tracked val x: Int) // error diff --git a/tests/new/tracked-mixin-traits.scala b/tests/new/tracked-mixin-traits.scala new file mode 100644 index 000000000000..21d890d44f42 --- /dev/null +++ b/tests/new/tracked-mixin-traits.scala @@ -0,0 +1,16 @@ +trait A: + type T +object a extends A: + type T = Int + +trait B(tracked val b: A): + type T = b.T + +trait C(tracked val c: A): + type T = c.T + +class D extends B(a), C(a): + val x: T = 2 + + + diff --git a/tests/pos/depclass-1.scala b/tests/pos/depclass-1.scala new file mode 100644 index 000000000000..38daef85ae98 --- /dev/null +++ b/tests/pos/depclass-1.scala @@ -0,0 +1,19 @@ +//> using options -source future -language:experimental.modularity +class A(tracked val source: String) + +class B(x: Int, tracked val source1: String) extends A(source1) + +class C(tracked val source2: String) extends B(1, source2) + +//class D(source1: String) extends C(source1) +val x = C("hello") +val _: A{ val source: "hello" } = x + +class Vec[Elem](tracked val size: Int) +class Vec8 extends Vec[Float](8) + +val v = Vec[Float](10) +val v2 = Vec8() +val xx: 10 = v.size +val x2: 8 = v2.size + diff --git a/tests/pos/i3920.scala b/tests/pos/i3920.scala new file mode 100644 index 000000000000..6cd74187098f --- /dev/null +++ b/tests/pos/i3920.scala @@ -0,0 +1,32 @@ +//> using options -source future -language:experimental.modularity +trait Ordering { + type T + def compare(t1:T, t2: T): Int +} + +class SetFunctor(tracked val ord: Ordering) { + type Set = List[ord.T] + def empty: Set = Nil + + implicit class helper(s: Set) { + def add(x: ord.T): Set = x :: remove(x) + def remove(x: ord.T): Set = s.filter(e => ord.compare(x, e) != 0) + def member(x: ord.T): Boolean = s.exists(e => ord.compare(x, e) == 0) + } +} + +object Test { + val orderInt = new Ordering { + type T = Int + def compare(t1: T, t2: T): Int = t1 - t2 + } + + val IntSet = new SetFunctor(orderInt) + import IntSet.* + + def main(args: Array[String]) = { + val set = IntSet.empty.add(6).add(8).add(23) + assert(!set.member(7)) + assert(set.member(8)) + } +} \ No newline at end of file diff --git a/tests/pos/i3964.scala b/tests/pos/i3964.scala new file mode 100644 index 000000000000..42412b910899 --- /dev/null +++ b/tests/pos/i3964.scala @@ -0,0 +1,32 @@ +//> using options -source future -language:experimental.modularity +trait Animal +class Dog extends Animal +class Cat extends Animal + +object Test2: + class Bar(tracked val x: Animal) + val b = new Bar(new Cat) + val bar: Bar { val x: Cat } = new Bar(new Cat) // ok + + trait Foo(tracked val x: Animal) + val foo: Foo { val x: Cat } = new Foo(new Cat) {} // ok + +object Test3: + trait Vec(tracked val size: Int) + class Vec8 extends Vec(8) + + abstract class Lst(tracked val size: Int) + class Lst8 extends Lst(8) + + val v8a: Vec { val size: 8 } = new Vec8 + val v8b: Vec { val size: 8 } = new Vec(8) {} + + val l8a: Lst { val size: 8 } = new Lst8 + val l8b: Lst { val size: 8 } = new Lst(8) {} + + class VecN(tracked val n: Int) extends Vec(n) + class Vec9 extends VecN(9) + val v9a = VecN(9) + val _: Vec { val size: 9 } = v9a + val v9b = Vec9() + val _: Vec { val size: 9 } = v9b diff --git a/tests/pos/i3964a/Defs_1.scala b/tests/pos/i3964a/Defs_1.scala new file mode 100644 index 000000000000..7dcc89f7003e --- /dev/null +++ b/tests/pos/i3964a/Defs_1.scala @@ -0,0 +1,18 @@ +//> using options -source future -language:experimental.modularity +trait Animal +class Dog extends Animal +class Cat extends Animal + +object Test2: + class Bar(tracked val x: Animal) + val b = new Bar(new Cat) + val bar: Bar { val x: Cat } = new Bar(new Cat) // ok + + trait Foo(tracked val x: Animal) + val foo: Foo { val x: Cat } = new Foo(new Cat) {} // ok + +package coll: + trait Vec(tracked val size: Int) + class Vec8 extends Vec(8) + + abstract class Lst(tracked val size: Int) \ No newline at end of file diff --git a/tests/pos/i3964a/Uses_2.scala b/tests/pos/i3964a/Uses_2.scala new file mode 100644 index 000000000000..9d1b6ebaa58b --- /dev/null +++ b/tests/pos/i3964a/Uses_2.scala @@ -0,0 +1,16 @@ +//> using options -source future -language:experimental.modularity +import coll.* +class Lst8 extends Lst(8) + +val v8a: Vec { val size: 8 } = new Vec8 +val v8b: Vec { val size: 8 } = new Vec(8) {} + +val l8a: Lst { val size: 8 } = new Lst8 +val l8b: Lst { val size: 8 } = new Lst(8) {} + +class VecN(tracked val n: Int) extends Vec(n) +class Vec9 extends VecN(9) +val v9a = VecN(9) +val _: Vec { val size: 9 } = v9a +val v9b = Vec9() +val _: Vec { val size: 9 } = v9b diff --git a/tests/pos/parsercombinators-expanded.scala b/tests/pos/parsercombinators-expanded.scala new file mode 100644 index 000000000000..cf8137bfe8eb --- /dev/null +++ b/tests/pos/parsercombinators-expanded.scala @@ -0,0 +1,64 @@ +//> using options -source future -language:experimental.modularity + +import collection.mutable + +/// A parser combinator. +trait Combinator[T]: + + /// The context from which elements are being parsed, typically a stream of tokens. + type Context + /// The element being parsed. + type Element + + extension (self: T) + /// Parses and returns an element from `context`. + def parse(context: Context): Option[Element] +end Combinator + +final case class Apply[C, E](action: C => Option[E]) +final case class Combine[A, B](first: A, second: B) + +object test: + + class apply[C, E] extends Combinator[Apply[C, E]]: + type Context = C + type Element = E + extension(self: Apply[C, E]) + def parse(context: C): Option[E] = self.action(context) + + def apply[C, E]: apply[C, E] = new apply[C, E] + + class combine[A, B]( + tracked val f: Combinator[A], + tracked val s: Combinator[B] { type Context = f.Context} + ) extends Combinator[Combine[A, B]]: + type Context = f.Context + type Element = (f.Element, s.Element) + extension(self: Combine[A, B]) + def parse(context: Context): Option[Element] = ??? + + def combine[A, B]( + _f: Combinator[A], + _s: Combinator[B] { type Context = _f.Context} + ) = new combine[A, B](_f, _s) + // cast is needed since the type of new combine[A, B](_f, _s) + // drops the required refinement. + + extension [A] (buf: mutable.ListBuffer[A]) def popFirst() = + if buf.isEmpty then None + else try Some(buf.head) finally buf.remove(0) + + @main def hello: Unit = { + val source = (0 to 10).toList + val stream = source.to(mutable.ListBuffer) + + val n = Apply[mutable.ListBuffer[Int], Int](s => s.popFirst()) + val m = Combine(n, n) + + val c = combine( + apply[mutable.ListBuffer[Int], Int], + apply[mutable.ListBuffer[Int], Int] + ) + val r = c.parse(m)(stream) // was type mismatch, now OK + val rc: Option[(Int, Int)] = r + } diff --git a/tests/pos/parsercombinators-givens-2.scala b/tests/pos/parsercombinators-givens-2.scala new file mode 100644 index 000000000000..8349d69a30af --- /dev/null +++ b/tests/pos/parsercombinators-givens-2.scala @@ -0,0 +1,52 @@ +//> using options -source future -language:experimental.modularity + +import collection.mutable + +/// A parser combinator. +trait Combinator[T]: + + /// The context from which elements are being parsed, typically a stream of tokens. + type Context + /// The element being parsed. + type Element + + extension (self: T) + /// Parses and returns an element from `context`. + def parse(context: Context): Option[Element] +end Combinator + +final case class Apply[C, E](action: C => Option[E]) +final case class Combine[A, B](first: A, second: B) + +given apply[C, E]: Combinator[Apply[C, E]] with { + type Context = C + type Element = E + extension(self: Apply[C, E]) { + def parse(context: C): Option[E] = self.action(context) + } +} + +given combine[A, B, C](using + f: Combinator[A] { type Context = C }, + s: Combinator[B] { type Context = C } +): Combinator[Combine[A, B]] with { + type Context = f.Context + type Element = (f.Element, s.Element) + extension(self: Combine[A, B]) { + def parse(context: Context): Option[Element] = ??? + } +} + +extension [A] (buf: mutable.ListBuffer[A]) def popFirst() = + if buf.isEmpty then None + else try Some(buf.head) finally buf.remove(0) + +@main def hello: Unit = { + val source = (0 to 10).toList + val stream = source.to(mutable.ListBuffer) + + val n = Apply[mutable.ListBuffer[Int], Int](s => s.popFirst()) + val m = Combine(n, n) + + val r = m.parse(stream) // works, but Element type is not resolved correctly +} diff --git a/tests/pos/parsercombinators-givens.scala b/tests/pos/parsercombinators-givens.scala new file mode 100644 index 000000000000..5b5588c93840 --- /dev/null +++ b/tests/pos/parsercombinators-givens.scala @@ -0,0 +1,54 @@ +//> using options -source future -language:experimental.modularity + +import collection.mutable + +/// A parser combinator. +trait Combinator[T]: + + /// The context from which elements are being parsed, typically a stream of tokens. + type Context + /// The element being parsed. + type Element + + extension (self: T) + /// Parses and returns an element from `context`. + def parse(context: Context): Option[Element] +end Combinator + +final case class Apply[C, E](action: C => Option[E]) +final case class Combine[A, B](first: A, second: B) + +given apply[C, E]: Combinator[Apply[C, E]] with { + type Context = C + type Element = E + extension(self: Apply[C, E]) { + def parse(context: C): Option[E] = self.action(context) + } +} + +given combine[A, B](using + tracked val f: Combinator[A], + tracked val s: Combinator[B] { type Context = f.Context } +): Combinator[Combine[A, B]] with { + type Context = f.Context + type Element = (f.Element, s.Element) + extension(self: Combine[A, B]) { + def parse(context: Context): Option[Element] = ??? + } +} + +extension [A] (buf: mutable.ListBuffer[A]) def popFirst() = + if buf.isEmpty then None + else try Some(buf.head) finally buf.remove(0) + +@main def hello: Unit = { + val source = (0 to 10).toList + val stream = source.to(mutable.ListBuffer) + + val n = Apply[mutable.ListBuffer[Int], Int](s => s.popFirst()) + val m = Combine(n, n) + + val r = m.parse(stream) // error: type mismatch, found `mutable.ListBuffer[Int]`, required `?1.Context` + val rc: Option[(Int, Int)] = r + // it would be great if this worked +} diff --git a/tests/run/i3920.scala b/tests/run/i3920.scala new file mode 100644 index 000000000000..c66fd8908976 --- /dev/null +++ b/tests/run/i3920.scala @@ -0,0 +1,26 @@ +//> using options -source future -language:experimental.modularity +trait Ordering: + type T + def compare(t1:T, t2: T): Int + +class SetFunctor(tracked val ord: Ordering): + type Set = List[ord.T] + + def empty: Set = Nil + + extension (s: Set) + def add(x: ord.T): Set = x :: remove(x) + def remove(x: ord.T): Set = s.filter(e => ord.compare(x, e) != 0) + def contains(x: ord.T): Boolean = s.exists(e => ord.compare(x, e) == 0) + +object intOrdering extends Ordering: + type T = Int + def compare(t1: T, t2: T): Int = t1 - t2 + +val IntSet = new SetFunctor(intOrdering) + +@main def Test = + import IntSet.* + val set = IntSet.empty.add(6).add(8).add(23) + assert(!set.contains(7)) + assert(set.contains(8)) \ No newline at end of file From 70fb91cfe257a2c2cbe98c53a6cbba9e779a7bc2 Mon Sep 17 00:00:00 2001 From: odersky Date: Mon, 1 Apr 2024 19:44:35 +0200 Subject: [PATCH 210/371] Make explicit arguments for context bounds an error from 3.5 [Cherry-picked ea3c688b94d9982cceda7b63969cd7e2a1887a46] --- compiler/src/dotty/tools/dotc/typer/ReTyper.scala | 1 + compiler/src/dotty/tools/dotc/typer/Typer.scala | 3 +++ tests/warn/context-bounds-migration.scala | 9 +++++++++ 3 files changed, 13 insertions(+) create mode 100644 tests/warn/context-bounds-migration.scala diff --git a/compiler/src/dotty/tools/dotc/typer/ReTyper.scala b/compiler/src/dotty/tools/dotc/typer/ReTyper.scala index 9741a366da89..7a5c838848ac 100644 --- a/compiler/src/dotty/tools/dotc/typer/ReTyper.scala +++ b/compiler/src/dotty/tools/dotc/typer/ReTyper.scala @@ -182,4 +182,5 @@ class ReTyper(nestingLevel: Int = 0) extends Typer(nestingLevel) with ReChecking override protected def checkEqualityEvidence(tree: tpd.Tree, pt: Type)(using Context): Unit = () override protected def matchingApply(methType: MethodOrPoly, pt: FunProto)(using Context): Boolean = true override protected def typedScala2MacroBody(call: untpd.Tree)(using Context): Tree = promote(call) + override protected def migrate[T](migration: => T, disabled: => T = ()): T = disabled } diff --git a/compiler/src/dotty/tools/dotc/typer/Typer.scala b/compiler/src/dotty/tools/dotc/typer/Typer.scala index 8f2b7ce95785..17a2cba25019 100644 --- a/compiler/src/dotty/tools/dotc/typer/Typer.scala +++ b/compiler/src/dotty/tools/dotc/typer/Typer.scala @@ -183,6 +183,9 @@ class Typer(@constructorOnly nestingLevel: Int = 0) extends Namer // Overridden in derived typers def newLikeThis(nestingLevel: Int): Typer = new Typer(nestingLevel) + // Overridden to do nothing in derived typers + protected def migrate[T](migration: => T, disabled: => T = ()): T = migration + /** Find the type of an identifier with given `name` in given context `ctx`. * @param name the name of the identifier * @param pt the expected type diff --git a/tests/warn/context-bounds-migration.scala b/tests/warn/context-bounds-migration.scala new file mode 100644 index 000000000000..cdd3eca62b5c --- /dev/null +++ b/tests/warn/context-bounds-migration.scala @@ -0,0 +1,9 @@ + +class C[T] +def foo[X: C] = () + +given [T]: C[T] = C[T]() + +def Test = + foo(C[Int]()) // warning + foo(using C[Int]()) // ok From 90e84b96a9e53d8e8203b09efc56d3cf0679783e Mon Sep 17 00:00:00 2001 From: odersky Date: Mon, 1 Apr 2024 20:02:13 +0200 Subject: [PATCH 211/371] Drop restriction against typedefs at level * only Allow the RHS of a type def to be higher-kinded. But keep the restriction for opaque type aliases; their RHS must be fully applied. I am not sure why the restriction applies to them, but there was a test specifically about that, so there night be a reason. # Conflicts: # compiler/src/dotty/tools/dotc/typer/Typer.scala # Conflicts: # compiler/src/dotty/tools/dotc/typer/Typer.scala # tests/pos/typeclasses-this.scala [Cherry-picked f96a769b17f362d14d2265693e72ad7311301172] --- .../src/dotty/tools/dotc/typer/Checking.scala | 16 ++++++++-------- compiler/src/dotty/tools/dotc/typer/Typer.scala | 5 +++-- tests/neg/i12456.scala | 2 +- tests/neg/i13757-match-type-anykind.scala | 2 +- tests/neg/i9328.scala | 2 +- tests/neg/parser-stability-12.scala | 2 +- tests/neg/unapplied-types.scala | 7 ------- tests/pos/unapplied-types.scala | 7 +++++++ 8 files changed, 22 insertions(+), 21 deletions(-) delete mode 100644 tests/neg/unapplied-types.scala create mode 100644 tests/pos/unapplied-types.scala diff --git a/compiler/src/dotty/tools/dotc/typer/Checking.scala b/compiler/src/dotty/tools/dotc/typer/Checking.scala index 5839ec1766af..073055ba5b58 100644 --- a/compiler/src/dotty/tools/dotc/typer/Checking.scala +++ b/compiler/src/dotty/tools/dotc/typer/Checking.scala @@ -1331,20 +1331,20 @@ trait Checking { } /** Check that user-defined (result) type is fully applied */ - def checkFullyAppliedType(tree: Tree)(using Context): Unit = tree match + def checkFullyAppliedType(tree: Tree, prefix: String)(using Context): Unit = tree match case TypeBoundsTree(lo, hi, alias) => - checkFullyAppliedType(lo) - checkFullyAppliedType(hi) - checkFullyAppliedType(alias) + checkFullyAppliedType(lo, prefix) + checkFullyAppliedType(hi, prefix) + checkFullyAppliedType(alias, prefix) case Annotated(arg, annot) => - checkFullyAppliedType(arg) + checkFullyAppliedType(arg, prefix) case LambdaTypeTree(_, body) => - checkFullyAppliedType(body) + checkFullyAppliedType(body, prefix) case _: TypeTree => case _ => if tree.tpe.typeParams.nonEmpty then val what = if tree.symbol.exists then tree.symbol.show else i"type $tree" - report.error(em"$what takes type parameters", tree.srcPos) + report.error(em"$prefix$what takes type parameters", tree.srcPos) /** Check that we are in an inline context (inside an inline method or in inline code) */ def checkInInlineContext(what: String, pos: SrcPos)(using Context): Unit = @@ -1609,7 +1609,7 @@ trait ReChecking extends Checking { override def checkEnumParent(cls: Symbol, firstParent: Symbol)(using Context): Unit = () override def checkEnum(cdef: untpd.TypeDef, cls: Symbol, firstParent: Symbol)(using Context): Unit = () override def checkRefsLegal(tree: tpd.Tree, badOwner: Symbol, allowed: (Name, Symbol) => Boolean, where: String)(using Context): Unit = () - override def checkFullyAppliedType(tree: Tree)(using Context): Unit = () + override def checkFullyAppliedType(tree: Tree, prefix: String)(using Context): Unit = () override def checkEnumCaseRefsLegal(cdef: TypeDef, enumCtx: Context)(using Context): Unit = () override def checkAnnotApplicable(annot: Tree, sym: Symbol)(using Context): Boolean = true override def checkMatchable(tp: Type, pos: SrcPos, pattern: Boolean)(using Context): Unit = () diff --git a/compiler/src/dotty/tools/dotc/typer/Typer.scala b/compiler/src/dotty/tools/dotc/typer/Typer.scala index 17a2cba25019..a357f06e4ee8 100644 --- a/compiler/src/dotty/tools/dotc/typer/Typer.scala +++ b/compiler/src/dotty/tools/dotc/typer/Typer.scala @@ -2780,8 +2780,9 @@ class Typer(@constructorOnly nestingLevel: Int = 0) extends Namer typeIndexedLambdaTypeTree(rhs, tparams, body) case rhs => typedType(rhs) - checkFullyAppliedType(rhs1) - if sym.isOpaqueAlias then checkNoContextFunctionType(rhs1) + if sym.isOpaqueAlias then + checkFullyAppliedType(rhs1, "Opaque type alias must be fully applied, but ") + checkNoContextFunctionType(rhs1) assignType(cpy.TypeDef(tdef)(name, rhs1), sym) } diff --git a/tests/neg/i12456.scala b/tests/neg/i12456.scala index b9fb0283dcd7..c1a3ada5a420 100644 --- a/tests/neg/i12456.scala +++ b/tests/neg/i12456.scala @@ -1 +1 @@ -object F { type T[G[X] <: X, F <: G[F]] } // error // error +object F { type T[G[X] <: X, F <: G[F]] } // error diff --git a/tests/neg/i13757-match-type-anykind.scala b/tests/neg/i13757-match-type-anykind.scala index a80e8b2b289b..998c54292b15 100644 --- a/tests/neg/i13757-match-type-anykind.scala +++ b/tests/neg/i13757-match-type-anykind.scala @@ -8,7 +8,7 @@ object Test: type AnyKindMatchType3[X <: AnyKind] = X match // error: the scrutinee of a match type cannot be higher-kinded case _ => Int - type AnyKindMatchType4[X <: Option] = X match // error // error: the scrutinee of a match type cannot be higher-kinded + type AnyKindMatchType4[X <: Option] = X match // error: the scrutinee of a match type cannot be higher-kinded case _ => Int type AnyKindMatchType5[X[_]] = X match // error: the scrutinee of a match type cannot be higher-kinded diff --git a/tests/neg/i9328.scala b/tests/neg/i9328.scala index dabde498e1dc..c13d33e103b9 100644 --- a/tests/neg/i9328.scala +++ b/tests/neg/i9328.scala @@ -3,7 +3,7 @@ type Id[T] = T match { case _ => T } -class Foo2[T <: Id[T]] // error // error +class Foo2[T <: Id[T]] // error object Foo { // error object Foo { } diff --git a/tests/neg/parser-stability-12.scala b/tests/neg/parser-stability-12.scala index 78ff178d010c..17a611d70e34 100644 --- a/tests/neg/parser-stability-12.scala +++ b/tests/neg/parser-stability-12.scala @@ -1,4 +1,4 @@ trait x0[]: // error - trait x1[x1 <:x0] // error: type x0 takes type parameters + trait x1[x1 <:x0] extends x1[ // error // error \ No newline at end of file diff --git a/tests/neg/unapplied-types.scala b/tests/neg/unapplied-types.scala deleted file mode 100644 index 2f2339baa026..000000000000 --- a/tests/neg/unapplied-types.scala +++ /dev/null @@ -1,7 +0,0 @@ -trait T { - type L[X] = List[X] - type T1 <: L // error: takes type parameters - type T2 = L // error: takes type parameters - type T3 = List // error: takes type parameters - type T4 <: List // error: takes type parameters -} diff --git a/tests/pos/unapplied-types.scala b/tests/pos/unapplied-types.scala new file mode 100644 index 000000000000..604e63deb8ad --- /dev/null +++ b/tests/pos/unapplied-types.scala @@ -0,0 +1,7 @@ +trait T { + type L[X] = List[X] + type T1 <: L // was error: takes type parameters + type T2 = L // was error: takes type parameters + type T3 = List // was error: takes type parameters + type T4 <: List // was error: takes type parameters +} From 62eed876ca35d942a3fa84ee7ffbb1999f917f6a Mon Sep 17 00:00:00 2001 From: odersky Date: Mon, 1 Apr 2024 20:10:36 +0200 Subject: [PATCH 212/371] Allow types in given definitions to be infix types A type implemented in a given definition can now be an infix type, without enclosing parens being necessary. By contrast, it cannot anymore be a refined type. Refined types have to be enclosed in parens. This second point aligns the dotty parser with the published syntax and the scala meta parser. # Conflicts: # tests/pos/typeclasses-this.scala [Cherry-picked ef71dcb45a0f31b72c5fe05fc48764865e1cea8e] --- .../dotty/tools/dotc/parsing/Parsers.scala | 26 +++++++++++++------ docs/_docs/internals/syntax.md | 4 ++- docs/_docs/reference/syntax.md | 9 ++++--- tests/neg/i12348.check | 16 ++++++------ tests/neg/i12348.scala | 3 +-- tests/neg/i7045.scala | 7 +++++ tests/pos/i7045.scala | 9 ------- tests/pos/typeclass-aggregates.scala | 6 ++--- 8 files changed, 45 insertions(+), 35 deletions(-) create mode 100644 tests/neg/i7045.scala delete mode 100644 tests/pos/i7045.scala diff --git a/compiler/src/dotty/tools/dotc/parsing/Parsers.scala b/compiler/src/dotty/tools/dotc/parsing/Parsers.scala index 94814457523e..6c0f19de3dd1 100644 --- a/compiler/src/dotty/tools/dotc/parsing/Parsers.scala +++ b/compiler/src/dotty/tools/dotc/parsing/Parsers.scala @@ -1806,8 +1806,8 @@ object Parsers { */ def infixType(): Tree = infixTypeRest(refinedType()) - def infixTypeRest(t: Tree): Tree = - infixOps(t, canStartInfixTypeTokens, refinedTypeFn, Location.ElseWhere, ParseKind.Type, + def infixTypeRest(t: Tree, operand: Location => Tree = refinedTypeFn): Tree = + infixOps(t, canStartInfixTypeTokens, operand, Location.ElseWhere, ParseKind.Type, isOperator = !followingIsVararg() && !isPureArrow && nextCanFollowOperator(canStartInfixTypeTokens)) @@ -1872,6 +1872,10 @@ object Parsers { */ def annotType(): Tree = annotTypeRest(simpleType()) + /** AnnotType1 ::= SimpleType1 {Annotation} + */ + def annotType1(): Tree = annotTypeRest(simpleType1()) + def annotTypeRest(t: Tree): Tree = if (in.token == AT) annotTypeRest(atSpan(startOffset(t)) { @@ -4097,8 +4101,10 @@ object Parsers { syntaxError(em"extension clause can only define methods", stat.span) } - /** GivenDef ::= [GivenSig] (AnnotType [‘=’ Expr] | StructuralInstance) - * GivenSig ::= [id] [DefTypeParamClause] {UsingParamClauses} ‘:’ + /** GivenDef ::= [GivenSig] (GivenType [‘=’ Expr] | StructuralInstance) + * GivenSig ::= [id] [DefTypeParamClause] {UsingParamClauses} ‘:’ + * GivenType ::= AnnotType1 {id [nl] AnnotType1} + * StructuralInstance ::= ConstrApp {‘with’ ConstrApp} [‘with’ WithTemplateBody] */ def givenDef(start: Offset, mods: Modifiers, givenMod: Mod) = atSpan(start, nameStart) { var mods1 = addMod(mods, givenMod) @@ -4124,8 +4130,12 @@ object Parsers { val noParams = tparams.isEmpty && vparamss.isEmpty if !(name.isEmpty && noParams) then acceptColon() val parents = - if isSimpleLiteral then rejectWildcardType(annotType()) :: Nil - else refinedTypeRest(constrApp()) :: withConstrApps() + if isSimpleLiteral then + rejectWildcardType(annotType()) :: Nil + else constrApp() match + case parent: Apply => parent :: withConstrApps() + case parent if in.isIdent => infixTypeRest(parent, _ => annotType1()) :: Nil + case parent => parent :: withConstrApps() val parentsIsType = parents.length == 1 && parents.head.isType if in.token == EQUALS && parentsIsType then accept(EQUALS) @@ -4219,10 +4229,10 @@ object Parsers { /* -------- TEMPLATES ------------------------------------------- */ - /** ConstrApp ::= SimpleType1 {Annotation} {ParArgumentExprs} + /** ConstrApp ::= AnnotType1 {ParArgumentExprs} */ val constrApp: () => Tree = () => - val t = rejectWildcardType(annotTypeRest(simpleType1()), + val t = rejectWildcardType(annotType1(), fallbackTree = Ident(tpnme.ERROR)) // Using Ident(tpnme.ERROR) to avoid causing cascade errors on non-user-written code if in.token == LPAREN then parArgumentExprss(wrapNew(t)) else t diff --git a/docs/_docs/internals/syntax.md b/docs/_docs/internals/syntax.md index c711d5f63db8..6ef346ab22cc 100644 --- a/docs/_docs/internals/syntax.md +++ b/docs/_docs/internals/syntax.md @@ -191,6 +191,7 @@ MatchType ::= InfixType `match` <<< TypeCaseClauses >>> InfixType ::= RefinedType {id [nl] RefinedType} InfixOp(t1, op, t2) RefinedType ::= AnnotType {[nl] Refinement} RefinedTypeTree(t, ds) AnnotType ::= SimpleType {Annotation} Annotated(t, annot) +AnnotType1 ::= SimpleType1 {Annotation} Annotated(t, annot) SimpleType ::= SimpleLiteral SingletonTypeTree(l) | ‘?’ TypeBounds @@ -466,8 +467,9 @@ ClassConstr ::= [ClsTypeParamClause] [ConstrMods] ClsParamClauses ConstrMods ::= {Annotation} [AccessModifier] ObjectDef ::= id [Template] ModuleDef(mods, name, template) // no constructor EnumDef ::= id ClassConstr InheritClauses EnumBody -GivenDef ::= [GivenSig] (AnnotType [‘=’ Expr] | StructuralInstance) +GivenDef ::= [GivenSig] (GivenType [‘=’ Expr] | StructuralInstance) GivenSig ::= [id] [DefTypeParamClause] {UsingParamClause} ‘:’ -- one of `id`, `DefTypeParamClause`, `UsingParamClause` must be present +GivenType ::= AnnotType1 {id [nl] AnnotType1} StructuralInstance ::= ConstrApp {‘with’ ConstrApp} [‘with’ WithTemplateBody] Extension ::= ‘extension’ [DefTypeParamClause] {UsingParamClause} ‘(’ DefTermParam ‘)’ {UsingParamClause} ExtMethods diff --git a/docs/_docs/reference/syntax.md b/docs/_docs/reference/syntax.md index ae541b65d8c4..66cf5a18fac9 100644 --- a/docs/_docs/reference/syntax.md +++ b/docs/_docs/reference/syntax.md @@ -200,8 +200,8 @@ SimpleType ::= SimpleLiteral | Singleton ‘.’ ‘type’ | ‘(’ [Types] ‘)’ | Refinement - | SimpleType1 TypeArgs - | SimpleType1 ‘#’ id + | SimpleType TypeArgs + | SimpleType ‘#’ id Singleton ::= SimpleRef | SimpleLiteral | Singleton ‘.’ id @@ -392,7 +392,7 @@ LocalModifier ::= ‘abstract’ AccessModifier ::= (‘private’ | ‘protected’) [AccessQualifier] AccessQualifier ::= ‘[’ id ‘]’ -Annotation ::= ‘@’ SimpleType1 {ParArgumentExprs} +Annotation ::= ‘@’ SimpleType {ParArgumentExprs} Import ::= ‘import’ ImportExpr {‘,’ ImportExpr} Export ::= ‘export’ ImportExpr {‘,’ ImportExpr} @@ -444,6 +444,7 @@ ObjectDef ::= id [Template] EnumDef ::= id ClassConstr InheritClauses EnumBody GivenDef ::= [GivenSig] (AnnotType [‘=’ Expr] | StructuralInstance) GivenSig ::= [id] [DefTypeParamClause] {UsingParamClause} ‘:’ -- one of `id`, `DefTypeParamClause`, `UsingParamClause` must be present +GivenType ::= AnnotType {id [nl] AnnotType} StructuralInstance ::= ConstrApp {‘with’ ConstrApp} [‘with’ WithTemplateBody] Extension ::= ‘extension’ [DefTypeParamClause] {UsingParamClause} ‘(’ DefTermParam ‘)’ {UsingParamClause} ExtMethods @@ -453,7 +454,7 @@ ExtMethod ::= {Annotation [nl]} {Modifier} ‘def’ DefDef Template ::= InheritClauses [TemplateBody] InheritClauses ::= [‘extends’ ConstrApps] [‘derives’ QualId {‘,’ QualId}] ConstrApps ::= ConstrApp ({‘,’ ConstrApp} | {‘with’ ConstrApp}) -ConstrApp ::= SimpleType1 {Annotation} {ParArgumentExprs} +ConstrApp ::= SimpleType {Annotation} {ParArgumentExprs} ConstrExpr ::= SelfInvocation | <<< SelfInvocation {semi BlockStat} >>> SelfInvocation ::= ‘this’ ArgumentExprs {ArgumentExprs} diff --git a/tests/neg/i12348.check b/tests/neg/i12348.check index ccc2b9f7ed00..eded51f70f31 100644 --- a/tests/neg/i12348.check +++ b/tests/neg/i12348.check @@ -1,8 +1,8 @@ --- [E040] Syntax Error: tests/neg/i12348.scala:2:15 -------------------------------------------------------------------- -2 | given inline x: Int = 0 // error - | ^ - | 'with' expected, but identifier found --- [E040] Syntax Error: tests/neg/i12348.scala:3:10 -------------------------------------------------------------------- -3 |} // error - | ^ - | '}' expected, but eof found +-- [E040] Syntax Error: tests/neg/i12348.scala:2:16 -------------------------------------------------------------------- +2 | given inline x: Int = 0 // error // error + | ^ + | an identifier expected, but ':' found +-- [E067] Syntax Error: tests/neg/i12348.scala:2:8 --------------------------------------------------------------------- +2 | given inline x: Int = 0 // error // error + | ^ + |Declaration of given instance given_x_inline_ not allowed here: only classes can have declared but undefined members diff --git a/tests/neg/i12348.scala b/tests/neg/i12348.scala index 69fc77fb532e..43daf9a2801b 100644 --- a/tests/neg/i12348.scala +++ b/tests/neg/i12348.scala @@ -1,3 +1,2 @@ object A { - given inline x: Int = 0 // error -} // error \ No newline at end of file + given inline x: Int = 0 // error // error diff --git a/tests/neg/i7045.scala b/tests/neg/i7045.scala new file mode 100644 index 000000000000..b4c6d60cd35a --- /dev/null +++ b/tests/neg/i7045.scala @@ -0,0 +1,7 @@ +trait Bar { type Y } +trait Foo { type X } + +class Test: + given a1(using b: Bar): Foo = new Foo { type X = b.Y } // ok + given a2(using b: Bar): (Foo { type X = b.Y }) = new Foo { type X = b.Y } // ok + given a3(using b: Bar): Foo { type X = b.Y } = new Foo { type X = b.Y } // error \ No newline at end of file diff --git a/tests/pos/i7045.scala b/tests/pos/i7045.scala deleted file mode 100644 index e683654dd5c3..000000000000 --- a/tests/pos/i7045.scala +++ /dev/null @@ -1,9 +0,0 @@ -trait Bar { type Y } -trait Foo { type X } - -class Test: - given a1(using b: Bar): Foo = new Foo { type X = b.Y } - - given a2(using b: Bar): Foo { type X = b.Y } = new Foo { type X = b.Y } - - given a3(using b: Bar): (Foo { type X = b.Y }) = new Foo { type X = b.Y } diff --git a/tests/pos/typeclass-aggregates.scala b/tests/pos/typeclass-aggregates.scala index 77b0f1a9f04a..9bb576603b7b 100644 --- a/tests/pos/typeclass-aggregates.scala +++ b/tests/pos/typeclass-aggregates.scala @@ -30,8 +30,8 @@ trait OrdWithMonoid extends Ord, Monoid def ordWithMonoid2(ord: Ord, monoid: Monoid{ type This = ord.This }) = //: OrdWithMonoid { type This = ord.This} = new OrdWithMonoid with ord.OrdProxy with monoid.MonoidProxy {} -given intOrd: Ord { type This = Int } = ??? -given intMonoid: Monoid { type This = Int } = ??? +given intOrd: (Ord { type This = Int }) = ??? +given intMonoid: (Monoid { type This = Int }) = ??? //given (using ord: Ord, monoid: Monoid{ type This = ord.This }): (Ord & Monoid { type This = ord.This}) = // ordWithMonoid2(ord, monoid) @@ -42,6 +42,6 @@ val y: Int = ??? : x.This // given [A, B](using ord: A is Ord, monoid: A is Monoid) => A is Ord & Monoid = // new ord.OrdProxy with monoid.MonoidProxy {} -given [A](using ord: Ord { type This = A }, monoid: Monoid { type This = A}): (Ord & Monoid) { type This = A} = +given [A](using ord: Ord { type This = A }, monoid: Monoid { type This = A}): ((Ord & Monoid) { type This = A}) = new ord.OrdProxy with monoid.MonoidProxy {} From 305dd2ea526b0693a4808f9467d12dc46a23a072 Mon Sep 17 00:00:00 2001 From: odersky Date: Mon, 1 Apr 2024 20:44:59 +0200 Subject: [PATCH 213/371] New syntax for given defs given [A: Ord] => A is Ord: ... given [A: Ord] => A is Ord as listOrd: ... [Cherry-picked 2f58cbc145dec06679b571f8b90b8729fc2a1094] --- .../dotty/tools/dotc/parsing/Parsers.scala | 70 +++++++-- .../test/dotc/pos-test-pickling.blacklist | 2 + docs/_docs/internals/syntax.md | 9 +- tests/pos/typeclasses-arrow0.scala | 136 ++++++++++++++++++ 4 files changed, 201 insertions(+), 16 deletions(-) create mode 100644 tests/pos/typeclasses-arrow0.scala diff --git a/compiler/src/dotty/tools/dotc/parsing/Parsers.scala b/compiler/src/dotty/tools/dotc/parsing/Parsers.scala index 6c0f19de3dd1..a5b33994d4a9 100644 --- a/compiler/src/dotty/tools/dotc/parsing/Parsers.scala +++ b/compiler/src/dotty/tools/dotc/parsing/Parsers.scala @@ -976,12 +976,14 @@ object Parsers { * i.e. an identifier followed by type and value parameters, followed by `:`? * @pre The current token is an identifier */ - def followingIsGivenSig() = + def followingIsOldStyleGivenSig() = val lookahead = in.LookaheadScanner() if lookahead.isIdent then lookahead.nextToken() + var paramsSeen = false def skipParams(): Unit = if lookahead.token == LPAREN || lookahead.token == LBRACKET then + paramsSeen = true lookahead.skipParens() skipParams() else if lookahead.isNewLine then @@ -989,6 +991,16 @@ object Parsers { skipParams() skipParams() lookahead.isColon + && { + !in.featureEnabled(Feature.modularity) + || { // with modularity language import, a `:` at EOL after an identifier represents a single identifier given + // Example: + // given C: + // def f = ... + lookahead.nextToken() + !lookahead.isAfterLineEnd + } + } def followingIsExtension() = val next = in.lookahead.token @@ -1808,7 +1820,9 @@ object Parsers { def infixTypeRest(t: Tree, operand: Location => Tree = refinedTypeFn): Tree = infixOps(t, canStartInfixTypeTokens, operand, Location.ElseWhere, ParseKind.Type, - isOperator = !followingIsVararg() && !isPureArrow + isOperator = !followingIsVararg() + && !isPureArrow + && !(isIdent(nme.as) && in.featureEnabled(Feature.modularity)) && nextCanFollowOperator(canStartInfixTypeTokens)) /** RefinedType ::= WithType {[nl] Refinement} [`^` CaptureSet] @@ -4101,15 +4115,30 @@ object Parsers { syntaxError(em"extension clause can only define methods", stat.span) } - /** GivenDef ::= [GivenSig] (GivenType [‘=’ Expr] | StructuralInstance) - * GivenSig ::= [id] [DefTypeParamClause] {UsingParamClauses} ‘:’ - * GivenType ::= AnnotType1 {id [nl] AnnotType1} + /** GivenDef ::= OldGivenDef | NewGivenDef + * OldGivenDef ::= [OldGivenSig] (GivenType [‘=’ Expr] | StructuralInstance) + * OldGivenSig ::= [id] [DefTypeParamClause] {UsingParamClauses} ‘:’ * StructuralInstance ::= ConstrApp {‘with’ ConstrApp} [‘with’ WithTemplateBody] + * + * NewGivenDef ::= [GivenConditional '=>'] NewGivenSig + * GivenConditional ::= [DefTypeParamClause | UsingParamClause] {UsingParamClause} + * NewGivenSig ::= GivenType ['as' id] ([‘=’ Expr] | TemplateBody) + * | ConstrApps ['as' id] TemplateBody + * + * GivenType ::= AnnotType1 {id [nl] AnnotType1} */ def givenDef(start: Offset, mods: Modifiers, givenMod: Mod) = atSpan(start, nameStart) { var mods1 = addMod(mods, givenMod) val nameStart = in.offset - val name = if isIdent && followingIsGivenSig() then ident() else EmptyTermName + var name = if isIdent && followingIsOldStyleGivenSig() then ident() else EmptyTermName + var newSyntaxAllowed = in.featureEnabled(Feature.modularity) + + def moreConstrApps() = + if newSyntaxAllowed && in.token == COMMA then + in.nextToken() + constrApps() + else // need to be careful with last `with` + withConstrApps() // TODO Change syntax description def adjustDefParams(paramss: List[ParamClause]): List[ParamClause] = @@ -4128,14 +4157,24 @@ object Parsers { else Nil newLinesOpt() val noParams = tparams.isEmpty && vparamss.isEmpty - if !(name.isEmpty && noParams) then acceptColon() + if !(name.isEmpty && noParams) then + if in.isColon then + newSyntaxAllowed = false + in.nextToken() + else if newSyntaxAllowed then accept(ARROW) + else acceptColon() val parents = if isSimpleLiteral then rejectWildcardType(annotType()) :: Nil else constrApp() match - case parent: Apply => parent :: withConstrApps() - case parent if in.isIdent => infixTypeRest(parent, _ => annotType1()) :: Nil - case parent => parent :: withConstrApps() + case parent: Apply => parent :: moreConstrApps() + case parent if in.isIdent => + infixTypeRest(parent, _ => annotType1()) :: Nil + case parent => parent :: moreConstrApps() + if newSyntaxAllowed && in.isIdent(nme.as) then + in.nextToken() + name = ident() + val parentsIsType = parents.length == 1 && parents.head.isType if in.token == EQUALS && parentsIsType then accept(EQUALS) @@ -4145,7 +4184,7 @@ object Parsers { ValDef(name, parents.head, subExpr()) else DefDef(name, adjustDefParams(joinParams(tparams, vparamss)), parents.head, subExpr()) - else if (isStatSep || isStatSeqEnd) && parentsIsType then + else if (isStatSep || isStatSeqEnd) && parentsIsType && !newSyntaxAllowed then if name.isEmpty then syntaxError(em"anonymous given cannot be abstract") DefDef(name, adjustDefParams(joinParams(tparams, vparamss)), parents.head, EmptyTree) @@ -4156,8 +4195,13 @@ object Parsers { else vparam val constr = makeConstructor(tparams, vparamss1) val templ = - if isStatSep || isStatSeqEnd then Template(constr, parents, Nil, EmptyValDef, Nil) - else withTemplate(constr, parents) + if isStatSep || isStatSeqEnd then + Template(constr, parents, Nil, EmptyValDef, Nil) + else if !newSyntaxAllowed || in.token == WITH then + withTemplate(constr, parents) + else + possibleTemplateStart() + templateBodyOpt(constr, parents, Nil) if noParams && !mods.is(Inline) then ModuleDef(name, templ) else TypeDef(name.toTypeName, templ) end gdef diff --git a/compiler/test/dotc/pos-test-pickling.blacklist b/compiler/test/dotc/pos-test-pickling.blacklist index ad9befa72f5f..3b14ce28569d 100644 --- a/compiler/test/dotc/pos-test-pickling.blacklist +++ b/compiler/test/dotc/pos-test-pickling.blacklist @@ -127,5 +127,7 @@ i20053b.scala # alias types at different levels of dereferencing parsercombinators-givens.scala parsercombinators-givens-2.scala +parsercombinators-arrow.scala + diff --git a/docs/_docs/internals/syntax.md b/docs/_docs/internals/syntax.md index 6ef346ab22cc..db858ba05fbc 100644 --- a/docs/_docs/internals/syntax.md +++ b/docs/_docs/internals/syntax.md @@ -467,10 +467,13 @@ ClassConstr ::= [ClsTypeParamClause] [ConstrMods] ClsParamClauses ConstrMods ::= {Annotation} [AccessModifier] ObjectDef ::= id [Template] ModuleDef(mods, name, template) // no constructor EnumDef ::= id ClassConstr InheritClauses EnumBody -GivenDef ::= [GivenSig] (GivenType [‘=’ Expr] | StructuralInstance) -GivenSig ::= [id] [DefTypeParamClause] {UsingParamClause} ‘:’ -- one of `id`, `DefTypeParamClause`, `UsingParamClause` must be present + +GivenDef ::= [GivenConditional '=>'] GivenSig +GivenConditional ::= [DefTypeParamClause | UsingParamClause] {UsingParamClause} +GivenSig ::= GivenType ['as' id] ([‘=’ Expr] | TemplateBody) + | ConstrApps ['as' id] TemplateBody GivenType ::= AnnotType1 {id [nl] AnnotType1} -StructuralInstance ::= ConstrApp {‘with’ ConstrApp} [‘with’ WithTemplateBody] + Extension ::= ‘extension’ [DefTypeParamClause] {UsingParamClause} ‘(’ DefTermParam ‘)’ {UsingParamClause} ExtMethods ExtMethods ::= ExtMethod | [nl] <<< ExtMethod {semi ExtMethod} >>> diff --git a/tests/pos/typeclasses-arrow0.scala b/tests/pos/typeclasses-arrow0.scala new file mode 100644 index 000000000000..22d84fe6478d --- /dev/null +++ b/tests/pos/typeclasses-arrow0.scala @@ -0,0 +1,136 @@ +//> using options -language:experimental.modularity -source future + +class Common: + + trait Ord[A]: + extension (x: A) + def compareTo(y: A): Int + def < (y: A): Boolean = compareTo(y) < 0 + def > (y: A): Boolean = compareTo(y) > 0 + def <= (y: A): Boolean = compareTo(y) <= 0 + def >= (y: A): Boolean = compareTo(y) >= 0 + def max(y: A): A = if x < y then y else x + + trait Show[A]: + extension (x: A) def show: String + + trait SemiGroup[A]: + extension (x: A) def combine(y: A): A + + trait Monoid[A] extends SemiGroup[A]: + def unit: A + + trait Functor[F[_]]: + extension [A](x: F[A]) def map[B](f: A => B): F[B] + + trait Monad[F[_]] extends Functor[F]: + def pure[A](x: A): F[A] + extension [A](x: F[A]) + def flatMap[B](f: A => F[B]): F[B] + def map[B](f: A => B) = x.flatMap(f `andThen` pure) +end Common + +object Instances extends Common: + + given Ord[Int] as intOrd: + extension (x: Int) + def compareTo(y: Int) = + if x < y then -1 + else if x > y then +1 + else 0 + + given [T: Ord] => Ord[List[T]]: + extension (xs: List[T]) def compareTo(ys: List[T]): Int = (xs, ys) match + case (Nil, Nil) => 0 + case (Nil, _) => -1 + case (_, Nil) => +1 + case (x :: xs1, y :: ys1) => + val fst = x.compareTo(y) + if (fst != 0) fst else xs1.compareTo(ys1) + + given Monad[List] as listMonad: + extension [A](xs: List[A]) def flatMap[B](f: A => List[B]): List[B] = + xs.flatMap(f) + def pure[A](x: A): List[A] = + List(x) + + type Reader[Ctx] = [X] =>> Ctx => X + + given [Ctx] => Monad[Reader[Ctx]] as readerMonad: + extension [A](r: Ctx => A) def flatMap[B](f: A => Ctx => B): Ctx => B = + ctx => f(r(ctx))(ctx) + def pure[A](x: A): Ctx => A = + ctx => x + + extension (xs: Seq[String]) + def longestStrings: Seq[String] = + val maxLength = xs.map(_.length).max + xs.filter(_.length == maxLength) + + extension [T](xs: List[T]) + def second = xs.tail.head + def third = xs.tail.tail.head + + extension [M[_]: Monad, A](xss: M[M[A]]) + def flatten: M[A] = + xss.flatMap(identity) + + def maximum[T: Ord](xs: List[T]): T = + xs.reduce(_ `max` _) + + given [T: Ord] => Ord[T] as descending: + extension (x: T) def compareTo(y: T) = summon[Ord[T]].compareTo(y)(x) + + def minimum[T: Ord](xs: List[T]) = + maximum(xs)(using descending) + + def test(): Unit = + val xs = List(1, 2, 3) + println(maximum(xs)) + println(maximum(xs)(using descending)) + println(maximum(xs)(using descending(using intOrd))) + println(minimum(xs)) + +// Adapted from the Rust by Example book: https://doc.rust-lang.org/rust-by-example/trait.html +// +// lines words chars +// wc Scala: 28 105 793 +// wc Rust : 57 193 1466 + +trait Animal[Self]: + + // Associated function signature; `Self` refers to the implementor type. + def apply(name: String): Self + + // Method signatures; these will return a string. + extension (self: Self) + def name: String + def noise: String + def talk(): Unit = println(s"$name, $noise") +end Animal + +class Sheep(val name: String): + var isNaked = false + def shear() = + if isNaked then + println(s"$name is already naked...") + else + println(s"$name gets a haircut!") + isNaked = true + +given Animal[Sheep]: + def apply(name: String) = Sheep(name) + extension (self: Sheep) + def name: String = self.name + def noise: String = if self.isNaked then "baaaaah?" else "baaaaah!" + override def talk(): Unit = + println(s"$name pauses briefly... $noise") + +/* + + - In a type pattern, A <: T, A >: T, A: T, A: _ are all allowed and mean + T is a fresh type variable (T can start with a capital letter). + - instance definitions + - `as m` syntax in context bounds and instance definitions + +*/ From 22b681c3b5749b0e3ea58fd426b2b6c2ec9ab8c7 Mon Sep 17 00:00:00 2001 From: odersky Date: Mon, 1 Apr 2024 21:59:15 +0200 Subject: [PATCH 214/371] Allow multiple context bounds in `{...}` [Cherry-picked 598c6adff60179e1533a3dd0226d58363ea19d29] --- .../src/dotty/tools/dotc/ast/Desugar.scala | 2 ++ compiler/src/dotty/tools/dotc/ast/untpd.scala | 8 +++++++ .../dotty/tools/dotc/parsing/Parsers.scala | 11 +++++++--- .../tools/dotc/printing/RefinedPrinter.scala | 21 ++++++++++++++----- .../src/dotty/tools/dotc/typer/Typer.scala | 11 ++++++++++ tests/neg/i9330.scala | 2 +- tests/pos/FromString-typeparam.scala | 13 ++++++++++++ tests/semanticdb/expect/Methods.expect.scala | 2 +- .../semanticdb/expect/Synthetic.expect.scala | 2 +- tests/semanticdb/metac.expect | 9 ++++---- 10 files changed, 66 insertions(+), 15 deletions(-) create mode 100644 tests/pos/FromString-typeparam.scala diff --git a/compiler/src/dotty/tools/dotc/ast/Desugar.scala b/compiler/src/dotty/tools/dotc/ast/Desugar.scala index c3a0c05088cb..774e77aa4b44 100644 --- a/compiler/src/dotty/tools/dotc/ast/Desugar.scala +++ b/compiler/src/dotty/tools/dotc/ast/Desugar.scala @@ -1144,6 +1144,8 @@ object desugar { case tree: TypeDef => tree.name.toString case tree: AppliedTypeTree if followArgs && tree.args.nonEmpty => s"${apply(x, tree.tpt)}_${extractArgs(tree.args)}" + case ContextBoundTypeTree(tycon, paramName, _) => + s"${apply(x, tycon)}_$paramName" case InfixOp(left, op, right) => if followArgs then s"${op.name}_${extractArgs(List(left, right))}" else op.name.toString diff --git a/compiler/src/dotty/tools/dotc/ast/untpd.scala b/compiler/src/dotty/tools/dotc/ast/untpd.scala index 91ef462bcf05..0486e2e6d3d7 100644 --- a/compiler/src/dotty/tools/dotc/ast/untpd.scala +++ b/compiler/src/dotty/tools/dotc/ast/untpd.scala @@ -118,6 +118,7 @@ object untpd extends Trees.Instance[Untyped] with UntypedTreeInfo { case class ContextBounds(bounds: TypeBoundsTree, cxBounds: List[Tree])(implicit @constructorOnly src: SourceFile) extends TypTree case class PatDef(mods: Modifiers, pats: List[Tree], tpt: Tree, rhs: Tree)(implicit @constructorOnly src: SourceFile) extends DefTree case class ExtMethods(paramss: List[ParamClause], methods: List[Tree])(implicit @constructorOnly src: SourceFile) extends Tree + case class ContextBoundTypeTree(tycon: Tree, paramName: TypeName, ownName: TermName)(implicit @constructorOnly src: SourceFile) extends Tree case class MacroTree(expr: Tree)(implicit @constructorOnly src: SourceFile) extends Tree case class ImportSelector(imported: Ident, renamed: Tree = EmptyTree, bound: Tree = EmptyTree)(implicit @constructorOnly src: SourceFile) extends Tree { @@ -677,6 +678,9 @@ object untpd extends Trees.Instance[Untyped] with UntypedTreeInfo { def ExtMethods(tree: Tree)(paramss: List[ParamClause], methods: List[Tree])(using Context): Tree = tree match case tree: ExtMethods if (paramss eq tree.paramss) && (methods == tree.methods) => tree case _ => finalize(tree, untpd.ExtMethods(paramss, methods)(tree.source)) + def ContextBoundTypeTree(tree: Tree)(tycon: Tree, paramName: TypeName, ownName: TermName)(using Context): Tree = tree match + case tree: ContextBoundTypeTree if (tycon eq tree.tycon) && paramName == tree.paramName && ownName == tree.ownName => tree + case _ => finalize(tree, untpd.ContextBoundTypeTree(tycon, paramName, ownName)(tree.source)) def ImportSelector(tree: Tree)(imported: Ident, renamed: Tree, bound: Tree)(using Context): Tree = tree match { case tree: ImportSelector if (imported eq tree.imported) && (renamed eq tree.renamed) && (bound eq tree.bound) => tree case _ => finalize(tree, untpd.ImportSelector(imported, renamed, bound)(tree.source)) @@ -742,6 +746,8 @@ object untpd extends Trees.Instance[Untyped] with UntypedTreeInfo { cpy.PatDef(tree)(mods, transform(pats), transform(tpt), transform(rhs)) case ExtMethods(paramss, methods) => cpy.ExtMethods(tree)(transformParamss(paramss), transformSub(methods)) + case ContextBoundTypeTree(tycon, paramName, ownName) => + cpy.ContextBoundTypeTree(tree)(transform(tycon), paramName, ownName) case ImportSelector(imported, renamed, bound) => cpy.ImportSelector(tree)(transformSub(imported), transform(renamed), transform(bound)) case Number(_, _) | TypedSplice(_) => @@ -797,6 +803,8 @@ object untpd extends Trees.Instance[Untyped] with UntypedTreeInfo { this(this(this(x, pats), tpt), rhs) case ExtMethods(paramss, methods) => this(paramss.foldLeft(x)(apply), methods) + case ContextBoundTypeTree(tycon, paramName, ownName) => + this(x, tycon) case ImportSelector(imported, renamed, bound) => this(this(this(x, imported), renamed), bound) case Number(_, _) => diff --git a/compiler/src/dotty/tools/dotc/parsing/Parsers.scala b/compiler/src/dotty/tools/dotc/parsing/Parsers.scala index a5b33994d4a9..8680ba8c1335 100644 --- a/compiler/src/dotty/tools/dotc/parsing/Parsers.scala +++ b/compiler/src/dotty/tools/dotc/parsing/Parsers.scala @@ -2205,11 +2205,16 @@ object Parsers { else atSpan((t.span union cbs.head.span).start) { ContextBounds(t, cbs) } } + /** ContextBound ::= Type [`as` id] */ + def contextBound(pname: TypeName): Tree = + ContextBoundTypeTree(toplevelTyp(), pname, EmptyTermName) + def contextBounds(pname: TypeName): List[Tree] = if in.isColon then - atSpan(in.skipToken()) { - AppliedTypeTree(toplevelTyp(), Ident(pname)) - } :: contextBounds(pname) + in.nextToken() + if in.token == LBRACE && in.featureEnabled(Feature.modularity) + then inBraces(commaSeparated(() => contextBound(pname))) + else contextBound(pname) :: contextBounds(pname) else if in.token == VIEWBOUND then report.errorOrMigrationWarning( em"view bounds `<%' are no longer supported, use a context bound `:' instead", diff --git a/compiler/src/dotty/tools/dotc/printing/RefinedPrinter.scala b/compiler/src/dotty/tools/dotc/printing/RefinedPrinter.scala index 0329f0639d87..1ff4c8cae339 100644 --- a/compiler/src/dotty/tools/dotc/printing/RefinedPrinter.scala +++ b/compiler/src/dotty/tools/dotc/printing/RefinedPrinter.scala @@ -386,7 +386,7 @@ class RefinedPrinter(_ctx: Context) extends PlainPrinter(_ctx) { changePrec(GlobalPrec) { keywordStr("for ") ~ Text(enums map enumText, "; ") ~ sep ~ toText(expr) } def cxBoundToText(bound: untpd.Tree): Text = bound match { // DD - case AppliedTypeTree(tpt, _) => " : " ~ toText(tpt) + case ContextBoundTypeTree(tpt, _, _) => " : " ~ toText(tpt) case untpd.Function(_, tpt) => " <% " ~ toText(tpt) } @@ -658,7 +658,7 @@ class RefinedPrinter(_ctx: Context) extends PlainPrinter(_ctx) { def toTextAnnot = toTextLocal(arg) ~~ annotText(annot.symbol.enclosingClass, annot) def toTextRetainsAnnot = - try changePrec(GlobalPrec)(toText(arg) ~ "^" ~ toTextCaptureSet(captureSet)) + try changePrec(GlobalPrec)(toTextLocal(arg) ~ "^" ~ toTextCaptureSet(captureSet)) catch case ex: IllegalCaptureRef => toTextAnnot if annot.symbol.maybeOwner.isRetains && Feature.ccEnabled && !printDebug @@ -747,9 +747,18 @@ class RefinedPrinter(_ctx: Context) extends PlainPrinter(_ctx) { case GenAlias(pat, expr) => toText(pat) ~ " = " ~ toText(expr) case ContextBounds(bounds, cxBounds) => - cxBounds.foldLeft(toText(bounds)) {(t, cxb) => - t ~ cxBoundToText(cxb) - } + if Feature.enabled(Feature.modularity) then + def boundsText(bounds: Tree) = bounds match + case ContextBoundTypeTree(tpt, _, ownName) => + toText(tpt) ~ (" as " ~ toText(ownName) `provided` !ownName.isEmpty) + case bounds => toText(bounds) + cxBounds match + case bound :: Nil => ": " ~ boundsText(bound) + case _ => ": {" ~ Text(cxBounds.map(boundsText), ", ") ~ "}" + else + cxBounds.foldLeft(toText(bounds)) {(t, cxb) => + t ~ cxBoundToText(cxb) + } case PatDef(mods, pats, tpt, rhs) => modText(mods, NoSymbol, keywordStr("val"), isType = false) ~~ toText(pats, ", ") ~ optAscription(tpt) ~ optText(rhs)(" = " ~ _) @@ -794,6 +803,8 @@ class RefinedPrinter(_ctx: Context) extends PlainPrinter(_ctx) { prefix ~~ idx.toString ~~ "|" ~~ tpeText ~~ "|" ~~ argsText ~~ "|" ~~ contentText ~~ postfix case CapturesAndResult(refs, parent) => changePrec(GlobalPrec)("^{" ~ Text(refs.map(toText), ", ") ~ "}" ~ toText(parent)) + case ContextBoundTypeTree(tycon, pname, ownName) => + toText(pname) ~ " : " ~ toText(tycon) ~ (" as " ~ toText(ownName) `provided` !ownName.isEmpty) case _ => tree.fallbackToText(this) } diff --git a/compiler/src/dotty/tools/dotc/typer/Typer.scala b/compiler/src/dotty/tools/dotc/typer/Typer.scala index a357f06e4ee8..b90b742aa0ec 100644 --- a/compiler/src/dotty/tools/dotc/typer/Typer.scala +++ b/compiler/src/dotty/tools/dotc/typer/Typer.scala @@ -2284,6 +2284,16 @@ class Typer(@constructorOnly nestingLevel: Int = 0) extends Namer tree.tpFun(tsyms, vsyms) completeTypeTree(InferredTypeTree(), tp, tree) + def typedContextBoundTypeTree(tree: untpd.ContextBoundTypeTree)(using Context): Tree = + val tycon = typedType(tree.tycon) + val tyconSplice = untpd.TypedSplice(tycon) + val tparam = untpd.Ident(tree.paramName).withSpan(tree.span) + if tycon.tpe.typeParams.nonEmpty then + typed(untpd.AppliedTypeTree(tyconSplice, tparam :: Nil)) + else + errorTree(tree, + em"""Illegal context bound: ${tycon.tpe} does not take type parameters.""") + def typedSingletonTypeTree(tree: untpd.SingletonTypeTree)(using Context): SingletonTypeTree = { val ref1 = typedExpr(tree.ref, SingletonTypeProto) checkStable(ref1.tpe, tree.srcPos, "singleton type") @@ -3269,6 +3279,7 @@ class Typer(@constructorOnly nestingLevel: Int = 0) extends Namer case tree: untpd.UnApply => typedUnApply(tree, pt) case tree: untpd.Tuple => typedTuple(tree, pt) case tree: untpd.InLambdaTypeTree => typedInLambdaTypeTree(tree, pt) + case tree: untpd.ContextBoundTypeTree => typedContextBoundTypeTree(tree) case tree: untpd.InfixOp => typedInfixOp(tree, pt) case tree: untpd.ParsedTry => typedTry(tree, pt) case tree @ untpd.PostfixOp(qual, Ident(nme.WILDCARD)) => typedAsFunction(tree, pt) diff --git a/tests/neg/i9330.scala b/tests/neg/i9330.scala index ca25582ef7e8..6ba57c033473 100644 --- a/tests/neg/i9330.scala +++ b/tests/neg/i9330.scala @@ -1,4 +1,4 @@ val x = { - () == "" // error + () == "" implicit def foo[A: A] // error // error // error } diff --git a/tests/pos/FromString-typeparam.scala b/tests/pos/FromString-typeparam.scala new file mode 100644 index 000000000000..893bcfd3decc --- /dev/null +++ b/tests/pos/FromString-typeparam.scala @@ -0,0 +1,13 @@ +//> using options -language:experimental.modularity -source future + +trait FromString[A]: + def fromString(s: String): A + +given FromString[Int] = _.toInt + +given FromString[Double] = _.toDouble + +def add[N: {FromString, Numeric}](a: String, b: String): N = + val num = summon[Numeric[N]] + val N = summon[FromString[N]] + num.plus(N.fromString(a), N.fromString(b)) diff --git a/tests/semanticdb/expect/Methods.expect.scala b/tests/semanticdb/expect/Methods.expect.scala index f34c657b2f6d..4ec723ad584e 100644 --- a/tests/semanticdb/expect/Methods.expect.scala +++ b/tests/semanticdb/expect/Methods.expect.scala @@ -15,7 +15,7 @@ class Methods/*<-example::Methods#*/[T/*<-example::Methods#[T]*/] { def m6/*<-example::Methods#m6().*/(x/*<-example::Methods#m6().(x)*/: Int/*->scala::Int#*/) = ???/*->scala::Predef.`???`().*/ def m6/*<-example::Methods#m6(+1).*/(x/*<-example::Methods#m6(+1).(x)*/: List/*->example::Methods#List#*/[T/*->example::Methods#[T]*/]) = ???/*->scala::Predef.`???`().*/ def m6/*<-example::Methods#m6(+2).*/(x/*<-example::Methods#m6(+2).(x)*/: scala.List/*->scala::package.List#*/[T/*->example::Methods#[T]*/]) = ???/*->scala::Predef.`???`().*/ - def m7/*<-example::Methods#m7().*/[U/*<-example::Methods#m7().[U]*//*<-example::Methods#m7().(evidence$1)*/: Ordering/*->scala::math::Ordering#*/](c/*<-example::Methods#m7().(c)*/: Methods/*->example::Methods#*/[T/*->example::Methods#[T]*/], l/*<-example::Methods#m7().(l)*/: List/*->example::Methods#List#*/[U/*->example::Methods#m7().[U]*/]) = ???/*->scala::Predef.`???`().*/ + def m7/*<-example::Methods#m7().*/[U/*<-example::Methods#m7().[U]*/: Ordering/*->example::Methods#m7().[U]*//*<-example::Methods#m7().(evidence$1)*/](c/*<-example::Methods#m7().(c)*/: Methods/*->example::Methods#*/[T/*->example::Methods#[T]*/], l/*<-example::Methods#m7().(l)*/: List/*->example::Methods#List#*/[U/*->example::Methods#m7().[U]*/]) = ???/*->scala::Predef.`???`().*/ def `m8()./*<-example::Methods#`m8().`().*/`() = ???/*->scala::Predef.`???`().*/ class `m9()./*<-example::Methods#`m9().`#*/` def m9/*<-example::Methods#m9().*/(x/*<-example::Methods#m9().(x)*/: `m9().`/*->example::Methods#`m9().`#*/) = ???/*->scala::Predef.`???`().*/ diff --git a/tests/semanticdb/expect/Synthetic.expect.scala b/tests/semanticdb/expect/Synthetic.expect.scala index a4419aa8bd82..4d797ce2b856 100644 --- a/tests/semanticdb/expect/Synthetic.expect.scala +++ b/tests/semanticdb/expect/Synthetic.expect.scala @@ -30,7 +30,7 @@ class Synthetic/*<-example::Synthetic#*/ { null.asInstanceOf/*->scala::Any#asInstanceOf().*/[Int/*->scala::Int#*/ => Int/*->scala::Int#*/](2) } - class J/*<-example::Synthetic#J#*/[T/*<-example::Synthetic#J#[T]*//*<-example::Synthetic#J#evidence$1.*/: Manifest/*->scala::Predef.Manifest#*/] { val arr/*<-example::Synthetic#J#arr.*/ = Array/*->scala::Array.*/.empty/*->scala::Array.empty().*/[T/*->example::Synthetic#J#[T]*/] } + class J/*<-example::Synthetic#J#*/[T/*<-example::Synthetic#J#[T]*/: /*<-example::Synthetic#J#evidence$1.*/Manifest/*->scala::Predef.Manifest#*//*->example::Synthetic#J#[T]*/] { val arr/*<-example::Synthetic#J#arr.*/ = Array/*->scala::Array.*/.empty/*->scala::Array.empty().*/[T/*->example::Synthetic#J#[T]*/] } class F/*<-example::Synthetic#F#*/ implicit val ordering/*<-example::Synthetic#ordering.*/: Ordering/*->scala::package.Ordering#*/[F/*->example::Synthetic#F#*/] = ???/*->scala::Predef.`???`().*/ diff --git a/tests/semanticdb/metac.expect b/tests/semanticdb/metac.expect index 2120cc633da8..84c3e7c6a110 100644 --- a/tests/semanticdb/metac.expect +++ b/tests/semanticdb/metac.expect @@ -2732,8 +2732,8 @@ Occurrences: [16:29..16:32): ??? -> scala/Predef.`???`(). [17:6..17:8): m7 <- example/Methods#m7(). [17:9..17:10): U <- example/Methods#m7().[U] -[17:10..17:10): <- example/Methods#m7().(evidence$1) -[17:12..17:20): Ordering -> scala/math/Ordering# +[17:12..17:20): Ordering -> example/Methods#m7().[U] +[17:12..17:12): <- example/Methods#m7().(evidence$1) [17:22..17:23): c <- example/Methods#m7().(c) [17:25..17:32): Methods -> example/Methods# [17:33..17:34): T -> example/Methods#[T] @@ -3533,7 +3533,7 @@ Uri => Synthetic.scala Text => empty Language => Scala Symbols => 52 entries -Occurrences => 136 entries +Occurrences => 137 entries Synthetics => 39 entries Symbols: @@ -3659,8 +3659,9 @@ Occurrences: [32:8..32:9): J <- example/Synthetic#J# [32:9..32:9): <- example/Synthetic#J#``(). [32:10..32:11): T <- example/Synthetic#J#[T] -[32:11..32:11): <- example/Synthetic#J#evidence$1. +[32:13..32:13): <- example/Synthetic#J#evidence$1. [32:13..32:21): Manifest -> scala/Predef.Manifest# +[32:13..32:21): Manifest -> example/Synthetic#J#[T] [32:29..32:32): arr <- example/Synthetic#J#arr. [32:35..32:40): Array -> scala/Array. [32:41..32:46): empty -> scala/Array.empty(). From a57a512663c383506999c833e1756d949b1e7cfb Mon Sep 17 00:00:00 2001 From: odersky Date: Tue, 2 Apr 2024 12:27:52 +0200 Subject: [PATCH 215/371] Allow renamings `as N` in context bounds Also, provide the possibility to use the parameter name for single context bounds. This is controlled by a Config setting, which is off by default. [Cherry-picked a61d2bc7b5c4ba97c037a2e46856fb8290594310] --- .../src/dotty/tools/dotc/ast/Desugar.scala | 180 +++++++++++------- .../src/dotty/tools/dotc/config/Config.scala | 7 + .../dotty/tools/dotc/parsing/Parsers.scala | 16 +- docs/_docs/internals/syntax.md | 8 +- tests/pos/FromString-named.scala | 11 ++ 5 files changed, 146 insertions(+), 76 deletions(-) create mode 100644 tests/pos/FromString-named.scala diff --git a/compiler/src/dotty/tools/dotc/ast/Desugar.scala b/compiler/src/dotty/tools/dotc/ast/Desugar.scala index 774e77aa4b44..04fd1afca8be 100644 --- a/compiler/src/dotty/tools/dotc/ast/Desugar.scala +++ b/compiler/src/dotty/tools/dotc/ast/Desugar.scala @@ -10,7 +10,7 @@ import Annotations.Annotation import NameKinds.{UniqueName, ContextBoundParamName, ContextFunctionParamName, DefaultGetterName, WildcardParamName} import typer.{Namer, Checking} import util.{Property, SourceFile, SourcePosition, SrcPos, Chars} -import config.Feature.{sourceVersion, migrateTo3, enabled} +import config.{Feature, Config} import config.SourceVersion.* import collection.mutable import reporting.* @@ -46,6 +46,11 @@ object desugar { */ val UntupledParam: Property.Key[Unit] = Property.StickyKey() + /** An attachment key to indicate that a ValDef is an evidence parameter + * for a context bound. + */ + val ContextBoundParam: Property.Key[Unit] = Property.StickyKey() + /** What static check should be applied to a Match? */ enum MatchCheck { case None, Exhaustive, IrrefutablePatDef, IrrefutableGenFrom @@ -195,17 +200,6 @@ object desugar { else vdef1 end valDef - def makeImplicitParameters( - tpts: List[Tree], implicitFlag: FlagSet, - mkParamName: Int => TermName, - forPrimaryConstructor: Boolean = false - )(using Context): List[ValDef] = - for (tpt, i) <- tpts.zipWithIndex yield { - val paramFlags: FlagSet = if (forPrimaryConstructor) LocalParamAccessor else Param - val epname = mkParamName(i) - ValDef(epname, tpt, EmptyTree).withFlags(paramFlags | implicitFlag) - } - def mapParamss(paramss: List[ParamClause]) (mapTypeParam: TypeDef => TypeDef) (mapTermParam: ValDef => ValDef)(using Context): List[ParamClause] = @@ -232,34 +226,57 @@ object desugar { private def defDef(meth: DefDef, isPrimaryConstructor: Boolean = false)(using Context): Tree = addDefaultGetters(elimContextBounds(meth, isPrimaryConstructor)) - private def elimContextBounds(meth: DefDef, isPrimaryConstructor: Boolean)(using Context): DefDef = - val DefDef(_, paramss, tpt, rhs) = meth - val evidenceParamBuf = mutable.ListBuffer[ValDef]() + private def desugarContextBounds( + tdef: TypeDef, + evidenceBuf: mutable.ListBuffer[ValDef], + flags: FlagSet, + freshName: untpd.Tree => TermName, + allParamss: List[ParamClause])(using Context): TypeDef = - var seenContextBounds: Int = 0 - def desugarContextBounds(rhs: Tree): Tree = rhs match + val evidenceNames = mutable.ListBuffer[TermName]() + + def desugarRhs(rhs: Tree): Tree = rhs match case ContextBounds(tbounds, cxbounds) => - val iflag = if sourceVersion.isAtLeast(`future`) then Given else Implicit - evidenceParamBuf ++= makeImplicitParameters( - cxbounds, iflag, - // Just like with `makeSyntheticParameter` on nameless parameters of - // using clauses, we only need names that are unique among the - // parameters of the method since shadowing does not affect - // implicit resolution in Scala 3. - mkParamName = i => - val index = seenContextBounds + 1 // Start at 1 like FreshNameCreator. - val ret = ContextBoundParamName(EmptyTermName, index) - seenContextBounds += 1 - ret, - forPrimaryConstructor = isPrimaryConstructor) + for bound <- cxbounds do + val evidenceName = bound match + case ContextBoundTypeTree(_, _, ownName) if !ownName.isEmpty => + ownName + case _ if Config.nameSingleContextBounds && cxbounds.tail.isEmpty + && Feature.enabled(Feature.modularity) => + tdef.name.toTermName + case _ => + freshName(bound) + evidenceNames += evidenceName + val evidenceParam = ValDef(evidenceName, bound, EmptyTree).withFlags(flags) + evidenceParam.pushAttachment(ContextBoundParam, ()) + evidenceBuf += evidenceParam tbounds case LambdaTypeTree(tparams, body) => - cpy.LambdaTypeTree(rhs)(tparams, desugarContextBounds(body)) + cpy.LambdaTypeTree(rhs)(tparams, desugarRhs(body)) case _ => rhs + + cpy.TypeDef(tdef)(rhs = desugarRhs(tdef.rhs)) + end desugarContextBounds + + private def elimContextBounds(meth: DefDef, isPrimaryConstructor: Boolean)(using Context): DefDef = + val DefDef(_, paramss, tpt, rhs) = meth + val evidenceParamBuf = mutable.ListBuffer[ValDef]() + + var seenContextBounds: Int = 0 + def freshName(unused: Tree) = + seenContextBounds += 1 // Start at 1 like FreshNameCreator. + ContextBoundParamName(EmptyTermName, seenContextBounds) + // Just like with `makeSyntheticParameter` on nameless parameters of + // using clauses, we only need names that are unique among the + // parameters of the method since shadowing does not affect + // implicit resolution in Scala 3. + val paramssNoContextBounds = + val iflag = if Feature.sourceVersion.isAtLeast(`future`) then Given else Implicit + val flags = if isPrimaryConstructor then iflag | LocalParamAccessor else iflag | Param mapParamss(paramss) { - tparam => cpy.TypeDef(tparam)(rhs = desugarContextBounds(tparam.rhs)) + tparam => desugarContextBounds(tparam, evidenceParamBuf, flags, freshName, paramss) }(identity) rhs match @@ -399,43 +416,70 @@ object desugar { (Nil, tree) /** Add all evidence parameters in `params` as implicit parameters to `meth`. - * If the parameters of `meth` end in an implicit parameter list or using clause, - * evidence parameters are added in front of that list. Otherwise they are added - * as a separate parameter clause. + * The position of the added parameters is determined as follows: + * + * - If there is an existing parameter list that refers to one of the added + * parameters in one of its parameter types, add the new parameters + * in front of the first such parameter list. + * - Otherwise, if the last parameter list consists implicit or using parameters, + * join the new parameters in front of this parameter list, creating one + * parameter list (this is equilavent to Scala 2's scheme). + * - Otherwise, add the new parameter list at the end as a separate parameter clause. */ private def addEvidenceParams(meth: DefDef, params: List[ValDef])(using Context): DefDef = - params match + if params.isEmpty then return meth + + val boundNames = params.map(_.name).toSet + + //println(i"add ev params ${meth.name}, ${boundNames.toList}") + + def references(vdef: ValDef): Boolean = + vdef.tpt.existsSubTree: + case Ident(name: TermName) => boundNames.contains(name) + case _ => false + + def recur(mparamss: List[ParamClause]): List[ParamClause] = mparamss match + case ValDefs(mparams) :: _ if mparams.exists(references) => + params :: mparamss + case ValDefs(mparams @ (mparam :: _)) :: Nil if mparam.mods.isOneOf(GivenOrImplicit) => + (params ++ mparams) :: Nil + case mparams :: mparamss1 => + mparams :: recur(mparamss1) case Nil => - meth - case evidenceParams => - val paramss1 = meth.paramss.reverse match - case ValDefs(vparams @ (vparam :: _)) :: rparamss if vparam.mods.isOneOf(GivenOrImplicit) => - ((evidenceParams ++ vparams) :: rparamss).reverse - case _ => - meth.paramss :+ evidenceParams - cpy.DefDef(meth)(paramss = paramss1) + params :: Nil + + cpy.DefDef(meth)(paramss = recur(meth.paramss)) + end addEvidenceParams /** The parameters generated from the contextual bounds of `meth`, as generated by `desugar.defDef` */ private def evidenceParams(meth: DefDef)(using Context): List[ValDef] = meth.paramss.reverse match { case ValDefs(vparams @ (vparam :: _)) :: _ if vparam.mods.isOneOf(GivenOrImplicit) => - vparams.takeWhile(_.name.is(ContextBoundParamName)) + vparams.takeWhile(_.hasAttachment(ContextBoundParam)) case _ => Nil } @sharable private val synthetic = Modifiers(Synthetic) - private def toDefParam(tparam: TypeDef, keepAnnotations: Boolean): TypeDef = { - var mods = tparam.rawMods - if (!keepAnnotations) mods = mods.withAnnotations(Nil) + /** Filter annotations in `mods` according to `keep` */ + private def filterAnnots(mods: Modifiers, keep: Boolean)(using Context) = + if keep then mods else mods.withAnnotations(Nil) + + private def toDefParam(tparam: TypeDef, keepAnnotations: Boolean)(using Context): TypeDef = + val mods = filterAnnots(tparam.rawMods, keepAnnotations) tparam.withMods(mods & EmptyFlags | Param) - } - private def toDefParam(vparam: ValDef, keepAnnotations: Boolean, keepDefault: Boolean): ValDef = { - var mods = vparam.rawMods - if (!keepAnnotations) mods = mods.withAnnotations(Nil) + + private def toDefParam(vparam: ValDef, keepAnnotations: Boolean, keepDefault: Boolean)(using Context): ValDef = { + val mods = filterAnnots(vparam.rawMods, keepAnnotations) val hasDefault = if keepDefault then HasDefault else EmptyFlags - vparam.withMods(mods & (GivenOrImplicit | Erased | hasDefault | Tracked) | Param) + // Need to ensure that tree is duplicated since term parameters can be watched + // and cloning a term parameter will copy its watchers to the clone, which means + // we'd get cross-talk between the original parameter and the clone. + ValDef(vparam.name, vparam.tpt, vparam.rhs) + .withSpan(vparam.span) + .withAttachmentsFrom(vparam) + .withMods(mods & (GivenOrImplicit | Erased | hasDefault | Tracked) | Param) } def mkApply(fn: Tree, paramss: List[ParamClause])(using Context): Tree = @@ -609,6 +653,11 @@ object desugar { case _ => false } + def isRepeated(tree: Tree): Boolean = stripByNameType(tree) match { + case PostfixOp(_, Ident(tpnme.raw.STAR)) => true + case _ => false + } + def appliedRef(tycon: Tree, tparams: List[TypeDef] = constrTparams, widenHK: Boolean = false) = { val targs = for (tparam <- tparams) yield { val targ = refOfDef(tparam) @@ -625,11 +674,6 @@ object desugar { appliedTypeTree(tycon, targs) } - def isRepeated(tree: Tree): Boolean = stripByNameType(tree) match { - case PostfixOp(_, Ident(tpnme.raw.STAR)) => true - case _ => false - } - // a reference to the class type bound by `cdef`, with type parameters coming from the constructor val classTypeRef = appliedRef(classTycon) @@ -667,7 +711,7 @@ object desugar { } ensureApplied(nu) - val copiedAccessFlags = if migrateTo3 then EmptyFlags else AccessFlags + val copiedAccessFlags = if Feature.migrateTo3 then EmptyFlags else AccessFlags // Methods to add to a case class C[..](p1: T1, ..., pN: Tn)(moreParams) // def _1: T1 = this.p1 @@ -850,12 +894,11 @@ object desugar { Nil } else { - val defParamss = constrVparamss match { + val defParamss = constrVparamss match case Nil :: paramss => paramss // drop leading () that got inserted by class // TODO: drop this once we do not silently insert empty class parameters anymore case paramss => paramss - } val finalFlag = if ctx.settings.YcompileScala2Library.value then EmptyFlags else Final // implicit wrapper is typechecked in same scope as constructor, so // we can reuse the constructor parameters; no derived params are needed. @@ -1681,14 +1724,13 @@ object desugar { .collect: case vd: ValDef => vd - def makeContextualFunction(formals: List[Tree], paramNamesOrNil: List[TermName], body: Tree, erasedParams: List[Boolean])(using Context): Function = { - val mods = Given - val params = makeImplicitParameters(formals, mods, - mkParamName = i => - if paramNamesOrNil.isEmpty then ContextFunctionParamName.fresh() - else paramNamesOrNil(i)) - FunctionWithMods(params, body, Modifiers(mods), erasedParams) - } + def makeContextualFunction(formals: List[Tree], paramNamesOrNil: List[TermName], body: Tree, erasedParams: List[Boolean])(using Context): Function = + val paramNames = + if paramNamesOrNil.nonEmpty then paramNamesOrNil + else formals.map(_ => ContextFunctionParamName.fresh()) + val params = for (tpt, pname) <- formals.zip(paramNames) yield + ValDef(pname, tpt, EmptyTree).withFlags(Given | Param) + FunctionWithMods(params, body, Modifiers(Given), erasedParams) private def derivedValDef(original: Tree, named: NameTree, tpt: Tree, rhs: Tree, mods: Modifiers)(using Context) = { val vdef = ValDef(named.name.asTermName, tpt, rhs) diff --git a/compiler/src/dotty/tools/dotc/config/Config.scala b/compiler/src/dotty/tools/dotc/config/Config.scala index 2746476261e5..293044c245ef 100644 --- a/compiler/src/dotty/tools/dotc/config/Config.scala +++ b/compiler/src/dotty/tools/dotc/config/Config.scala @@ -235,4 +235,11 @@ object Config { */ inline val checkLevelsOnConstraints = false inline val checkLevelsOnInstantiation = true + + /** If a type parameter `X` has a single context bound `X: C`, should the + * witness parameter be named `X`? This would prevent the creation of a + * context bound companion. + */ + inline val nameSingleContextBounds = false } + diff --git a/compiler/src/dotty/tools/dotc/parsing/Parsers.scala b/compiler/src/dotty/tools/dotc/parsing/Parsers.scala index 8680ba8c1335..bbc4096f266b 100644 --- a/compiler/src/dotty/tools/dotc/parsing/Parsers.scala +++ b/compiler/src/dotty/tools/dotc/parsing/Parsers.scala @@ -2196,9 +2196,9 @@ object Parsers { if (in.token == tok) { in.nextToken(); toplevelTyp() } else EmptyTree - /** TypeParamBounds ::= TypeBounds {`<%' Type} {`:' Type} + /** TypeAndCtxBounds ::= TypeBounds [`:` ContextBounds] */ - def typeParamBounds(pname: TypeName): Tree = { + def typeAndCtxBounds(pname: TypeName): Tree = { val t = typeBounds() val cbs = contextBounds(pname) if (cbs.isEmpty) t @@ -2207,8 +2207,16 @@ object Parsers { /** ContextBound ::= Type [`as` id] */ def contextBound(pname: TypeName): Tree = - ContextBoundTypeTree(toplevelTyp(), pname, EmptyTermName) + val t = toplevelTyp() + val ownName = + if isIdent(nme.as) && in.featureEnabled(Feature.modularity) then + in.nextToken() + ident() + else EmptyTermName + ContextBoundTypeTree(t, pname, ownName) + /** ContextBounds ::= ContextBound | `{` ContextBound {`,` ContextBound} `}` + */ def contextBounds(pname: TypeName): List[Tree] = if in.isColon then in.nextToken() @@ -3411,7 +3419,7 @@ object Parsers { } else ident().toTypeName val hkparams = typeParamClauseOpt(ParamOwner.Type) - val bounds = if (isAbstractOwner) typeBounds() else typeParamBounds(name) + val bounds = if (isAbstractOwner) typeBounds() else typeAndCtxBounds(name) TypeDef(name, lambdaAbstract(hkparams, bounds)).withMods(mods) } } diff --git a/docs/_docs/internals/syntax.md b/docs/_docs/internals/syntax.md index db858ba05fbc..e123fa900258 100644 --- a/docs/_docs/internals/syntax.md +++ b/docs/_docs/internals/syntax.md @@ -221,7 +221,9 @@ IntoTargetType ::= Type TypeArgs ::= ‘[’ Types ‘]’ ts Refinement ::= :<<< [RefineDcl] {semi [RefineDcl]} >>> ds TypeBounds ::= [‘>:’ Type] [‘<:’ Type] TypeBoundsTree(lo, hi) -TypeParamBounds ::= TypeBounds {‘:’ Type} ContextBounds(typeBounds, tps) +TypeAndCtxBounds ::= TypeBounds [‘:’ ContextBounds] ContextBounds(typeBounds, tps) +ContextBounds ::= ContextBound | '{' ContextBound {',' ContextBound} '}' +ContextBound ::= Type ['as' id] Types ::= Type {‘,’ Type} NamesAndTypes ::= NameAndType {‘,’ NameAndType} NameAndType ::= id ':' Type @@ -359,7 +361,7 @@ ArgumentPatterns ::= ‘(’ [Patterns] ‘)’ ```ebnf ClsTypeParamClause::= ‘[’ ClsTypeParam {‘,’ ClsTypeParam} ‘]’ ClsTypeParam ::= {Annotation} [‘+’ | ‘-’] TypeDef(Modifiers, name, tparams, bounds) - id [HkTypeParamClause] TypeParamBounds Bound(below, above, context) + id [HkTypeParamClause] TypeAndCtxBounds Bound(below, above, context) TypTypeParamClause::= ‘[’ TypTypeParam {‘,’ TypTypeParam} ‘]’ TypTypeParam ::= {Annotation} id [HkTypeParamClause] TypeBounds @@ -384,7 +386,7 @@ TypelessClause ::= DefTermParamClause | UsingParamClause DefTypeParamClause::= [nl] ‘[’ DefTypeParam {‘,’ DefTypeParam} ‘]’ -DefTypeParam ::= {Annotation} id [HkTypeParamClause] TypeParamBounds +DefTypeParam ::= {Annotation} id [HkTypeParamClause] TypeAndCtxBounds DefTermParamClause::= [nl] ‘(’ [DefTermParams] ‘)’ UsingParamClause ::= [nl] ‘(’ ‘using’ (DefTermParams | FunArgTypes) ‘)’ DefImplicitClause ::= [nl] ‘(’ ‘implicit’ DefTermParams ‘)’ diff --git a/tests/pos/FromString-named.scala b/tests/pos/FromString-named.scala new file mode 100644 index 000000000000..efa0882ae347 --- /dev/null +++ b/tests/pos/FromString-named.scala @@ -0,0 +1,11 @@ +//> using options -language:experimental.modularity -source future + +trait FromString[A]: + def fromString(s: String): A + +given FromString[Int] = _.toInt + +given FromString[Double] = _.toDouble + +def add[N: {FromString as N, Numeric as num}](a: String, b: String): N = + num.plus(N.fromString(a), N.fromString(b)) From 9a96cf0ee5f7cb511d32679fab85dd18e575c00e Mon Sep 17 00:00:00 2001 From: odersky Date: Tue, 2 Apr 2024 13:38:52 +0200 Subject: [PATCH 216/371] Implement `deferred` givens A definition like `given T = deferred` in a trait will be expanded to an abstract given in the trait that is implemented automatically in all classes inheriting the trait. [Cherry-picked b48fb99fd607bd3955477db8c1d94ceec295b1a1] --- .../dotty/tools/dotc/core/Definitions.scala | 1 + .../src/dotty/tools/dotc/core/Flags.scala | 1 + .../src/dotty/tools/dotc/core/StdNames.scala | 1 + .../dotty/tools/dotc/transform/Erasure.scala | 8 +- .../dotty/tools/dotc/typer/Implicits.scala | 4 +- .../src/dotty/tools/dotc/typer/Namer.scala | 12 + .../dotty/tools/dotc/typer/RefChecks.scala | 2 +- .../src/dotty/tools/dotc/typer/Typer.scala | 81 +++- .../test/dotc/pos-test-pickling.blacklist | 4 +- library/src/scala/compiletime/package.scala | 13 + tests/neg/deferred-givens.check | 13 + tests/neg/deferred-givens.scala | 30 ++ tests/neg/deferredSummon.check | 17 + tests/neg/deferredSummon.scala | 19 + tests/pos/deferred-givens.scala | 26 ++ tests/pos/deferredSummon.scala | 30 ++ .../pos/hylolib-deferred-given-extract.scala | 19 + .../AnyCollection.scala | 69 ++++ .../pos/hylolib-deferred-given/AnyValue.scala | 76 ++++ .../pos/hylolib-deferred-given/BitArray.scala | 375 ++++++++++++++++++ .../hylolib-deferred-given/Collection.scala | 281 +++++++++++++ .../hylolib-deferred-given/CoreTraits.scala | 57 +++ tests/pos/hylolib-deferred-given/Hasher.scala | 38 ++ .../pos/hylolib-deferred-given/HyArray.scala | 224 +++++++++++ .../pos/hylolib-deferred-given/Integers.scala | 58 +++ tests/pos/hylolib-deferred-given/Range.scala | 37 ++ tests/pos/hylolib-deferred-given/Slice.scala | 49 +++ .../StringConvertible.scala | 14 + 28 files changed, 1545 insertions(+), 14 deletions(-) create mode 100644 tests/neg/deferred-givens.check create mode 100644 tests/neg/deferred-givens.scala create mode 100644 tests/neg/deferredSummon.check create mode 100644 tests/neg/deferredSummon.scala create mode 100644 tests/pos/deferred-givens.scala create mode 100644 tests/pos/deferredSummon.scala create mode 100644 tests/pos/hylolib-deferred-given-extract.scala create mode 100644 tests/pos/hylolib-deferred-given/AnyCollection.scala create mode 100644 tests/pos/hylolib-deferred-given/AnyValue.scala create mode 100644 tests/pos/hylolib-deferred-given/BitArray.scala create mode 100644 tests/pos/hylolib-deferred-given/Collection.scala create mode 100644 tests/pos/hylolib-deferred-given/CoreTraits.scala create mode 100644 tests/pos/hylolib-deferred-given/Hasher.scala create mode 100644 tests/pos/hylolib-deferred-given/HyArray.scala create mode 100644 tests/pos/hylolib-deferred-given/Integers.scala create mode 100644 tests/pos/hylolib-deferred-given/Range.scala create mode 100644 tests/pos/hylolib-deferred-given/Slice.scala create mode 100644 tests/pos/hylolib-deferred-given/StringConvertible.scala diff --git a/compiler/src/dotty/tools/dotc/core/Definitions.scala b/compiler/src/dotty/tools/dotc/core/Definitions.scala index 15880207b3c8..9ee5891f1606 100644 --- a/compiler/src/dotty/tools/dotc/core/Definitions.scala +++ b/compiler/src/dotty/tools/dotc/core/Definitions.scala @@ -240,6 +240,7 @@ class Definitions { @tu lazy val Compiletime_codeOf: Symbol = CompiletimePackageClass.requiredMethod("codeOf") @tu lazy val Compiletime_erasedValue : Symbol = CompiletimePackageClass.requiredMethod("erasedValue") @tu lazy val Compiletime_uninitialized: Symbol = CompiletimePackageClass.requiredMethod("uninitialized") + @tu lazy val Compiletime_deferred : Symbol = CompiletimePackageClass.requiredMethod("deferred") @tu lazy val Compiletime_error : Symbol = CompiletimePackageClass.requiredMethod(nme.error) @tu lazy val Compiletime_requireConst : Symbol = CompiletimePackageClass.requiredMethod("requireConst") @tu lazy val Compiletime_constValue : Symbol = CompiletimePackageClass.requiredMethod("constValue") diff --git a/compiler/src/dotty/tools/dotc/core/Flags.scala b/compiler/src/dotty/tools/dotc/core/Flags.scala index 2bc7610bb0ce..e17834d61fdc 100644 --- a/compiler/src/dotty/tools/dotc/core/Flags.scala +++ b/compiler/src/dotty/tools/dotc/core/Flags.scala @@ -573,6 +573,7 @@ object Flags { val DeferredOrLazyOrMethod: FlagSet = Deferred | Lazy | Method val DeferredOrTermParamOrAccessor: FlagSet = Deferred | ParamAccessor | TermParam // term symbols without right-hand sides val DeferredOrTypeParam: FlagSet = Deferred | TypeParam // type symbols without right-hand sides + val DeferredGivenFlags = Deferred | Given | HasDefault val EnumValue: FlagSet = Enum | StableRealizable // A Scala enum value val FinalOrInline: FlagSet = Final | Inline val FinalOrModuleClass: FlagSet = Final | ModuleClass // A module class or a final class diff --git a/compiler/src/dotty/tools/dotc/core/StdNames.scala b/compiler/src/dotty/tools/dotc/core/StdNames.scala index 7545cf5c4ba1..c0eb8a690eb4 100644 --- a/compiler/src/dotty/tools/dotc/core/StdNames.scala +++ b/compiler/src/dotty/tools/dotc/core/StdNames.scala @@ -455,6 +455,7 @@ object StdNames { val create: N = "create" val currentMirror: N = "currentMirror" val curried: N = "curried" + val deferred: N = "deferred" val definitions: N = "definitions" val delayedInit: N = "delayedInit" val delayedInitArg: N = "delayedInit$body" diff --git a/compiler/src/dotty/tools/dotc/transform/Erasure.scala b/compiler/src/dotty/tools/dotc/transform/Erasure.scala index 8bfbb90a0700..a25a2fcb5c6d 100644 --- a/compiler/src/dotty/tools/dotc/transform/Erasure.scala +++ b/compiler/src/dotty/tools/dotc/transform/Erasure.scala @@ -567,7 +567,13 @@ object Erasure { case Some(annot) => val message = annot.argumentConstant(0) match case Some(c) => - c.stringValue.toMessage + val addendum = tree match + case tree: RefTree + if tree.symbol == defn.Compiletime_deferred && tree.name != nme.deferred => + i".\nNote that `deferred` can only be used under its own name when implementing a given in a trait; `${tree.name}` is not accepted." + case _ => + "" + (c.stringValue ++ addendum).toMessage case _ => em"""Reference to ${tree.symbol.showLocated} should not have survived, |it should have been processed and eliminated during expansion of an enclosing macro or term erasure.""" diff --git a/compiler/src/dotty/tools/dotc/typer/Implicits.scala b/compiler/src/dotty/tools/dotc/typer/Implicits.scala index bc19e97b85d8..5ac12ce1aa0c 100644 --- a/compiler/src/dotty/tools/dotc/typer/Implicits.scala +++ b/compiler/src/dotty/tools/dotc/typer/Implicits.scala @@ -924,10 +924,10 @@ trait Implicits: /** Search an implicit argument and report error if not found */ - def implicitArgTree(formal: Type, span: Span)(using Context): Tree = { + def implicitArgTree(formal: Type, span: Span, where: => String = "")(using Context): Tree = { val arg = inferImplicitArg(formal, span) if (arg.tpe.isInstanceOf[SearchFailureType]) - report.error(missingArgMsg(arg, formal, ""), ctx.source.atSpan(span)) + report.error(missingArgMsg(arg, formal, where), ctx.source.atSpan(span)) arg } diff --git a/compiler/src/dotty/tools/dotc/typer/Namer.scala b/compiler/src/dotty/tools/dotc/typer/Namer.scala index e48c2fdf5066..22a12ed0f468 100644 --- a/compiler/src/dotty/tools/dotc/typer/Namer.scala +++ b/compiler/src/dotty/tools/dotc/typer/Namer.scala @@ -1830,6 +1830,18 @@ class Namer { typer: Typer => case _ => WildcardType } + + // translate `given T = deferred` to an abstract given with HasDefault flag + if sym.is(Given) then + mdef.rhs match + case rhs: RefTree + if rhs.name == nme.deferred + && typedAheadExpr(rhs).symbol == defn.Compiletime_deferred + && sym.maybeOwner.is(Trait) => + sym.resetFlag(Final) + sym.setFlag(Deferred | HasDefault) + case _ => + val mbrTpe = paramFn(checkSimpleKinded(typedAheadType(mdef.tpt, tptProto)).tpe) if (ctx.explicitNulls && mdef.mods.is(JavaDefined)) JavaNullInterop.nullifyMember(sym, mbrTpe, mdef.mods.isAllOf(JavaEnumValue)) diff --git a/compiler/src/dotty/tools/dotc/typer/RefChecks.scala b/compiler/src/dotty/tools/dotc/typer/RefChecks.scala index 7cd1d67e9aa5..266b69d029c1 100644 --- a/compiler/src/dotty/tools/dotc/typer/RefChecks.scala +++ b/compiler/src/dotty/tools/dotc/typer/RefChecks.scala @@ -552,7 +552,7 @@ object RefChecks { overrideError("is an extension method, cannot override a normal method") else if (other.is(ExtensionMethod) && !member.is(ExtensionMethod)) // (1.3) overrideError("is a normal method, cannot override an extension method") - else if !other.is(Deferred) + else if (!other.is(Deferred) || other.isAllOf(Given | HasDefault)) && !member.is(Deferred) && !other.name.is(DefaultGetterName) && !member.isAnyOverride diff --git a/compiler/src/dotty/tools/dotc/typer/Typer.scala b/compiler/src/dotty/tools/dotc/typer/Typer.scala index b90b742aa0ec..c467a4507730 100644 --- a/compiler/src/dotty/tools/dotc/typer/Typer.scala +++ b/compiler/src/dotty/tools/dotc/typer/Typer.scala @@ -2649,12 +2649,17 @@ class Typer(@constructorOnly nestingLevel: Int = 0) extends Namer val ValDef(name, tpt, _) = vdef checkNonRootName(vdef.name, vdef.nameSpan) completeAnnotations(vdef, sym) - if (sym.isOneOf(GivenOrImplicit)) checkImplicitConversionDefOK(sym) + if sym.is(Implicit) then checkImplicitConversionDefOK(sym) if sym.is(Module) then checkNoModuleClash(sym) val tpt1 = checkSimpleKinded(typedType(tpt)) val rhs1 = vdef.rhs match { - case rhs @ Ident(nme.WILDCARD) => rhs withType tpt1.tpe - case rhs => typedExpr(rhs, tpt1.tpe.widenExpr) + case rhs @ Ident(nme.WILDCARD) => + rhs.withType(tpt1.tpe) + case rhs: RefTree + if rhs.name == nme.deferred && sym.isAllOf(DeferredGivenFlags, butNot = Param) => + EmptyTree + case rhs => + typedExpr(rhs, tpt1.tpe.widenExpr) } val vdef1 = assignType(cpy.ValDef(vdef)(name, tpt1, rhs1), sym) postProcessInfo(vdef1, sym) @@ -2715,9 +2720,13 @@ class Typer(@constructorOnly nestingLevel: Int = 0) extends Namer if sym.isInlineMethod then rhsCtx.addMode(Mode.InlineableBody) if sym.is(ExtensionMethod) then rhsCtx.addMode(Mode.InExtensionMethod) - val rhs1 = PrepareInlineable.dropInlineIfError(sym, - if sym.isScala2Macro then typedScala2MacroBody(ddef.rhs)(using rhsCtx) - else typedExpr(ddef.rhs, tpt1.tpe.widenExpr)(using rhsCtx)) + val rhs1 = ddef.rhs match + case Ident(nme.deferred) if sym.isAllOf(DeferredGivenFlags) => + EmptyTree + case rhs => + PrepareInlineable.dropInlineIfError(sym, + if sym.isScala2Macro then typedScala2MacroBody(ddef.rhs)(using rhsCtx) + else typedExpr(ddef.rhs, tpt1.tpe.widenExpr)(using rhsCtx)) if sym.isInlineMethod then if StagingLevel.level > 0 then @@ -2898,6 +2907,59 @@ class Typer(@constructorOnly nestingLevel: Int = 0) extends Namer case None => body + /** Implement givens that were declared with a `deferred` rhs. + * The a given value matching the declared type is searched in a + * context directly enclosing the current class, in which all given + * parameters of the current class are also defined. + */ + def implementDeferredGivens(body: List[Tree]): List[Tree] = + if cls.is(Trait) || ctx.isAfterTyper then body + else + def isGivenValue(mbr: TermRef) = + val dcl = mbr.symbol + if dcl.is(Method) then + report.error( + em"""Cannnot infer the implementation of the deferred ${dcl.showLocated} + |since that given is parameterized. An implementing given needs to be written explicitly.""", + cdef.srcPos) + false + else true + + def givenImpl(mbr: TermRef): ValDef = + val dcl = mbr.symbol + val target = dcl.info.asSeenFrom(cls.thisType, dcl.owner) + val constr = cls.primaryConstructor + val usingParamAccessors = cls.paramAccessors.filter(_.is(Given)) + val paramScope = newScopeWith(usingParamAccessors*) + val searchCtx = ctx.outer.fresh.setScope(paramScope) + val rhs = implicitArgTree(target, cdef.span, + where = i"inferring the implementation of the deferred ${dcl.showLocated}" + )(using searchCtx) + + val impl = dcl.copy(cls, + flags = dcl.flags &~ (HasDefault | Deferred) | Final | Override, + info = target, + coord = rhs.span).entered.asTerm + + def anchorParams = new TreeMap: + override def transform(tree: Tree)(using Context): Tree = tree match + case id: Ident if usingParamAccessors.contains(id.symbol) => + cpy.Select(id)(This(cls), id.name) + case _ => + super.transform(tree) + ValDef(impl, anchorParams.transform(rhs)) + end givenImpl + + val givenImpls = + cls.thisType.implicitMembers + //.showing(i"impl def givens for $cls/$result") + .filter(_.symbol.isAllOf(DeferredGivenFlags, butNot = Param)) + //.showing(i"impl def filtered givens for $cls/$result") + .filter(isGivenValue) + .map(givenImpl) + body ++ givenImpls + end implementDeferredGivens + ensureCorrectSuperClass() completeAnnotations(cdef, cls) val constr1 = typed(constr).asInstanceOf[DefDef] @@ -2919,9 +2981,10 @@ class Typer(@constructorOnly nestingLevel: Int = 0) extends Namer else { val dummy = localDummy(cls, impl) val body1 = - addParentRefinements( - addAccessorDefs(cls, - typedStats(impl.body, dummy)(using ctx.inClassContext(self1.symbol))._1)) + implementDeferredGivens( + addParentRefinements( + addAccessorDefs(cls, + typedStats(impl.body, dummy)(using ctx.inClassContext(self1.symbol))._1))) checkNoDoubleDeclaration(cls) val impl1 = cpy.Template(impl)(constr1, parents1, Nil, self1, body1) diff --git a/compiler/test/dotc/pos-test-pickling.blacklist b/compiler/test/dotc/pos-test-pickling.blacklist index 3b14ce28569d..5c715faa504b 100644 --- a/compiler/test/dotc/pos-test-pickling.blacklist +++ b/compiler/test/dotc/pos-test-pickling.blacklist @@ -103,7 +103,7 @@ i13842.scala # Position change under captureChecking boxmap-paper.scala -# Function types print differnt after unpickling since test mispredicts Feature.preFundsEnabled +# Function types print different after unpickling since test mispredicts Feature.preFundsEnabled caps-universal.scala # GADT cast applied to singleton type difference @@ -128,6 +128,8 @@ i20053b.scala parsercombinators-givens.scala parsercombinators-givens-2.scala parsercombinators-arrow.scala +hylolib-deferred-given + diff --git a/library/src/scala/compiletime/package.scala b/library/src/scala/compiletime/package.scala index 3eca997554a0..be76941a680b 100644 --- a/library/src/scala/compiletime/package.scala +++ b/library/src/scala/compiletime/package.scala @@ -42,6 +42,19 @@ def erasedValue[T]: T = erasedValue[T] @compileTimeOnly("`uninitialized` can only be used as the right hand side of a mutable field definition") def uninitialized: Nothing = ??? +/** Used as the right hand side of a given in a trait, like this + * + * ``` + * given T = deferred + * ``` + * + * This signifies that the given will get a synthesized definition in all classes + * that implement the enclosing trait and that do not contain an explicit overriding + * definition of that given. + */ +@compileTimeOnly("`deferred` can only be used as the right hand side of a given definition in a trait") +def deferred: Nothing = ??? + /** The error method is used to produce user-defined compile errors during inline expansion. * If an inline expansion results in a call error(msgStr) the compiler produces an error message containing the given msgStr. * diff --git a/tests/neg/deferred-givens.check b/tests/neg/deferred-givens.check new file mode 100644 index 000000000000..cc15901d087f --- /dev/null +++ b/tests/neg/deferred-givens.check @@ -0,0 +1,13 @@ +-- [E172] Type Error: tests/neg/deferred-givens.scala:11:6 ------------------------------------------------------------- +11 |class B extends A // error + |^^^^^^^^^^^^^^^^^ + |No given instance of type Ctx was found for inferring the implementation of the deferred given instance ctx in trait A +-- [E172] Type Error: tests/neg/deferred-givens.scala:13:15 ------------------------------------------------------------ +13 |abstract class C extends A // error + |^^^^^^^^^^^^^^^^^^^^^^^^^^ + |No given instance of type Ctx was found for inferring the implementation of the deferred given instance ctx in trait A +-- Error: tests/neg/deferred-givens.scala:26:8 ------------------------------------------------------------------------- +26 | class E extends A2 // error, can't summon polymorphic given + | ^^^^^^^^^^^^^^^^^^ + | Cannnot infer the implementation of the deferred given instance given_Ctx3_T in trait A2 + | since that given is parameterized. An implementing given needs to be written explicitly. diff --git a/tests/neg/deferred-givens.scala b/tests/neg/deferred-givens.scala new file mode 100644 index 000000000000..7ff67d784714 --- /dev/null +++ b/tests/neg/deferred-givens.scala @@ -0,0 +1,30 @@ +//> using options -language:experimental.modularity -source future +import compiletime.deferred + +class Ctx +class Ctx2 + +trait A: + given Ctx as ctx = deferred + given Ctx2 = deferred + +class B extends A // error + +abstract class C extends A // error + +class D extends A: + given Ctx as ctx = Ctx() // ok, was implemented + given Ctx2 = Ctx2() // ok + +class Ctx3[T] + +trait A2: + given [T] => Ctx3[T] = deferred + +object O: + given [T] => Ctx3[T] = Ctx3[T]() + class E extends A2 // error, can't summon polymorphic given + +class E extends A2: + given [T] => Ctx3[T] = Ctx3[T]() // ok + diff --git a/tests/neg/deferredSummon.check b/tests/neg/deferredSummon.check new file mode 100644 index 000000000000..bd76ad73467e --- /dev/null +++ b/tests/neg/deferredSummon.check @@ -0,0 +1,17 @@ +-- Error: tests/neg/deferredSummon.scala:4:26 -------------------------------------------------------------------------- +4 | given Int = compiletime.deferred // error + | ^^^^^^^^^^^^^^^^^^^^ + | `deferred` can only be used as the right hand side of a given definition in a trait +-- Error: tests/neg/deferredSummon.scala:7:26 -------------------------------------------------------------------------- +7 | given Int = compiletime.deferred // error + | ^^^^^^^^^^^^^^^^^^^^ + | `deferred` can only be used as the right hand side of a given definition in a trait +-- Error: tests/neg/deferredSummon.scala:12:16 ------------------------------------------------------------------------- +12 | given Int = deferred // error + | ^^^^^^^^ + | `deferred` can only be used as the right hand side of a given definition in a trait +-- Error: tests/neg/deferredSummon.scala:16:14 ------------------------------------------------------------------------- +16 | given Int = defered // error + | ^^^^^^^ + |`deferred` can only be used as the right hand side of a given definition in a trait. + |Note that `deferred` can only be used under its own name when implementing a given in a trait; `defered` is not accepted. diff --git a/tests/neg/deferredSummon.scala b/tests/neg/deferredSummon.scala new file mode 100644 index 000000000000..cddde82535fb --- /dev/null +++ b/tests/neg/deferredSummon.scala @@ -0,0 +1,19 @@ +//> using options -language:experimental.modularity + +object Test: + given Int = compiletime.deferred // error + +abstract class C: + given Int = compiletime.deferred // error + +trait A: + import compiletime.deferred + locally: + given Int = deferred // error + +trait B: + import compiletime.deferred as defered + given Int = defered // error + + + diff --git a/tests/pos/deferred-givens.scala b/tests/pos/deferred-givens.scala new file mode 100644 index 000000000000..51fa43866d1e --- /dev/null +++ b/tests/pos/deferred-givens.scala @@ -0,0 +1,26 @@ +//> using options -language:experimental.modularity -source future +import compiletime.* +class Ord[Elem] + +given Ord[Double] + +trait B: + type Elem + given Ord[Elem] = deferred + def foo = summon[Ord[Elem]] + +class C extends B: + type Elem = String + override given Ord[Elem] = ??? + +def bar(using Ord[String]) = 1 + +class D(using Ord[String]) extends B: + type Elem = String + +class E(using x: Ord[String]) extends B: + type Elem = String + override given Ord[Elem] = x + +class F[X: Ord] extends B: + type Elem = X diff --git a/tests/pos/deferredSummon.scala b/tests/pos/deferredSummon.scala new file mode 100644 index 000000000000..d12a98e52736 --- /dev/null +++ b/tests/pos/deferredSummon.scala @@ -0,0 +1,30 @@ +//> using options -language:experimental.modularity -source future +import compiletime.deferred + +trait Ord[Self]: + def less(x: Self, y: Self): Boolean + +trait A: + type Elem + given Ord[Elem] = deferred + def foo = summon[Ord[Elem]] + +object Inst: + given Ord[Int]: + def less(x: Int, y: Int) = x < y + +object Test: + import Inst.given + class C extends A: + type Elem = Int + object E extends A: + type Elem = Int + given A: + type Elem = Int + +class D[T: Ord] extends A: + type Elem = T + + + + diff --git a/tests/pos/hylolib-deferred-given-extract.scala b/tests/pos/hylolib-deferred-given-extract.scala new file mode 100644 index 000000000000..02d889dc9aac --- /dev/null +++ b/tests/pos/hylolib-deferred-given-extract.scala @@ -0,0 +1,19 @@ +//> using options -language:experimental.modularity -source future +package hylotest +import compiletime.deferred + +trait Value[Self] + +/** A collection of elements accessible by their position. */ +trait Collection[Self]: + + /** The type of the elements in the collection. */ + type Element + given elementIsValue: Value[Element] = compiletime.deferred + +class BitArray + +given Value[Boolean] {} + +given Collection[BitArray] with + type Element = Boolean diff --git a/tests/pos/hylolib-deferred-given/AnyCollection.scala b/tests/pos/hylolib-deferred-given/AnyCollection.scala new file mode 100644 index 000000000000..55e453d6dc87 --- /dev/null +++ b/tests/pos/hylolib-deferred-given/AnyCollection.scala @@ -0,0 +1,69 @@ +package hylo + +/** A type-erased collection. + * + * A `AnyCollection` forwards its operations to a wrapped value, hiding its implementation. + */ +final class AnyCollection[Element] private ( + val _start: () => AnyValue, + val _end: () => AnyValue, + val _after: (AnyValue) => AnyValue, + val _at: (AnyValue) => Element +) + +object AnyCollection { + + /** Creates an instance forwarding its operations to `base`. */ + def apply[Base](using b: Collection[Base])(base: Base): AnyCollection[b.Element] = + // NOTE: This evidence is redefined so the compiler won't report ambiguity between `intIsValue` + // and `anyValueIsValue` when the method is called on a collection of `Int`s. None of these + // choices is even correct! Note also that the ambiguity is suppressed if the constructor of + // `AnyValue` is declared with a context bound rather than an implicit parameter. + given Value[b.Position] = b.positionIsValue + + def start(): AnyValue = + AnyValue(base.startPosition) + + def end(): AnyValue = + AnyValue(base.endPosition) + + def after(p: AnyValue): AnyValue = + AnyValue(base.positionAfter(p.unsafelyUnwrappedAs[b.Position])) + + def at(p: AnyValue): b.Element = + base.at(p.unsafelyUnwrappedAs[b.Position]) + + new AnyCollection[b.Element]( + _start = start, + _end = end, + _after = after, + _at = at + ) + +} + +given anyCollectionIsCollection[T](using tIsValue: Value[T]): Collection[AnyCollection[T]] with { + + type Element = T + //given elementIsValue: Value[Element] = tIsValue + + type Position = AnyValue + given positionIsValue: Value[Position] = anyValueIsValue + + extension (self: AnyCollection[T]) { + + def startPosition = + self._start() + + def endPosition = + self._end() + + def positionAfter(p: Position) = + self._after(p) + + def at(p: Position) = + self._at(p) + + } + +} diff --git a/tests/pos/hylolib-deferred-given/AnyValue.scala b/tests/pos/hylolib-deferred-given/AnyValue.scala new file mode 100644 index 000000000000..b9d39869c09a --- /dev/null +++ b/tests/pos/hylolib-deferred-given/AnyValue.scala @@ -0,0 +1,76 @@ +package hylo + +/** A wrapper around an object providing a reference API. */ +private final class Ref[T](val value: T) { + + override def toString: String = + s"Ref($value)" + +} + +/** A type-erased value. + * + * An `AnyValue` forwards its operations to a wrapped value, hiding its implementation. + */ +final class AnyValue private ( + private val wrapped: AnyRef, + private val _copy: (AnyRef) => AnyValue, + private val _eq: (AnyRef, AnyRef) => Boolean, + private val _hashInto: (AnyRef, Hasher) => Hasher +) { + + /** Returns a copy of `this`. */ + def copy(): AnyValue = + _copy(this.wrapped) + + /** Returns `true` iff `this` and `other` have an equivalent value. */ + def eq(other: AnyValue): Boolean = + _eq(this.wrapped, other.wrapped) + + /** Hashes the salient parts of `this` into `hasher`. */ + def hashInto(hasher: Hasher): Hasher = + _hashInto(this.wrapped, hasher) + + /** Returns the value wrapped in `this` as an instance of `T`. */ + def unsafelyUnwrappedAs[T]: T = + wrapped.asInstanceOf[Ref[T]].value + + /** Returns a textual description of `this`. */ + override def toString: String = + wrapped.toString + +} + +object AnyValue { + + /** Creates an instance wrapping `wrapped`. */ + def apply[T](using Value[T])(wrapped: T): AnyValue = + def copy(a: AnyRef): AnyValue = + AnyValue(a.asInstanceOf[Ref[T]].value.copy()) + + def eq(a: AnyRef, b: AnyRef): Boolean = + a.asInstanceOf[Ref[T]].value `eq` b.asInstanceOf[Ref[T]].value + + def hashInto(a: AnyRef, hasher: Hasher): Hasher = + a.asInstanceOf[Ref[T]].value.hashInto(hasher) + + new AnyValue(Ref(wrapped), copy, eq, hashInto) + +} + +given anyValueIsValue: Value[AnyValue] with { + + extension (self: AnyValue) { + + def copy(): AnyValue = + self.copy() + + def eq(other: AnyValue): Boolean = + self `eq` other + + def hashInto(hasher: Hasher): Hasher = + self.hashInto(hasher) + + } + +} diff --git a/tests/pos/hylolib-deferred-given/BitArray.scala b/tests/pos/hylolib-deferred-given/BitArray.scala new file mode 100644 index 000000000000..485f30472847 --- /dev/null +++ b/tests/pos/hylolib-deferred-given/BitArray.scala @@ -0,0 +1,375 @@ +package hylo + +import scala.collection.mutable + +/** An array of bit values represented as Booleans, where `true` indicates that the bit is on. */ +final class BitArray private ( + private var _bits: HyArray[Int], + private var _count: Int +) { + + /** Returns `true` iff `this` is empty. */ + def isEmpty: Boolean = + _count == 0 + + /** Returns the number of elements in `this`. */ + def count: Int = + _count + + /** The number of bits that the array can contain before allocating new storage. */ + def capacity: Int = + _bits.capacity << 5 + + /** Reserves enough storage to store `n` elements in `this`. */ + def reserveCapacity(n: Int, assumeUniqueness: Boolean = false): BitArray = + if (n == 0) { + this + } else { + val k = 1 + ((n - 1) >> 5) + if (assumeUniqueness) { + _bits = _bits.reserveCapacity(k, assumeUniqueness) + this + } else { + new BitArray(_bits.reserveCapacity(k), _count) + } + } + + /** Adds a new element at the end of the array. */ + def append(bit: Boolean, assumeUniqueness: Boolean = false): BitArray = + val result = if assumeUniqueness && (count < capacity) then this else copy(count + 1) + val p = BitArray.Position(count) + if (p.bucket >= _bits.count) { + result._bits = _bits.append(if bit then 1 else 0) + } else { + result.setValue(bit, p) + } + result._count += 1 + result + + /** Removes and returns the last element, or returns `None` if the array is empty. */ + def popLast(assumeUniqueness: Boolean = false): (BitArray, Option[Boolean]) = + if (isEmpty) { + (this, None) + } else { + val result = if assumeUniqueness then this else copy() + val bit = result.at(BitArray.Position(count)) + result._count -= 1 + (result, Some(bit)) + } + + /** Removes all elements in the array, keeping allocated storage iff `keepStorage` is true. */ + def removeAll( + keepStorage: Boolean = false, + assumeUniqueness: Boolean = false + ): BitArray = + if (isEmpty) { + this + } else if (keepStorage) { + val result = if assumeUniqueness then this else copy() + result._bits.removeAll(keepStorage, assumeUniqueness = true) + result._count = 0 + result + } else { + BitArray() + } + + /** Returns `true` iff all elements in `this` are `false`. */ + def allFalse: Boolean = + if (isEmpty) { + true + } else { + val k = (count - 1) >> 5 + def loop(i: Int): Boolean = + if (i == k) { + val m = (1 << (count & 31)) - 1 + (_bits.at(k) & m) == 0 + } else if (_bits.at(i) != 0) { + false + } else { + loop(i + 1) + } + loop(0) + } + + /** Returns `true` iff all elements in `this` are `true`. */ + def allTrue: Boolean = + if (isEmpty) { + true + } else { + val k = (count - 1) >> 5 + def loop(i: Int): Boolean = + if (i == k) { + val m = (1 << (count & 31)) - 1 + (_bits.at(k) & m) == m + } else if (_bits.at(i) != ~0) { + false + } else { + loop(i + 1) + } + loop(0) + } + + /** Returns the bitwise OR of `this` and `other`. */ + def | (other: BitArray): BitArray = + val result = copy() + result.applyBitwise(other, _ | _, assumeUniqueness = true) + + /** Returns the bitwise AND of `this` and `other`. */ + def & (other: BitArray): BitArray = + val result = copy() + result.applyBitwise(other, _ & _, assumeUniqueness = true) + + /** Returns the bitwise XOR of `this` and `other`. */ + def ^ (other: BitArray): BitArray = + val result = copy() + result.applyBitwise(other, _ ^ _, assumeUniqueness = true) + + /** Assigns each bits in `this` to the result of `operation` applied on those bits and their + * corresponding bits in `other`. + * + * @requires + * `self.count == other.count`. + */ + private def applyBitwise( + other: BitArray, + operation: (Int, Int) => Int, + assumeUniqueness: Boolean = false + ): BitArray = + require(this.count == other.count) + if (isEmpty) { + this + } else { + val result = if assumeUniqueness then this else copy() + var u = assumeUniqueness + val k = (count - 1) >> 5 + + for (i <- 0 until k) { + result._bits = result._bits.modifyAt( + i, (n) => operation(n, other._bits.at(n)), + assumeUniqueness = u + ) + u = true + } + val m = (1 << (count & 31)) - 1 + result._bits = result._bits.modifyAt( + k, (n) => operation(n & m, other._bits.at(k) & m), + assumeUniqueness = u + ) + + result + } + + /** Returns the position of `this`'s first element', or `endPosition` if `this` is empty. + * + * @complexity + * O(1). + */ + def startPosition: BitArray.Position = + BitArray.Position(0) + + /** Returns the "past the end" position in `this`, that is, the position immediately after the + * last element in `this`. + * + * @complexity + * O(1). + */ + def endPosition: BitArray.Position = + BitArray.Position(count) + + /** Returns the position immediately after `p`. + * + * @requires + * `p` is a valid position in `self` different from `endPosition`. + * @complexity + * O(1). + */ + def positionAfter(p: BitArray.Position): BitArray.Position = + if (p.offsetInBucket == 63) { + BitArray.Position(p.bucket + 1, 0) + } else { + BitArray.Position(p.bucket, p.offsetInBucket + 1) + } + + /** Accesses the element at `p`. + * + * @requires + * `p` is a valid position in `self` different from `endPosition`. + * @complexity + * O(1). + */ + def at(p: BitArray.Position): Boolean = + val m = 1 << p.offsetInBucket + val b: Int = _bits.at(p.bucket) + (b & m) == m + + /** Accesses the `i`-th element of `this`. + * + * @requires + * `i` is greater than or equal to 0, and less than `count`. + * @complexity + * O(1). + */ + def atIndex(i: Int): Boolean = + at(BitArray.Position(i)) + + /** Calls `transform` on the element at `p` to update its value. + * + * @requires + * `p` is a valid position in `self` different from `endPosition`. + * @complexity + * O(1). + */ + def modifyAt( + p: BitArray.Position, + transform: (Boolean) => Boolean, + assumeUniqueness: Boolean = false + ): BitArray = + val result = if assumeUniqueness then this else copy() + result.setValue(transform(result.at(p)), p) + result + + /** Calls `transform` on `i`-th element of `this` to update its value. + * + * @requires + * `i` is greater than or equal to 0, and less than `count`. + * @complexity + * O(1). + */ + def modifyAtIndex( + i: Int, + transform: (Boolean) => Boolean, + assumeUniqueness: Boolean = false + ): BitArray = + modifyAt(BitArray.Position(i), transform, assumeUniqueness) + + /** Returns an independent copy of `this`. */ + def copy(minimumCapacity: Int = 0): BitArray = + if (minimumCapacity > capacity) { + // If the requested capacity on the copy is greater than what we have, `reserveCapacity` will + // create an independent value. + reserveCapacity(minimumCapacity) + } else { + val k = 1 + ((minimumCapacity - 1) >> 5) + val newBits = _bits.copy(k) + new BitArray(newBits, _count) + } + + /** Returns a textual description of `this`. */ + override def toString: String = + _bits.toString + + /** Sets the value `b` for the bit at position `p`. + * + * @requires + * `this` is uniquely referenced and `p` is a valid position in `this`. + */ + private def setValue(b: Boolean, p: BitArray.Position): Unit = + val m = 1 << p.offsetInBucket + _bits = _bits.modifyAt( + p.bucket, + (e) => if b then e | m else e & ~m, + assumeUniqueness = true + ) + +} + +object BitArray { + + /** A position in a `BitArray`. + * + * @param bucket + * The bucket containing `this`. + * @param offsetInBucket + * The offset of `this` in its containing bucket. + */ + final class Position( + private[BitArray] val bucket: Int, + private[BitArray] val offsetInBucket: Int + ) { + + /** Creates a position from an index. */ + private[BitArray] def this(index: Int) = + this(index >> 5, index & 31) + + /** Returns the index corresponding to this position. */ + private def index: Int = + (bucket >> 5) + offsetInBucket + + /** Returns a copy of `this`. */ + def copy(): Position = + new Position(bucket, offsetInBucket) + + /** Returns `true` iff `this` and `other` have an equivalent value. */ + def eq(other: Position): Boolean = + (this.bucket == other.bucket) && (this.offsetInBucket == other.offsetInBucket) + + /** Hashes the salient parts of `self` into `hasher`. */ + def hashInto(hasher: Hasher): Hasher = + hasher.combine(bucket) + hasher.combine(offsetInBucket) + + } + + /** Creates an array with the given `bits`. */ + def apply[T](bits: Boolean*): BitArray = + var result = new BitArray(HyArray[Int](), 0) + for (b <- bits) result = result.append(b, assumeUniqueness = true) + result + +} + +given bitArrayPositionIsValue: Value[BitArray.Position] with { + + extension (self: BitArray.Position) { + + def copy(): BitArray.Position = + self.copy() + + def eq(other: BitArray.Position): Boolean = + self.eq(other) + + def hashInto(hasher: Hasher): Hasher = + self.hashInto(hasher) + + } + +} + +given bitArrayIsCollection: Collection[BitArray] with { + + type Element = Boolean + //given elementIsValue: Value[Boolean] = booleanIsValue + + type Position = BitArray.Position + given positionIsValue: Value[BitArray.Position] = bitArrayPositionIsValue + + extension (self: BitArray) { + + override def count: Int = + self.count + + def startPosition: BitArray.Position = + self.startPosition + + def endPosition: BitArray.Position = + self.endPosition + + def positionAfter(p: BitArray.Position): BitArray.Position = + self.positionAfter(p) + + def at(p: BitArray.Position): Boolean = + self.at(p) + + } + +} + +given bitArrayIsStringConvertible: StringConvertible[BitArray] with { + + extension (self: BitArray) + override def description: String = + var contents = mutable.StringBuilder() + self.forEach((e) => { contents += (if e then '1' else '0'); true }) + contents.mkString + +} diff --git a/tests/pos/hylolib-deferred-given/Collection.scala b/tests/pos/hylolib-deferred-given/Collection.scala new file mode 100644 index 000000000000..6b5e7a762dc8 --- /dev/null +++ b/tests/pos/hylolib-deferred-given/Collection.scala @@ -0,0 +1,281 @@ +//> using options -language:experimental.modularity -source future +package hylo + +/** A collection of elements accessible by their position. */ +trait Collection[Self] { + + /** The type of the elements in the collection. */ + type Element + given elementIsValue: Value[Element] = compiletime.deferred + + /** The type of a position in the collection. */ + type Position + given positionIsValue: Value[Position] + + extension (self: Self) { + + /** Returns `true` iff `self` is empty. */ + def isEmpty: Boolean = + startPosition `eq` endPosition + + /** Returns the number of elements in `self`. + * + * @complexity + * O(n) where n is the number of elements in `self`. + */ + def count: Int = + val e = endPosition + def _count(p: Position, n: Int): Int = + if p `eq` e then n else _count(self.positionAfter(p), n + 1) + _count(startPosition, 0) + + /** Returns the position of `self`'s first element', or `endPosition` if `self` is empty. + * + * @complexity + * O(1) + */ + def startPosition: Position + + /** Returns the "past the end" position in `self`, that is, the position immediately after the + * last element in `self`. + * + * @complexity + * O(1). + */ + def endPosition: Position + + /** Returns the position immediately after `p`. + * + * @requires + * `p` is a valid position in `self` different from `endPosition`. + * @complexity + * O(1). + */ + def positionAfter(p: Position): Position + + /** Accesses the element at `p`. + * + * @requires + * `p` is a valid position in `self` different from `endPosition`. + * @complexity + * O(1). + */ + def at(p: Position): Element + + /** Returns `true` iff `i` precedes `j`. + * + * @requires + * `i` and j` are valid positions in `self`. + * @complexity + * O(n) where n is the number of elements in `self`. + */ + def isBefore(i: Position, j: Position): Boolean = + val e = self.endPosition + if (i.eq(e)) { + false + } else if (j.eq(e)) { + true + } else { + def _isBefore(n: Position): Boolean = + if (n.eq(j)) { + true + } else if (n.eq(e)) { + false + } else { + _isBefore(self.positionAfter(n)) + } + _isBefore(self.positionAfter(i)) + } + + } + +} + +extension [Self](self: Self)(using s: Collection[Self]) { + + /** Returns the first element of `self` along with a slice containing the suffix after this + * element, or `None` if `self` is empty. + * + * @complexity + * O(1) + */ + def headAndTail: Option[(s.Element, Slice[Self])] = + if (self.isEmpty) { + None + } else { + val p = self.startPosition + val q = self.positionAfter(p) + val t = Slice(self, Range(q, self.endPosition, (a, b) => (a `eq` b) || self.isBefore(a, b))) + Some((self.at(p), t)) + } + + /** Applies `combine` on `partialResult` and each element of `self`, in order. + * + * @complexity + * O(n) where n is the number of elements in `self`. + */ + def reduce[T](partialResult: T, combine: (T, s.Element) => T): T = + val e = self.endPosition + def loop(p: s.Position, r: T): T = + if (p.eq(e)) { + r + } else { + loop(self.positionAfter(p), combine(r, self.at(p))) + } + loop(self.startPosition, partialResult) + + /** Applies `action` on each element of `self`, in order, until `action` returns `false`, and + * returns `false` iff `action` did. + * + * You can return `false` from `action` to emulate a `continue` statement as found in traditional + * imperative languages (e.g., C). + * + * @complexity + * O(n) where n is the number of elements in `self`. + */ + def forEach(action: (s.Element) => Boolean): Boolean = + val e = self.endPosition + def loop(p: s.Position): Boolean = + if (p.eq(e)) { + true + } else if (!action(self.at(p))) { + false + } else { + loop(self.positionAfter(p)) + } + loop(self.startPosition) + + /** Returns a collection with the elements of `self` transformed by `transform`, in order. + * + * @complexity + * O(n) where n is the number of elements in `self`. + */ + def map[T](using Value[T])(transform: (s.Element) => T): HyArray[T] = + self.reduce( + HyArray[T](), + (r, e) => r.append(transform(e), assumeUniqueness = true) + ) + + /** Returns a collection with the elements of `self` satisfying `isInclude`, in order. + * + * @complexity + * O(n) where n is the number of elements in `self`. + */ + def filter(isIncluded: (s.Element) => Boolean): HyArray[s.Element] = + self.reduce( + HyArray[s.Element](), + (r, e) => if (isIncluded(e)) then r.append(e, assumeUniqueness = true) else r + ) + + /** Returns `true` if `self` contains an element satisfying `predicate`. + * + * @complexity + * O(n) where n is the number of elements in `self`. + */ + def containsWhere(predicate: (s.Element) => Boolean): Boolean = + self.firstPositionWhere(predicate) != None + + /** Returns `true` if all elements in `self` satisfy `predicate`. + * + * @complexity + * O(n) where n is the number of elements in `self`. + */ + def allSatisfy(predicate: (s.Element) => Boolean): Boolean = + self.firstPositionWhere(predicate) == None + + /** Returns the position of the first element of `self` satisfying `predicate`, or `None` if no + * such element exists. + * + * @complexity + * O(n) where n is the number of elements in `self`. + */ + def firstPositionWhere(predicate: (s.Element) => Boolean): Option[s.Position] = + val e = self.endPosition + def loop(p: s.Position): Option[s.Position] = + if (p.eq(e)) { + None + } else if (predicate(self.at(p))) { + Some(p) + } else { + loop(self.positionAfter(p)) + } + loop(self.startPosition) + + /** Returns the minimum element in `self`, using `isLessThan` to compare elements. + * + * @complexity + * O(n) where n is the number of elements in `self`. + */ + def minElement(isLessThan: (s.Element, s.Element) => Boolean): Option[s.Element] = + self.leastElement(isLessThan) + + // NOTE: I can't find a reasonable way to call this method. + /** Returns the minimum element in `self`. + * + * @complexity + * O(n) where n is the number of elements in `self`. + */ + def minElement()(using Comparable[s.Element]): Option[s.Element] = + self.minElement(isLessThan = _ `lt` _) + + /** Returns the maximum element in `self`, using `isGreaterThan` to compare elements. + * + * @complexity + * O(n) where n is the number of elements in `self`. + */ + def maxElement(isGreaterThan: (s.Element, s.Element) => Boolean): Option[s.Element] = + self.leastElement(isGreaterThan) + + /** Returns the maximum element in `self`. + * + * @complexity + * O(n) where n is the number of elements in `self`. + */ + def maxElement()(using Comparable[s.Element]): Option[s.Element] = + self.maxElement(isGreaterThan = _ `gt` _) + + /** Returns the maximum element in `self`, using `isOrderedBefore` to compare elements. + * + * @complexity + * O(n) where n is the number of elements in `self`. + */ + def leastElement(isOrderedBefore: (s.Element, s.Element) => Boolean): Option[s.Element] = + if (self.isEmpty) { + None + } else { + val e = self.endPosition + def _least(p: s.Position, least: s.Element): s.Element = + if (p.eq(e)) { + least + } else { + val x = self.at(p) + val y = if isOrderedBefore(x, least) then x else least + _least(self.positionAfter(p), y) + } + + val b = self.startPosition + Some(_least(self.positionAfter(b), self.at(b))) + } + +} + +extension [Self](self: Self)(using + s: Collection[Self], + e: Value[s.Element] +) { + + /** Returns `true` if `self` contains the same elements as `other`, in the same order. */ + def elementsEqual[T](using o: Collection[T] { type Element = s.Element })(other: T): Boolean = + def loop(i: s.Position, j: o.Position): Boolean = + if (i `eq` self.endPosition) { + j `eq` other.endPosition + } else if (j `eq` other.endPosition) { + false + } else if (self.at(i) `neq` other.at(j)) { + false + } else { + loop(self.positionAfter(i), other.positionAfter(j)) + } + loop(self.startPosition, other.startPosition) + +} diff --git a/tests/pos/hylolib-deferred-given/CoreTraits.scala b/tests/pos/hylolib-deferred-given/CoreTraits.scala new file mode 100644 index 000000000000..01b2c5242af9 --- /dev/null +++ b/tests/pos/hylolib-deferred-given/CoreTraits.scala @@ -0,0 +1,57 @@ +package hylo + +/** A type whose instance can be treated as independent values. + * + * The data structure of and algorithms of Hylo's standard library operate "notional values" rather + * than arbitrary references. This trait defines the basis operations of all values. + */ +trait Value[Self] { + + extension (self: Self) { + + /** Returns a copy of `self`. */ + def copy(): Self + + /** Returns `true` iff `self` and `other` have an equivalent value. */ + def eq(other: Self): Boolean + + /** Hashes the salient parts of `self` into `hasher`. */ + def hashInto(hasher: Hasher): Hasher + + } + +} + +extension [Self: Value](self: Self) def neq(other: Self): Boolean = !self.eq(other) + +// ---------------------------------------------------------------------------- +// Comparable +// ---------------------------------------------------------------------------- + +trait Comparable[Self] extends Value[Self] { + + extension (self: Self) { + + /** Returns `true` iff `self` is ordered before `other`. */ + def lt(other: Self): Boolean + + /** Returns `true` iff `self` is ordered after `other`. */ + def gt(other: Self): Boolean = other.lt(self) + + /** Returns `true` iff `self` is equal to or ordered before `other`. */ + def le(other: Self): Boolean = !other.lt(self) + + /** Returns `true` iff `self` is equal to or ordered after `other`. */ + def ge(other: Self): Boolean = !self.lt(other) + + } + +} + +/** Returns the lesser of `x` and `y`. */ +def min[T: Comparable](x: T, y: T): T = + if y.lt(x) then y else x + +/** Returns the greater of `x` and `y`. */ +def max[T: Comparable](x: T, y: T): T = + if x.lt(y) then y else x diff --git a/tests/pos/hylolib-deferred-given/Hasher.scala b/tests/pos/hylolib-deferred-given/Hasher.scala new file mode 100644 index 000000000000..ef6813df6b60 --- /dev/null +++ b/tests/pos/hylolib-deferred-given/Hasher.scala @@ -0,0 +1,38 @@ +package hylo + +import scala.util.Random + +/** A universal hash function. */ +final class Hasher private (private val hash: Int = Hasher.offsetBasis) { + + /** Returns the computed hash value. */ + def finalizeHash(): Int = + hash + + /** Adds `n` to the computed hash value. */ + def combine(n: Int): Hasher = + var h = hash + h = h ^ n + h = h * Hasher.prime + new Hasher(h) +} + +object Hasher { + + private val offsetBasis = 0x811c9dc5 + private val prime = 0x01000193 + + /** A random seed ensuring different hashes across multiple runs. */ + private lazy val seed = scala.util.Random.nextInt() + + /** Creates an instance with the given `seed`. */ + def apply(): Hasher = + val h = new Hasher() + h.combine(seed) + h + + /** Returns the hash of `v`. */ + def hash[T: Value](v: T): Int = + v.hashInto(Hasher()).finalizeHash() + +} diff --git a/tests/pos/hylolib-deferred-given/HyArray.scala b/tests/pos/hylolib-deferred-given/HyArray.scala new file mode 100644 index 000000000000..98632dcb65bc --- /dev/null +++ b/tests/pos/hylolib-deferred-given/HyArray.scala @@ -0,0 +1,224 @@ +package hylo + +import java.util.Arrays +import scala.collection.mutable + +/** An ordered, random-access collection. */ +final class HyArray[Element] private (using + elementIsValue: Value[Element] +)( + private var _storage: scala.Array[AnyRef | Null] | Null, + private var _count: Int // NOTE: where do I document private fields +) { + + // NOTE: The fact that we need Array[AnyRef] is diappointing and difficult to discover + // The compiler error sent me on a wild goose chase with ClassTag. + + /** Returns `true` iff `this` is empty. */ + def isEmpty: Boolean = + _count == 0 + + /** Returns the number of elements in `this`. */ + def count: Int = + _count + + /** Returns the number of elements that `this` can contain before allocating new storage. */ + def capacity: Int = + if _storage == null then 0 else _storage.length + + /** Reserves enough storage to store `n` elements in `this`. */ + def reserveCapacity(n: Int, assumeUniqueness: Boolean = false): HyArray[Element] = + if (n <= capacity) { + this + } else { + var newCapacity = max(1, capacity) + while (newCapacity < n) { newCapacity = newCapacity << 1 } + + val newStorage = new scala.Array[AnyRef | Null](newCapacity) + val s = _storage.asInstanceOf[scala.Array[AnyRef | Null]] + var i = 0 + while (i < count) { + newStorage(i) = _storage(i).asInstanceOf[Element].copy().asInstanceOf[AnyRef] + i += 1 + } + + if (assumeUniqueness) { + _storage = newStorage + this + } else { + new HyArray(newStorage, count) + } + } + + /** Adds a new element at the end of the array. */ + def append(source: Element, assumeUniqueness: Boolean = false): HyArray[Element] = + val result = if assumeUniqueness && (count < capacity) then this else copy(count + 1) + result._storage(count) = source.asInstanceOf[AnyRef] + result._count += 1 + result + + // NOTE: Can't refine `C.Element` without renaming the generic parameter of `HyArray`. + // /** Adds the contents of `source` at the end of the array. */ + // def appendContents[C](using + // s: Collection[C] + // )( + // source: C { type Element = Element }, + // assumeUniqueness: Boolean = false + // ): HyArray[Element] = + // val result = if (assumeUniqueness) { this } else { copy(count + source.count) } + // source.reduce(result, (r, e) => r.append(e, assumeUniqueness = true)) + + /** Removes and returns the last element, or returns `None` if the array is empty. */ + def popLast(assumeUniqueness: Boolean = false): (HyArray[Element], Option[Element]) = + if (isEmpty) { + (this, None) + } else { + val result = if assumeUniqueness then this else copy() + result._count -= 1 + (result, Some(result._storage(result._count).asInstanceOf[Element])) + } + + /** Removes all elements in the array, keeping allocated storage iff `keepStorage` is true. */ + def removeAll( + keepStorage: Boolean = false, + assumeUniqueness: Boolean = false + ): HyArray[Element] = + if (isEmpty) { + this + } else if (keepStorage) { + val result = if assumeUniqueness then this else copy() + Arrays.fill(result._storage, null) + result._count = 0 + result + } else { + HyArray() + } + + /** Accesses the element at `p`. + * + * @requires + * `p` is a valid position in `self` different from `endPosition`. + * @complexity + * O(1). + */ + def at(p: Int): Element = + _storage(p).asInstanceOf[Element] + + /** Calls `transform` on the element at `p` to update its value. + * + * @requires + * `p` is a valid position in `self` different from `endPosition`. + * @complexity + * O(1). + */ + def modifyAt( + p: Int, + transform: (Element) => Element, + assumeUniqueness: Boolean = false + ): HyArray[Element] = + val result = if assumeUniqueness then this else copy() + result._storage(p) = transform(at(p)).asInstanceOf[AnyRef] + result + + /** Returns a textual description of `this`. */ + override def toString: String = + var s = "[" + var i = 0 + while (i < count) { + if (i > 0) { s += ", " } + s += s"${at(i)}" + i += 1 + } + s + "]" + + /** Returns an independent copy of `this`, capable of storing `minimumCapacity` elements before + * allocating new storage. + */ + def copy(minimumCapacity: Int = 0): HyArray[Element] = + if (minimumCapacity > capacity) { + // If the requested capacity on the copy is greater than what we have, `reserveCapacity` will + // create an independent value. + reserveCapacity(minimumCapacity) + } else { + val clone = HyArray[Element]().reserveCapacity(max(minimumCapacity, count)) + var i = 0 + while (i < count) { + clone._storage(i) = _storage(i).asInstanceOf[Element].copy().asInstanceOf[AnyRef] + i += 1 + } + clone._count = count + clone + } + +} + +object HyArray { + + /** Creates an array with the given `elements`. */ + def apply[T](using t: Value[T])(elements: T*): HyArray[T] = + var a = new HyArray[T](null, 0) + for (e <- elements) a = a.append(e, assumeUniqueness = true) + a + +} + +given hyArrayIsValue[T](using tIsValue: Value[T]): Value[HyArray[T]] with { + + extension (self: HyArray[T]) { + + def copy(): HyArray[T] = + self.copy() + + def eq(other: HyArray[T]): Boolean = + self.elementsEqual(other) + + def hashInto(hasher: Hasher): Hasher = + self.reduce(hasher, (h, e) => e.hashInto(h)) + + } + +} + +given hyArrayIsCollection[T](using tIsValue: Value[T]): Collection[HyArray[T]] with { + + type Element = T + //given elementIsValue: Value[T] = tIsValue + + type Position = Int + given positionIsValue: Value[Int] = intIsValue + + extension (self: HyArray[T]) { + + // NOTE: Having to explicitly override means that primary declaration can't automatically + // specialize trait requirements. + override def isEmpty: Boolean = self.isEmpty + + override def count: Int = self.count + + def startPosition = 0 + + def endPosition = self.count + + def positionAfter(p: Int) = p + 1 + + def at(p: Int) = self.at(p) + + } + +} + +// NOTE: This should work. +// given hyArrayIsStringConvertible[T](using +// tIsValue: Value[T], +// tIsStringConvertible: StringConvertible[T] +// ): StringConvertible[HyArray[T]] with { +// +// given Collection[HyArray[T]] = hyArrayIsCollection[T] +// +// extension (self: HyArray[T]) +// override def description: String = +// var contents = mutable.StringBuilder() +// self.forEach((e) => { contents ++= e.description; true }) +// s"[${contents.mkString(", ")}]" +// +// } diff --git a/tests/pos/hylolib-deferred-given/Integers.scala b/tests/pos/hylolib-deferred-given/Integers.scala new file mode 100644 index 000000000000..b9bc203a88ea --- /dev/null +++ b/tests/pos/hylolib-deferred-given/Integers.scala @@ -0,0 +1,58 @@ +package hylo + +given booleanIsValue: Value[Boolean] with { + + extension (self: Boolean) { + + def copy(): Boolean = + // Note: Scala's `Boolean` has value semantics already. + self + + def eq(other: Boolean): Boolean = + self == other + + def hashInto(hasher: Hasher): Hasher = + hasher.combine(if self then 1 else 0) + + } + +} + +given intIsValue: Value[Int] with { + + extension (self: Int) { + + def copy(): Int = + // Note: Scala's `Int` has value semantics already. + self + + def eq(other: Int): Boolean = + self == other + + def hashInto(hasher: Hasher): Hasher = + hasher.combine(self) + + } + +} + +given intIsComparable: Comparable[Int] with { + + extension (self: Int) { + + def copy(): Int = + self + + def eq(other: Int): Boolean = + self == other + + def hashInto(hasher: Hasher): Hasher = + hasher.combine(self) + + def lt(other: Int): Boolean = self < other + + } + +} + +given intIsStringConvertible: StringConvertible[Int] with {} diff --git a/tests/pos/hylolib-deferred-given/Range.scala b/tests/pos/hylolib-deferred-given/Range.scala new file mode 100644 index 000000000000..1f597652ead1 --- /dev/null +++ b/tests/pos/hylolib-deferred-given/Range.scala @@ -0,0 +1,37 @@ +package hylo + +/** A half-open interval from a lower bound up to, but not including, an uppor bound. */ +final class Range[Bound] private (val lowerBound: Bound, val upperBound: Bound) { + + /** Returns a textual description of `this`. */ + override def toString: String = + s"[${lowerBound}, ${upperBound})" + +} + +object Range { + + /** Creates a half-open interval [`lowerBound`, `upperBound`), using `isLessThanOrEqual` to ensure + * that the bounds are well-formed. + * + * @requires + * `lowerBound` is lesser than or equal to `upperBound`. + */ + def apply[Bound]( + lowerBound: Bound, + upperBound: Bound, + isLessThanOrEqual: (Bound, Bound) => Boolean + ) = + require(isLessThanOrEqual(lowerBound, upperBound)) + new Range(lowerBound, upperBound) + + /** Creates a half-open interval [`lowerBound`, `upperBound`). + * + * @requires + * `lowerBound` is lesser than or equal to `upperBound`. + */ + def apply[Bound](lowerBound: Bound, upperBound: Bound)(using Comparable[Bound]) = + require(lowerBound `le` upperBound) + new Range(lowerBound, upperBound) + +} diff --git a/tests/pos/hylolib-deferred-given/Slice.scala b/tests/pos/hylolib-deferred-given/Slice.scala new file mode 100644 index 000000000000..57cdb38f6e53 --- /dev/null +++ b/tests/pos/hylolib-deferred-given/Slice.scala @@ -0,0 +1,49 @@ +package hylo + +/** A view into a collection. */ +final class Slice[Base](using + val b: Collection[Base] +)( + val base: Base, + val bounds: Range[b.Position] +) { + + /** Returns `true` iff `this` is empty. */ + def isEmpty: Boolean = + bounds.lowerBound.eq(bounds.upperBound) + + def startPosition: b.Position = + bounds.lowerBound + + def endPosition: b.Position = + bounds.upperBound + + def positionAfter(p: b.Position): b.Position = + base.positionAfter(p) + + def at(p: b.Position): b.Element = + base.at(p) + +} + +given sliceIsCollection[T](using c: Collection[T]): Collection[Slice[T]] with { + + type Element = c.Element + //given elementIsValue: Value[Element] = c.elementIsValue + + type Position = c.Position + given positionIsValue: Value[Position] = c.positionIsValue + + extension (self: Slice[T]) { + + def startPosition = self.bounds.lowerBound.asInstanceOf[Position] // NOTE: Ugly hack + + def endPosition = self.bounds.upperBound.asInstanceOf[Position] + + def positionAfter(p: Position) = self.base.positionAfter(p) + + def at(p: Position) = self.base.at(p) + + } + +} diff --git a/tests/pos/hylolib-deferred-given/StringConvertible.scala b/tests/pos/hylolib-deferred-given/StringConvertible.scala new file mode 100644 index 000000000000..0702f79f2794 --- /dev/null +++ b/tests/pos/hylolib-deferred-given/StringConvertible.scala @@ -0,0 +1,14 @@ +package hylo + +/** A type whose instances can be described by a character string. */ +trait StringConvertible[Self] { + + extension (self: Self) { + + /** Returns a textual description of `self`. */ + def description: String = + self.toString + + } + +} From 6016ce99a66aac5e6ff6da801f3c01743ed35cf2 Mon Sep 17 00:00:00 2001 From: odersky Date: Tue, 2 Apr 2024 17:06:18 +0200 Subject: [PATCH 217/371] FIX: Allow ContextBoundParamNames to be unmangled. Also, fix the unmangling of UniqueExtNames, which seemingly never worked. [Cherry-picked 600293ee2a74e945ad8870b9034b416e2294c0e6] --- .../src/dotty/tools/dotc/core/NameKinds.scala | 37 +++++++++---------- 1 file changed, 17 insertions(+), 20 deletions(-) diff --git a/compiler/src/dotty/tools/dotc/core/NameKinds.scala b/compiler/src/dotty/tools/dotc/core/NameKinds.scala index d4f009cbbbd5..74d440562824 100644 --- a/compiler/src/dotty/tools/dotc/core/NameKinds.scala +++ b/compiler/src/dotty/tools/dotc/core/NameKinds.scala @@ -182,13 +182,13 @@ object NameKinds { case DerivedName(underlying, info: this.NumberedInfo) => Some((underlying, info.num)) case _ => None } - protected def skipSeparatorAndNum(name: SimpleName, separator: String): Int = { + protected def skipSeparatorAndNum(name: SimpleName, separator: String): Int = var i = name.length - while (i > 0 && name(i - 1).isDigit) i -= 1 - if (i > separator.length && i < name.length && - name.slice(i - separator.length, i).toString == separator) i + while i > 0 && name(i - 1).isDigit do i -= 1 + if i >= separator.length && i < name.length + && name.slice(i - separator.length, i).toString == separator + then i else -1 - } numberedNameKinds(tag) = this: @unchecked } @@ -240,6 +240,16 @@ object NameKinds { } } + /** Unique names that can be unmangled */ + class UniqueNameKindWithUnmangle(separator: String) extends UniqueNameKind(separator): + override def unmangle(name: SimpleName): TermName = + val i = skipSeparatorAndNum(name, separator) + if i > 0 then + val index = name.drop(i).toString.toInt + val original = name.take(i - separator.length).asTermName + apply(original, index) + else name + /** Names of the form `prefix . name` */ val QualifiedName: QualifiedNameKind = new QualifiedNameKind(QUALIFIED, ".") @@ -288,7 +298,7 @@ object NameKinds { * * The "evidence$" prefix is a convention copied from Scala 2. */ - val ContextBoundParamName: UniqueNameKind = new UniqueNameKind("evidence$") + val ContextBoundParamName: UniqueNameKind = new UniqueNameKindWithUnmangle("evidence$") /** The name of an inferred contextual function parameter: * @@ -323,20 +333,7 @@ object NameKinds { val InlineBinderName: UniqueNameKind = new UniqueNameKind("$proxy") val MacroNames: UniqueNameKind = new UniqueNameKind("$macro$") - /** A kind of unique extension methods; Unlike other unique names, these can be - * unmangled. - */ - val UniqueExtMethName: UniqueNameKind = new UniqueNameKind("$extension") { - override def unmangle(name: SimpleName): TermName = { - val i = skipSeparatorAndNum(name, separator) - if (i > 0) { - val index = name.drop(i).toString.toInt - val original = name.take(i - separator.length).asTermName - apply(original, index) - } - else name - } - } + val UniqueExtMethName: UniqueNameKind = new UniqueNameKindWithUnmangle("$extension") /** Kinds of unique names generated by the pattern matcher */ val PatMatStdBinderName: UniqueNameKind = new UniqueNameKind("x") From 81679fabee21c6777099021b125afe5f77f7709d Mon Sep 17 00:00:00 2001 From: odersky Date: Thu, 21 Dec 2023 11:32:24 +0100 Subject: [PATCH 218/371] Change rules for given prioritization Consider the following program: ```scala class A class B extends A class C extends A given A = A() given B = B() given C = C() def f(using a: A, b: B, c: C) = println(a.getClass) println(b.getClass) println(c.getClass) @main def Test = f ``` With the current rules, this would fail with an ambiguity error between B and C when trying to synthesize the A parameter. This is a problem without an easy remedy. We can fix this problem by flipping the priority for implicit arguments. Instead of requiring an argument to be most _specific_, we now require it to be most _general_ while still conforming to the formal parameter. There are three justifications for this change, which at first glance seems quite drastic: - It gives us a natural way to deal with inheritance triangles like the one in the code above. Such triangles are quite common. - Intuitively, we want to get the closest possible match between required formal parameter type and synthetisized argument. The "most general" rule provides that. - We already do a crucial part of this. Namely, with current rules we interpolate all type variables in an implicit argument downwards, no matter what their variance is. This makes no sense in theory, but solves hairy problems with contravariant typeclasses like `Comparable`. Instead of this hack, we now do something more principled, by flipping the direction everywhere, preferring general over specific, instead of just flipping contravariant type parameters. The behavior is dependent on the Scala version - Old behavior: up to 3.4 - New behavior: from 3.5, 3.5-migration warns on behavior change The CB builds under the new rules. One fix was needed for a shapeless 3 deriving test. There was a typo: mkInstances instead of mkProductInstances, which previously got healed by accident because of the most specific rule. Also: Don't flip contravariant type arguments for overloading resolution Flipping contravariant type arguments was needed for implicit search where it will be replaced by a more general scheme. But it makes no sense for overloading resolution. For overloading resolution, we want to pick the most specific alternative, analogous to us picking the most specific instantiation when we force a fully defined type. Also: Disable implicit search everywhere for disambiaguation Previously, one disambiguation step missed that, whereas implicits were turned off everywhere else. [Cherry-picked 48000ee3f578201279094c7d76152a9fbf0992cc] --- compiler/src/dotty/tools/dotc/typer/Applications.scala | 2 +- compiler/src/dotty/tools/dotc/typer/Implicits.scala | 2 +- 2 files changed, 2 insertions(+), 2 deletions(-) diff --git a/compiler/src/dotty/tools/dotc/typer/Applications.scala b/compiler/src/dotty/tools/dotc/typer/Applications.scala index 76d057f15408..63e86e3a321d 100644 --- a/compiler/src/dotty/tools/dotc/typer/Applications.scala +++ b/compiler/src/dotty/tools/dotc/typer/Applications.scala @@ -1886,7 +1886,7 @@ trait Applications extends Compatibility { then // Intermediate rules: better means specialize, but map all type arguments downwards // These are enabled for 3.0-3.5, and for all comparisons between old-style implicits, - // and in 3.5 amd 3.6-migration when we compare with previous rules. + // and in 3.5 and 3.6-migration when we compare with previous rules. val flip = new TypeMap: def apply(t: Type) = t match case t @ AppliedType(tycon, args) => diff --git a/compiler/src/dotty/tools/dotc/typer/Implicits.scala b/compiler/src/dotty/tools/dotc/typer/Implicits.scala index 5ac12ce1aa0c..fd22f0ec5529 100644 --- a/compiler/src/dotty/tools/dotc/typer/Implicits.scala +++ b/compiler/src/dotty/tools/dotc/typer/Implicits.scala @@ -531,7 +531,7 @@ object Implicits: |must be more specific than $target""" :: Nil override def msg(using Context) = - super.msg.append("\nThe expected type $target is not specific enough, so no search was attempted") + super.msg.append(i"\nThe expected type $target is not specific enough, so no search was attempted") override def toString = s"TooUnspecific" end TooUnspecific From 555f67c800af263b0528cdc410f32b47bab9b7e3 Mon Sep 17 00:00:00 2001 From: odersky Date: Tue, 2 Apr 2024 15:54:37 +0200 Subject: [PATCH 219/371] Allow context bounds in type declarations Expand them to deferred givens [Cherry-picked d923cac0f70b357d75721daf0cf316b4393f2beb] --- .../src/dotty/tools/dotc/ast/Desugar.scala | 15 +- .../dotty/tools/dotc/parsing/Parsers.scala | 53 +-- .../test/dotc/pos-test-pickling.blacklist | 1 + docs/_docs/internals/syntax.md | 2 +- tests/pos/deferredSummon.scala | 21 +- tests/pos/dep-context-bounds.scala | 10 + tests/pos/hylolib-cb-extract.scala | 18 + tests/pos/hylolib-cb/AnyCollection.scala | 66 ++++ tests/pos/hylolib-cb/AnyValue.scala | 76 ++++ tests/pos/hylolib-cb/BitArray.scala | 372 ++++++++++++++++++ tests/pos/hylolib-cb/Collection.scala | 279 +++++++++++++ tests/pos/hylolib-cb/CoreTraits.scala | 57 +++ tests/pos/hylolib-cb/Hasher.scala | 38 ++ tests/pos/hylolib-cb/HyArray.scala | 221 +++++++++++ tests/pos/hylolib-cb/Integers.scala | 58 +++ tests/pos/hylolib-cb/Range.scala | 37 ++ tests/pos/hylolib-cb/Slice.scala | 46 +++ tests/pos/hylolib-cb/StringConvertible.scala | 14 + .../pos/hylolib-deferred-given/AnyValue.scala | 2 +- tests/pos/hylolib-deferred-given/Range.scala | 2 +- 20 files changed, 1355 insertions(+), 33 deletions(-) create mode 100644 tests/pos/dep-context-bounds.scala create mode 100644 tests/pos/hylolib-cb-extract.scala create mode 100644 tests/pos/hylolib-cb/AnyCollection.scala create mode 100644 tests/pos/hylolib-cb/AnyValue.scala create mode 100644 tests/pos/hylolib-cb/BitArray.scala create mode 100644 tests/pos/hylolib-cb/Collection.scala create mode 100644 tests/pos/hylolib-cb/CoreTraits.scala create mode 100644 tests/pos/hylolib-cb/Hasher.scala create mode 100644 tests/pos/hylolib-cb/HyArray.scala create mode 100644 tests/pos/hylolib-cb/Integers.scala create mode 100644 tests/pos/hylolib-cb/Range.scala create mode 100644 tests/pos/hylolib-cb/Slice.scala create mode 100644 tests/pos/hylolib-cb/StringConvertible.scala diff --git a/compiler/src/dotty/tools/dotc/ast/Desugar.scala b/compiler/src/dotty/tools/dotc/ast/Desugar.scala index 04fd1afca8be..d6e442ed4a0c 100644 --- a/compiler/src/dotty/tools/dotc/ast/Desugar.scala +++ b/compiler/src/dotty/tools/dotc/ast/Desugar.scala @@ -237,12 +237,13 @@ object desugar { def desugarRhs(rhs: Tree): Tree = rhs match case ContextBounds(tbounds, cxbounds) => + val isMember = flags.isAllOf(DeferredGivenFlags) for bound <- cxbounds do val evidenceName = bound match case ContextBoundTypeTree(_, _, ownName) if !ownName.isEmpty => ownName - case _ if Config.nameSingleContextBounds && cxbounds.tail.isEmpty - && Feature.enabled(Feature.modularity) => + case _ if Config.nameSingleContextBounds && !isMember + && cxbounds.tail.isEmpty && Feature.enabled(Feature.modularity) => tdef.name.toTermName case _ => freshName(bound) @@ -492,6 +493,14 @@ object desugar { Apply(fn, params.map(refOfDef)) } + def typeDef(tdef: TypeDef)(using Context): Tree = + val evidenceBuf = new mutable.ListBuffer[ValDef] + val result = desugarContextBounds( + tdef, evidenceBuf, + (tdef.mods.flags.toTermFlags & AccessFlags) | Lazy | DeferredGivenFlags, + inventGivenOrExtensionName, Nil) + if evidenceBuf.isEmpty then result else Thicket(result :: evidenceBuf.toList) + /** The expansion of a class definition. See inline comments for what is involved */ def classDef(cdef: TypeDef)(using Context): Tree = { val impl @ Template(constr0, _, self, _) = cdef.rhs: @unchecked @@ -1426,7 +1435,7 @@ object desugar { case tree: TypeDef => if (tree.isClassDef) classDef(tree) else if (ctx.mode.isQuotedPattern) quotedPatternTypeDef(tree) - else tree + else typeDef(tree) case tree: DefDef => if (tree.name.isConstructorName) tree // was already handled by enclosing classDef else defDef(tree) diff --git a/compiler/src/dotty/tools/dotc/parsing/Parsers.scala b/compiler/src/dotty/tools/dotc/parsing/Parsers.scala index bbc4096f266b..f3d02dda5c48 100644 --- a/compiler/src/dotty/tools/dotc/parsing/Parsers.scala +++ b/compiler/src/dotty/tools/dotc/parsing/Parsers.scala @@ -3930,14 +3930,16 @@ object Parsers { argumentExprss(mkApply(Ident(nme.CONSTRUCTOR), argumentExprs())) } - /** TypeDef ::= id [TypeParamClause] {FunParamClause} TypeBounds [‘=’ Type] + /** TypeDef ::= id [TypeParamClause] {FunParamClause} TypeAndCtxBounds [‘=’ Type] */ def typeDefOrDcl(start: Offset, mods: Modifiers): Tree = { newLinesOpt() atSpan(start, nameStart) { val nameIdent = typeIdent() + val tname = nameIdent.name.asTypeName val tparams = typeParamClauseOpt(ParamOwner.Type) val vparamss = funParamClauses() + def makeTypeDef(rhs: Tree): Tree = { val rhs1 = lambdaAbstractAll(tparams :: vparamss, rhs) val tdef = TypeDef(nameIdent.name.toTypeName, rhs1) @@ -3945,36 +3947,37 @@ object Parsers { tdef.pushAttachment(Backquoted, ()) finalizeDef(tdef, mods, start) } + in.token match { case EQUALS => in.nextToken() makeTypeDef(toplevelTyp()) case SUBTYPE | SUPERTYPE => - val bounds = typeBounds() - if (in.token == EQUALS) { - val eqOffset = in.skipToken() - var rhs = toplevelTyp() - rhs match { - case mtt: MatchTypeTree => - bounds match { - case TypeBoundsTree(EmptyTree, upper, _) => - rhs = MatchTypeTree(upper, mtt.selector, mtt.cases) - case _ => - syntaxError(em"cannot combine lower bound and match type alias", eqOffset) - } - case _ => - if mods.is(Opaque) then - rhs = TypeBoundsTree(bounds.lo, bounds.hi, rhs) - else - syntaxError(em"cannot combine bound and alias", eqOffset) - } - makeTypeDef(rhs) - } - else makeTypeDef(bounds) + typeAndCtxBounds(tname) match + case bounds: TypeBoundsTree if in.token == EQUALS => + val eqOffset = in.skipToken() + var rhs = toplevelTyp() + rhs match { + case mtt: MatchTypeTree => + bounds match { + case TypeBoundsTree(EmptyTree, upper, _) => + rhs = MatchTypeTree(upper, mtt.selector, mtt.cases) + case _ => + syntaxError(em"cannot combine lower bound and match type alias", eqOffset) + } + case _ => + if mods.is(Opaque) then + rhs = TypeBoundsTree(bounds.lo, bounds.hi, rhs) + else + syntaxError(em"cannot combine bound and alias", eqOffset) + } + makeTypeDef(rhs) + case bounds => makeTypeDef(bounds) case SEMI | NEWLINE | NEWLINES | COMMA | RBRACE | OUTDENT | EOF => - makeTypeDef(typeBounds()) - case _ if (staged & StageKind.QuotedPattern) != 0 => - makeTypeDef(typeBounds()) + makeTypeDef(typeAndCtxBounds(tname)) + case _ if (staged & StageKind.QuotedPattern) != 0 + || in.featureEnabled(Feature.modularity) && in.isColon => + makeTypeDef(typeAndCtxBounds(tname)) case _ => syntaxErrorOrIncomplete(ExpectedTypeBoundOrEquals(in.token)) return EmptyTree // return to avoid setting the span to EmptyTree diff --git a/compiler/test/dotc/pos-test-pickling.blacklist b/compiler/test/dotc/pos-test-pickling.blacklist index 5c715faa504b..e58277bdc0e5 100644 --- a/compiler/test/dotc/pos-test-pickling.blacklist +++ b/compiler/test/dotc/pos-test-pickling.blacklist @@ -129,6 +129,7 @@ parsercombinators-givens.scala parsercombinators-givens-2.scala parsercombinators-arrow.scala hylolib-deferred-given +hylolib-cb diff --git a/docs/_docs/internals/syntax.md b/docs/_docs/internals/syntax.md index e123fa900258..05f89a344148 100644 --- a/docs/_docs/internals/syntax.md +++ b/docs/_docs/internals/syntax.md @@ -457,7 +457,7 @@ PatDef ::= ids [‘:’ Type] [‘=’ Expr] DefDef ::= DefSig [‘:’ Type] [‘=’ Expr] DefDef(_, name, paramss, tpe, expr) | ‘this’ TypelessClauses [DefImplicitClause] ‘=’ ConstrExpr DefDef(_, , vparamss, EmptyTree, expr | Block) DefSig ::= id [DefParamClauses] [DefImplicitClause] -TypeDef ::= id [TypeParamClause] {FunParamClause} TypeBounds TypeDefTree(_, name, tparams, bound +TypeDef ::= id [TypeParamClause] {FunParamClause} TypeAndCtxBounds TypeDefTree(_, name, tparams, bound [‘=’ Type] TmplDef ::= ([‘case’] ‘class’ | ‘trait’) ClassDef diff --git a/tests/pos/deferredSummon.scala b/tests/pos/deferredSummon.scala index d12a98e52736..31a9697eda6b 100644 --- a/tests/pos/deferredSummon.scala +++ b/tests/pos/deferredSummon.scala @@ -9,11 +9,15 @@ trait A: given Ord[Elem] = deferred def foo = summon[Ord[Elem]] +trait B: + type Elem: Ord + def foo = summon[Ord[Elem]] + object Inst: given Ord[Int]: def less(x: Int, y: Int) = x < y -object Test: +object Test1: import Inst.given class C extends A: type Elem = Int @@ -22,9 +26,22 @@ object Test: given A: type Elem = Int -class D[T: Ord] extends A: +class D1[T: Ord] extends B: + type Elem = T + +object Test2: + import Inst.given + class C extends B: + type Elem = Int + object E extends B: + type Elem = Int + given B: + type Elem = Int + +class D2[T: Ord] extends B: type Elem = T + diff --git a/tests/pos/dep-context-bounds.scala b/tests/pos/dep-context-bounds.scala new file mode 100644 index 000000000000..434805762622 --- /dev/null +++ b/tests/pos/dep-context-bounds.scala @@ -0,0 +1,10 @@ +//> using options -language:experimental.modularity -source future +trait A[X]: + type Self = X + +object Test2: + def foo[X: A as x](a: x.Self) = ??? + + def bar[X: A as x](a: Int) = ??? + + def baz[X: A as x](a: Int)(using String) = ??? diff --git a/tests/pos/hylolib-cb-extract.scala b/tests/pos/hylolib-cb-extract.scala new file mode 100644 index 000000000000..b80a88485a2b --- /dev/null +++ b/tests/pos/hylolib-cb-extract.scala @@ -0,0 +1,18 @@ +//> using options -language:experimental.modularity -source future +package hylotest +import compiletime.deferred + +trait Value[Self] + +/** A collection of elements accessible by their position. */ +trait Collection[Self]: + + /** The type of the elements in the collection. */ + type Element: Value + +class BitArray + +given Value[Boolean] {} + +given Collection[BitArray] with + type Element = Boolean diff --git a/tests/pos/hylolib-cb/AnyCollection.scala b/tests/pos/hylolib-cb/AnyCollection.scala new file mode 100644 index 000000000000..1a44344d0e51 --- /dev/null +++ b/tests/pos/hylolib-cb/AnyCollection.scala @@ -0,0 +1,66 @@ +package hylo + +/** A type-erased collection. + * + * A `AnyCollection` forwards its operations to a wrapped value, hiding its implementation. + */ +final class AnyCollection[Element] private ( + val _start: () => AnyValue, + val _end: () => AnyValue, + val _after: (AnyValue) => AnyValue, + val _at: (AnyValue) => Element +) + +object AnyCollection { + + /** Creates an instance forwarding its operations to `base`. */ + def apply[Base](using b: Collection[Base])(base: Base): AnyCollection[b.Element] = + // NOTE: This evidence is redefined so the compiler won't report ambiguity between `intIsValue` + // and `anyValueIsValue` when the method is called on a collection of `Int`s. None of these + // choices is even correct! Note also that the ambiguity is suppressed if the constructor of + // `AnyValue` is declared with a context bound rather than an implicit parameter. + given Value[b.Position] = b.positionIsValue + + def start(): AnyValue = + AnyValue(base.startPosition) + + def end(): AnyValue = + AnyValue(base.endPosition) + + def after(p: AnyValue): AnyValue = + AnyValue(base.positionAfter(p.unsafelyUnwrappedAs[b.Position])) + + def at(p: AnyValue): b.Element = + base.at(p.unsafelyUnwrappedAs[b.Position]) + + new AnyCollection[b.Element]( + _start = start, + _end = end, + _after = after, + _at = at + ) + +} + +given anyCollectionIsCollection[T](using tIsValue: Value[T]): Collection[AnyCollection[T]] with { + + type Element = T + type Position = AnyValue + + extension (self: AnyCollection[T]) { + + def startPosition = + self._start() + + def endPosition = + self._end() + + def positionAfter(p: Position) = + self._after(p) + + def at(p: Position) = + self._at(p) + + } + +} diff --git a/tests/pos/hylolib-cb/AnyValue.scala b/tests/pos/hylolib-cb/AnyValue.scala new file mode 100644 index 000000000000..b9d39869c09a --- /dev/null +++ b/tests/pos/hylolib-cb/AnyValue.scala @@ -0,0 +1,76 @@ +package hylo + +/** A wrapper around an object providing a reference API. */ +private final class Ref[T](val value: T) { + + override def toString: String = + s"Ref($value)" + +} + +/** A type-erased value. + * + * An `AnyValue` forwards its operations to a wrapped value, hiding its implementation. + */ +final class AnyValue private ( + private val wrapped: AnyRef, + private val _copy: (AnyRef) => AnyValue, + private val _eq: (AnyRef, AnyRef) => Boolean, + private val _hashInto: (AnyRef, Hasher) => Hasher +) { + + /** Returns a copy of `this`. */ + def copy(): AnyValue = + _copy(this.wrapped) + + /** Returns `true` iff `this` and `other` have an equivalent value. */ + def eq(other: AnyValue): Boolean = + _eq(this.wrapped, other.wrapped) + + /** Hashes the salient parts of `this` into `hasher`. */ + def hashInto(hasher: Hasher): Hasher = + _hashInto(this.wrapped, hasher) + + /** Returns the value wrapped in `this` as an instance of `T`. */ + def unsafelyUnwrappedAs[T]: T = + wrapped.asInstanceOf[Ref[T]].value + + /** Returns a textual description of `this`. */ + override def toString: String = + wrapped.toString + +} + +object AnyValue { + + /** Creates an instance wrapping `wrapped`. */ + def apply[T](using Value[T])(wrapped: T): AnyValue = + def copy(a: AnyRef): AnyValue = + AnyValue(a.asInstanceOf[Ref[T]].value.copy()) + + def eq(a: AnyRef, b: AnyRef): Boolean = + a.asInstanceOf[Ref[T]].value `eq` b.asInstanceOf[Ref[T]].value + + def hashInto(a: AnyRef, hasher: Hasher): Hasher = + a.asInstanceOf[Ref[T]].value.hashInto(hasher) + + new AnyValue(Ref(wrapped), copy, eq, hashInto) + +} + +given anyValueIsValue: Value[AnyValue] with { + + extension (self: AnyValue) { + + def copy(): AnyValue = + self.copy() + + def eq(other: AnyValue): Boolean = + self `eq` other + + def hashInto(hasher: Hasher): Hasher = + self.hashInto(hasher) + + } + +} diff --git a/tests/pos/hylolib-cb/BitArray.scala b/tests/pos/hylolib-cb/BitArray.scala new file mode 100644 index 000000000000..3a0b4658f747 --- /dev/null +++ b/tests/pos/hylolib-cb/BitArray.scala @@ -0,0 +1,372 @@ +package hylo + +import scala.collection.mutable + +/** An array of bit values represented as Booleans, where `true` indicates that the bit is on. */ +final class BitArray private ( + private var _bits: HyArray[Int], + private var _count: Int +) { + + /** Returns `true` iff `this` is empty. */ + def isEmpty: Boolean = + _count == 0 + + /** Returns the number of elements in `this`. */ + def count: Int = + _count + + /** The number of bits that the array can contain before allocating new storage. */ + def capacity: Int = + _bits.capacity << 5 + + /** Reserves enough storage to store `n` elements in `this`. */ + def reserveCapacity(n: Int, assumeUniqueness: Boolean = false): BitArray = + if (n == 0) { + this + } else { + val k = 1 + ((n - 1) >> 5) + if (assumeUniqueness) { + _bits = _bits.reserveCapacity(k, assumeUniqueness) + this + } else { + new BitArray(_bits.reserveCapacity(k), _count) + } + } + + /** Adds a new element at the end of the array. */ + def append(bit: Boolean, assumeUniqueness: Boolean = false): BitArray = + val result = if assumeUniqueness && (count < capacity) then this else copy(count + 1) + val p = BitArray.Position(count) + if (p.bucket >= _bits.count) { + result._bits = _bits.append(if bit then 1 else 0) + } else { + result.setValue(bit, p) + } + result._count += 1 + result + + /** Removes and returns the last element, or returns `None` if the array is empty. */ + def popLast(assumeUniqueness: Boolean = false): (BitArray, Option[Boolean]) = + if (isEmpty) { + (this, None) + } else { + val result = if assumeUniqueness then this else copy() + val bit = result.at(BitArray.Position(count)) + result._count -= 1 + (result, Some(bit)) + } + + /** Removes all elements in the array, keeping allocated storage iff `keepStorage` is true. */ + def removeAll( + keepStorage: Boolean = false, + assumeUniqueness: Boolean = false + ): BitArray = + if (isEmpty) { + this + } else if (keepStorage) { + val result = if assumeUniqueness then this else copy() + result._bits.removeAll(keepStorage, assumeUniqueness = true) + result._count = 0 + result + } else { + BitArray() + } + + /** Returns `true` iff all elements in `this` are `false`. */ + def allFalse: Boolean = + if (isEmpty) { + true + } else { + val k = (count - 1) >> 5 + def loop(i: Int): Boolean = + if (i == k) { + val m = (1 << (count & 31)) - 1 + (_bits.at(k) & m) == 0 + } else if (_bits.at(i) != 0) { + false + } else { + loop(i + 1) + } + loop(0) + } + + /** Returns `true` iff all elements in `this` are `true`. */ + def allTrue: Boolean = + if (isEmpty) { + true + } else { + val k = (count - 1) >> 5 + def loop(i: Int): Boolean = + if (i == k) { + val m = (1 << (count & 31)) - 1 + (_bits.at(k) & m) == m + } else if (_bits.at(i) != ~0) { + false + } else { + loop(i + 1) + } + loop(0) + } + + /** Returns the bitwise OR of `this` and `other`. */ + def | (other: BitArray): BitArray = + val result = copy() + result.applyBitwise(other, _ | _, assumeUniqueness = true) + + /** Returns the bitwise AND of `this` and `other`. */ + def & (other: BitArray): BitArray = + val result = copy() + result.applyBitwise(other, _ & _, assumeUniqueness = true) + + /** Returns the bitwise XOR of `this` and `other`. */ + def ^ (other: BitArray): BitArray = + val result = copy() + result.applyBitwise(other, _ ^ _, assumeUniqueness = true) + + /** Assigns each bits in `this` to the result of `operation` applied on those bits and their + * corresponding bits in `other`. + * + * @requires + * `self.count == other.count`. + */ + private def applyBitwise( + other: BitArray, + operation: (Int, Int) => Int, + assumeUniqueness: Boolean = false + ): BitArray = + require(this.count == other.count) + if (isEmpty) { + this + } else { + val result = if assumeUniqueness then this else copy() + var u = assumeUniqueness + val k = (count - 1) >> 5 + + for (i <- 0 until k) { + result._bits = result._bits.modifyAt( + i, (n) => operation(n, other._bits.at(n)), + assumeUniqueness = u + ) + u = true + } + val m = (1 << (count & 31)) - 1 + result._bits = result._bits.modifyAt( + k, (n) => operation(n & m, other._bits.at(k) & m), + assumeUniqueness = u + ) + + result + } + + /** Returns the position of `this`'s first element', or `endPosition` if `this` is empty. + * + * @complexity + * O(1). + */ + def startPosition: BitArray.Position = + BitArray.Position(0) + + /** Returns the "past the end" position in `this`, that is, the position immediately after the + * last element in `this`. + * + * @complexity + * O(1). + */ + def endPosition: BitArray.Position = + BitArray.Position(count) + + /** Returns the position immediately after `p`. + * + * @requires + * `p` is a valid position in `self` different from `endPosition`. + * @complexity + * O(1). + */ + def positionAfter(p: BitArray.Position): BitArray.Position = + if (p.offsetInBucket == 63) { + BitArray.Position(p.bucket + 1, 0) + } else { + BitArray.Position(p.bucket, p.offsetInBucket + 1) + } + + /** Accesses the element at `p`. + * + * @requires + * `p` is a valid position in `self` different from `endPosition`. + * @complexity + * O(1). + */ + def at(p: BitArray.Position): Boolean = + val m = 1 << p.offsetInBucket + val b: Int = _bits.at(p.bucket) + (b & m) == m + + /** Accesses the `i`-th element of `this`. + * + * @requires + * `i` is greater than or equal to 0, and less than `count`. + * @complexity + * O(1). + */ + def atIndex(i: Int): Boolean = + at(BitArray.Position(i)) + + /** Calls `transform` on the element at `p` to update its value. + * + * @requires + * `p` is a valid position in `self` different from `endPosition`. + * @complexity + * O(1). + */ + def modifyAt( + p: BitArray.Position, + transform: (Boolean) => Boolean, + assumeUniqueness: Boolean = false + ): BitArray = + val result = if assumeUniqueness then this else copy() + result.setValue(transform(result.at(p)), p) + result + + /** Calls `transform` on `i`-th element of `this` to update its value. + * + * @requires + * `i` is greater than or equal to 0, and less than `count`. + * @complexity + * O(1). + */ + def modifyAtIndex( + i: Int, + transform: (Boolean) => Boolean, + assumeUniqueness: Boolean = false + ): BitArray = + modifyAt(BitArray.Position(i), transform, assumeUniqueness) + + /** Returns an independent copy of `this`. */ + def copy(minimumCapacity: Int = 0): BitArray = + if (minimumCapacity > capacity) { + // If the requested capacity on the copy is greater than what we have, `reserveCapacity` will + // create an independent value. + reserveCapacity(minimumCapacity) + } else { + val k = 1 + ((minimumCapacity - 1) >> 5) + val newBits = _bits.copy(k) + new BitArray(newBits, _count) + } + + /** Returns a textual description of `this`. */ + override def toString: String = + _bits.toString + + /** Sets the value `b` for the bit at position `p`. + * + * @requires + * `this` is uniquely referenced and `p` is a valid position in `this`. + */ + private def setValue(b: Boolean, p: BitArray.Position): Unit = + val m = 1 << p.offsetInBucket + _bits = _bits.modifyAt( + p.bucket, + (e) => if b then e | m else e & ~m, + assumeUniqueness = true + ) + +} + +object BitArray { + + /** A position in a `BitArray`. + * + * @param bucket + * The bucket containing `this`. + * @param offsetInBucket + * The offset of `this` in its containing bucket. + */ + final class Position( + private[BitArray] val bucket: Int, + private[BitArray] val offsetInBucket: Int + ) { + + /** Creates a position from an index. */ + private[BitArray] def this(index: Int) = + this(index >> 5, index & 31) + + /** Returns the index corresponding to this position. */ + private def index: Int = + (bucket >> 5) + offsetInBucket + + /** Returns a copy of `this`. */ + def copy(): Position = + new Position(bucket, offsetInBucket) + + /** Returns `true` iff `this` and `other` have an equivalent value. */ + def eq(other: Position): Boolean = + (this.bucket == other.bucket) && (this.offsetInBucket == other.offsetInBucket) + + /** Hashes the salient parts of `self` into `hasher`. */ + def hashInto(hasher: Hasher): Hasher = + hasher.combine(bucket) + hasher.combine(offsetInBucket) + + } + + /** Creates an array with the given `bits`. */ + def apply[T](bits: Boolean*): BitArray = + var result = new BitArray(HyArray[Int](), 0) + for (b <- bits) result = result.append(b, assumeUniqueness = true) + result + +} + +given bitArrayPositionIsValue: Value[BitArray.Position] with { + + extension (self: BitArray.Position) { + + def copy(): BitArray.Position = + self.copy() + + def eq(other: BitArray.Position): Boolean = + self.eq(other) + + def hashInto(hasher: Hasher): Hasher = + self.hashInto(hasher) + + } + +} + +given bitArrayIsCollection: Collection[BitArray] with { + + type Element = Boolean + type Position = BitArray.Position + + extension (self: BitArray) { + + override def count: Int = + self.count + + def startPosition: BitArray.Position = + self.startPosition + + def endPosition: BitArray.Position = + self.endPosition + + def positionAfter(p: BitArray.Position): BitArray.Position = + self.positionAfter(p) + + def at(p: BitArray.Position): Boolean = + self.at(p) + + } + +} + +given bitArrayIsStringConvertible: StringConvertible[BitArray] with { + + extension (self: BitArray) + override def description: String = + var contents = mutable.StringBuilder() + self.forEach((e) => { contents += (if e then '1' else '0'); true }) + contents.mkString + +} diff --git a/tests/pos/hylolib-cb/Collection.scala b/tests/pos/hylolib-cb/Collection.scala new file mode 100644 index 000000000000..073a99cdd16b --- /dev/null +++ b/tests/pos/hylolib-cb/Collection.scala @@ -0,0 +1,279 @@ +//> using options -language:experimental.modularity -source future +package hylo + +/** A collection of elements accessible by their position. */ +trait Collection[Self] { + + /** The type of the elements in the collection. */ + type Element: Value + + /** The type of a position in the collection. */ + type Position: Value as positionIsValue + + extension (self: Self) { + + /** Returns `true` iff `self` is empty. */ + def isEmpty: Boolean = + startPosition `eq` endPosition + + /** Returns the number of elements in `self`. + * + * @complexity + * O(n) where n is the number of elements in `self`. + */ + def count: Int = + val e = endPosition + def _count(p: Position, n: Int): Int = + if p `eq` e then n else _count(self.positionAfter(p), n + 1) + _count(startPosition, 0) + + /** Returns the position of `self`'s first element', or `endPosition` if `self` is empty. + * + * @complexity + * O(1) + */ + def startPosition: Position + + /** Returns the "past the end" position in `self`, that is, the position immediately after the + * last element in `self`. + * + * @complexity + * O(1). + */ + def endPosition: Position + + /** Returns the position immediately after `p`. + * + * @requires + * `p` is a valid position in `self` different from `endPosition`. + * @complexity + * O(1). + */ + def positionAfter(p: Position): Position + + /** Accesses the element at `p`. + * + * @requires + * `p` is a valid position in `self` different from `endPosition`. + * @complexity + * O(1). + */ + def at(p: Position): Element + + /** Returns `true` iff `i` precedes `j`. + * + * @requires + * `i` and j` are valid positions in `self`. + * @complexity + * O(n) where n is the number of elements in `self`. + */ + def isBefore(i: Position, j: Position): Boolean = + val e = self.endPosition + if (i.eq(e)) { + false + } else if (j.eq(e)) { + true + } else { + def _isBefore(n: Position): Boolean = + if (n.eq(j)) { + true + } else if (n.eq(e)) { + false + } else { + _isBefore(self.positionAfter(n)) + } + _isBefore(self.positionAfter(i)) + } + + } + +} + +extension [Self](self: Self)(using s: Collection[Self]) { + + /** Returns the first element of `self` along with a slice containing the suffix after this + * element, or `None` if `self` is empty. + * + * @complexity + * O(1) + */ + def headAndTail: Option[(s.Element, Slice[Self])] = + if (self.isEmpty) { + None + } else { + val p = self.startPosition + val q = self.positionAfter(p) + val t = Slice(self, Range(q, self.endPosition, (a, b) => (a `eq` b) || self.isBefore(a, b))) + Some((self.at(p), t)) + } + + /** Applies `combine` on `partialResult` and each element of `self`, in order. + * + * @complexity + * O(n) where n is the number of elements in `self`. + */ + def reduce[T](partialResult: T, combine: (T, s.Element) => T): T = + val e = self.endPosition + def loop(p: s.Position, r: T): T = + if (p.eq(e)) { + r + } else { + loop(self.positionAfter(p), combine(r, self.at(p))) + } + loop(self.startPosition, partialResult) + + /** Applies `action` on each element of `self`, in order, until `action` returns `false`, and + * returns `false` iff `action` did. + * + * You can return `false` from `action` to emulate a `continue` statement as found in traditional + * imperative languages (e.g., C). + * + * @complexity + * O(n) where n is the number of elements in `self`. + */ + def forEach(action: (s.Element) => Boolean): Boolean = + val e = self.endPosition + def loop(p: s.Position): Boolean = + if (p.eq(e)) { + true + } else if (!action(self.at(p))) { + false + } else { + loop(self.positionAfter(p)) + } + loop(self.startPosition) + + /** Returns a collection with the elements of `self` transformed by `transform`, in order. + * + * @complexity + * O(n) where n is the number of elements in `self`. + */ + def map[T](using Value[T])(transform: (s.Element) => T): HyArray[T] = + self.reduce( + HyArray[T](), + (r, e) => r.append(transform(e), assumeUniqueness = true) + ) + + /** Returns a collection with the elements of `self` satisfying `isInclude`, in order. + * + * @complexity + * O(n) where n is the number of elements in `self`. + */ + def filter(isIncluded: (s.Element) => Boolean): HyArray[s.Element] = + self.reduce( + HyArray[s.Element](), + (r, e) => if (isIncluded(e)) then r.append(e, assumeUniqueness = true) else r + ) + + /** Returns `true` if `self` contains an element satisfying `predicate`. + * + * @complexity + * O(n) where n is the number of elements in `self`. + */ + def containsWhere(predicate: (s.Element) => Boolean): Boolean = + self.firstPositionWhere(predicate) != None + + /** Returns `true` if all elements in `self` satisfy `predicate`. + * + * @complexity + * O(n) where n is the number of elements in `self`. + */ + def allSatisfy(predicate: (s.Element) => Boolean): Boolean = + self.firstPositionWhere(predicate) == None + + /** Returns the position of the first element of `self` satisfying `predicate`, or `None` if no + * such element exists. + * + * @complexity + * O(n) where n is the number of elements in `self`. + */ + def firstPositionWhere(predicate: (s.Element) => Boolean): Option[s.Position] = + val e = self.endPosition + def loop(p: s.Position): Option[s.Position] = + if (p.eq(e)) { + None + } else if (predicate(self.at(p))) { + Some(p) + } else { + loop(self.positionAfter(p)) + } + loop(self.startPosition) + + /** Returns the minimum element in `self`, using `isLessThan` to compare elements. + * + * @complexity + * O(n) where n is the number of elements in `self`. + */ + def minElement(isLessThan: (s.Element, s.Element) => Boolean): Option[s.Element] = + self.leastElement(isLessThan) + + // NOTE: I can't find a reasonable way to call this method. + /** Returns the minimum element in `self`. + * + * @complexity + * O(n) where n is the number of elements in `self`. + */ + def minElement()(using Comparable[s.Element]): Option[s.Element] = + self.minElement(isLessThan = _ `lt` _) + + /** Returns the maximum element in `self`, using `isGreaterThan` to compare elements. + * + * @complexity + * O(n) where n is the number of elements in `self`. + */ + def maxElement(isGreaterThan: (s.Element, s.Element) => Boolean): Option[s.Element] = + self.leastElement(isGreaterThan) + + /** Returns the maximum element in `self`. + * + * @complexity + * O(n) where n is the number of elements in `self`. + */ + def maxElement()(using Comparable[s.Element]): Option[s.Element] = + self.maxElement(isGreaterThan = _ `gt` _) + + /** Returns the maximum element in `self`, using `isOrderedBefore` to compare elements. + * + * @complexity + * O(n) where n is the number of elements in `self`. + */ + def leastElement(isOrderedBefore: (s.Element, s.Element) => Boolean): Option[s.Element] = + if (self.isEmpty) { + None + } else { + val e = self.endPosition + def _least(p: s.Position, least: s.Element): s.Element = + if (p.eq(e)) { + least + } else { + val x = self.at(p) + val y = if isOrderedBefore(x, least) then x else least + _least(self.positionAfter(p), y) + } + + val b = self.startPosition + Some(_least(self.positionAfter(b), self.at(b))) + } + +} + +extension [Self](self: Self)(using + s: Collection[Self], + e: Value[s.Element] +) { + + /** Returns `true` if `self` contains the same elements as `other`, in the same order. */ + def elementsEqual[T](using o: Collection[T] { type Element = s.Element })(other: T): Boolean = + def loop(i: s.Position, j: o.Position): Boolean = + if (i `eq` self.endPosition) { + j `eq` other.endPosition + } else if (j `eq` other.endPosition) { + false + } else if (self.at(i) `neq` other.at(j)) { + false + } else { + loop(self.positionAfter(i), other.positionAfter(j)) + } + loop(self.startPosition, other.startPosition) + +} diff --git a/tests/pos/hylolib-cb/CoreTraits.scala b/tests/pos/hylolib-cb/CoreTraits.scala new file mode 100644 index 000000000000..01b2c5242af9 --- /dev/null +++ b/tests/pos/hylolib-cb/CoreTraits.scala @@ -0,0 +1,57 @@ +package hylo + +/** A type whose instance can be treated as independent values. + * + * The data structure of and algorithms of Hylo's standard library operate "notional values" rather + * than arbitrary references. This trait defines the basis operations of all values. + */ +trait Value[Self] { + + extension (self: Self) { + + /** Returns a copy of `self`. */ + def copy(): Self + + /** Returns `true` iff `self` and `other` have an equivalent value. */ + def eq(other: Self): Boolean + + /** Hashes the salient parts of `self` into `hasher`. */ + def hashInto(hasher: Hasher): Hasher + + } + +} + +extension [Self: Value](self: Self) def neq(other: Self): Boolean = !self.eq(other) + +// ---------------------------------------------------------------------------- +// Comparable +// ---------------------------------------------------------------------------- + +trait Comparable[Self] extends Value[Self] { + + extension (self: Self) { + + /** Returns `true` iff `self` is ordered before `other`. */ + def lt(other: Self): Boolean + + /** Returns `true` iff `self` is ordered after `other`. */ + def gt(other: Self): Boolean = other.lt(self) + + /** Returns `true` iff `self` is equal to or ordered before `other`. */ + def le(other: Self): Boolean = !other.lt(self) + + /** Returns `true` iff `self` is equal to or ordered after `other`. */ + def ge(other: Self): Boolean = !self.lt(other) + + } + +} + +/** Returns the lesser of `x` and `y`. */ +def min[T: Comparable](x: T, y: T): T = + if y.lt(x) then y else x + +/** Returns the greater of `x` and `y`. */ +def max[T: Comparable](x: T, y: T): T = + if x.lt(y) then y else x diff --git a/tests/pos/hylolib-cb/Hasher.scala b/tests/pos/hylolib-cb/Hasher.scala new file mode 100644 index 000000000000..ef6813df6b60 --- /dev/null +++ b/tests/pos/hylolib-cb/Hasher.scala @@ -0,0 +1,38 @@ +package hylo + +import scala.util.Random + +/** A universal hash function. */ +final class Hasher private (private val hash: Int = Hasher.offsetBasis) { + + /** Returns the computed hash value. */ + def finalizeHash(): Int = + hash + + /** Adds `n` to the computed hash value. */ + def combine(n: Int): Hasher = + var h = hash + h = h ^ n + h = h * Hasher.prime + new Hasher(h) +} + +object Hasher { + + private val offsetBasis = 0x811c9dc5 + private val prime = 0x01000193 + + /** A random seed ensuring different hashes across multiple runs. */ + private lazy val seed = scala.util.Random.nextInt() + + /** Creates an instance with the given `seed`. */ + def apply(): Hasher = + val h = new Hasher() + h.combine(seed) + h + + /** Returns the hash of `v`. */ + def hash[T: Value](v: T): Int = + v.hashInto(Hasher()).finalizeHash() + +} diff --git a/tests/pos/hylolib-cb/HyArray.scala b/tests/pos/hylolib-cb/HyArray.scala new file mode 100644 index 000000000000..9347f7eb12cc --- /dev/null +++ b/tests/pos/hylolib-cb/HyArray.scala @@ -0,0 +1,221 @@ +package hylo + +import java.util.Arrays +import scala.collection.mutable + +/** An ordered, random-access collection. */ +final class HyArray[Element] private (using + elementIsValue: Value[Element] +)( + private var _storage: scala.Array[AnyRef | Null] | Null, + private var _count: Int // NOTE: where do I document private fields +) { + + // NOTE: The fact that we need Array[AnyRef] is diappointing and difficult to discover + // The compiler error sent me on a wild goose chase with ClassTag. + + /** Returns `true` iff `this` is empty. */ + def isEmpty: Boolean = + _count == 0 + + /** Returns the number of elements in `this`. */ + def count: Int = + _count + + /** Returns the number of elements that `this` can contain before allocating new storage. */ + def capacity: Int = + if _storage == null then 0 else _storage.length + + /** Reserves enough storage to store `n` elements in `this`. */ + def reserveCapacity(n: Int, assumeUniqueness: Boolean = false): HyArray[Element] = + if (n <= capacity) { + this + } else { + var newCapacity = max(1, capacity) + while (newCapacity < n) { newCapacity = newCapacity << 1 } + + val newStorage = new scala.Array[AnyRef | Null](newCapacity) + val s = _storage.asInstanceOf[scala.Array[AnyRef | Null]] + var i = 0 + while (i < count) { + newStorage(i) = _storage(i).asInstanceOf[Element].copy().asInstanceOf[AnyRef] + i += 1 + } + + if (assumeUniqueness) { + _storage = newStorage + this + } else { + new HyArray(newStorage, count) + } + } + + /** Adds a new element at the end of the array. */ + def append(source: Element, assumeUniqueness: Boolean = false): HyArray[Element] = + val result = if assumeUniqueness && (count < capacity) then this else copy(count + 1) + result._storage(count) = source.asInstanceOf[AnyRef] + result._count += 1 + result + + // NOTE: Can't refine `C.Element` without renaming the generic parameter of `HyArray`. + // /** Adds the contents of `source` at the end of the array. */ + // def appendContents[C](using + // s: Collection[C] + // )( + // source: C { type Element = Element }, + // assumeUniqueness: Boolean = false + // ): HyArray[Element] = + // val result = if (assumeUniqueness) { this } else { copy(count + source.count) } + // source.reduce(result, (r, e) => r.append(e, assumeUniqueness = true)) + + /** Removes and returns the last element, or returns `None` if the array is empty. */ + def popLast(assumeUniqueness: Boolean = false): (HyArray[Element], Option[Element]) = + if (isEmpty) { + (this, None) + } else { + val result = if assumeUniqueness then this else copy() + result._count -= 1 + (result, Some(result._storage(result._count).asInstanceOf[Element])) + } + + /** Removes all elements in the array, keeping allocated storage iff `keepStorage` is true. */ + def removeAll( + keepStorage: Boolean = false, + assumeUniqueness: Boolean = false + ): HyArray[Element] = + if (isEmpty) { + this + } else if (keepStorage) { + val result = if assumeUniqueness then this else copy() + Arrays.fill(result._storage, null) + result._count = 0 + result + } else { + HyArray() + } + + /** Accesses the element at `p`. + * + * @requires + * `p` is a valid position in `self` different from `endPosition`. + * @complexity + * O(1). + */ + def at(p: Int): Element = + _storage(p).asInstanceOf[Element] + + /** Calls `transform` on the element at `p` to update its value. + * + * @requires + * `p` is a valid position in `self` different from `endPosition`. + * @complexity + * O(1). + */ + def modifyAt( + p: Int, + transform: (Element) => Element, + assumeUniqueness: Boolean = false + ): HyArray[Element] = + val result = if assumeUniqueness then this else copy() + result._storage(p) = transform(at(p)).asInstanceOf[AnyRef] + result + + /** Returns a textual description of `this`. */ + override def toString: String = + var s = "[" + var i = 0 + while (i < count) { + if (i > 0) { s += ", " } + s += s"${at(i)}" + i += 1 + } + s + "]" + + /** Returns an independent copy of `this`, capable of storing `minimumCapacity` elements before + * allocating new storage. + */ + def copy(minimumCapacity: Int = 0): HyArray[Element] = + if (minimumCapacity > capacity) { + // If the requested capacity on the copy is greater than what we have, `reserveCapacity` will + // create an independent value. + reserveCapacity(minimumCapacity) + } else { + val clone = HyArray[Element]().reserveCapacity(max(minimumCapacity, count)) + var i = 0 + while (i < count) { + clone._storage(i) = _storage(i).asInstanceOf[Element].copy().asInstanceOf[AnyRef] + i += 1 + } + clone._count = count + clone + } + +} + +object HyArray { + + /** Creates an array with the given `elements`. */ + def apply[T](using t: Value[T])(elements: T*): HyArray[T] = + var a = new HyArray[T](null, 0) + for (e <- elements) a = a.append(e, assumeUniqueness = true) + a + +} + +given hyArrayIsValue[T](using tIsValue: Value[T]): Value[HyArray[T]] with { + + extension (self: HyArray[T]) { + + def copy(): HyArray[T] = + self.copy() + + def eq(other: HyArray[T]): Boolean = + self.elementsEqual(other) + + def hashInto(hasher: Hasher): Hasher = + self.reduce(hasher, (h, e) => e.hashInto(h)) + + } + +} + +given hyArrayIsCollection[T](using tIsValue: Value[T]): Collection[HyArray[T]] with { + + type Element = T + type Position = Int + + extension (self: HyArray[T]) { + + // NOTE: Having to explicitly override means that primary declaration can't automatically + // specialize trait requirements. + override def isEmpty: Boolean = self.isEmpty + + override def count: Int = self.count + + def startPosition = 0 + + def endPosition = self.count + + def positionAfter(p: Int) = p + 1 + + def at(p: Int) = self.at(p) + + } + +} + +// NOTE: This should work. +// given hyArrayIsStringConvertible[T](using +// tIsValue: Value[T], +// tIsStringConvertible: StringConvertible[T] +// ): StringConvertible[HyArray[T]] with { +// +// given Collection[HyArray[T]] = hyArrayIsCollection[T] +// +// extension (self: HyArray[T]) +// override def description: String = +// var contents = mutable.StringBuilder() +// self.forEach((e) => { contents ++= e.description; true }) +// s"[${contents.mkString(", ")}]" +// +// } diff --git a/tests/pos/hylolib-cb/Integers.scala b/tests/pos/hylolib-cb/Integers.scala new file mode 100644 index 000000000000..b9bc203a88ea --- /dev/null +++ b/tests/pos/hylolib-cb/Integers.scala @@ -0,0 +1,58 @@ +package hylo + +given booleanIsValue: Value[Boolean] with { + + extension (self: Boolean) { + + def copy(): Boolean = + // Note: Scala's `Boolean` has value semantics already. + self + + def eq(other: Boolean): Boolean = + self == other + + def hashInto(hasher: Hasher): Hasher = + hasher.combine(if self then 1 else 0) + + } + +} + +given intIsValue: Value[Int] with { + + extension (self: Int) { + + def copy(): Int = + // Note: Scala's `Int` has value semantics already. + self + + def eq(other: Int): Boolean = + self == other + + def hashInto(hasher: Hasher): Hasher = + hasher.combine(self) + + } + +} + +given intIsComparable: Comparable[Int] with { + + extension (self: Int) { + + def copy(): Int = + self + + def eq(other: Int): Boolean = + self == other + + def hashInto(hasher: Hasher): Hasher = + hasher.combine(self) + + def lt(other: Int): Boolean = self < other + + } + +} + +given intIsStringConvertible: StringConvertible[Int] with {} diff --git a/tests/pos/hylolib-cb/Range.scala b/tests/pos/hylolib-cb/Range.scala new file mode 100644 index 000000000000..1f597652ead1 --- /dev/null +++ b/tests/pos/hylolib-cb/Range.scala @@ -0,0 +1,37 @@ +package hylo + +/** A half-open interval from a lower bound up to, but not including, an uppor bound. */ +final class Range[Bound] private (val lowerBound: Bound, val upperBound: Bound) { + + /** Returns a textual description of `this`. */ + override def toString: String = + s"[${lowerBound}, ${upperBound})" + +} + +object Range { + + /** Creates a half-open interval [`lowerBound`, `upperBound`), using `isLessThanOrEqual` to ensure + * that the bounds are well-formed. + * + * @requires + * `lowerBound` is lesser than or equal to `upperBound`. + */ + def apply[Bound]( + lowerBound: Bound, + upperBound: Bound, + isLessThanOrEqual: (Bound, Bound) => Boolean + ) = + require(isLessThanOrEqual(lowerBound, upperBound)) + new Range(lowerBound, upperBound) + + /** Creates a half-open interval [`lowerBound`, `upperBound`). + * + * @requires + * `lowerBound` is lesser than or equal to `upperBound`. + */ + def apply[Bound](lowerBound: Bound, upperBound: Bound)(using Comparable[Bound]) = + require(lowerBound `le` upperBound) + new Range(lowerBound, upperBound) + +} diff --git a/tests/pos/hylolib-cb/Slice.scala b/tests/pos/hylolib-cb/Slice.scala new file mode 100644 index 000000000000..2289ac2a085b --- /dev/null +++ b/tests/pos/hylolib-cb/Slice.scala @@ -0,0 +1,46 @@ +package hylo + +/** A view into a collection. */ +final class Slice[Base](using + val b: Collection[Base] +)( + val base: Base, + val bounds: Range[b.Position] +) { + + /** Returns `true` iff `this` is empty. */ + def isEmpty: Boolean = + bounds.lowerBound.eq(bounds.upperBound) + + def startPosition: b.Position = + bounds.lowerBound + + def endPosition: b.Position = + bounds.upperBound + + def positionAfter(p: b.Position): b.Position = + base.positionAfter(p) + + def at(p: b.Position): b.Element = + base.at(p) + +} + +given sliceIsCollection[T](using c: Collection[T]): Collection[Slice[T]] with { + + type Element = c.Element + type Position = c.Position + + extension (self: Slice[T]) { + + def startPosition = self.bounds.lowerBound.asInstanceOf[Position] // NOTE: Ugly hack + + def endPosition = self.bounds.upperBound.asInstanceOf[Position] + + def positionAfter(p: Position) = self.base.positionAfter(p) + + def at(p: Position) = self.base.at(p) + + } + +} diff --git a/tests/pos/hylolib-cb/StringConvertible.scala b/tests/pos/hylolib-cb/StringConvertible.scala new file mode 100644 index 000000000000..0702f79f2794 --- /dev/null +++ b/tests/pos/hylolib-cb/StringConvertible.scala @@ -0,0 +1,14 @@ +package hylo + +/** A type whose instances can be described by a character string. */ +trait StringConvertible[Self] { + + extension (self: Self) { + + /** Returns a textual description of `self`. */ + def description: String = + self.toString + + } + +} diff --git a/tests/pos/hylolib-deferred-given/AnyValue.scala b/tests/pos/hylolib-deferred-given/AnyValue.scala index b9d39869c09a..21f2965e102e 100644 --- a/tests/pos/hylolib-deferred-given/AnyValue.scala +++ b/tests/pos/hylolib-deferred-given/AnyValue.scala @@ -44,7 +44,7 @@ final class AnyValue private ( object AnyValue { /** Creates an instance wrapping `wrapped`. */ - def apply[T](using Value[T])(wrapped: T): AnyValue = + def apply[T: Value](wrapped: T): AnyValue = def copy(a: AnyRef): AnyValue = AnyValue(a.asInstanceOf[Ref[T]].value.copy()) diff --git a/tests/pos/hylolib-deferred-given/Range.scala b/tests/pos/hylolib-deferred-given/Range.scala index 1f597652ead1..b0f50dd55c8c 100644 --- a/tests/pos/hylolib-deferred-given/Range.scala +++ b/tests/pos/hylolib-deferred-given/Range.scala @@ -30,7 +30,7 @@ object Range { * @requires * `lowerBound` is lesser than or equal to `upperBound`. */ - def apply[Bound](lowerBound: Bound, upperBound: Bound)(using Comparable[Bound]) = + def apply[Bound: Comparable](lowerBound: Bound, upperBound: Bound) = require(lowerBound `le` upperBound) new Range(lowerBound, upperBound) From 34375268ad921f612364b81d6dec63a0adc3aa7b Mon Sep 17 00:00:00 2001 From: odersky Date: Tue, 2 Apr 2024 17:46:23 +0200 Subject: [PATCH 220/371] Make some context bound evidence params tracked Make context bound evidence params tracked if they have types with abstract type members. [Cherry-picked 4d62692a69e994b10a1386e8d1a73a06b1528b85] --- .../src/dotty/tools/dotc/core/Symbols.scala | 8 ++--- .../src/dotty/tools/dotc/typer/Namer.scala | 30 +++++++++++++++++++ tests/pos/hylolib-cb/AnyCollection.scala | 4 +-- tests/pos/hylolib-cb/Collection.scala | 9 +++--- tests/pos/hylolib-cb/HyArray.scala | 11 ++++--- tests/pos/hylolib-cb/Slice.scala | 6 ++-- 6 files changed, 47 insertions(+), 21 deletions(-) diff --git a/compiler/src/dotty/tools/dotc/core/Symbols.scala b/compiler/src/dotty/tools/dotc/core/Symbols.scala index 0020efa5018d..da0ecac47b7d 100644 --- a/compiler/src/dotty/tools/dotc/core/Symbols.scala +++ b/compiler/src/dotty/tools/dotc/core/Symbols.scala @@ -312,7 +312,6 @@ object Symbols extends SymUtils { * With the given setup, all such calls will give implicit-not found errors */ final def symbol(implicit ev: DontUseSymbolOnSymbol): Nothing = unsupported("symbol") - type DontUseSymbolOnSymbol final def source(using Context): SourceFile = { def valid(src: SourceFile): SourceFile = @@ -402,13 +401,12 @@ object Symbols extends SymUtils { flags: FlagSet = this.flags, info: Type = this.info, privateWithin: Symbol = this.privateWithin, - coord: Coord = NoCoord, // Can be `= owner.coord` once we bootstrap - compUnitInfo: CompilationUnitInfo | Null = null // Can be `= owner.associatedFile` once we bootstrap + coord: Coord = NoCoord, // Can be `= owner.coord` once we have new default args + compUnitInfo: CompilationUnitInfo | Null = null // Can be `= owner.compilationUnitInfo` once we have new default args ): Symbol = { val coord1 = if (coord == NoCoord) owner.coord else coord val compilationUnitInfo1 = if (compilationUnitInfo == null) owner.compilationUnitInfo else compilationUnitInfo - if isClass then newClassSymbol(owner, name.asTypeName, flags, _ => info, privateWithin, coord1, compilationUnitInfo1) else @@ -936,6 +934,8 @@ object Symbols extends SymUtils { case (x: Symbol) :: _ if x.isType => Some(xs.asInstanceOf[List[TypeSymbol]]) case _ => None + type DontUseSymbolOnSymbol + // ----- Locating predefined symbols ---------------------------------------- def requiredPackage(path: PreName)(using Context): TermSymbol = { diff --git a/compiler/src/dotty/tools/dotc/typer/Namer.scala b/compiler/src/dotty/tools/dotc/typer/Namer.scala index 22a12ed0f468..85678b9685f7 100644 --- a/compiler/src/dotty/tools/dotc/typer/Namer.scala +++ b/compiler/src/dotty/tools/dotc/typer/Namer.scala @@ -1885,6 +1885,28 @@ class Namer { typer: Typer => ddef.trailingParamss.foreach(completeParams) val paramSymss = normalizeIfConstructor(ddef.paramss.nestedMap(symbolOfTree), isConstructor) sym.setParamss(paramSymss) + + /** We add `tracked` to context bound witnesses that have abstract type members */ + def needsTracked(sym: Symbol, param: ValDef)(using Context) = + !sym.is(Tracked) + && param.hasAttachment(ContextBoundParam) + && sym.info.memberNames(abstractTypeNameFilter).nonEmpty + + /** Set every context bound evidence parameter of a class to be tracked, + * provided it has a type that has an abstract type member. Reset private and local flags + * so that the parameter becomes a `val`. + */ + def setTracked(param: ValDef): Unit = + val sym = symbolOfTree(param) + sym.maybeOwner.maybeOwner.infoOrCompleter match + case info: TempClassInfo if needsTracked(sym, param) => + typr.println(i"set tracked $param, $sym: ${sym.info} containing ${sym.info.memberNames(abstractTypeNameFilter).toList}") + for acc <- info.decls.lookupAll(sym.name) if acc.is(ParamAccessor) do + acc.resetFlag(PrivateLocal) + acc.setFlag(Tracked) + sym.setFlag(Tracked) + case _ => + def wrapMethType(restpe: Type): Type = instantiateDependent(restpe, paramSymss) methodType(paramSymss, restpe, ddef.mods.is(JavaDefined)) @@ -1893,10 +1915,18 @@ class Namer { typer: Typer => wrapMethType(addParamRefinements(restpe, paramSymss)) if isConstructor then + if sym.isPrimaryConstructor && Feature.enabled(modularity) then + ddef.termParamss.foreach(_.foreach(setTracked)) // set result type tree to unit, but take the current class as result type of the symbol typedAheadType(ddef.tpt, defn.UnitType) wrapMethType(effectiveResultType(sym, paramSymss)) else if sym.isAllOf(Given | Method) && Feature.enabled(modularity) then + // set every context bound evidence parameter of a given companion method + // to be tracked, provided it has a type that has an abstract type member. + // Add refinements for all tracked parameters to the result type. + for params <- ddef.termParamss; param <- params do + val psym = symbolOfTree(param) + if needsTracked(psym, param) then psym.setFlag(Tracked) valOrDefDefSig(ddef, sym, paramSymss, wrapRefinedMethType) else valOrDefDefSig(ddef, sym, paramSymss, wrapMethType) diff --git a/tests/pos/hylolib-cb/AnyCollection.scala b/tests/pos/hylolib-cb/AnyCollection.scala index 1a44344d0e51..50f4313e46ce 100644 --- a/tests/pos/hylolib-cb/AnyCollection.scala +++ b/tests/pos/hylolib-cb/AnyCollection.scala @@ -14,7 +14,7 @@ final class AnyCollection[Element] private ( object AnyCollection { /** Creates an instance forwarding its operations to `base`. */ - def apply[Base](using b: Collection[Base])(base: Base): AnyCollection[b.Element] = + def apply[Base: Collection as b](base: Base): AnyCollection[b.Element] = // NOTE: This evidence is redefined so the compiler won't report ambiguity between `intIsValue` // and `anyValueIsValue` when the method is called on a collection of `Int`s. None of these // choices is even correct! Note also that the ambiguity is suppressed if the constructor of @@ -42,7 +42,7 @@ object AnyCollection { } -given anyCollectionIsCollection[T](using tIsValue: Value[T]): Collection[AnyCollection[T]] with { +given anyCollectionIsCollection[T: Value]: Collection[AnyCollection[T]] with { type Element = T type Position = AnyValue diff --git a/tests/pos/hylolib-cb/Collection.scala b/tests/pos/hylolib-cb/Collection.scala index 073a99cdd16b..2fc04f02b9ac 100644 --- a/tests/pos/hylolib-cb/Collection.scala +++ b/tests/pos/hylolib-cb/Collection.scala @@ -89,7 +89,7 @@ trait Collection[Self] { } -extension [Self](self: Self)(using s: Collection[Self]) { +extension [Self: Collection as s](self: Self) { /** Returns the first element of `self` along with a slice containing the suffix after this * element, or `None` if `self` is empty. @@ -148,7 +148,7 @@ extension [Self](self: Self)(using s: Collection[Self]) { * @complexity * O(n) where n is the number of elements in `self`. */ - def map[T](using Value[T])(transform: (s.Element) => T): HyArray[T] = + def map[T: Value](transform: (s.Element) => T): HyArray[T] = self.reduce( HyArray[T](), (r, e) => r.append(transform(e), assumeUniqueness = true) @@ -257,9 +257,8 @@ extension [Self](self: Self)(using s: Collection[Self]) { } -extension [Self](self: Self)(using - s: Collection[Self], - e: Value[s.Element] +extension [Self: Collection as s](self: Self)(using + Value[s.Element] ) { /** Returns `true` if `self` contains the same elements as `other`, in the same order. */ diff --git a/tests/pos/hylolib-cb/HyArray.scala b/tests/pos/hylolib-cb/HyArray.scala index 9347f7eb12cc..0fff45e744ec 100644 --- a/tests/pos/hylolib-cb/HyArray.scala +++ b/tests/pos/hylolib-cb/HyArray.scala @@ -1,12 +1,11 @@ +//> using options -language:experimental.modularity -source future package hylo import java.util.Arrays import scala.collection.mutable /** An ordered, random-access collection. */ -final class HyArray[Element] private (using - elementIsValue: Value[Element] -)( +final class HyArray[Element: Value as elementIsCValue]( private var _storage: scala.Array[AnyRef | Null] | Null, private var _count: Int // NOTE: where do I document private fields ) { @@ -155,14 +154,14 @@ final class HyArray[Element] private (using object HyArray { /** Creates an array with the given `elements`. */ - def apply[T](using t: Value[T])(elements: T*): HyArray[T] = + def apply[T: Value](elements: T*): HyArray[T] = var a = new HyArray[T](null, 0) for (e <- elements) a = a.append(e, assumeUniqueness = true) a } -given hyArrayIsValue[T](using tIsValue: Value[T]): Value[HyArray[T]] with { +given [T: Value] => Value[HyArray[T]] with { extension (self: HyArray[T]) { @@ -179,7 +178,7 @@ given hyArrayIsValue[T](using tIsValue: Value[T]): Value[HyArray[T]] with { } -given hyArrayIsCollection[T](using tIsValue: Value[T]): Collection[HyArray[T]] with { +given [T: Value] => Collection[HyArray[T]] with { type Element = T type Position = Int diff --git a/tests/pos/hylolib-cb/Slice.scala b/tests/pos/hylolib-cb/Slice.scala index 2289ac2a085b..b577ceeb3739 100644 --- a/tests/pos/hylolib-cb/Slice.scala +++ b/tests/pos/hylolib-cb/Slice.scala @@ -1,9 +1,7 @@ package hylo /** A view into a collection. */ -final class Slice[Base](using - val b: Collection[Base] -)( +final class Slice[Base: Collection as b]( val base: Base, val bounds: Range[b.Position] ) { @@ -26,7 +24,7 @@ final class Slice[Base](using } -given sliceIsCollection[T](using c: Collection[T]): Collection[Slice[T]] with { +given sliceIsCollection[T: Collection as c]: Collection[Slice[T]] with { type Element = c.Element type Position = c.Position From d856e50a0a9e061c28b361e00788fc925ec80ab0 Mon Sep 17 00:00:00 2001 From: odersky Date: Tue, 2 Apr 2024 18:01:56 +0200 Subject: [PATCH 221/371] FIX: Fix typing of RefinedTypes with watching parents If a refined type has a parent type watching some other type, the parent should not be mapped to Object. Previously, the parent counted as `isEmpty` which caused this mapping. Fixes #10929 [Cherry-picked 11d7fa39372c430220f1818632ff1fe0c25ba60d] --- .../src/dotty/tools/dotc/typer/Typer.scala | 2 +- tests/pos/hylolib-deferred-given/Hasher.scala | 1 + tests/pos/i10929.scala | 21 +++++++++++++++++++ tests/pos/i13580.scala | 13 ++++++++++++ 4 files changed, 36 insertions(+), 1 deletion(-) create mode 100644 tests/pos/i10929.scala create mode 100644 tests/pos/i13580.scala diff --git a/compiler/src/dotty/tools/dotc/typer/Typer.scala b/compiler/src/dotty/tools/dotc/typer/Typer.scala index c467a4507730..f744eb392d7c 100644 --- a/compiler/src/dotty/tools/dotc/typer/Typer.scala +++ b/compiler/src/dotty/tools/dotc/typer/Typer.scala @@ -2301,7 +2301,7 @@ class Typer(@constructorOnly nestingLevel: Int = 0) extends Namer } def typedRefinedTypeTree(tree: untpd.RefinedTypeTree)(using Context): TypTree = { - val tpt1 = if (tree.tpt.isEmpty) TypeTree(defn.ObjectType) else typedAheadType(tree.tpt) + val tpt1 = if tree.tpt == EmptyTree then TypeTree(defn.ObjectType) else typedAheadType(tree.tpt) val refineClsDef = desugar.refinedTypeToClass(tpt1, tree.refinements).withSpan(tree.span) val refineCls = createSymbol(refineClsDef).asClass val TypeDef(_, impl: Template) = typed(refineClsDef): @unchecked diff --git a/tests/pos/hylolib-deferred-given/Hasher.scala b/tests/pos/hylolib-deferred-given/Hasher.scala index ef6813df6b60..ca45550ed002 100644 --- a/tests/pos/hylolib-deferred-given/Hasher.scala +++ b/tests/pos/hylolib-deferred-given/Hasher.scala @@ -1,3 +1,4 @@ +//> using options -language:experimental.modularity -source future package hylo import scala.util.Random diff --git a/tests/pos/i10929.scala b/tests/pos/i10929.scala new file mode 100644 index 000000000000..e916e4547e59 --- /dev/null +++ b/tests/pos/i10929.scala @@ -0,0 +1,21 @@ +//> using options -language:experimental.modularity -source future +infix abstract class TupleOf[T, +A]: + type Mapped[+A] <: Tuple + def map[B](x: T)(f: A => B): Mapped[B] + +object TupleOf: + + given TupleOf[EmptyTuple, Nothing] with + type Mapped[+A] = EmptyTuple + def map[B](x: EmptyTuple)(f: Nothing => B): Mapped[B] = x + + given [A, Rest <: Tuple](using tracked val tup: Rest TupleOf A): TupleOf[A *: Rest, A] with + type Mapped[+A] = A *: tup.Mapped[A] + def map[B](x: A *: Rest)(f: A => B): Mapped[B] = + (f(x.head) *: tup.map(x.tail)(f)) + +def foo[T](xs: T)(using tup: T TupleOf Int): tup.Mapped[Int] = tup.map(xs)(_ + 1) + +@main def test = + foo(EmptyTuple): EmptyTuple // ok + foo(1 *: EmptyTuple): Int *: EmptyTuple // now also ok \ No newline at end of file diff --git a/tests/pos/i13580.scala b/tests/pos/i13580.scala new file mode 100644 index 000000000000..c3c491a19dbe --- /dev/null +++ b/tests/pos/i13580.scala @@ -0,0 +1,13 @@ +//> using options -language:experimental.modularity -source future +trait IntWidth: + type Out +given IntWidth: + type Out = 155 + +trait IntCandidate: + type Out +given (using tracked val w: IntWidth) => IntCandidate: + type Out = w.Out + +val x = summon[IntCandidate] +val xx = summon[x.Out =:= 155] From 5fe6b5bb8e0428b3a32ee350e85f0709e1395d89 Mon Sep 17 00:00:00 2001 From: odersky Date: Sat, 6 Jan 2024 13:53:17 +0100 Subject: [PATCH 222/371] Also reduce term projections We already reduce `R { type A = T } # A` to `T` in most situations when we create types. We now also reduce `R { val x: S } # x` to `S` if `S` is a singleton type. This will simplify types as we go to more term-dependent typing. As a concrete benefit, it will avoid several test-pickling failures due to pickling differences when using dependent types. [Cherry-picked 96fbf2942a296df3f63b05e2503f6a1a904e28cf] --- .../src/dotty/tools/dotc/core/Types.scala | 66 +++++++++---------- 1 file changed, 32 insertions(+), 34 deletions(-) diff --git a/compiler/src/dotty/tools/dotc/core/Types.scala b/compiler/src/dotty/tools/dotc/core/Types.scala index a6136a20cf32..ac3aef2a59d2 100644 --- a/compiler/src/dotty/tools/dotc/core/Types.scala +++ b/compiler/src/dotty/tools/dotc/core/Types.scala @@ -1642,17 +1642,19 @@ object Types extends TypeUtils { * * P { ... type T = / += / -= U ... } # T * - * to just U. Does not perform the reduction if the resulting type would contain - * a reference to the "this" of the current refined type, except in the following situation + * to just U. Analogously, `P { val x: S} # x` is reduced tp `S` is `S` + * is a singleton type. * - * (1) The "this" reference can be avoided by following an alias. Example: + * Does not perform the reduction if the resulting type would contain + * a reference to the "this" of the current refined type, except if the "this" + * reference can be avoided by following an alias. Example: * * P { type T = String, type R = P{...}.T } # R --> String * * (*) normalizes means: follow instantiated typevars and aliases. */ - def lookupRefined(name: Name)(using Context): Type = { - @tailrec def loop(pre: Type): Type = pre.stripTypeVar match { + def lookupRefined(name: Name)(using Context): Type = + @tailrec def loop(pre: Type): Type = pre match case pre: RefinedType => pre.refinedInfo match { case tp: AliasingBounds => @@ -1675,12 +1677,13 @@ object Types extends TypeUtils { case TypeAlias(alias) => loop(alias) case _ => NoType } + case pre: (TypeVar | AnnotatedType) => + loop(pre.underlying) case _ => NoType - } loop(this) - } + end lookupRefined /** The type , reduced if possible */ def select(name: Name)(using Context): Type = @@ -2820,35 +2823,30 @@ object Types extends TypeUtils { def derivedSelect(prefix: Type)(using Context): Type = if prefix eq this.prefix then this else if prefix.isExactlyNothing then prefix - else { - val res = - if (isType && currentValidSymbol.isAllOf(ClassTypeParam)) argForParam(prefix) + else + val reduced = + if isType && currentValidSymbol.isAllOf(ClassTypeParam) then argForParam(prefix) else prefix.lookupRefined(name) - if (res.exists) return res - if (isType) { - if (Config.splitProjections) - prefix match { - case prefix: AndType => - def isMissing(tp: Type) = tp match { - case tp: TypeRef => !tp.info.exists - case _ => false - } - val derived1 = derivedSelect(prefix.tp1) - val derived2 = derivedSelect(prefix.tp2) - return ( - if (isMissing(derived1)) derived2 - else if (isMissing(derived2)) derived1 - else prefix.derivedAndType(derived1, derived2)) - case prefix: OrType => - val derived1 = derivedSelect(prefix.tp1) - val derived2 = derivedSelect(prefix.tp2) - return prefix.derivedOrType(derived1, derived2) - case _ => - } - } - if (prefix.isInstanceOf[WildcardType]) WildcardType.sameKindAs(this) + if reduced.exists then return reduced + if Config.splitProjections && isType then + prefix match + case prefix: AndType => + def isMissing(tp: Type) = tp match + case tp: TypeRef => !tp.info.exists + case _ => false + val derived1 = derivedSelect(prefix.tp1) + val derived2 = derivedSelect(prefix.tp2) + return + if isMissing(derived1) then derived2 + else if isMissing(derived2) then derived1 + else prefix.derivedAndType(derived1, derived2) + case prefix: OrType => + val derived1 = derivedSelect(prefix.tp1) + val derived2 = derivedSelect(prefix.tp2) + return prefix.derivedOrType(derived1, derived2) + case _ => + if prefix.isInstanceOf[WildcardType] then WildcardType.sameKindAs(this) else withPrefix(prefix) - } /** A reference like this one, but with the given symbol, if it exists */ private def withSym(sym: Symbol)(using Context): ThisType = From becdf887a5a581b35386f605db702949326f1f6e Mon Sep 17 00:00:00 2001 From: odersky Date: Tue, 2 Apr 2024 20:43:48 +0200 Subject: [PATCH 223/371] Implement context bound companions [Cherry-picked ce09ef3bc4a49c4f851b3f8ab3c4b3c2ba64bb7d] --- .../src/dotty/tools/dotc/ast/Desugar.scala | 50 ++++++++---- .../src/dotty/tools/dotc/ast/TreeInfo.scala | 31 ++++++++ .../src/dotty/tools/dotc/core/Contexts.scala | 13 ++-- .../dotty/tools/dotc/core/Definitions.scala | 9 +++ .../src/dotty/tools/dotc/core/NamerOps.scala | 53 +++++++++++++ .../src/dotty/tools/dotc/core/StdNames.scala | 2 + .../src/dotty/tools/dotc/core/SymUtils.scala | 3 + .../tools/dotc/core/tasty/TreeUnpickler.scala | 1 + .../tools/dotc/printing/PlainPrinter.scala | 4 +- .../tools/dotc/reporting/ErrorMessageID.scala | 4 +- .../dotty/tools/dotc/reporting/messages.scala | 36 +++++++++ .../tools/dotc/transform/PostTyper.scala | 22 ++++-- .../tools/dotc/transform/TreeChecker.scala | 21 ++--- .../src/dotty/tools/dotc/typer/Namer.scala | 35 +++++++-- .../src/dotty/tools/dotc/typer/Typer.scala | 76 +++++++++++++++++++ .../annotation/internal/WitnessNames.scala | 53 +++++++++++++ project/MiMaFilters.scala | 2 + tests/neg/cb-companion-leaks.check | 66 ++++++++++++++++ tests/neg/cb-companion-leaks.scala | 16 ++++ tests/pos-macros/i8325/Macro_1.scala | 4 +- tests/pos-macros/i8325/Test_2.scala | 2 +- tests/pos-macros/i8325b/Macro_1.scala | 4 +- tests/pos-macros/i8325b/Test_2.scala | 2 +- tests/pos/FromString-cb-companion.scala | 14 ++++ tests/pos/cb-companion-joins.scala | 21 +++++ 25 files changed, 496 insertions(+), 48 deletions(-) create mode 100644 library/src/scala/annotation/internal/WitnessNames.scala create mode 100644 tests/neg/cb-companion-leaks.check create mode 100644 tests/neg/cb-companion-leaks.scala create mode 100644 tests/pos/FromString-cb-companion.scala create mode 100644 tests/pos/cb-companion-joins.scala diff --git a/compiler/src/dotty/tools/dotc/ast/Desugar.scala b/compiler/src/dotty/tools/dotc/ast/Desugar.scala index d6e442ed4a0c..08953f1dec6b 100644 --- a/compiler/src/dotty/tools/dotc/ast/Desugar.scala +++ b/compiler/src/dotty/tools/dotc/ast/Desugar.scala @@ -257,7 +257,16 @@ object desugar { case _ => rhs - cpy.TypeDef(tdef)(rhs = desugarRhs(tdef.rhs)) + val tdef1 = cpy.TypeDef(tdef)(rhs = desugarRhs(tdef.rhs)) + if Feature.enabled(Feature.modularity) + && evidenceNames.nonEmpty + && !evidenceNames.contains(tdef.name.toTermName) + && !allParamss.nestedExists(_.name == tdef.name.toTermName) + then + tdef1.withAddedAnnotation: + WitnessNamesAnnot(evidenceNames.toList).withSpan(tdef.span) + else + tdef1 end desugarContextBounds private def elimContextBounds(meth: DefDef, isPrimaryConstructor: Boolean)(using Context): DefDef = @@ -323,9 +332,9 @@ object desugar { def getterParamss(n: Int): List[ParamClause] = mapParamss(takeUpTo(paramssNoRHS, n)) { - tparam => dropContextBounds(toDefParam(tparam, keepAnnotations = true)) + tparam => dropContextBounds(toDefParam(tparam, KeepAnnotations.All)) } { - vparam => toDefParam(vparam, keepAnnotations = true, keepDefault = false) + vparam => toDefParam(vparam, KeepAnnotations.All, keepDefault = false) } def defaultGetters(paramss: List[ParamClause], n: Int): List[DefDef] = paramss match @@ -430,7 +439,12 @@ object desugar { private def addEvidenceParams(meth: DefDef, params: List[ValDef])(using Context): DefDef = if params.isEmpty then return meth - val boundNames = params.map(_.name).toSet + var boundNames = params.map(_.name).toSet + for mparams <- meth.paramss; mparam <- mparams do + mparam match + case tparam: TypeDef if tparam.mods.annotations.exists(WitnessNamesAnnot.unapply(_).isDefined) => + boundNames += tparam.name.toTermName + case _ => //println(i"add ev params ${meth.name}, ${boundNames.toList}") @@ -463,16 +477,26 @@ object desugar { @sharable private val synthetic = Modifiers(Synthetic) + /** Which annotations to keep in derived parameters */ + private enum KeepAnnotations: + case None, All, WitnessOnly + /** Filter annotations in `mods` according to `keep` */ - private def filterAnnots(mods: Modifiers, keep: Boolean)(using Context) = - if keep then mods else mods.withAnnotations(Nil) + private def filterAnnots(mods: Modifiers, keep: KeepAnnotations)(using Context) = keep match + case KeepAnnotations.None => mods.withAnnotations(Nil) + case KeepAnnotations.All => mods + case KeepAnnotations.WitnessOnly => + mods.withAnnotations: + mods.annotations.filter: + case WitnessNamesAnnot(_) => true + case _ => false - private def toDefParam(tparam: TypeDef, keepAnnotations: Boolean)(using Context): TypeDef = - val mods = filterAnnots(tparam.rawMods, keepAnnotations) + private def toDefParam(tparam: TypeDef, keep: KeepAnnotations)(using Context): TypeDef = + val mods = filterAnnots(tparam.rawMods, keep) tparam.withMods(mods & EmptyFlags | Param) - private def toDefParam(vparam: ValDef, keepAnnotations: Boolean, keepDefault: Boolean)(using Context): ValDef = { - val mods = filterAnnots(vparam.rawMods, keepAnnotations) + private def toDefParam(vparam: ValDef, keep: KeepAnnotations, keepDefault: Boolean)(using Context): ValDef = { + val mods = filterAnnots(vparam.rawMods, keep) val hasDefault = if keepDefault then HasDefault else EmptyFlags // Need to ensure that tree is duplicated since term parameters can be watched // and cloning a term parameter will copy its watchers to the clone, which means @@ -573,7 +597,7 @@ object desugar { // Annotations on class _type_ parameters are set on the derived parameters // but not on the constructor parameters. The reverse is true for // annotations on class _value_ parameters. - val constrTparams = impliedTparams.map(toDefParam(_, keepAnnotations = false)) + val constrTparams = impliedTparams.map(toDefParam(_, KeepAnnotations.WitnessOnly)) val constrVparamss = if (originalVparamss.isEmpty) { // ensure parameter list is non-empty if (isCaseClass) @@ -584,7 +608,7 @@ object desugar { report.error(CaseClassMissingNonImplicitParamList(cdef), namePos) ListOfNil } - else originalVparamss.nestedMap(toDefParam(_, keepAnnotations = true, keepDefault = true)) + else originalVparamss.nestedMap(toDefParam(_, KeepAnnotations.All, keepDefault = true)) val derivedTparams = constrTparams.zipWithConserve(impliedTparams)((tparam, impliedParam) => derivedTypeParam(tparam).withAnnotations(impliedParam.mods.annotations)) @@ -606,7 +630,7 @@ object desugar { defDef( addEvidenceParams( cpy.DefDef(ddef)(paramss = joinParams(constrTparams, ddef.paramss)), - evidenceParams(constr1).map(toDefParam(_, keepAnnotations = false, keepDefault = false))))) + evidenceParams(constr1).map(toDefParam(_, KeepAnnotations.None, keepDefault = false))))) case stat => stat } diff --git a/compiler/src/dotty/tools/dotc/ast/TreeInfo.scala b/compiler/src/dotty/tools/dotc/ast/TreeInfo.scala index 941e7b8f1219..990fb37f4e60 100644 --- a/compiler/src/dotty/tools/dotc/ast/TreeInfo.scala +++ b/compiler/src/dotty/tools/dotc/ast/TreeInfo.scala @@ -5,6 +5,8 @@ package ast import core.* import Flags.*, Trees.*, Types.*, Contexts.* import Names.*, StdNames.*, NameOps.*, Symbols.* +import Annotations.Annotation +import NameKinds.ContextBoundParamName import typer.ConstFold import reporting.trace @@ -380,6 +382,35 @@ trait TreeInfo[T <: Untyped] { self: Trees.Instance[T] => case _ => tree.tpe.isInstanceOf[ThisType] } + + /** Extractor for annotation.internal.WitnessNames(name_1, ..., name_n)` + * represented as an untyped or typed tree. + */ + object WitnessNamesAnnot: + def apply(names0: List[TermName])(using Context): untpd.Tree = + untpd.TypedSplice(tpd.New( + defn.WitnessNamesAnnot.typeRef, + tpd.SeqLiteral(names0.map(n => tpd.Literal(Constant(n.toString))), tpd.TypeTree(defn.StringType)) :: Nil + )) + + def unapply(tree: Tree)(using Context): Option[List[TermName]] = + def isWitnessNames(tp: Type) = tp match + case tp: TypeRef => + tp.name == tpnme.WitnessNames && tp.symbol == defn.WitnessNamesAnnot + case _ => + false + unsplice(tree) match + case Apply( + Select(New(tpt: tpd.TypeTree), nme.CONSTRUCTOR), + SeqLiteral(elems, _) :: Nil + ) if isWitnessNames(tpt.tpe) => + Some: + elems.map: + case Literal(Constant(str: String)) => + ContextBoundParamName.unmangle(str.toTermName.asSimpleName) + case _ => + None + end WitnessNamesAnnot } trait UntypedTreeInfo extends TreeInfo[Untyped] { self: Trees.Instance[Untyped] => diff --git a/compiler/src/dotty/tools/dotc/core/Contexts.scala b/compiler/src/dotty/tools/dotc/core/Contexts.scala index d0c30a665289..a5b0e2dba254 100644 --- a/compiler/src/dotty/tools/dotc/core/Contexts.scala +++ b/compiler/src/dotty/tools/dotc/core/Contexts.scala @@ -12,6 +12,7 @@ import Symbols.* import Scopes.* import Uniques.* import ast.Trees.* +import Flags.ParamAccessor import ast.untpd import util.{NoSource, SimpleIdentityMap, SourceFile, HashSet, ReusableInstance} import typer.{Implicits, ImportInfo, SearchHistory, SearchRoot, TypeAssigner, Typer, Nullables} @@ -399,7 +400,8 @@ object Contexts { * * - as owner: The primary constructor of the class * - as outer context: The context enclosing the class context - * - as scope: The parameter accessors in the class context + * - as scope: type parameters, the parameter accessors, and + * the context bound companions in the class context, * * The reasons for this peculiar choice of attributes are as follows: * @@ -413,10 +415,11 @@ object Contexts { * context see the constructor parameters instead, but then we'd need a final substitution step * from constructor parameters to class parameter accessors. */ - def superCallContext: Context = { - val locals = newScopeWith(owner.typeParams ++ owner.asClass.paramAccessors*) - superOrThisCallContext(owner.primaryConstructor, locals) - } + def superCallContext: Context = + val locals = owner.typeParams + ++ owner.asClass.unforcedDecls.filter: sym => + sym.is(ParamAccessor) || sym.isContextBoundCompanion + superOrThisCallContext(owner.primaryConstructor, newScopeWith(locals*)) /** The context for the arguments of a this(...) constructor call. * The context is computed from the local auxiliary constructor context. diff --git a/compiler/src/dotty/tools/dotc/core/Definitions.scala b/compiler/src/dotty/tools/dotc/core/Definitions.scala index 9ee5891f1606..b408883009ab 100644 --- a/compiler/src/dotty/tools/dotc/core/Definitions.scala +++ b/compiler/src/dotty/tools/dotc/core/Definitions.scala @@ -459,6 +459,13 @@ class Definitions { @tu lazy val andType: TypeSymbol = enterBinaryAlias(tpnme.AND, AndType(_, _)) @tu lazy val orType: TypeSymbol = enterBinaryAlias(tpnme.OR, OrType(_, _, soft = false)) + @tu lazy val CBCompanion: TypeSymbol = // type ``[-Refs] + enterPermanentSymbol(tpnme.CBCompanion, + TypeBounds(NothingType, + HKTypeLambda(tpnme.syntheticTypeParamName(0) :: Nil, Contravariant :: Nil)( + tl => TypeBounds.empty :: Nil, + tl => AnyType))).asType + /** Method representing a throw */ @tu lazy val throwMethod: TermSymbol = enterMethod(OpsPackageClass, nme.THROWkw, MethodType(List(ThrowableType), NothingType)) @@ -1062,6 +1069,7 @@ class Definitions { @tu lazy val RetainsByNameAnnot: ClassSymbol = requiredClass("scala.annotation.retainsByName") @tu lazy val RetainsArgAnnot: ClassSymbol = requiredClass("scala.annotation.retainsArg") @tu lazy val PublicInBinaryAnnot: ClassSymbol = requiredClass("scala.annotation.publicInBinary") + @tu lazy val WitnessNamesAnnot: ClassSymbol = requiredClass("scala.annotation.internal.WitnessNames") @tu lazy val JavaRepeatableAnnot: ClassSymbol = requiredClass("java.lang.annotation.Repeatable") @@ -2158,6 +2166,7 @@ class Definitions { NullClass, NothingClass, SingletonClass, + CBCompanion, MaybeCapabilityAnnot) @tu lazy val syntheticCoreClasses: List[Symbol] = syntheticScalaClasses ++ List( diff --git a/compiler/src/dotty/tools/dotc/core/NamerOps.scala b/compiler/src/dotty/tools/dotc/core/NamerOps.scala index af03573da4a8..58b4ad681c6f 100644 --- a/compiler/src/dotty/tools/dotc/core/NamerOps.scala +++ b/compiler/src/dotty/tools/dotc/core/NamerOps.scala @@ -4,8 +4,10 @@ package core import Contexts.*, Symbols.*, Types.*, Flags.*, Scopes.*, Decorators.*, Names.*, NameOps.* import SymDenotations.{LazyType, SymDenotation}, StdNames.nme +import ContextOps.enter import TypeApplications.EtaExpansion import collection.mutable +import config.Printers.typr /** Operations that are shared between Namer and TreeUnpickler */ object NamerOps: @@ -256,4 +258,55 @@ object NamerOps: rhsCtx.gadtState.addBound(psym, tr, isUpper = true) } + /** Create a context-bound companion for type symbol `tsym`, which has a context + * bound that defines a set of witnesses with names `witnessNames`. + * + * @param parans If `tsym` is a type parameter, a list of parameter symbols + * that include all witnesses, otherwise the empty list. + * + * The context-bound companion has as name the name of `tsym` translated to + * a term name. We create a synthetic val of the form + * + * val A: ``[witnessRef1 | ... | witnessRefN] + * + * where + * + * is the CBCompanion type created in Definitions + * withnessRefK is a refence to the K'th witness. + * + * The companion has the same access flags as the original type. + */ + def addContextBoundCompanionFor(tsym: Symbol, witnessNames: List[TermName], params: List[Symbol])(using Context): Unit = + val prefix = ctx.owner.thisType + val companionName = tsym.name.toTermName + val witnessRefs = + if params.nonEmpty then + witnessNames.map: witnessName => + prefix.select(params.find(_.name == witnessName).get) + else + witnessNames.map(TermRef(prefix, _)) + val cbtype = defn.CBCompanion.typeRef.appliedTo: + witnessRefs.reduce[Type](OrType(_, _, soft = false)) + val cbc = newSymbol( + ctx.owner, companionName, + (tsym.flagsUNSAFE & (AccessFlags)).toTermFlags | Synthetic, + cbtype) + typr.println(s"context bound companion created $cbc for $witnessNames in ${ctx.owner}") + ctx.enter(cbc) + end addContextBoundCompanionFor + + /** Add context bound companions to all context-bound types declared in + * this class. This assumes that these types already have their + * WitnessNames annotation set even before they are completed. This is + * the case for unpickling but currently not for Namer. So the method + * is only called during unpickling, and is not part of NamerOps. + */ + def addContextBoundCompanions(cls: ClassSymbol)(using Context): Unit = + for sym <- cls.info.decls do + if sym.isType && !sym.isClass then + for ann <- sym.annotationsUNSAFE do + if ann.symbol == defn.WitnessNamesAnnot then + ann.tree match + case ast.tpd.WitnessNamesAnnot(witnessNames) => + addContextBoundCompanionFor(sym, witnessNames, Nil) end NamerOps diff --git a/compiler/src/dotty/tools/dotc/core/StdNames.scala b/compiler/src/dotty/tools/dotc/core/StdNames.scala index c0eb8a690eb4..ab7e4eea0b46 100644 --- a/compiler/src/dotty/tools/dotc/core/StdNames.scala +++ b/compiler/src/dotty/tools/dotc/core/StdNames.scala @@ -288,6 +288,7 @@ object StdNames { // Compiler-internal val CAPTURE_ROOT: N = "cap" + val CBCompanion: N = "" val CONSTRUCTOR: N = "" val STATIC_CONSTRUCTOR: N = "" val EVT2U: N = "evt2u$" @@ -396,6 +397,7 @@ object StdNames { val TypeApply: N = "TypeApply" val TypeRef: N = "TypeRef" val UNIT : N = "UNIT" + val WitnessNames: N = "WitnessNames" val acc: N = "acc" val adhocExtensions: N = "adhocExtensions" val andThen: N = "andThen" diff --git a/compiler/src/dotty/tools/dotc/core/SymUtils.scala b/compiler/src/dotty/tools/dotc/core/SymUtils.scala index 65634241b790..3a97a0053dbd 100644 --- a/compiler/src/dotty/tools/dotc/core/SymUtils.scala +++ b/compiler/src/dotty/tools/dotc/core/SymUtils.scala @@ -87,6 +87,9 @@ class SymUtils: !d.isPrimitiveValueClass } + def isContextBoundCompanion(using Context): Boolean = + self.is(Synthetic) && self.infoOrCompleter.typeSymbol == defn.CBCompanion + /** Is this a case class for which a product mirror is generated? * Excluded are value classes, abstract classes and case classes with more than one * parameter section. diff --git a/compiler/src/dotty/tools/dotc/core/tasty/TreeUnpickler.scala b/compiler/src/dotty/tools/dotc/core/tasty/TreeUnpickler.scala index 15f58956fbe3..91a5899146cc 100644 --- a/compiler/src/dotty/tools/dotc/core/tasty/TreeUnpickler.scala +++ b/compiler/src/dotty/tools/dotc/core/tasty/TreeUnpickler.scala @@ -1138,6 +1138,7 @@ class TreeUnpickler(reader: TastyReader, }) defn.patchStdLibClass(cls) NamerOps.addConstructorProxies(cls) + NamerOps.addContextBoundCompanions(cls) setSpan(start, untpd.Template(constr, mappedParents, self, lazyStats) .withType(localDummy.termRef)) diff --git a/compiler/src/dotty/tools/dotc/printing/PlainPrinter.scala b/compiler/src/dotty/tools/dotc/printing/PlainPrinter.scala index 5808707326a0..c06b43cafe17 100644 --- a/compiler/src/dotty/tools/dotc/printing/PlainPrinter.scala +++ b/compiler/src/dotty/tools/dotc/printing/PlainPrinter.scala @@ -434,11 +434,11 @@ class PlainPrinter(_ctx: Context) extends Printer { sym.isEffectiveRoot || sym.isAnonymousClass || sym.name.isReplWrapperName /** String representation of a definition's type following its name, - * if symbol is completed, "?" otherwise. + * if symbol is completed, ": ?" otherwise. */ protected def toTextRHS(optType: Option[Type]): Text = optType match { case Some(tp) => toTextRHS(tp) - case None => "?" + case None => ": ?" } protected def decomposeLambdas(bounds: TypeBounds): (Text, TypeBounds) = diff --git a/compiler/src/dotty/tools/dotc/reporting/ErrorMessageID.scala b/compiler/src/dotty/tools/dotc/reporting/ErrorMessageID.scala index e51f0a8b77ac..04380a7b8e4a 100644 --- a/compiler/src/dotty/tools/dotc/reporting/ErrorMessageID.scala +++ b/compiler/src/dotty/tools/dotc/reporting/ErrorMessageID.scala @@ -208,7 +208,9 @@ enum ErrorMessageID(val isActive: Boolean = true) extends java.lang.Enum[ErrorMe case UnstableInlineAccessorID // errorNumber: 192 case VolatileOnValID // errorNumber: 193 case ExtensionNullifiedByMemberID // errorNumber: 194 - case InlinedAnonClassWarningID // errorNumber: 195 + case ConstructorProxyNotValueID // errorNumber: 195 + case ContextBoundCompanionNotValueID // errorNumber: 196 + case InlinedAnonClassWarningID // errorNumber: 197 def errorNumber = ordinal - 1 diff --git a/compiler/src/dotty/tools/dotc/reporting/messages.scala b/compiler/src/dotty/tools/dotc/reporting/messages.scala index 51556a5c93ac..ceb8ecbc8e03 100644 --- a/compiler/src/dotty/tools/dotc/reporting/messages.scala +++ b/compiler/src/dotty/tools/dotc/reporting/messages.scala @@ -3203,3 +3203,39 @@ class VolatileOnVal()(using Context) extends SyntaxMsg(VolatileOnValID): protected def msg(using Context): String = "values cannot be volatile" protected def explain(using Context): String = "" + +class ConstructorProxyNotValue(sym: Symbol)(using Context) +extends TypeMsg(ConstructorProxyNotValueID): + protected def msg(using Context): String = + i"constructor proxy $sym cannot be used as a value" + protected def explain(using Context): String = + i"""A constructor proxy is a symbol made up by the compiler to represent a non-existent + |factory method of a class. For instance, in + | + | class C(x: Int) + | + |C does not have an apply method since it is not a case class. Yet one can + |still create instances with applications like `C(3)` which expand to `new C(3)`. + |The `C` in this call is a constructor proxy. It can only be used as applications + |but not as a stand-alone value.""" + +class ContextBoundCompanionNotValue(sym: Symbol)(using Context) +extends TypeMsg(ConstructorProxyNotValueID): + protected def msg(using Context): String = + i"context bound companion $sym cannot be used as a value" + protected def explain(using Context): String = + i"""A context bound companion is a symbol made up by the compiler to represent the + |witness or witnesses generated for the context bound(s) of a type parameter or type. + |For instance, in + | + | class Monoid extends SemiGroup: + | type Self + | def unit: Self + | + | type A: Monoid + | + |there is just a type `A` declared but not a value `A`. Nevertheless, one can write + |the selection `A.unit`, which works because the compiler created a context bound + |companion value with the (term-)name `A`. However, these context bound companions + |are not values themselves, they can only be referred to in selections.""" + diff --git a/compiler/src/dotty/tools/dotc/transform/PostTyper.scala b/compiler/src/dotty/tools/dotc/transform/PostTyper.scala index 954b08c24ac1..a110ec53abc0 100644 --- a/compiler/src/dotty/tools/dotc/transform/PostTyper.scala +++ b/compiler/src/dotty/tools/dotc/transform/PostTyper.scala @@ -279,9 +279,13 @@ class PostTyper extends MacroTransform with InfoTransformer { thisPhase => } } - def checkNoConstructorProxy(tree: Tree)(using Context): Unit = + def checkUsableAsValue(tree: Tree)(using Context): Unit = + def unusable(msg: Symbol => Message) = + report.error(msg(tree.symbol), tree.srcPos) if tree.symbol.is(ConstructorProxy) then - report.error(em"constructor proxy ${tree.symbol} cannot be used as a value", tree.srcPos) + unusable(ConstructorProxyNotValue(_)) + if tree.symbol.isContextBoundCompanion then + unusable(ContextBoundCompanionNotValue(_)) def checkStableSelection(tree: Tree)(using Context): Unit = def check(qual: Tree) = @@ -326,7 +330,7 @@ class PostTyper extends MacroTransform with InfoTransformer { thisPhase => if tree.isType then checkNotPackage(tree) else - checkNoConstructorProxy(tree) + checkUsableAsValue(tree) registerNeedsInlining(tree) tree.tpe match { case tpe: ThisType => This(tpe.cls).withSpan(tree.span) @@ -338,7 +342,7 @@ class PostTyper extends MacroTransform with InfoTransformer { thisPhase => Checking.checkRealizable(qual.tpe, qual.srcPos) withMode(Mode.Type)(super.transform(checkNotPackage(tree))) else - checkNoConstructorProxy(tree) + checkUsableAsValue(tree) transformSelect(tree, Nil) case tree: Apply => val methType = tree.fun.tpe.widen.asInstanceOf[MethodType] @@ -469,8 +473,14 @@ class PostTyper extends MacroTransform with InfoTransformer { thisPhase => val relativePath = util.SourceFile.relativePath(ctx.compilationUnit.source, reference) sym.addAnnotation(Annotation(defn.SourceFileAnnot, Literal(Constants.Constant(relativePath)), tree.span)) else - if !sym.is(Param) && !sym.owner.isOneOf(AbstractOrTrait) then - Checking.checkGoodBounds(tree.symbol) + if !sym.is(Param) then + if !sym.owner.isOneOf(AbstractOrTrait) then + Checking.checkGoodBounds(tree.symbol) + if sym.owner.isClass && sym.hasAnnotation(defn.WitnessNamesAnnot) then + val decls = sym.owner.info.decls + for cbCompanion <- decls.lookupAll(sym.name.toTermName) do + if cbCompanion.isContextBoundCompanion then + decls.openForMutations.unlink(cbCompanion) (tree.rhs, sym.info) match case (rhs: LambdaTypeTree, bounds: TypeBounds) => VarianceChecker.checkLambda(rhs, bounds) diff --git a/compiler/src/dotty/tools/dotc/transform/TreeChecker.scala b/compiler/src/dotty/tools/dotc/transform/TreeChecker.scala index 2ebe33a9a14f..c4e1c7892e8d 100644 --- a/compiler/src/dotty/tools/dotc/transform/TreeChecker.scala +++ b/compiler/src/dotty/tools/dotc/transform/TreeChecker.scala @@ -311,9 +311,11 @@ object TreeChecker { def assertDefined(tree: untpd.Tree)(using Context): Unit = if (tree.symbol.maybeOwner.isTerm) { val sym = tree.symbol + def isAllowed = // constructor proxies and context bound companions are flagged at PostTyper + isSymWithoutDef(sym) && ctx.phase.id < postTyperPhase.id assert( - nowDefinedSyms.contains(sym) || patBoundSyms.contains(sym), - i"undefined symbol ${sym} at line " + tree.srcPos.line + nowDefinedSyms.contains(sym) || patBoundSyms.contains(sym) || isAllowed, + i"undefined symbol ${sym} in ${sym.owner} at line " + tree.srcPos.line ) if (!ctx.phase.patternTranslated) @@ -384,6 +386,9 @@ object TreeChecker { case _ => } + def isSymWithoutDef(sym: Symbol)(using Context): Boolean = + sym.is(ConstructorProxy) || sym.isContextBoundCompanion + /** Exclude from double definition checks any erased symbols that were * made `private` in phase `UnlinkErasedDecls`. These symbols will be removed * completely in phase `Erasure` if they are defined in a currently compiled unit. @@ -614,14 +619,12 @@ object TreeChecker { val decls = cls.classInfo.decls.toList.toSet.filter(isNonMagicalMember) val defined = impl.body.map(_.symbol) - def isAllowed(sym: Symbol): Boolean = sym.is(ConstructorProxy) - - val symbolsNotDefined = (decls -- defined - constr.symbol).filterNot(isAllowed) + val symbolsMissingDefs = (decls -- defined - constr.symbol).filterNot(isSymWithoutDef) - assert(symbolsNotDefined.isEmpty, - i" $cls tree does not define members: ${symbolsNotDefined.toList}%, %\n" + - i"expected: ${decls.toList}%, %\n" + - i"defined: ${defined}%, %") + assert(symbolsMissingDefs.isEmpty, + i"""$cls tree does not define members: ${symbolsMissingDefs.toList}%, % + |expected: ${decls.toList}%, % + |defined: ${defined}%, %""") super.typedClassDef(cdef, cls) } diff --git a/compiler/src/dotty/tools/dotc/typer/Namer.scala b/compiler/src/dotty/tools/dotc/typer/Namer.scala index 85678b9685f7..393b38c5ff57 100644 --- a/compiler/src/dotty/tools/dotc/typer/Namer.scala +++ b/compiler/src/dotty/tools/dotc/typer/Namer.scala @@ -406,6 +406,11 @@ class Namer { typer: Typer => enterSymbol(sym) setDocstring(sym, origStat) addEnumConstants(mdef, sym) + mdef match + case tdef: TypeDef if ctx.owner.isClass => + for case WitnessNamesAnnot(witnessNames) <- tdef.mods.annotations do + addContextBoundCompanionFor(symbolOfTree(tdef), witnessNames, Nil) + case _ => ctx case stats: Thicket => stats.toList.foreach(recur) @@ -1749,12 +1754,6 @@ class Namer { typer: Typer => val sym = tree.symbol if sym.isConstructor then sym.owner else sym - /** Enter and typecheck parameter list */ - def completeParams(params: List[MemberDef])(using Context): Unit = { - index(params) - for (param <- params) typedAheadExpr(param) - } - /** The signature of a module valdef. * This will compute the corresponding module class TypeRef immediately * without going through the defined type of the ValDef. This is necessary @@ -1853,6 +1852,30 @@ class Namer { typer: Typer => // Beware: ddef.name need not match sym.name if sym was freshened! val isConstructor = sym.name == nme.CONSTRUCTOR + val witnessNamesOfParam = mutable.Map[TypeDef, List[TermName]]() + if !ddef.name.is(DefaultGetterName) && !sym.is(Synthetic) then + for params <- ddef.paramss; case tdef: TypeDef <- params do + for case WitnessNamesAnnot(ws) <- tdef.mods.annotations do + witnessNamesOfParam(tdef) = ws + + /** Are all names in `wnames` defined by the longest prefix of all `params` + * that have been typed ahead (i.e. that carry the TypedAhead attachment)? + */ + def allParamsSeen(wnames: List[TermName], params: List[MemberDef]) = + (wnames.toSet[Name] -- params.takeWhile(_.hasAttachment(TypedAhead)).map(_.name)).isEmpty + + /** Enter and typecheck parameter list, add context companions as. + * Once all witness parameters for a context bound are seen, create a + * context bound companion for it. + */ + def completeParams(params: List[MemberDef])(using Context): Unit = + index(params) + for param <- params do + typedAheadExpr(param) + for (tdef, wnames) <- witnessNamesOfParam do + if wnames.contains(param.name) && allParamsSeen(wnames, params) then + addContextBoundCompanionFor(symbolOfTree(tdef), wnames, params.map(symbolOfTree)) + // The following 3 lines replace what was previously just completeParams(tparams). // But that can cause bad bounds being computed, as witnessed by // tests/pos/paramcycle.scala. The problematic sequence is this: diff --git a/compiler/src/dotty/tools/dotc/typer/Typer.scala b/compiler/src/dotty/tools/dotc/typer/Typer.scala index f744eb392d7c..37da51157e91 100644 --- a/compiler/src/dotty/tools/dotc/typer/Typer.scala +++ b/compiler/src/dotty/tools/dotc/typer/Typer.scala @@ -840,6 +840,12 @@ class Typer(@constructorOnly nestingLevel: Int = 0) extends Namer return dynSelected.ensureConforms(fieldType) case _ => + // Otherwise, if the qualifier is a context bound companion, handle + // by selecting a witness in typedCBSelect + if qual.tpe.typeSymbol == defn.CBCompanion then + val witnessSelection = typedCBSelect(tree0, pt, qual) + if !witnessSelection.isEmpty then return witnessSelection + // Otherwise, report an error assignType(tree, rawType match @@ -849,6 +855,76 @@ class Typer(@constructorOnly nestingLevel: Int = 0) extends Namer notAMemberErrorType(tree, qual, pt)) end typedSelect + /** Expand a selection A.m on a context bound companion A with type + * `[ref_1 | ... | ref_N]` as described by + * Step 3 of the doc comment of annotation.internal.WitnessNames. + * @return the best alternative if it exists, + * or EmptyTree if no witness admits selecting with the given name, + * or EmptyTree and report an ambiguity error of there are several + * possible witnesses and no selection is better than the other + * according to the critera given in Step 3. + */ + def typedCBSelect(tree: untpd.Select, pt: Type, qual: Tree)(using Context): Tree = + + type Alts = List[(/*prev: */Tree, /*prevState: */TyperState, /*prevWitness: */TermRef)] + + /** Compare two alternative selections `alt1` and `alt2` from witness types + * `wit1`, `wit2` according to the 3 criteria in the enclosing doc comment. I.e. + * + * alt1 = qual1.m, alt2 = qual2.m, qual1: wit1, qual2: wit2 + * + * @return 1 if 1st alternative is preferred over 2nd + * -1 if 2nd alternative is preferred over 1st + * 0 if neither alternative is preferred over the other + */ + def compareAlts(alt1: Tree, alt2: Tree, wit1: TermRef, wit2: TermRef): Int = + val cmpPrefix = compare(wit1, wit2, preferGeneral = true) + typr.println(i"compare witnesses $wit1: ${wit1.info}, $wit2: ${wit2.info} = $cmpPrefix") + if cmpPrefix != 0 then cmpPrefix + else (alt1.tpe, alt2.tpe) match + case (tp1: TypeRef, tp2: TypeRef) => + if tp1.dealias == tp2.dealias then 1 else 0 + case (tp1: TermRef, tp2: TermRef) => + if tp1.info.isSingleton && (tp1 frozen_=:= tp2) then 1 + else compare(tp1, tp2, preferGeneral = false) + case (tp1: TermRef, _) => 1 + case (_, tp2: TermRef) => -1 + case _ => 0 + + /** Find the set of maximally preferred alternative among `prev` and the + * remaining alternatives generated from `witnesses` with is a union type + * of witness references. + */ + def tryAlts(prevs: Alts, witnesses: Type): Alts = witnesses match + case OrType(wit1, wit2) => + tryAlts(tryAlts(prevs, wit1), wit2) + case witness: TermRef => + val altQual = tpd.ref(witness).withSpan(qual.span) + val altCtx = ctx.fresh.setNewTyperState() + val alt = typedSelect(tree, pt, altQual)(using altCtx) + def current = (alt, altCtx.typerState, witness) + if altCtx.reporter.hasErrors then prevs + else + val cmps = prevs.map: (prevTree, prevState, prevWitness) => + compareAlts(prevTree, alt, prevWitness, witness) + if cmps.exists(_ == 1) then prevs + else current :: prevs.zip(cmps).collect{ case (prev, cmp) if cmp != -1 => prev } + + qual.tpe.widen match + case AppliedType(_, arg :: Nil) => + tryAlts(Nil, arg) match + case Nil => EmptyTree + case (best @ (bestTree, bestState, _)) :: Nil => + bestState.commit() + bestTree + case multiAlts => + report.error( + em"""Ambiguous witness reference. None of the following alternatives is more specific than the other: + |${multiAlts.map((alt, _, witness) => i"\n $witness.${tree.name}: ${alt.tpe.widen}")}""", + tree.srcPos) + EmptyTree + end typedCBSelect + def typedSelect(tree: untpd.Select, pt: Type)(using Context): Tree = { record("typedSelect") diff --git a/library/src/scala/annotation/internal/WitnessNames.scala b/library/src/scala/annotation/internal/WitnessNames.scala new file mode 100644 index 000000000000..f859cda96d06 --- /dev/null +++ b/library/src/scala/annotation/internal/WitnessNames.scala @@ -0,0 +1,53 @@ +package scala.annotation +package internal + +/** An annotation that is used for marking type definitions that should get + * context bound companions. The scheme is as follows: + * + * 1. When desugaring a context-bounded type A, add a @WitnessNames(n_1, ... , n_k) + * annotation to the type declaration node, where n_1, ..., n_k are the names of + * all the witnesses generated for the context bounds of A. This annotation will + * be pickled as usual. + * + * 2. During Namer or Unpickling, when encountering a type declaration A with + * a WitnessNames(n_1, ... , n_k) annotation, create a CB companion `val A` with + * rtype ``[ref_1 | ... | ref_k] where ref_i is a TermRef + * with the same prefix as A and name n_i. Except, don't do this if the type in + * question is a type parameter and there is already a term parameter with name A + * defined for the same method. + * + * ContextBoundCompanion is defined as an internal abstract type like this: + * + * type ``[-Refs] + * + * The context bound companion's variance is negative, so that unons in the + * arguments are joined when encountering multiple definfitions and forming a glb. + * + * 3. Add a special case for typing a selection A.m on a value A of type + * ContextBoundCompanion[ref_1, ..., ref_k]. Namely, try to typecheck all + * selections ref_1.m, ..., ref_k.m with the expected type. There must be + * a unique selection ref_i.m that typechecks and such that for all other + * selections ref_j.m that also typecheck one of the following three criteria + * applies: + * + * 1. ref_i.m and ref_j.m are the same. This means: If they are types then + * ref_i.m is an alias of ref_j.m. If they are terms then they are both + * singleton types and ref_i.m =:= ref_j.m. + * 2. The underlying type (under widen) of ref_i is a true supertype of the + * underlying type of ref_j. + * 3. ref_i.m is a term, the underlying type of ref_j is not a strict subtype + * of the underlying type of ref_j, and the underlying type ref_i.m is a + * strict subtype of the underlying type of ref_j.m. + * + * If there is such a selection, map A.m to ref_i.m, otherwise report an error. + * + * (2) might surprise. It is the analogue of given disambiguation, where we also + * pick the most general candidate that matches the expected type. E.g. we have + * context bounds for Functor, Monad, and Applicable. In this case we want to + * select the `map` method of `Functor`. + * + * 4. At PostTyper, issue an error when encountering any reference to a CB companion. + */ +class WitnessNames(names: String*) extends StaticAnnotation + + diff --git a/project/MiMaFilters.scala b/project/MiMaFilters.scala index 3b28733226a0..6c3640eed12c 100644 --- a/project/MiMaFilters.scala +++ b/project/MiMaFilters.scala @@ -20,6 +20,8 @@ object MiMaFilters { ProblemFilters.exclude[MissingClassProblem]("scala.runtime.stdLibPatches.language$experimental$namedTuples$"), ProblemFilters.exclude[MissingFieldProblem]("scala.runtime.stdLibPatches.language#experimental.modularity"), ProblemFilters.exclude[MissingClassProblem]("scala.runtime.stdLibPatches.language$experimental$modularity$"), + ProblemFilters.exclude[DirectMissingMethodProblem]("scala.compiletime.package#package.deferred"), + ProblemFilters.exclude[MissingClassProblem]("scala.annotation.internal.WitnessNames"), ), // Additions since last LTS diff --git a/tests/neg/cb-companion-leaks.check b/tests/neg/cb-companion-leaks.check new file mode 100644 index 000000000000..156f8a7ab3ee --- /dev/null +++ b/tests/neg/cb-companion-leaks.check @@ -0,0 +1,66 @@ +-- [E194] Type Error: tests/neg/cb-companion-leaks.scala:9:23 ---------------------------------------------------------- +9 | def foo[A: {C, D}] = A // error + | ^ + | context bound companion value A cannot be used as a value + |--------------------------------------------------------------------------------------------------------------------- + | Explanation (enabled by `-explain`) + |- - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - + | A context bound companion is a symbol made up by the compiler to represent the + | witness or witnesses generated for the context bound(s) of a type parameter or type. + | For instance, in + | + | class Monoid extends SemiGroup: + | type Self + | def unit: Self + | + | type A: Monoid + | + | there is just a type `A` declared but not a value `A`. Nevertheless, one can write + | the selection `A.unit`, which works because the compiler created a context bound + | companion value with the (term-)name `A`. However, these context bound companions + | are not values themselves, they can only be referred to in selections. + --------------------------------------------------------------------------------------------------------------------- +-- [E194] Type Error: tests/neg/cb-companion-leaks.scala:13:10 --------------------------------------------------------- +13 | val x = A // error + | ^ + | context bound companion value A cannot be used as a value + |-------------------------------------------------------------------------------------------------------------------- + | Explanation (enabled by `-explain`) + |- - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - + | A context bound companion is a symbol made up by the compiler to represent the + | witness or witnesses generated for the context bound(s) of a type parameter or type. + | For instance, in + | + | class Monoid extends SemiGroup: + | type Self + | def unit: Self + | + | type A: Monoid + | + | there is just a type `A` declared but not a value `A`. Nevertheless, one can write + | the selection `A.unit`, which works because the compiler created a context bound + | companion value with the (term-)name `A`. However, these context bound companions + | are not values themselves, they can only be referred to in selections. + -------------------------------------------------------------------------------------------------------------------- +-- [E194] Type Error: tests/neg/cb-companion-leaks.scala:15:9 ---------------------------------------------------------- +15 | val y: A.type = ??? // error + | ^ + | context bound companion value A cannot be used as a value + |-------------------------------------------------------------------------------------------------------------------- + | Explanation (enabled by `-explain`) + |- - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - + | A context bound companion is a symbol made up by the compiler to represent the + | witness or witnesses generated for the context bound(s) of a type parameter or type. + | For instance, in + | + | class Monoid extends SemiGroup: + | type Self + | def unit: Self + | + | type A: Monoid + | + | there is just a type `A` declared but not a value `A`. Nevertheless, one can write + | the selection `A.unit`, which works because the compiler created a context bound + | companion value with the (term-)name `A`. However, these context bound companions + | are not values themselves, they can only be referred to in selections. + -------------------------------------------------------------------------------------------------------------------- diff --git a/tests/neg/cb-companion-leaks.scala b/tests/neg/cb-companion-leaks.scala new file mode 100644 index 000000000000..07155edb05dc --- /dev/null +++ b/tests/neg/cb-companion-leaks.scala @@ -0,0 +1,16 @@ +//> using options -language:experimental.modularity -source future -explain + +class C[Self] + +class D[Self] + +trait Test: + + def foo[A: {C, D}] = A // error + + type A: C + + val x = A // error + + val y: A.type = ??? // error + diff --git a/tests/pos-macros/i8325/Macro_1.scala b/tests/pos-macros/i8325/Macro_1.scala index 18466e17b3df..92a54d21b00a 100644 --- a/tests/pos-macros/i8325/Macro_1.scala +++ b/tests/pos-macros/i8325/Macro_1.scala @@ -3,7 +3,7 @@ package a import scala.quoted.* -object A: +object O: inline def transform[A](inline expr: A): A = ${ transformImplExpr('expr) @@ -15,7 +15,7 @@ object A: import quotes.reflect.* expr.asTerm match { case Inlined(x,y,z) => transformImplExpr(z.asExpr.asInstanceOf[Expr[A]]) - case Apply(fun,args) => '{ A.pure(${Apply(fun,args).asExpr.asInstanceOf[Expr[A]]}) } + case Apply(fun,args) => '{ O.pure(${Apply(fun,args).asExpr.asInstanceOf[Expr[A]]}) } case other => expr } } diff --git a/tests/pos-macros/i8325/Test_2.scala b/tests/pos-macros/i8325/Test_2.scala index 8b0a74b11a08..90e88dfee341 100644 --- a/tests/pos-macros/i8325/Test_2.scala +++ b/tests/pos-macros/i8325/Test_2.scala @@ -3,7 +3,7 @@ package a class Test1 { def t1(): Unit = { - A.transform( + O.transform( s"a ${1} ${2}") } diff --git a/tests/pos-macros/i8325b/Macro_1.scala b/tests/pos-macros/i8325b/Macro_1.scala index 181efa260f9b..139abed94078 100644 --- a/tests/pos-macros/i8325b/Macro_1.scala +++ b/tests/pos-macros/i8325b/Macro_1.scala @@ -3,7 +3,7 @@ package a import scala.quoted.* -object A: +object O: inline def transform[A](inline expr: A): A = ${ transformImplExpr('expr) @@ -16,7 +16,7 @@ object A: expr.asTerm match { case Inlined(x,y,z) => transformImplExpr(z.asExpr.asInstanceOf[Expr[A]]) case r@Apply(fun,args) => '{ - A.pure(${r.asExpr.asInstanceOf[Expr[A]]}) } + O.pure(${r.asExpr.asInstanceOf[Expr[A]]}) } case other => expr } } diff --git a/tests/pos-macros/i8325b/Test_2.scala b/tests/pos-macros/i8325b/Test_2.scala index 8b0a74b11a08..90e88dfee341 100644 --- a/tests/pos-macros/i8325b/Test_2.scala +++ b/tests/pos-macros/i8325b/Test_2.scala @@ -3,7 +3,7 @@ package a class Test1 { def t1(): Unit = { - A.transform( + O.transform( s"a ${1} ${2}") } diff --git a/tests/pos/FromString-cb-companion.scala b/tests/pos/FromString-cb-companion.scala new file mode 100644 index 000000000000..d086420761ee --- /dev/null +++ b/tests/pos/FromString-cb-companion.scala @@ -0,0 +1,14 @@ +//> using options -language:experimental.modularity -source future + +trait FromString[Self]: + def fromString(s: String): Self + +given FromString[Int] = _.toInt + +given FromString[Double] = _.toDouble + +def add[N: {FromString, Numeric as num}](a: String, b: String): N = + N.plus( + num.plus(N.fromString(a), N.fromString(b)), + N.fromString(a) + ) \ No newline at end of file diff --git a/tests/pos/cb-companion-joins.scala b/tests/pos/cb-companion-joins.scala new file mode 100644 index 000000000000..97e0a8a7e4ac --- /dev/null +++ b/tests/pos/cb-companion-joins.scala @@ -0,0 +1,21 @@ +import language.experimental.modularity +import language.future + +trait M[Self]: + extension (x: Self) def combine (y: Self): String + def unit: Self + +trait Num[Self]: + def zero: Self + +trait A extends M[A] +trait B extends M[A] + +trait AA: + type X: M +trait BB: + type X: Num +class CC[X1: {M, Num}] extends AA, BB: + type X = X1 + X.zero + X.unit From 5f3ff9ff3295057b1344ade9c32932b8f8af4550 Mon Sep 17 00:00:00 2001 From: odersky Date: Tue, 2 Apr 2024 23:33:36 +0200 Subject: [PATCH 224/371] Allow contecxt bounds with abstract `Self` types If a context bound type `T` for type parameter `A` does not have type parameters, demand evidence of type `T { type Self = A }` instead. [Cherry-picked c6388c2785f628b7e4a8680b6d4f1e7be0b0a925] --- .../src/dotty/tools/dotc/core/StdNames.scala | 1 + .../src/dotty/tools/dotc/typer/Typer.scala | 6 +- .../test/dotc/pos-test-pickling.blacklist | 7 +- .../scala/runtime/stdLibPatches/Predef.scala | 13 + tests/pos/FromString.scala | 15 + tests/pos/deferred-givens.scala | 12 +- tests/pos/deferredSummon.scala | 11 +- tests/pos/dep-context-bounds.scala | 11 +- tests/pos/hylolib-extract.scala | 29 ++ tests/pos/hylolib/AnyCollection.scala | 51 +++ tests/pos/hylolib/AnyValue.scala | 67 ++++ tests/pos/hylolib/AnyValueTests.scala | 15 + tests/pos/hylolib/BitArray.scala | 362 ++++++++++++++++++ tests/pos/hylolib/Collection.scala | 267 +++++++++++++ tests/pos/hylolib/CollectionTests.scala | 67 ++++ tests/pos/hylolib/CoreTraits.scala | 56 +++ tests/pos/hylolib/Hasher.scala | 39 ++ tests/pos/hylolib/HyArray.scala | 202 ++++++++++ tests/pos/hylolib/HyArrayTests.scala | 17 + tests/pos/hylolib/Integers.scala | 46 +++ tests/pos/hylolib/IntegersTests.scala | 14 + tests/pos/hylolib/Range.scala | 37 ++ tests/pos/hylolib/Slice.scala | 63 +++ tests/pos/hylolib/StringConvertible.scala | 9 + tests/pos/hylolib/Test.scala | 16 + tests/pos/i10929-new-syntax.scala | 22 ++ tests/pos/ord-over-tracked.scala | 15 + tests/pos/parsercombinators-arrow.scala | 48 +++ tests/pos/parsercombinators-ctx-bounds.scala | 49 +++ tests/pos/parsercombinators-new-syntax.scala | 45 +++ tests/pos/parsercombinators-this.scala | 53 +++ tests/pos/sets-tc.scala | 46 +++ tests/pos/typeclass-aggregates.scala | 32 +- tests/pos/typeclasses-arrow.scala | 140 +++++++ tests/pos/typeclasses-this.scala | 141 +++++++ tests/pos/typeclasses.scala | 47 ++- tests/run/for-desugar-strawman.scala | 96 +++++ tests/run/given-disambiguation.scala | 58 +++ tests/run/i15840.scala | 27 ++ 39 files changed, 2199 insertions(+), 53 deletions(-) create mode 100644 tests/pos/FromString.scala create mode 100644 tests/pos/hylolib-extract.scala create mode 100644 tests/pos/hylolib/AnyCollection.scala create mode 100644 tests/pos/hylolib/AnyValue.scala create mode 100644 tests/pos/hylolib/AnyValueTests.scala create mode 100644 tests/pos/hylolib/BitArray.scala create mode 100644 tests/pos/hylolib/Collection.scala create mode 100644 tests/pos/hylolib/CollectionTests.scala create mode 100644 tests/pos/hylolib/CoreTraits.scala create mode 100644 tests/pos/hylolib/Hasher.scala create mode 100644 tests/pos/hylolib/HyArray.scala create mode 100644 tests/pos/hylolib/HyArrayTests.scala create mode 100644 tests/pos/hylolib/Integers.scala create mode 100644 tests/pos/hylolib/IntegersTests.scala create mode 100644 tests/pos/hylolib/Range.scala create mode 100644 tests/pos/hylolib/Slice.scala create mode 100644 tests/pos/hylolib/StringConvertible.scala create mode 100644 tests/pos/hylolib/Test.scala create mode 100644 tests/pos/i10929-new-syntax.scala create mode 100644 tests/pos/ord-over-tracked.scala create mode 100644 tests/pos/parsercombinators-arrow.scala create mode 100644 tests/pos/parsercombinators-ctx-bounds.scala create mode 100644 tests/pos/parsercombinators-new-syntax.scala create mode 100644 tests/pos/parsercombinators-this.scala create mode 100644 tests/pos/sets-tc.scala create mode 100644 tests/pos/typeclasses-arrow.scala create mode 100644 tests/pos/typeclasses-this.scala create mode 100644 tests/run/for-desugar-strawman.scala create mode 100644 tests/run/given-disambiguation.scala create mode 100644 tests/run/i15840.scala diff --git a/compiler/src/dotty/tools/dotc/core/StdNames.scala b/compiler/src/dotty/tools/dotc/core/StdNames.scala index ab7e4eea0b46..b935488695e0 100644 --- a/compiler/src/dotty/tools/dotc/core/StdNames.scala +++ b/compiler/src/dotty/tools/dotc/core/StdNames.scala @@ -388,6 +388,7 @@ object StdNames { val RootPackage: N = "RootPackage" val RootClass: N = "RootClass" val Select: N = "Select" + val Self: N = "Self" val Shape: N = "Shape" val StringContext: N = "StringContext" val This: N = "This" diff --git a/compiler/src/dotty/tools/dotc/typer/Typer.scala b/compiler/src/dotty/tools/dotc/typer/Typer.scala index 37da51157e91..6ac41ed619b6 100644 --- a/compiler/src/dotty/tools/dotc/typer/Typer.scala +++ b/compiler/src/dotty/tools/dotc/typer/Typer.scala @@ -2366,9 +2366,13 @@ class Typer(@constructorOnly nestingLevel: Int = 0) extends Namer val tparam = untpd.Ident(tree.paramName).withSpan(tree.span) if tycon.tpe.typeParams.nonEmpty then typed(untpd.AppliedTypeTree(tyconSplice, tparam :: Nil)) + else if Feature.enabled(modularity) && tycon.tpe.member(tpnme.Self).symbol.isAbstractType then + val tparamSplice = untpd.TypedSplice(typedExpr(tparam)) + typed(untpd.RefinedTypeTree(tyconSplice, List(untpd.TypeDef(tpnme.Self, tparamSplice)))) else errorTree(tree, - em"""Illegal context bound: ${tycon.tpe} does not take type parameters.""") + em"""Illegal context bound: ${tycon.tpe} does not take type parameters and + |does not have an abstract type member named `Self` either.""") def typedSingletonTypeTree(tree: untpd.SingletonTypeTree)(using Context): SingletonTypeTree = { val ref1 = typedExpr(tree.ref, SingletonTypeProto) diff --git a/compiler/test/dotc/pos-test-pickling.blacklist b/compiler/test/dotc/pos-test-pickling.blacklist index e58277bdc0e5..d6f962176ecc 100644 --- a/compiler/test/dotc/pos-test-pickling.blacklist +++ b/compiler/test/dotc/pos-test-pickling.blacklist @@ -127,10 +127,11 @@ i20053b.scala # alias types at different levels of dereferencing parsercombinators-givens.scala parsercombinators-givens-2.scala +parsercombinators-ctx-bounds.scala +parsercombinators-this.scala parsercombinators-arrow.scala +parsercombinators-new-syntax.scala hylolib-deferred-given hylolib-cb - - - +hylolib diff --git a/library/src/scala/runtime/stdLibPatches/Predef.scala b/library/src/scala/runtime/stdLibPatches/Predef.scala index 7abd92e408f8..a68a628623bf 100644 --- a/library/src/scala/runtime/stdLibPatches/Predef.scala +++ b/library/src/scala/runtime/stdLibPatches/Predef.scala @@ -66,4 +66,17 @@ object Predef: extension (opt: Option.type) @experimental inline def fromNullable[T](t: T | Null): Option[T] = Option(t).asInstanceOf[Option[T]] + + /** A type supporting Self-based type classes. + * + * A is TC + * + * expands to + * + * TC { type Self = A } + * + * which is what is needed for a context bound `[A: TC]`. + */ + infix type is[A <: AnyKind, B <: {type Self <: AnyKind}] = B { type Self = A } + end Predef diff --git a/tests/pos/FromString.scala b/tests/pos/FromString.scala new file mode 100644 index 000000000000..333a4c002989 --- /dev/null +++ b/tests/pos/FromString.scala @@ -0,0 +1,15 @@ +//> using options -language:experimental.modularity -source future + +trait FromString: + type Self + def fromString(s: String): Self + +given Int is FromString = _.toInt + +given Double is FromString = _.toDouble + +def add[N: {FromString, Numeric as num}](a: String, b: String): N = + N.plus( + num.plus(N.fromString(a), N.fromString(b)), + N.fromString(a) + ) \ No newline at end of file diff --git a/tests/pos/deferred-givens.scala b/tests/pos/deferred-givens.scala index 51fa43866d1e..b9018c97e151 100644 --- a/tests/pos/deferred-givens.scala +++ b/tests/pos/deferred-givens.scala @@ -1,9 +1,19 @@ //> using options -language:experimental.modularity -source future import compiletime.* class Ord[Elem] - given Ord[Double] +trait A: + type Elem : Ord + def foo = summon[Ord[Elem]] + +class AC extends A: + type Elem = Double + override given Ord[Elem] = ??? + +class AD extends A: + type Elem = Double + trait B: type Elem given Ord[Elem] = deferred diff --git a/tests/pos/deferredSummon.scala b/tests/pos/deferredSummon.scala index 31a9697eda6b..f8252576d81a 100644 --- a/tests/pos/deferredSummon.scala +++ b/tests/pos/deferredSummon.scala @@ -1,20 +1,21 @@ //> using options -language:experimental.modularity -source future import compiletime.deferred -trait Ord[Self]: +trait Ord: + type Self def less(x: Self, y: Self): Boolean trait A: type Elem - given Ord[Elem] = deferred - def foo = summon[Ord[Elem]] + given Elem is Ord = deferred + def foo = summon[Elem is Ord] trait B: type Elem: Ord - def foo = summon[Ord[Elem]] + def foo = summon[Elem is Ord] object Inst: - given Ord[Int]: + given Int is Ord: def less(x: Int, y: Int) = x < y object Test1: diff --git a/tests/pos/dep-context-bounds.scala b/tests/pos/dep-context-bounds.scala index 434805762622..c724d92e9809 100644 --- a/tests/pos/dep-context-bounds.scala +++ b/tests/pos/dep-context-bounds.scala @@ -1,6 +1,13 @@ //> using options -language:experimental.modularity -source future -trait A[X]: - type Self = X +trait A: + type Self + +object Test1: + def foo[X: A](x: X.Self) = ??? + + def bar[X: A](a: Int) = ??? + + def baz[X: A](a: Int)(using String) = ??? object Test2: def foo[X: A as x](a: x.Self) = ??? diff --git a/tests/pos/hylolib-extract.scala b/tests/pos/hylolib-extract.scala new file mode 100644 index 000000000000..846e52f30df6 --- /dev/null +++ b/tests/pos/hylolib-extract.scala @@ -0,0 +1,29 @@ +//> using options -language:experimental.modularity -source future +package hylotest + +trait Value: + type Self + extension (self: Self) def eq(other: Self): Boolean + +/** A collection of elements accessible by their position. */ +trait Collection: + type Self + + /** The type of the elements in the collection. */ + type Element: Value + +class BitArray + +given Boolean is Value: + extension (self: Self) def eq(other: Self): Boolean = + self == other + +given BitArray is Collection: + type Element = Boolean + +extension [Self: Value](self: Self) + def neq(other: Self): Boolean = !self.eq(other) + +extension [Self: Collection](self: Self) + def elementsEqual[T: Collection { type Element = Self.Element } ](other: T): Boolean = + ??? diff --git a/tests/pos/hylolib/AnyCollection.scala b/tests/pos/hylolib/AnyCollection.scala new file mode 100644 index 000000000000..6c2b835852e6 --- /dev/null +++ b/tests/pos/hylolib/AnyCollection.scala @@ -0,0 +1,51 @@ +//> using options -language:experimental.modularity -source future +package hylo + +/** A type-erased collection. + * + * A `AnyCollection` forwards its operations to a wrapped value, hiding its implementation. + */ +final class AnyCollection[Element] private ( + val _start: () => AnyValue, + val _end: () => AnyValue, + val _after: (AnyValue) => AnyValue, + val _at: (AnyValue) => Element +) + +object AnyCollection { + + /** Creates an instance forwarding its operations to `base`. */ + def apply[Base: Collection](base: Base): AnyCollection[Base.Element] = + + def start(): AnyValue = + AnyValue(base.startPosition) + + def end(): AnyValue = + AnyValue(base.endPosition) + + def after(p: AnyValue): AnyValue = + AnyValue(base.positionAfter(p.unsafelyUnwrappedAs[Base.Position])) + + def at(p: AnyValue): Base.Element = + base.at(p.unsafelyUnwrappedAs[Base.Position]) + + new AnyCollection[Base.Element]( + _start = start, + _end = end, + _after = after, + _at = at + ) + +} + +given [T: Value] => AnyCollection[T] is Collection: + + type Element = T + type Position = AnyValue + + extension (self: AnyCollection[T]) + def startPosition = self._start() + def endPosition = self._end() + def positionAfter(p: Position) = self._after(p) + def at(p: Position) = self._at(p) + diff --git a/tests/pos/hylolib/AnyValue.scala b/tests/pos/hylolib/AnyValue.scala new file mode 100644 index 000000000000..6844135b646b --- /dev/null +++ b/tests/pos/hylolib/AnyValue.scala @@ -0,0 +1,67 @@ +package hylo + +/** A wrapper around an object providing a reference API. */ +private final class Ref[T](val value: T) { + + override def toString: String = + s"Ref($value)" + +} + +/** A type-erased value. + * + * An `AnyValue` forwards its operations to a wrapped value, hiding its implementation. + */ +final class AnyValue private ( + private val wrapped: AnyRef, + private val _copy: (AnyRef) => AnyValue, + private val _eq: (AnyRef, AnyRef) => Boolean, + private val _hashInto: (AnyRef, Hasher) => Hasher +) { + + /** Returns a copy of `this`. */ + def copy(): AnyValue = + _copy(this.wrapped) + + /** Returns `true` iff `this` and `other` have an equivalent value. */ + def eq(other: AnyValue): Boolean = + _eq(this.wrapped, other.wrapped) + + /** Hashes the salient parts of `this` into `hasher`. */ + def hashInto(hasher: Hasher): Hasher = + _hashInto(this.wrapped, hasher) + + /** Returns the value wrapped in `this` as an instance of `T`. */ + def unsafelyUnwrappedAs[T]: T = + wrapped.asInstanceOf[Ref[T]].value + + /** Returns a textual description of `this`. */ + override def toString: String = + wrapped.toString + +} + +object AnyValue { + + /** Creates an instance wrapping `wrapped`. */ + def apply[T: Value](wrapped: T): AnyValue = + def copy(a: AnyRef): AnyValue = + AnyValue(a.asInstanceOf[Ref[T]].value.copy()) + + def eq(a: AnyRef, b: AnyRef): Boolean = + a.asInstanceOf[Ref[T]].value `eq` b.asInstanceOf[Ref[T]].value + + def hashInto(a: AnyRef, hasher: Hasher): Hasher = + a.asInstanceOf[Ref[T]].value.hashInto(hasher) + + new AnyValue(Ref(wrapped), copy, eq, hashInto) + +} + +given AnyValue is Value: + + extension (self: AnyValue) + def copy(): AnyValue = self.copy() + def eq(other: AnyValue): Boolean = self `eq` other + def hashInto(hasher: Hasher): Hasher = self.hashInto(hasher) + diff --git a/tests/pos/hylolib/AnyValueTests.scala b/tests/pos/hylolib/AnyValueTests.scala new file mode 100644 index 000000000000..96d3563f4f53 --- /dev/null +++ b/tests/pos/hylolib/AnyValueTests.scala @@ -0,0 +1,15 @@ +//> using options -language:experimental.modularity -source future +import hylo.* +import hylo.given + +class AnyValueTests extends munit.FunSuite: + + test("eq"): + val a = AnyValue(1) + assert(a `eq` a) + assert(!(a `neq` a)) + + val b = AnyValue(2) + assert(!(a `eq` b)) + assert(a `neq` b) + diff --git a/tests/pos/hylolib/BitArray.scala b/tests/pos/hylolib/BitArray.scala new file mode 100644 index 000000000000..6ef406e5ad83 --- /dev/null +++ b/tests/pos/hylolib/BitArray.scala @@ -0,0 +1,362 @@ +package hylo + +import scala.collection.mutable + +/** An array of bit values represented as Booleans, where `true` indicates that the bit is on. */ +final class BitArray private ( + private var _bits: HyArray[Int], + private var _count: Int +) { + + /** Returns `true` iff `this` is empty. */ + def isEmpty: Boolean = + _count == 0 + + /** Returns the number of elements in `this`. */ + def count: Int = + _count + + /** The number of bits that the array can contain before allocating new storage. */ + def capacity: Int = + _bits.capacity << 5 + + /** Reserves enough storage to store `n` elements in `this`. */ + def reserveCapacity(n: Int, assumeUniqueness: Boolean = false): BitArray = + if (n == 0) { + this + } else { + val k = 1 + ((n - 1) >> 5) + if (assumeUniqueness) { + _bits = _bits.reserveCapacity(k, assumeUniqueness) + this + } else { + new BitArray(_bits.reserveCapacity(k), _count) + } + } + + /** Adds a new element at the end of the array. */ + def append(bit: Boolean, assumeUniqueness: Boolean = false): BitArray = + val result = if assumeUniqueness && (count < capacity) then this else copy(count + 1) + val p = BitArray.Position(count) + if (p.bucket >= _bits.count) { + result._bits = _bits.append(if bit then 1 else 0) + } else { + result.setValue(bit, p) + } + result._count += 1 + result + + /** Removes and returns the last element, or returns `None` if the array is empty. */ + def popLast(assumeUniqueness: Boolean = false): (BitArray, Option[Boolean]) = + if (isEmpty) { + (this, None) + } else { + val result = if assumeUniqueness then this else copy() + val bit = result.at(BitArray.Position(count)) + result._count -= 1 + (result, Some(bit)) + } + + /** Removes all elements in the array, keeping allocated storage iff `keepStorage` is true. */ + def removeAll( + keepStorage: Boolean = false, + assumeUniqueness: Boolean = false + ): BitArray = + if (isEmpty) { + this + } else if (keepStorage) { + val result = if assumeUniqueness then this else copy() + result._bits.removeAll(keepStorage, assumeUniqueness = true) + result._count = 0 + result + } else { + BitArray() + } + + /** Returns `true` iff all elements in `this` are `false`. */ + def allFalse: Boolean = + if (isEmpty) { + true + } else { + val k = (count - 1) >> 5 + def loop(i: Int): Boolean = + if (i == k) { + val m = (1 << (count & 31)) - 1 + (_bits.at(k) & m) == 0 + } else if (_bits.at(i) != 0) { + false + } else { + loop(i + 1) + } + loop(0) + } + + /** Returns `true` iff all elements in `this` are `true`. */ + def allTrue: Boolean = + if (isEmpty) { + true + } else { + val k = (count - 1) >> 5 + def loop(i: Int): Boolean = + if (i == k) { + val m = (1 << (count & 31)) - 1 + (_bits.at(k) & m) == m + } else if (_bits.at(i) != ~0) { + false + } else { + loop(i + 1) + } + loop(0) + } + + /** Returns the bitwise OR of `this` and `other`. */ + def | (other: BitArray): BitArray = + val result = copy() + result.applyBitwise(other, _ | _, assumeUniqueness = true) + + /** Returns the bitwise AND of `this` and `other`. */ + def & (other: BitArray): BitArray = + val result = copy() + result.applyBitwise(other, _ & _, assumeUniqueness = true) + + /** Returns the bitwise XOR of `this` and `other`. */ + def ^ (other: BitArray): BitArray = + val result = copy() + result.applyBitwise(other, _ ^ _, assumeUniqueness = true) + + /** Assigns each bits in `this` to the result of `operation` applied on those bits and their + * corresponding bits in `other`. + * + * @requires + * `self.count == other.count`. + */ + private def applyBitwise( + other: BitArray, + operation: (Int, Int) => Int, + assumeUniqueness: Boolean = false + ): BitArray = + require(this.count == other.count) + if (isEmpty) { + this + } else { + val result = if assumeUniqueness then this else copy() + var u = assumeUniqueness + val k = (count - 1) >> 5 + + for (i <- 0 until k) { + result._bits = result._bits.modifyAt( + i, (n) => operation(n, other._bits.at(n)), + assumeUniqueness = u + ) + u = true + } + val m = (1 << (count & 31)) - 1 + result._bits = result._bits.modifyAt( + k, (n) => operation(n & m, other._bits.at(k) & m), + assumeUniqueness = u + ) + + result + } + + /** Returns the position of `this`'s first element', or `endPosition` if `this` is empty. + * + * @complexity + * O(1). + */ + def startPosition: BitArray.Position = + BitArray.Position(0) + + /** Returns the "past the end" position in `this`, that is, the position immediately after the + * last element in `this`. + * + * @complexity + * O(1). + */ + def endPosition: BitArray.Position = + BitArray.Position(count) + + /** Returns the position immediately after `p`. + * + * @requires + * `p` is a valid position in `self` different from `endPosition`. + * @complexity + * O(1). + */ + def positionAfter(p: BitArray.Position): BitArray.Position = + if (p.offsetInBucket == 63) { + BitArray.Position(p.bucket + 1, 0) + } else { + BitArray.Position(p.bucket, p.offsetInBucket + 1) + } + + /** Accesses the element at `p`. + * + * @requires + * `p` is a valid position in `self` different from `endPosition`. + * @complexity + * O(1). + */ + def at(p: BitArray.Position): Boolean = + val m = 1 << p.offsetInBucket + val b: Int = _bits.at(p.bucket) + (b & m) == m + + /** Accesses the `i`-th element of `this`. + * + * @requires + * `i` is greater than or equal to 0, and less than `count`. + * @complexity + * O(1). + */ + def atIndex(i: Int): Boolean = + at(BitArray.Position(i)) + + /** Calls `transform` on the element at `p` to update its value. + * + * @requires + * `p` is a valid position in `self` different from `endPosition`. + * @complexity + * O(1). + */ + def modifyAt( + p: BitArray.Position, + transform: (Boolean) => Boolean, + assumeUniqueness: Boolean = false + ): BitArray = + val result = if assumeUniqueness then this else copy() + result.setValue(transform(result.at(p)), p) + result + + /** Calls `transform` on `i`-th element of `this` to update its value. + * + * @requires + * `i` is greater than or equal to 0, and less than `count`. + * @complexity + * O(1). + */ + def modifyAtIndex( + i: Int, + transform: (Boolean) => Boolean, + assumeUniqueness: Boolean = false + ): BitArray = + modifyAt(BitArray.Position(i), transform, assumeUniqueness) + + /** Returns an independent copy of `this`. */ + def copy(minimumCapacity: Int = 0): BitArray = + if (minimumCapacity > capacity) { + // If the requested capacity on the copy is greater than what we have, `reserveCapacity` will + // create an independent value. + reserveCapacity(minimumCapacity) + } else { + val k = 1 + ((minimumCapacity - 1) >> 5) + val newBits = _bits.copy(k) + new BitArray(newBits, _count) + } + + /** Returns a textual description of `this`. */ + override def toString: String = + _bits.toString + + /** Sets the value `b` for the bit at position `p`. + * + * @requires + * `this` is uniquely referenced and `p` is a valid position in `this`. + */ + private def setValue(b: Boolean, p: BitArray.Position): Unit = + val m = 1 << p.offsetInBucket + _bits = _bits.modifyAt( + p.bucket, + (e) => if b then e | m else e & ~m, + assumeUniqueness = true + ) + +} + +object BitArray { + + /** A position in a `BitArray`. + * + * @param bucket + * The bucket containing `this`. + * @param offsetInBucket + * The offset of `this` in its containing bucket. + */ + final class Position( + private[BitArray] val bucket: Int, + private[BitArray] val offsetInBucket: Int + ) { + + /** Creates a position from an index. */ + private[BitArray] def this(index: Int) = + this(index >> 5, index & 31) + + /** Returns the index corresponding to this position. */ + private def index: Int = + (bucket >> 5) + offsetInBucket + + /** Returns a copy of `this`. */ + def copy(): Position = + new Position(bucket, offsetInBucket) + + /** Returns `true` iff `this` and `other` have an equivalent value. */ + def eq(other: Position): Boolean = + (this.bucket == other.bucket) && (this.offsetInBucket == other.offsetInBucket) + + /** Hashes the salient parts of `self` into `hasher`. */ + def hashInto(hasher: Hasher): Hasher = + hasher.combine(bucket) + hasher.combine(offsetInBucket) + + } + + /** Creates an array with the given `bits`. */ + def apply[T](bits: Boolean*): BitArray = + var result = new BitArray(HyArray[Int](), 0) + for (b <- bits) result = result.append(b, assumeUniqueness = true) + result + +} + +given BitArray.Position is Value: + + extension (self: BitArray.Position) + + def copy(): BitArray.Position = + self.copy() + + def eq(other: BitArray.Position): Boolean = + self.eq(other) + + def hashInto(hasher: Hasher): Hasher = + self.hashInto(hasher) + +given BitArray is Collection: + + type Element = Boolean + type Position = BitArray.Position + + extension (self: BitArray) + + override def count: Int = + self.count + + def startPosition: BitArray.Position = + self.startPosition + + def endPosition: BitArray.Position = + self.endPosition + + def positionAfter(p: BitArray.Position): BitArray.Position = + self.positionAfter(p) + + def at(p: BitArray.Position): Boolean = + self.at(p) + +given BitArray is StringConvertible: + extension (self: BitArray) + override def description: String = + var contents = mutable.StringBuilder() + self.forEach((e) => { contents += (if e then '1' else '0'); true }) + contents.mkString + diff --git a/tests/pos/hylolib/Collection.scala b/tests/pos/hylolib/Collection.scala new file mode 100644 index 000000000000..bef86a967e6e --- /dev/null +++ b/tests/pos/hylolib/Collection.scala @@ -0,0 +1,267 @@ +//> using options -language:experimental.modularity -source future +package hylo + +/** A collection of elements accessible by their position. */ +trait Collection: + type Self + + /** The type of the elements in the collection. */ + type Element: Value + + /** The type of a position in the collection. */ + type Position: Value + + extension (self: Self) + + /** Returns `true` iff `self` is empty. */ + def isEmpty: Boolean = + startPosition `eq` endPosition + + /** Returns the number of elements in `self`. + * + * @complexity + * O(n) where n is the number of elements in `self`. + */ + def count: Int = + val e = endPosition + def loop(p: Position, n: Int): Int = + if p `eq` e then n else loop(self.positionAfter(p), n + 1) + loop(startPosition, 0) + + /** Returns the position of `self`'s first element', or `endPosition` if `self` is empty. + * + * @complexity + * O(1) + */ + def startPosition: Position + + /** Returns the "past the end" position in `self`, that is, the position immediately after the + * last element in `self`. + * + * @complexity + * O(1). + */ + def endPosition: Position + + /** Returns the position immediately after `p`. + * + * @requires + * `p` is a valid position in `self` different from `endPosition`. + * @complexity + * O(1). + */ + def positionAfter(p: Position): Position + + /** Accesses the element at `p`. + * + * @requires + * `p` is a valid position in `self` different from `endPosition`. + * @complexity + * O(1). + */ + def at(p: Position): Element + + /** Returns `true` iff `i` precedes `j`. + * + * @requires + * `i` and j` are valid positions in `self`. + * @complexity + * O(n) where n is the number of elements in `self`. + */ + def isBefore(i: Position, j: Position): Boolean = + val e = self.endPosition + if i `eq` e then false + else if j `eq` e then true + else + def recur(n: Position): Boolean = + if n `eq` j then true + else if n `eq` e then false + else recur(self.positionAfter(n)) + recur(self.positionAfter(i)) + + class Slice2(val base: Self, val bounds: Range[Position]): + + def isEmpty: Boolean = + bounds.lowerBound.eq(bounds.upperBound) + + def startPosition: Position = + bounds.lowerBound + + def endPosition: Position = + bounds.upperBound + + def at(p: Position): Element = + base.at(p) + end Slice2 + +end Collection + +extension [Self: Collection](self: Self) + + /** Returns the first element of `self` along with a slice containing the suffix after this + * element, or `None` if `self` is empty. + * + * @complexity + * O(1) + */ + def headAndTail: Option[(Self.Element, Slice[Self])] = + if self.isEmpty then + None + else + val p = self.startPosition + val q = self.positionAfter(p) + val t = Slice(self, Range(q, self.endPosition, (a, b) => (a `eq` b) || self.isBefore(a, b))) + Some((self.at(p), t)) + + def headAndTail2: Option[(Self.Element, Self.Slice2)] = + if self.isEmpty then + None + else + val p = self.startPosition + val q = self.positionAfter(p) + val t = Self.Slice2(self, Range(q, self.endPosition, (a, b) => (a `eq` b) || self.isBefore(a, b))) + Some((self.at(p), t)) + + /** Applies `combine` on `partialResult` and each element of `self`, in order. + * + * @complexity + * O(n) where n is the number of elements in `self`. + */ + def reduce[T](partialResult: T)(combine: (T, Self.Element) => T): T = + val e = self.endPosition + def loop(p: Self.Position, r: T): T = + if p `eq` e then r + else loop(self.positionAfter(p), combine(r, self.at(p))) + loop(self.startPosition, partialResult) + + /** Applies `action` on each element of `self`, in order, until `action` returns `false`, and + * returns `false` iff `action` did. + * + * You can return `false` from `action` to emulate a `continue` statement as found in traditional + * imperative languages (e.g., C). + * + * @complexity + * O(n) where n is the number of elements in `self`. + */ + def forEach(action: Self.Element => Boolean): Boolean = + val e = self.endPosition + def loop(p: Self.Position): Boolean = + if p `eq` e then true + else if !action(self.at(p)) then false + else loop(self.positionAfter(p)) + loop(self.startPosition) + + /** Returns a collection with the elements of `self` transformed by `transform`, in order. + * + * @complexity + * O(n) where n is the number of elements in `self`. + */ + def map[T: Value](transform: Self.Element => T): HyArray[T] = + self.reduce(HyArray[T]()): (r, e) => + r.append(transform(e), assumeUniqueness = true) + + /** Returns a collection with the elements of `self` satisfying `isInclude`, in order. + * + * @complexity + * O(n) where n is the number of elements in `self`. + */ + def filter(isIncluded: Self.Element => Boolean): HyArray[Self.Element] = + self.reduce(HyArray[Self.Element]()): (r, e) => + if isIncluded(e) then r.append(e, assumeUniqueness = true) else r + + /** Returns `true` if `self` contains an element satisfying `predicate`. + * + * @complexity + * O(n) where n is the number of elements in `self`. + */ + def containsWhere(predicate: Self.Element => Boolean): Boolean = + self.firstPositionWhere(predicate) != None + + /** Returns `true` if all elements in `self` satisfy `predicate`. + * + * @complexity + * O(n) where n is the number of elements in `self`. + */ + def allSatisfy(predicate: Self.Element => Boolean): Boolean = + self.firstPositionWhere(predicate) == None + + /** Returns the position of the first element of `self` satisfying `predicate`, or `None` if no + * such element exists. + * + * @complexity + * O(n) where n is the number of elements in `self`. + */ + def firstPositionWhere(predicate: Self.Element => Boolean): Option[Self.Position] = + val e = self.endPosition + def loop(p: Self.Position): Option[Self.Position] = + if p `eq` e then None + else if predicate(self.at(p)) then Some(p) + else loop(self.positionAfter(p)) + loop(self.startPosition) + + /** Returns the minimum element in `self`, using `isLessThan` to compare elements. + * + * @complexity + * O(n) where n is the number of elements in `self`. + */ + def minElement(isLessThan: (Self.Element, Self.Element) => Boolean): Option[Self.Element] = + self.leastElement(isLessThan) + + // NOTE: I can't find a reasonable way to call this method. + /** Returns the minimum element in `self`. + * + * @complexity + * O(n) where n is the number of elements in `self`. + */ + def minElement()(using Self.Element is Comparable): Option[Self.Element] = + self.minElement(isLessThan = _ `lt` _) + + /** Returns the maximum element in `self`, using `isGreaterThan` to compare elements. + * + * @complexity + * O(n) where n is the number of elements in `self`. + */ + def maxElement(isGreaterThan: (Self.Element, Self.Element) => Boolean): Option[Self.Element] = + self.leastElement(isGreaterThan) + + /** Returns the maximum element in `self`. + * + * @complexity + * O(n) where n is the number of elements in `self`. + */ + def maxElement()(using Self.Element is Comparable): Option[Self.Element] = + self.maxElement(isGreaterThan = _ `gt` _) + + /** Returns the maximum element in `self`, using `isOrderedBefore` to compare elements. + * + * @complexity + * O(n) where n is the number of elements in `self`. + */ + def leastElement(isOrderedBefore: (Self.Element, Self.Element) => Boolean): Option[Self.Element] = + if self.isEmpty then + None + else + val e = self.endPosition + def loop(p: Self.Position, least: Self.Element): Self.Element = + if p `eq` e then + least + else + val x = self.at(p) + val y = if isOrderedBefore(x, least) then x else least + loop(self.positionAfter(p), y) + val b = self.startPosition + Some(loop(self.positionAfter(b), self.at(b))) + + /** Returns `true` if `self` contains the same elements as `other`, in the same order. */ + def elementsEqual[T: Collection { type Element = Self.Element } ](other: T): Boolean = + def loop(i: Self.Position, j: T.Position): Boolean = + if i `eq` self.endPosition then + j `eq` other.endPosition + else if j `eq` other.endPosition then + false + else if self.at(i) `neq` other.at(j)then + false + else + loop(self.positionAfter(i), other.positionAfter(j)) + loop(self.startPosition, other.startPosition) +end extension diff --git a/tests/pos/hylolib/CollectionTests.scala b/tests/pos/hylolib/CollectionTests.scala new file mode 100644 index 000000000000..d884790f64d7 --- /dev/null +++ b/tests/pos/hylolib/CollectionTests.scala @@ -0,0 +1,67 @@ +//> using options -language:experimental.modularity -source future +import hylo.* +import hylo.given + +class CollectionTests extends munit.FunSuite: + + test("isEmpty"): + val empty = AnyCollection(HyArray[Int]()) + assert(empty.isEmpty) + + val nonEmpty = AnyCollection(HyArray[Int](1, 2)) + assert(!nonEmpty.isEmpty) + + test("count"): + val a = AnyCollection(HyArray[Int](1, 2)) + assertEquals(a.count, 2) + + test("isBefore"): + val empty = AnyCollection(HyArray[Int]()) + assert(!empty.isBefore(empty.startPosition, empty.endPosition)) + + val nonEmpty = AnyCollection(HyArray[Int](1, 2)) + val p0 = nonEmpty.startPosition + val p1 = nonEmpty.positionAfter(p0) + val p2 = nonEmpty.positionAfter(p1) + assert(nonEmpty.isBefore(p0, nonEmpty.endPosition)) + assert(nonEmpty.isBefore(p1, nonEmpty.endPosition)) + assert(!nonEmpty.isBefore(p2, nonEmpty.endPosition)) + + test("headAndTail"): + val empty = AnyCollection(HyArray[Int]()) + assertEquals(empty.headAndTail, None) + + val one = AnyCollection(HyArray[Int](1)) + val Some((h0, t0)) = one.headAndTail: @unchecked + assert(h0 eq 1) + assert(t0.isEmpty) + + val two = AnyCollection(HyArray[Int](1, 2)) + val Some((h1, t1)) = two.headAndTail: @unchecked + assertEquals(h1, 1) + assertEquals(t1.count, 1) + + test("reduce"): + val empty = AnyCollection(HyArray[Int]()) + assertEquals(empty.reduce(0)((s, x) => s + x), 0) + + val nonEmpty = AnyCollection(HyArray[Int](1, 2, 3)) + assertEquals(nonEmpty.reduce(0)((s, x) => s + x), 6) + + test("forEach"): + val empty = AnyCollection(HyArray[Int]()) + assert(empty.forEach((e) => false)) + + val nonEmpty = AnyCollection(HyArray[Int](1, 2, 3)) + var s = 0 + assert(nonEmpty.forEach((e) => { s += e; true })) + assertEquals(s, 6) + + s = 0 + assert(!nonEmpty.forEach((e) => { s += e; false })) + assertEquals(s, 1) + + test("elementsEqual"): + val a = HyArray(1, 2) + assert(a.elementsEqual(a)) +end CollectionTests diff --git a/tests/pos/hylolib/CoreTraits.scala b/tests/pos/hylolib/CoreTraits.scala new file mode 100644 index 000000000000..f4b3699b430e --- /dev/null +++ b/tests/pos/hylolib/CoreTraits.scala @@ -0,0 +1,56 @@ +package hylo + +/** A type whose instance can be treated as independent values. + * + * The data structure of and algorithms of Hylo's standard library operate "notional values" rather + * than arbitrary references. This trait defines the basis operations of all values. + */ +trait Value: + type Self + + extension (self: Self) { + + /** Returns a copy of `self`. */ + def copy(): Self + + /** Returns `true` iff `self` and `other` have an equivalent value. */ + def eq(other: Self): Boolean + + def neq(other: Self): Boolean = !self.eq(other) + + /** Hashes the salient parts of `self` into `hasher`. */ + def hashInto(hasher: Hasher): Hasher + + } + +// ---------------------------------------------------------------------------- +// Comparable +// ---------------------------------------------------------------------------- + +trait Comparable extends Value { + + extension (self: Self) { + + /** Returns `true` iff `self` is ordered before `other`. */ + def lt(other: Self): Boolean + + /** Returns `true` iff `self` is ordered after `other`. */ + def gt(other: Self): Boolean = other.lt(self) + + /** Returns `true` iff `self` is equal to or ordered before `other`. */ + def le(other: Self): Boolean = !other.lt(self) + + /** Returns `true` iff `self` is equal to or ordered after `other`. */ + def ge(other: Self): Boolean = !self.lt(other) + + } + +} + +/** Returns the lesser of `x` and `y`. */ +def min[T: Comparable](x: T, y: T): T = + if y.lt(x) then y else x + +/** Returns the greater of `x` and `y`. */ +def max[T: Comparable](x: T, y: T): T = + if x.lt(y) then y else x diff --git a/tests/pos/hylolib/Hasher.scala b/tests/pos/hylolib/Hasher.scala new file mode 100644 index 000000000000..ca45550ed002 --- /dev/null +++ b/tests/pos/hylolib/Hasher.scala @@ -0,0 +1,39 @@ +//> using options -language:experimental.modularity -source future +package hylo + +import scala.util.Random + +/** A universal hash function. */ +final class Hasher private (private val hash: Int = Hasher.offsetBasis) { + + /** Returns the computed hash value. */ + def finalizeHash(): Int = + hash + + /** Adds `n` to the computed hash value. */ + def combine(n: Int): Hasher = + var h = hash + h = h ^ n + h = h * Hasher.prime + new Hasher(h) +} + +object Hasher { + + private val offsetBasis = 0x811c9dc5 + private val prime = 0x01000193 + + /** A random seed ensuring different hashes across multiple runs. */ + private lazy val seed = scala.util.Random.nextInt() + + /** Creates an instance with the given `seed`. */ + def apply(): Hasher = + val h = new Hasher() + h.combine(seed) + h + + /** Returns the hash of `v`. */ + def hash[T: Value](v: T): Int = + v.hashInto(Hasher()).finalizeHash() + +} diff --git a/tests/pos/hylolib/HyArray.scala b/tests/pos/hylolib/HyArray.scala new file mode 100644 index 000000000000..de5e83d3b1a3 --- /dev/null +++ b/tests/pos/hylolib/HyArray.scala @@ -0,0 +1,202 @@ +//> using options -language:experimental.modularity -source future +package hylo + +import java.util.Arrays +import scala.collection.mutable + +/** An ordered, random-access collection. */ +final class HyArray[Element: Value as elementIsCValue]( + private var _storage: scala.Array[AnyRef | Null] | Null, + private var _count: Int // NOTE: where do I document private fields +) { + + // NOTE: The fact that we need Array[AnyRef] is diappointing and difficult to discover + // The compiler error sent me on a wild goose chase with ClassTag. + + /** Returns `true` iff `this` is empty. */ + def isEmpty: Boolean = + _count == 0 + + /** Returns the number of elements in `this`. */ + def count: Int = + _count + + /** Returns the number of elements that `this` can contain before allocating new storage. */ + def capacity: Int = + if _storage == null then 0 else _storage.length + + /** Reserves enough storage to store `n` elements in `this`. */ + def reserveCapacity(n: Int, assumeUniqueness: Boolean = false): HyArray[Element] = + if (n <= capacity) { + this + } else { + var newCapacity = max(1, capacity) + while (newCapacity < n) { newCapacity = newCapacity << 1 } + + val newStorage = new scala.Array[AnyRef | Null](newCapacity) + val s = _storage.asInstanceOf[scala.Array[AnyRef | Null]] + var i = 0 + while (i < count) { + newStorage(i) = _storage(i).asInstanceOf[Element].copy().asInstanceOf[AnyRef] + i += 1 + } + + if (assumeUniqueness) { + _storage = newStorage + this + } else { + new HyArray(newStorage, count) + } + } + + /** Adds a new element at the end of the array. */ + def append(source: Element, assumeUniqueness: Boolean = false): HyArray[Element] = + val result = if assumeUniqueness && (count < capacity) then this else copy(count + 1) + result._storage(count) = source.asInstanceOf[AnyRef] + result._count += 1 + result + + /** Adds the contents of `source` at the end of the array. */ + def appendContents[C: Collection { type Element = HyArray.this.Element }]( + source: C, assumeUniqueness: Boolean = false + ): HyArray[Element] = + val result = if (assumeUniqueness) { this } else { copy(count + source.count) } + source.reduce(result): (r, e) => + r.append(e, assumeUniqueness = true) + + /** Removes and returns the last element, or returns `None` if the array is empty. */ + def popLast(assumeUniqueness: Boolean = false): (HyArray[Element], Option[Element]) = + if (isEmpty) { + (this, None) + } else { + val result = if assumeUniqueness then this else copy() + result._count -= 1 + (result, Some(result._storage(result._count).asInstanceOf[Element])) + } + + /** Removes all elements in the array, keeping allocated storage iff `keepStorage` is true. */ + def removeAll( + keepStorage: Boolean = false, + assumeUniqueness: Boolean = false + ): HyArray[Element] = + if (isEmpty) { + this + } else if (keepStorage) { + val result = if assumeUniqueness then this else copy() + Arrays.fill(result._storage, null) + result._count = 0 + result + } else { + HyArray() + } + + /** Accesses the element at `p`. + * + * @requires + * `p` is a valid position in `self` different from `endPosition`. + * @complexity + * O(1). + */ + def at(p: Int): Element = + _storage(p).asInstanceOf[Element] + + /** Calls `transform` on the element at `p` to update its value. + * + * @requires + * `p` is a valid position in `self` different from `endPosition`. + * @complexity + * O(1). + */ + def modifyAt( + p: Int, + transform: (Element) => Element, + assumeUniqueness: Boolean = false + ): HyArray[Element] = + val result = if assumeUniqueness then this else copy() + result._storage(p) = transform(at(p)).asInstanceOf[AnyRef] + result + + /** Returns a textual description of `this`. */ + override def toString: String = + var s = "[" + var i = 0 + while (i < count) { + if (i > 0) { s += ", " } + s += s"${at(i)}" + i += 1 + } + s + "]" + + /** Returns an independent copy of `this`, capable of storing `minimumCapacity` elements before + * allocating new storage. + */ + def copy(minimumCapacity: Int = 0): HyArray[Element] = + if (minimumCapacity > capacity) { + // If the requested capacity on the copy is greater than what we have, `reserveCapacity` will + // create an independent value. + reserveCapacity(minimumCapacity) + } else { + val clone = HyArray[Element]().reserveCapacity(max(minimumCapacity, count)) + var i = 0 + while (i < count) { + clone._storage(i) = _storage(i).asInstanceOf[Element].copy().asInstanceOf[AnyRef] + i += 1 + } + clone._count = count + clone + } + +} + +object HyArray { + + /** Creates an array with the given `elements`. */ + def apply[T: Value](elements: T*): HyArray[T] = + var a = new HyArray[T](null, 0) + for (e <- elements) a = a.append(e, assumeUniqueness = true) + a + +} + +given [T: Value] => HyArray[T] is Value: + + extension (self: HyArray[T]) + + def copy(): HyArray[T] = + self.copy() + + def eq(other: HyArray[T]): Boolean = + self.elementsEqual(other) + + def hashInto(hasher: Hasher): Hasher = + self.reduce(hasher)((h, e) => e.hashInto(h)) + +given [T: Value] => HyArray[T] is Collection: + + type Element = T + type Position = Int + + extension (self: HyArray[T]) + + // NOTE: Having to explicitly override means that primary declaration can't automatically + // specialize trait requirements. + override def isEmpty: Boolean = self.isEmpty + + override def count: Int = self.count + + def startPosition = 0 + + def endPosition = self.count + + def positionAfter(p: Int) = p + 1 + + def at(p: Int) = self.at(p) + +given [T: {Value, StringConvertible}] => HyArray[T] is StringConvertible: + extension (self: HyArray[T]) + override def description: String = + val contents = mutable.StringBuilder() + self.forEach: e => + contents ++= e.description + true + s"[${contents.mkString(", ")}]" diff --git a/tests/pos/hylolib/HyArrayTests.scala b/tests/pos/hylolib/HyArrayTests.scala new file mode 100644 index 000000000000..0de65603d0c7 --- /dev/null +++ b/tests/pos/hylolib/HyArrayTests.scala @@ -0,0 +1,17 @@ +import hylo.* +import hylo.given + +class HyArrayTests extends munit.FunSuite: + + test("reserveCapacity"): + var a = HyArray[Int]() + a = a.append(1) + a = a.append(2) + + a = a.reserveCapacity(10) + assert(a.capacity >= 10) + assertEquals(a.count, 2) + assertEquals(a.at(0), 1) + assertEquals(a.at(1), 2) + +end HyArrayTests diff --git a/tests/pos/hylolib/Integers.scala b/tests/pos/hylolib/Integers.scala new file mode 100644 index 000000000000..f7334ae40786 --- /dev/null +++ b/tests/pos/hylolib/Integers.scala @@ -0,0 +1,46 @@ +package hylo + +given Boolean is Value: + + extension (self: Boolean) + + def copy(): Boolean = + // Note: Scala's `Boolean` has value semantics already. + self + + def eq(other: Boolean): Boolean = + self == other + + def hashInto(hasher: Hasher): Hasher = + hasher.combine(if self then 1 else 0) + +given Int is Value: + + extension (self: Int) + + def copy(): Int = + // Note: Scala's `Int` has value semantics already. + self + + def eq(other: Int): Boolean = + self == other + + def hashInto(hasher: Hasher): Hasher = + hasher.combine(self) + +given Int is Comparable: + + extension (self: Int) + + def copy(): Int = + self + + def eq(other: Int): Boolean = + self == other + + def hashInto(hasher: Hasher): Hasher = + hasher.combine(self) + + def lt(other: Int): Boolean = self < other + +given Int is StringConvertible diff --git a/tests/pos/hylolib/IntegersTests.scala b/tests/pos/hylolib/IntegersTests.scala new file mode 100644 index 000000000000..74dedf30d83e --- /dev/null +++ b/tests/pos/hylolib/IntegersTests.scala @@ -0,0 +1,14 @@ +//> using options -language:experimental.modularity -source future +import hylo.* +import hylo.given + +class IntegersTests extends munit.FunSuite: + + test("Int.hashInto"): + val x = Hasher.hash(42) + val y = Hasher.hash(42) + assertEquals(x, y) + + val z = Hasher.hash(1337) + assertNotEquals(x, z) + diff --git a/tests/pos/hylolib/Range.scala b/tests/pos/hylolib/Range.scala new file mode 100644 index 000000000000..b0f50dd55c8c --- /dev/null +++ b/tests/pos/hylolib/Range.scala @@ -0,0 +1,37 @@ +package hylo + +/** A half-open interval from a lower bound up to, but not including, an uppor bound. */ +final class Range[Bound] private (val lowerBound: Bound, val upperBound: Bound) { + + /** Returns a textual description of `this`. */ + override def toString: String = + s"[${lowerBound}, ${upperBound})" + +} + +object Range { + + /** Creates a half-open interval [`lowerBound`, `upperBound`), using `isLessThanOrEqual` to ensure + * that the bounds are well-formed. + * + * @requires + * `lowerBound` is lesser than or equal to `upperBound`. + */ + def apply[Bound]( + lowerBound: Bound, + upperBound: Bound, + isLessThanOrEqual: (Bound, Bound) => Boolean + ) = + require(isLessThanOrEqual(lowerBound, upperBound)) + new Range(lowerBound, upperBound) + + /** Creates a half-open interval [`lowerBound`, `upperBound`). + * + * @requires + * `lowerBound` is lesser than or equal to `upperBound`. + */ + def apply[Bound: Comparable](lowerBound: Bound, upperBound: Bound) = + require(lowerBound `le` upperBound) + new Range(lowerBound, upperBound) + +} diff --git a/tests/pos/hylolib/Slice.scala b/tests/pos/hylolib/Slice.scala new file mode 100644 index 000000000000..d54f855b1041 --- /dev/null +++ b/tests/pos/hylolib/Slice.scala @@ -0,0 +1,63 @@ +package hylo + +/** A view into a collection. */ +final class Slice[Base: Collection]( + val base: Base, + val bounds: Range[Base.Position] +) { + + /** Returns `true` iff `this` is empty. */ + def isEmpty: Boolean = + bounds.lowerBound.eq(bounds.upperBound) + + def startPosition: Base.Position = + bounds.lowerBound + + def endPosition: Base.Position = + bounds.upperBound + + def positionAfter(p: Base.Position): Base.Position = + base.positionAfter(p) + + def at(p: Base.Position): Base.Element = + base.at(p) + +} + +given [C: Collection] => Slice[C] is Collection: + + type Element = C.Element + type Position = C.Position + + extension (self: Slice[C]) + + def startPosition = self.bounds.lowerBound.asInstanceOf[Position] + // This is actually unsafe. We have: + // self.bounds: Range(Slice[C].Base.Position) + // But the _value_ of Slice[C].Base is not necssarily this given, even + // though it is true that `type Slice[C].Base = C`. There might be multiple + // implementations of `Slice[C] is Collection` that define different `Position` + // types. So we cannot conclude that `Slice[C].Base.Position = this.Position`. + // To make this safe, we'd need some form of coherence, where we ensure that + // there is only one way to implement `Slice is Collection`. + // + // As an alternativem we can make Slice dependent on the original Collection + // _instance_ instead of the original Collection _type_. This design is + // realized by the Slice2 definitions. It works without casts. + + def endPosition = self.bounds.upperBound.asInstanceOf[Position] + + def positionAfter(p: Position) = self.base.positionAfter(p) + + def at(p: Position) = self.base.at(p) + +given [C: Collection] => C.Slice2 is Collection: + type Element = C.Element + type Position = C.Position + + extension (self: C.Slice2) + + def startPosition = self.bounds.lowerBound + def endPosition = self.bounds.upperBound + def positionAfter(p: Position) = self.base.positionAfter(p) + def at(p: Position) = self.base.at(p) diff --git a/tests/pos/hylolib/StringConvertible.scala b/tests/pos/hylolib/StringConvertible.scala new file mode 100644 index 000000000000..cf901d9a3313 --- /dev/null +++ b/tests/pos/hylolib/StringConvertible.scala @@ -0,0 +1,9 @@ +package hylo + +/** A type whose instances can be described by a character string. */ +trait StringConvertible: + type Self + + /** Returns a textual description of `self`. */ + extension (self: Self) + def description: String = self.toString diff --git a/tests/pos/hylolib/Test.scala b/tests/pos/hylolib/Test.scala new file mode 100644 index 000000000000..9e8d6181affd --- /dev/null +++ b/tests/pos/hylolib/Test.scala @@ -0,0 +1,16 @@ +//> using options -language:experimental.modularity -source future +import hylo.* +import hylo.given + +object munit: + open class FunSuite: + def test(name: String)(op: => Unit): Unit = op + def assertEquals[T](x: T, y: T) = assert(x == y) + def assertNotEquals[T](x: T, y: T) = assert(x != y) + +@main def Test = + CollectionTests() + AnyValueTests() + HyArrayTests() + IntegersTests() + println("done") diff --git a/tests/pos/i10929-new-syntax.scala b/tests/pos/i10929-new-syntax.scala new file mode 100644 index 000000000000..11c5e9313d4c --- /dev/null +++ b/tests/pos/i10929-new-syntax.scala @@ -0,0 +1,22 @@ +//> using options -language:experimental.modularity -source future +trait TupleOf[+A]: + type Self + type Mapped[+A] <: Tuple + def map[B](x: Self)(f: A => B): Mapped[B] + +object TupleOf: + + given EmptyTuple is TupleOf[Nothing]: + type Mapped[+A] = EmptyTuple + def map[B](x: EmptyTuple)(f: Nothing => B): Mapped[B] = x + + given [A, Rest <: Tuple : TupleOf[A]] => A *: Rest is TupleOf[A]: + type Mapped[+A] = A *: Rest.Mapped[A] + def map[B](x: A *: Rest)(f: A => B): Mapped[B] = + (f(x.head) *: Rest.map(x.tail)(f)) + +def foo[T: TupleOf[Int]](xs: T): T.Mapped[Int] = T.map(xs)(_ + 1) + +@main def test = + foo(EmptyTuple): EmptyTuple // ok + foo(1 *: EmptyTuple): Int *: EmptyTuple // now also ok diff --git a/tests/pos/ord-over-tracked.scala b/tests/pos/ord-over-tracked.scala new file mode 100644 index 000000000000..a9b4aba556e1 --- /dev/null +++ b/tests/pos/ord-over-tracked.scala @@ -0,0 +1,15 @@ +import language.experimental.modularity + +trait Ord[T]: + def lt(x: T, y: T): Boolean + +given Ord[Int] = ??? + +case class D(tracked val x: Int) +given [T <: D]: Ord[T] = (a, b) => a.x < b.x + +def mySort[T: Ord](x: Array[T]): Array[T] = ??? + +def test = + val arr = Array(D(1)) + val arr1 = mySort(arr) // error: no given instance of type Ord[D{val x: (1 : Int)}] \ No newline at end of file diff --git a/tests/pos/parsercombinators-arrow.scala b/tests/pos/parsercombinators-arrow.scala new file mode 100644 index 000000000000..f8bec02067e5 --- /dev/null +++ b/tests/pos/parsercombinators-arrow.scala @@ -0,0 +1,48 @@ +//> using options -language:experimental.modularity -source future +import collection.mutable + +/// A parser combinator. +trait Combinator: + + type Self + + /// The context from which elements are being parsed, typically a stream of tokens. + type Context + /// The element being parsed. + type Element + + extension (self: Self) + /// Parses and returns an element from `context`. + def parse(context: Context): Option[Element] +end Combinator + +final case class Apply[C, E](action: C => Option[E]) +final case class Combine[A, B](first: A, second: B) + +given [C, E] => Apply[C, E] is Combinator: + type Context = C + type Element = E + extension(self: Apply[C, E]) + def parse(context: C): Option[E] = self.action(context) + +given [A: Combinator, B: Combinator { type Context = A.Context }] + => Combine[A, B] is Combinator: + type Context = A.Context + type Element = (A.Element, B.Element) + extension(self: Combine[A, B]) + def parse(context: Context): Option[Element] = ??? + +extension [A] (buf: mutable.ListBuffer[A]) def popFirst() = + if buf.isEmpty then None + else try Some(buf.head) finally buf.remove(0) + +@main def hello: Unit = + val source = (0 to 10).toList + val stream = source.to(mutable.ListBuffer) + + val n = Apply[mutable.ListBuffer[Int], Int](s => s.popFirst()) + val m = Combine(n, n) + + val r = m.parse(stream) // error: type mismatch, found `mutable.ListBuffer[Int]`, required `?1.Context` + val rc: Option[(Int, Int)] = r + // it would be great if this worked diff --git a/tests/pos/parsercombinators-ctx-bounds.scala b/tests/pos/parsercombinators-ctx-bounds.scala new file mode 100644 index 000000000000..d77abea5e539 --- /dev/null +++ b/tests/pos/parsercombinators-ctx-bounds.scala @@ -0,0 +1,49 @@ +//> using options -language:experimental.modularity -source future +import collection.mutable + +/// A parser combinator. +trait Combinator[T]: + + /// The context from which elements are being parsed, typically a stream of tokens. + type Context + /// The element being parsed. + type Element + + extension (self: T) + /// Parses and returns an element from `context`. + def parse(context: Context): Option[Element] +end Combinator + +final case class Apply[C, E](action: C => Option[E]) +final case class Combine[A, B](first: A, second: B) + +given apply[C, E]: Combinator[Apply[C, E]] with { + type Context = C + type Element = E + extension(self: Apply[C, E]) { + def parse(context: C): Option[E] = self.action(context) + } +} + +given combine[A: Combinator, B: [X] =>> Combinator[X] { type Context = A.Context }] + : Combinator[Combine[A, B]] with + type Context = A.Context + type Element = (A.Element, B.Element) + extension(self: Combine[A, B]) + def parse(context: Context): Option[Element] = ??? + +extension [A] (buf: mutable.ListBuffer[A]) def popFirst() = + if buf.isEmpty then None + else try Some(buf.head) finally buf.remove(0) + +@main def hello: Unit = { + val source = (0 to 10).toList + val stream = source.to(mutable.ListBuffer) + + val n = Apply[mutable.ListBuffer[Int], Int](s => s.popFirst()) + val m = Combine(n, n) + + val r = m.parse(stream) // error: type mismatch, found `mutable.ListBuffer[Int]`, required `?1.Context` + val rc: Option[(Int, Int)] = r + // it would be great if this worked +} diff --git a/tests/pos/parsercombinators-new-syntax.scala b/tests/pos/parsercombinators-new-syntax.scala new file mode 100644 index 000000000000..f984972b915d --- /dev/null +++ b/tests/pos/parsercombinators-new-syntax.scala @@ -0,0 +1,45 @@ +//> using options -language:experimental.modularity -source future +import collection.mutable + +/// A parser combinator. +trait Combinator: + type Self + type Input + type Result + + extension (self: Self) + /// Parses and returns an element from input `in`. + def parse(in: Input): Option[Result] +end Combinator + +case class Apply[I, R](action: I => Option[R]) +case class Combine[A, B](first: A, second: B) + +given [I, R] => Apply[I, R] is Combinator: + type Input = I + type Result = R + extension (self: Apply[I, R]) + def parse(in: I): Option[R] = self.action(in) + +given [A: Combinator, B: Combinator { type Input = A.Input }] + => Combine[A, B] is Combinator: + type Input = A.Input + type Result = (A.Result, B.Result) + extension (self: Combine[A, B]) + def parse(in: Input): Option[Result] = + for x <- self.first.parse(in); y <- self.second.parse(in) yield (x, y) + +extension [A] (buf: mutable.ListBuffer[A]) def popFirst() = + if buf.isEmpty then None + else try Some(buf.head) finally buf.remove(0) + +@main def hello: Unit = + val source = (0 to 10).toList + val stream = source.to(mutable.ListBuffer) + + val n = Apply[mutable.ListBuffer[Int], Int](s => s.popFirst()) + val m = Combine(n, n) + + val r = m.parse(stream) // was error: type mismatch, found `mutable.ListBuffer[Int]`, required `?1.Input` + val rc: Option[(Int, Int)] = r + diff --git a/tests/pos/parsercombinators-this.scala b/tests/pos/parsercombinators-this.scala new file mode 100644 index 000000000000..70b423985400 --- /dev/null +++ b/tests/pos/parsercombinators-this.scala @@ -0,0 +1,53 @@ +//> using options -language:experimental.modularity -source future +import collection.mutable + +/// A parser combinator. +trait Combinator: + + type Self + + /// The context from which elements are being parsed, typically a stream of tokens. + type Context + /// The element being parsed. + type Element + + extension (self: Self) + /// Parses and returns an element from `context`. + def parse(context: Context): Option[Element] +end Combinator + +final case class Apply[C, E](action: C => Option[E]) +final case class Combine[A, B](first: A, second: B) + +given apply[C, E]: Combinator with { + type Self = Apply[C, E] + type Context = C + type Element = E + extension(self: Apply[C, E]) { + def parse(context: C): Option[E] = self.action(context) + } +} + +given combine[A: Combinator, B: Combinator { type Context = A.Context }] + : Combinator with + type Self = Combine[A, B] + type Context = A.Context + type Element = (A.Element, B.Element) + extension(self: Combine[A, B]) + def parse(context: Context): Option[Element] = ??? + +extension [A] (buf: mutable.ListBuffer[A]) def popFirst() = + if buf.isEmpty then None + else try Some(buf.head) finally buf.remove(0) + +@main def hello: Unit = { + val source = (0 to 10).toList + val stream = source.to(mutable.ListBuffer) + + val n = Apply[mutable.ListBuffer[Int], Int](s => s.popFirst()) + val m = Combine(n, n) + + val r = m.parse(stream) // error: type mismatch, found `mutable.ListBuffer[Int]`, required `?1.Context` + val rc: Option[(Int, Int)] = r + // it would be great if this worked +} diff --git a/tests/pos/sets-tc.scala b/tests/pos/sets-tc.scala new file mode 100644 index 000000000000..86349bf6a405 --- /dev/null +++ b/tests/pos/sets-tc.scala @@ -0,0 +1,46 @@ +import language.experimental.modularity + +// First version: higher-kinded self type +object v1: + trait Set: + type Self[A] + def empty[A]: Self[A] + def union[A](self: Self[A], other: Self[A]): Self[A] + + case class ListSet[A](elems: List[A]) + + given ListSet is Set: + def empty[A]: ListSet[A] = ListSet(Nil) + + def union[A](self: ListSet[A], other: ListSet[A]): ListSet[A] = + ListSet(self.elems ++ other.elems) + + def listUnion[A, S[_]: Set](xs: List[S[A]]): S[A] = + xs.foldLeft(S.empty)(S.union) + + val xs = ListSet(List(1, 2, 3)) + val ys = ListSet(List(4, 5)) + val zs = listUnion(List(xs, ys)) + + // Second version: parameterized type class +object v2: + trait Set[A]: + type Self + def empty: Self + extension (s: Self) def union (other: Self): Self + + case class ListSet[A](elems: List[A]) + + given [A] => ListSet[A] is Set[A]: + def empty: ListSet[A] = ListSet(Nil) + + extension (self: ListSet[A]) def union(other: ListSet[A]): ListSet[A] = + ListSet(self.elems ++ other.elems) + + def listUnion[A, S: Set[A]](xs: List[S]): S = + xs.foldLeft(S.empty)(_ `union` _) + + val xs = ListSet(List(1, 2, 3)) + val ys = ListSet(List(4, 5)) + val zs = listUnion(List(xs, ys)) + diff --git a/tests/pos/typeclass-aggregates.scala b/tests/pos/typeclass-aggregates.scala index 9bb576603b7b..5e4551b226b7 100644 --- a/tests/pos/typeclass-aggregates.scala +++ b/tests/pos/typeclass-aggregates.scala @@ -1,47 +1,47 @@ //> using options -source future -language:experimental.modularity trait Ord: - type This - extension (x: This) - def compareTo(y: This): Int - def < (y: This): Boolean = compareTo(y) < 0 - def > (y: This): Boolean = compareTo(y) > 0 + type Self + extension (x: Self) + def compareTo(y: Self): Int + def < (y: Self): Boolean = compareTo(y) < 0 + def > (y: Self): Boolean = compareTo(y) > 0 trait OrdProxy extends Ord: export Ord.this.* trait SemiGroup: - type This - extension (x: This) def combine(y: This): This + type Self + extension (x: Self) def combine(y: Self): Self trait SemiGroupProxy extends SemiGroup: export SemiGroup.this.* trait Monoid extends SemiGroup: - def unit: This + def unit: Self trait MonoidProxy extends Monoid: export Monoid.this.* -def ordWithMonoid(ord: Ord, monoid: Monoid{ type This = ord.This }): Ord & Monoid = +def ordWithMonoid(ord: Ord, monoid: Monoid{ type Self = ord.Self }): Ord & Monoid = new ord.OrdProxy with monoid.MonoidProxy {} trait OrdWithMonoid extends Ord, Monoid -def ordWithMonoid2(ord: Ord, monoid: Monoid{ type This = ord.This }) = //: OrdWithMonoid { type This = ord.This} = +def ordWithMonoid2(ord: Ord, monoid: Monoid{ type Self = ord.Self }) = //: OrdWithMonoid { type Self = ord.Self} = new OrdWithMonoid with ord.OrdProxy with monoid.MonoidProxy {} -given intOrd: (Ord { type This = Int }) = ??? -given intMonoid: (Monoid { type This = Int }) = ??? +given intOrd: (Ord { type Self = Int }) = ??? +given intMonoid: (Monoid { type Self = Int }) = ??? -//given (using ord: Ord, monoid: Monoid{ type This = ord.This }): (Ord & Monoid { type This = ord.This}) = +//given (using ord: Ord, monoid: Monoid{ type Self = ord.Self }): (Ord & Monoid { type Self = ord.Self}) = // ordWithMonoid2(ord, monoid) -val x = summon[Ord & Monoid { type This = Int}] -val y: Int = ??? : x.This +val x = summon[Ord & Monoid { type Self = Int}] +val y: Int = ??? : x.Self // given [A, B](using ord: A is Ord, monoid: A is Monoid) => A is Ord & Monoid = // new ord.OrdProxy with monoid.MonoidProxy {} -given [A](using ord: Ord { type This = A }, monoid: Monoid { type This = A}): ((Ord & Monoid) { type This = A}) = +given [A](using ord: Ord { type Self = A }, monoid: Monoid { type Self = A}): ((Ord & Monoid) { type Self = A}) = new ord.OrdProxy with monoid.MonoidProxy {} diff --git a/tests/pos/typeclasses-arrow.scala b/tests/pos/typeclasses-arrow.scala new file mode 100644 index 000000000000..379365ffa1c5 --- /dev/null +++ b/tests/pos/typeclasses-arrow.scala @@ -0,0 +1,140 @@ +//> using options -language:experimental.modularity -source future + +class Common: + + trait Ord: + type Self + extension (x: Self) + def compareTo(y: Self): Int + def < (y: Self): Boolean = compareTo(y) < 0 + def > (y: Self): Boolean = compareTo(y) > 0 + def <= (y: Self): Boolean = compareTo(y) <= 0 + def >= (y: Self): Boolean = compareTo(y) >= 0 + def max(y: Self): Self = if x < y then y else x + + trait Show: + type Self + extension (x: Self) def show: String + + trait SemiGroup: + type Self + extension (x: Self) def combine(y: Self): Self + + trait Monoid extends SemiGroup: + def unit: Self + + trait Functor: + type Self[A] + extension [A](x: Self[A]) def map[B](f: A => B): Self[B] + + trait Monad extends Functor: + def pure[A](x: A): Self[A] + extension [A](x: Self[A]) + def flatMap[B](f: A => Self[B]): Self[B] + def map[B](f: A => B) = x.flatMap(f `andThen` pure) +end Common + +object Instances extends Common: + + given Int is Ord as intOrd: + extension (x: Int) + def compareTo(y: Int) = + if x < y then -1 + else if x > y then +1 + else 0 + + given [T: Ord] => List[T] is Ord: + extension (xs: List[T]) def compareTo(ys: List[T]): Int = (xs, ys) match + case (Nil, Nil) => 0 + case (Nil, _) => -1 + case (_, Nil) => +1 + case (x :: xs1, y :: ys1) => + val fst = x.compareTo(y) + if (fst != 0) fst else xs1.compareTo(ys1) + + given List is Monad as listMonad: + extension [A](xs: List[A]) def flatMap[B](f: A => List[B]): List[B] = + xs.flatMap(f) + def pure[A](x: A): List[A] = + List(x) + + type Reader[Ctx] = [X] =>> Ctx => X + + given [Ctx] => Reader[Ctx] is Monad as readerMonad: + extension [A](r: Ctx => A) def flatMap[B](f: A => Ctx => B): Ctx => B = + ctx => f(r(ctx))(ctx) + def pure[A](x: A): Ctx => A = + ctx => x + + extension (xs: Seq[String]) + def longestStrings: Seq[String] = + val maxLength = xs.map(_.length).max + xs.filter(_.length == maxLength) + + extension [T](xs: List[T]) + def second = xs.tail.head + def third = xs.tail.tail.head + + extension [M[_]: Monad, A](xss: M[M[A]]) + def flatten: M[A] = + xss.flatMap(identity) + + def maximum[T: Ord](xs: List[T]): T = + xs.reduce(_ `max` _) + + given [T: Ord] => T is Ord as descending: + extension (x: T) def compareTo(y: T) = T.compareTo(y)(x) + + def minimum[T: Ord](xs: List[T]) = + maximum(xs)(using descending) + + def test(): Unit = + val xs = List(1, 2, 3) + println(maximum(xs)) + println(maximum(xs)(using descending)) + println(maximum(xs)(using descending(using intOrd))) + println(minimum(xs)) + +// Adapted from the Rust by Example book: https://doc.rust-lang.org/rust-by-example/trait.html +// +// lines words chars +// wc Scala: 28 105 793 +// wc Rust : 57 193 1466 + +trait Animal: + type Self + // Associated function signature; `Self` refers to the implementor type. + def apply(name: String): Self + + // Method signatures; these will return a string. + extension (self: Self) + def name: String + def noise: String + def talk(): Unit = println(s"$name, $noise") +end Animal + +class Sheep(val name: String): + var isNaked = false + def shear() = + if isNaked then + println(s"$name is already naked...") + else + println(s"$name gets a haircut!") + isNaked = true + +given Sheep is Animal: + def apply(name: String) = Sheep(name) + extension (self: Self) + def name: String = self.name + def noise: String = if self.isNaked then "baaaaah?" else "baaaaah!" + override def talk(): Unit = + println(s"$name pauses briefly... $noise") + +/* + + - In a type pattern, A <: T, A >: T, A: T, A: _ are all allowed and mean + T is a fresh type variable (T can start with a capital letter). + - instance definitions + - `as m` syntax in context bounds and instance definitions + +*/ diff --git a/tests/pos/typeclasses-this.scala b/tests/pos/typeclasses-this.scala new file mode 100644 index 000000000000..20ce78678b22 --- /dev/null +++ b/tests/pos/typeclasses-this.scala @@ -0,0 +1,141 @@ +//> using options -language:experimental.modularity -source future + +class Common: + + trait Ord: + type Self + extension (x: Self) + def compareTo(y: Self): Int + def < (y: Self): Boolean = compareTo(y) < 0 + def > (y: Self): Boolean = compareTo(y) > 0 + def <= (y: Self): Boolean = compareTo(y) <= 0 + def >= (y: Self): Boolean = compareTo(y) >= 0 + def max(y: Self): Self = if x < y then y else x + + trait Show: + type Self + extension (x: Self) def show: String + + trait SemiGroup: + type Self + extension (x: Self) def combine(y: Self): Self + + trait Monoid extends SemiGroup: + def unit: Self + + trait Functor: + type Self[A] + extension [A](x: Self[A]) def map[B](f: A => B): Self[B] + + trait Monad extends Functor: + def pure[A](x: A): Self[A] + extension [A](x: Self[A]) + def flatMap[B](f: A => Self[B]): Self[B] + def map[B](f: A => B) = x.flatMap(f `andThen` pure) +end Common + +object Instances extends Common: + + given intOrd: Int is Ord with + extension (x: Int) + def compareTo(y: Int) = + if x < y then -1 + else if x > y then +1 + else 0 + +// given [T](using tracked val ev: Ord { type Self = T}): Ord { type Self = List[T] } with + given [T: Ord]: List[T] is Ord with + extension (xs: List[T]) def compareTo(ys: List[T]): Int = (xs, ys) match + case (Nil, Nil) => 0 + case (Nil, _) => -1 + case (_, Nil) => +1 + case (x :: xs1, y :: ys1) => + val fst = x.compareTo(y) + if (fst != 0) fst else xs1.compareTo(ys1) + + given listMonad: List is Monad with + extension [A](xs: List[A]) def flatMap[B](f: A => List[B]): List[B] = + xs.flatMap(f) + def pure[A](x: A): List[A] = + List(x) + + type Reader[Ctx] = [X] =>> Ctx => X + + given readerMonad[Ctx]: Reader[Ctx] is Monad with + extension [A](r: Ctx => A) def flatMap[B](f: A => Ctx => B): Ctx => B = + ctx => f(r(ctx))(ctx) + def pure[A](x: A): Ctx => A = + ctx => x + + extension (xs: Seq[String]) + def longestStrings: Seq[String] = + val maxLength = xs.map(_.length).max + xs.filter(_.length == maxLength) + + extension [T](xs: List[T]) + def second = xs.tail.head + def third = xs.tail.tail.head + + extension [M[_]: Monad, A](xss: M[M[A]]) + def flatten: M[A] = + xss.flatMap(identity) + + def maximum[T: Ord](xs: List[T]): T = + xs.reduce(_ `max` _) + + given descending[T: Ord]: T is Ord with + extension (x: T) def compareTo(y: T) = T.compareTo(y)(x) + + def minimum[T: Ord](xs: List[T]) = + maximum(xs)(using descending) + + def test(): Unit = + val xs = List(1, 2, 3) + println(maximum(xs)) + println(maximum(xs)(using descending)) + println(maximum(xs)(using descending(using intOrd))) + println(minimum(xs)) + +// Adapted from the Rust by Example book: https://doc.rust-lang.org/rust-by-example/trait.html +// +// lines words chars +// wc Scala: 28 105 793 +// wc Rust : 57 193 1466 + +trait Animal: + type Self + // Associated function signature; `Self` refers to the implementor type. + def apply(name: String): Self + + // Method signatures; these will return a string. + extension (self: Self) + def name: String + def noise: String + def talk(): Unit = println(s"$name, $noise") +end Animal + +class Sheep(val name: String): + var isNaked = false + def shear() = + if isNaked then + println(s"$name is already naked...") + else + println(s"$name gets a haircut!") + isNaked = true + +given Sheep is Animal with + def apply(name: String) = Sheep(name) + extension (self: Self) + def name: String = self.name + def noise: String = if self.isNaked then "baaaaah?" else "baaaaah!" + override def talk(): Unit = + println(s"$name pauses briefly... $noise") + +/* + + - In a type pattern, A <: T, A >: T, A: T, A: _ are all allowed and mean + T is a fresh type variable (T can start with a capital letter). + - instance definitions + - `as m` syntax in context bounds and instance definitions + +*/ diff --git a/tests/pos/typeclasses.scala b/tests/pos/typeclasses.scala index 2bf7f76f0804..d0315a318310 100644 --- a/tests/pos/typeclasses.scala +++ b/tests/pos/typeclasses.scala @@ -3,38 +3,36 @@ class Common: trait Ord: - type This - extension (x: This) - def compareTo(y: This): Int - def < (y: This): Boolean = compareTo(y) < 0 - def > (y: This): Boolean = compareTo(y) > 0 + type Self + extension (x: Self) + def compareTo(y: Self): Int + def < (y: Self): Boolean = compareTo(y) < 0 + def > (y: Self): Boolean = compareTo(y) > 0 trait SemiGroup: - type This - extension (x: This) def combine(y: This): This + type Self + extension (x: Self) def combine(y: Self): Self trait Monoid extends SemiGroup: - def unit: This + def unit: Self trait Functor: - type This[A] - extension [A](x: This[A]) def map[B](f: A => B): This[B] + type Self[A] + extension [A](x: Self[A]) def map[B](f: A => B): Self[B] trait Monad extends Functor: - def pure[A](x: A): This[A] - extension [A](x: This[A]) - def flatMap[B](f: A => This[B]): This[B] + def pure[A](x: A): Self[A] + extension [A](x: Self[A]) + def flatMap[B](f: A => Self[B]): Self[B] def map[B](f: A => B) = x.flatMap(f `andThen` pure) - infix type is[A <: AnyKind, B <: {type This <: AnyKind}] = B { type This = A } - end Common object Instances extends Common: given intOrd: (Int is Ord) with - type This = Int + type Self = Int extension (x: Int) def compareTo(y: Int) = if x < y then -1 @@ -77,8 +75,8 @@ object Instances extends Common: def second = xs.tail.head def third = xs.tail.tail.head - extension [M, A](using m: Monad)(xss: m.This[m.This[A]]) - def flatten: m.This[A] = + extension [M, A](using m: Monad)(xss: m.Self[m.Self[A]]) + def flatten: m.Self[A] = xss.flatMap(identity) def maximum[T](xs: List[T])(using T is Ord): T = @@ -103,12 +101,12 @@ object Instances extends Common: // wc Scala: 30 115 853 // wc Rust : 57 193 1466 trait Animal: - type This - // Associated function signature; `This` refers to the implementor type. - def apply(name: String): This + type Self + // Associated function signature; `Self` refers to the implementor type. + def apply(name: String): Self // Method signatures; these will return a string. - extension (self: This) + extension (self: Self) def name: String def noise: String def talk(): Unit = println(s"$name, $noise") @@ -126,18 +124,17 @@ class Sheep(val name: String): /* instance Sheep: Animal with def apply(name: String) = Sheep(name) - extension (self: This) + extension (self: Self) def name: String = self.name def noise: String = if self.isNaked then "baaaaah?" else "baaaaah!" override def talk(): Unit = println(s"$name pauses briefly... $noise") */ -import Instances.is // Implement the `Animal` trait for `Sheep`. given (Sheep is Animal) with def apply(name: String) = Sheep(name) - extension (self: This) + extension (self: Self) def name: String = self.name def noise: String = if self.isNaked then "baaaaah?" else "baaaaah!" override def talk(): Unit = diff --git a/tests/run/for-desugar-strawman.scala b/tests/run/for-desugar-strawman.scala new file mode 100644 index 000000000000..a92b19b9150a --- /dev/null +++ b/tests/run/for-desugar-strawman.scala @@ -0,0 +1,96 @@ + +@main def Test = + println: + for + x <- List(1, 2, 3) + y = x + x + if x >= 2 + i <- List.range(0, y) + z = i * i + if z % 2 == 0 + yield + i * x + + println: + val xs = List(1, 2, 3) + xs.flatMapDefined: x => + val y = x + x + xs.applyFilter(x >= 2): + val is = List.range(0, y) + is.mapDefined: i => + val z = i * i + is.applyFilter(z % 2 == 0): + i * x + +extension [A](as: List[A]) + + def applyFilter[B](p: => Boolean)(b: => B) = + if p then Some(b) else None + + def flatMapDefined[B](f: A => Option[IterableOnce[B]]): List[B] = + as.flatMap: x => + f(x).getOrElse(Nil) + + def mapDefined[B](f: A => Option[B]): List[B] = + as.flatMap(f) + +object UNDEFINED + +extension [A](as: Vector[A]) + + def applyFilter[B](p: => Boolean)(b: => B) = + if p then b else UNDEFINED + + def flatMapDefined[B](f: A => IterableOnce[B] | UNDEFINED.type): Vector[B] = + as.flatMap: x => + f(x) match + case UNDEFINED => Nil + case y: IterableOnce[B] => y + + def mapDefined[B](f: A => B | UNDEFINED.type): Vector[B] = + as.flatMap: x => + f(x) match + case UNDEFINED => Nil + case y: B => y :: Nil + +/* +F ::= val x = E; F + x <- E; G +G ::= [] + val x = E; G + if E; G + x <- E; G + +Translation scheme: + +{ for F yield E }c where c = undefined +{ for G yield E }c where c is a reference to the generator preceding the G sequence + +{ for [] yield E }c = E +{ for p = Ep; G yield E }c = val p = Ep; { for G yield E }c +{ for if Ep; G yield E}c = c.applyFilter(Ep)({ for G yield E }c) +{ for p <- Ep; G yield E }c = val c1 = Ep; c1.BIND{ case p => { for G yield E }c1 } (c1 fresh) + + where BIND = flatMapDefined if isGen(G), isFilter(G) + = mapDefined if !isGen(G), isFilter(G) + = flatMap if isGen(G), !isFilter(G) + = map if !isGen(G), !isFilter(G) + +{ for case p <- Ep; G yield E }c = { for $x <- Ep; if $x match case p => true case _ => false; p = $x@RuntimeChecked; G yield E }c +{ for case p = Ep; G yield E }c = { for $x = Ep; if $x match case p => true case _ => false; p = $x@RuntimeChecked; G yield E}c + +isFilter(if E; S) +isFilter(val x = E; S) if isFilter(S) + +isGen(x <- E; S) +isGen(val x = E; S) if isGen(S) +isGen(if E; S) if isGen(S) + +*/ + +val foo = 1 + +def main2 = + foo + ??? + ??? match { case _ => 0 } \ No newline at end of file diff --git a/tests/run/given-disambiguation.scala b/tests/run/given-disambiguation.scala new file mode 100644 index 000000000000..637c02a5621f --- /dev/null +++ b/tests/run/given-disambiguation.scala @@ -0,0 +1,58 @@ +import language.experimental.modularity +import language.future + +trait M: + type Self + extension (x: Self) def combine (y: Self): String + def unit: Self + +trait Num: + type Self + def zero: Self + +trait A extends M +trait B extends M + +def f[X: {M, A, B}](x: X) = + summon[X is M] + x.combine(x) + +trait AA: + type XX: {M, A, B} + val x = XX.unit + val A: String = "hello" + +trait AAA: + type X: M +trait BBB: + type X: Num +class CCC[X1: {M, Num}] extends AAA, BBB: + type X = X1 + X.zero + X.unit + +@main def Test = + class C + + given C is M: + extension (x: Self) def combine (y: Self) = "M" + def unit = C() + + given C is A: + extension (x: Self) def combine (y: Self) = "A" + def unit = C() + + given C is B: + extension (x: Self) def combine (y: Self) = "B" + def unit = C() + + assert(f(C()) == "M") + + class CC extends AA: + type XX = C + assert(A.length == 5) + assert(A.toString == "hello") + + CC() + + diff --git a/tests/run/i15840.scala b/tests/run/i15840.scala new file mode 100644 index 000000000000..0f238e2e7148 --- /dev/null +++ b/tests/run/i15840.scala @@ -0,0 +1,27 @@ +//> using options -language:experimental.modularity -source future + +trait Nat: + type N <: Nat + +class _0 extends Nat: + type N = _0 + +class NatOps[N <: Nat](tracked val n: N): + def toInt(using toIntN: ToInt[n.N]): Int = toIntN() + +// works +def toInt[N <: Nat](n: N)(using toIntN: ToInt[n.N]) = toIntN() + +sealed abstract class ToInt[N <: Nat]: + def apply(): Int + +object ToInt: + given ToInt[_0] { + def apply() = 0 + } + +@main def Test() = + assert(toInt(new _0) == 0) + assert(NatOps[_0](new _0).toInt == 0) + assert: + NatOps(new _0).toInt == 0 // did not work From 9d299d6f778a039dcca41b6da40b39753d08442e Mon Sep 17 00:00:00 2001 From: odersky Date: Wed, 3 Apr 2024 10:06:39 +0200 Subject: [PATCH 225/371] Add a doc page [Cherry-picked f444b4605c39ff38c8e41c61fdc93efec3bd02d8] --- .../reference/experimental/typeclasses.md | 776 ++++++++++++++++++ docs/sidebar.yml | 1 + .../runtime/stdLibPatches/language.scala | 1 + 3 files changed, 778 insertions(+) create mode 100644 docs/_docs/reference/experimental/typeclasses.md diff --git a/docs/_docs/reference/experimental/typeclasses.md b/docs/_docs/reference/experimental/typeclasses.md new file mode 100644 index 000000000000..5ac81061e42d --- /dev/null +++ b/docs/_docs/reference/experimental/typeclasses.md @@ -0,0 +1,776 @@ + +--- +layout: doc-page +title: "Type Classes" +nightlyOf: https://docs.scala-lang.org/scala3/reference/experimental/typeclasses.html +--- + +# Some Proposed Changes for Better Support of Type Classes + +Martin Odersky, 8.1.2024 + +A type class in Scala is a pattern where we define + + - a trait with one type parameter (the _type class_) + - given instances at specific instantiations of that trait, + - using clauses or context bounds abstracting over that trait. + +Type classes as a pattern work overall OK, but if we compare them to native implementations in Haskell, or protocols in Swift, or traits in Rust, then there are some idiosyncrasies and rough corners which in the end make them +a bit cumbersome and limiting for standard generic programming patterns. Much has improved since Scala 2's implicits, but there is still some gap to bridge to get to parity with these languages. + +This note shows that with some fairly small and reasonable tweaks to Scala's syntax and typing rules we can obtain a much better scheme for working with type classes, or do generic programming in general. + +The bulk of the suggested improvements has been implemented and is available +under source version `future` if the additional experimental language import `modularity` is present. For instance, using the following command: + +``` + scala compile -source:future -language:experimental.modularity +``` + +## Generalizing Context Bounds + + The only place in Scala's syntax where the type class pattern is relevant is + in context bounds. A context bound such as + +```scala + def min[A: Ordering](x: List[A]): A +``` +requires that `Ordering` is a trait or class with a single type parameter (which makes it a type class) and expands to a `using` clause that instantiates that parameter. Here is the expansion of `min`: +```scala + def min[A](x: List[A])(using Ordering[A]): A +``` + +**Proposal** Allow type classes to define an abstract type member named `Self` instead of a type parameter. + +**Example** + +```scala + trait Ord: + type Self + + trait SemiGroup: + type Self + extension (x: Self) def combine(y: Self): Self + + trait Monoid extends SemiGroup: + def unit: Self + + trait Functor: + type Self[A] + extension [A](x: Self[A]) def map[B](f: A => B): Self[B] + + trait Monad extends Functor: + def pure[A](x: A): Self[A] + extension [A](x: Self[A]) + def flatMap[B](f: A => Self[B]): Self[B] + def map[B](f: A => B) = x.flatMap(f `andThen` pure) + + def reduce[A: Monoid](xs: List[A]): A = + xs.foldLeft(Monoid.unit)(_ `combine` _) + + trait ParserCombinator: + type Self + type Input + type Result + extension (self: Self) + def parse(input: Input): Option[Result] = ... + + def combine[A: ParserCombinator, B: ParserCombinator { type Input = A.Input }] = ... +``` + +**Advantages** + + - Avoids repetitive type parameters, concentrates on what's essential, namely the type class hierarchy. + - Gives a clear indication of traits intended as type classes. A trait is a type class + if it has type `Self` as a member + - Allows to create aggregate type classes that combine givens via intersection types. + - Allows to use refinements in context bounds (the `combine` example above would be very awkward to express using the old way of context bounds expanding to type constructors). + +`Self`-based context bounds are a better fit for a dependently typed language like Scala than parameter-based ones. The main reason is that we are dealing with proper types, not type constructors. Proper types can be parameterized, intersected, or refined. This makes `Self`-based designs inherently more compositional than parameterized ones. + + + +**Details** + +When a trait has both a type parameter and an abstract `Self` type, we + resolve a context bound to the `Self` type. This allows type classes + that carry type parameters, as in + +```scala +trait Sequential[E]: + type Self +``` + +Here, +```scala +[S: Sequential[Int]] +``` +should resolve to: +```scala +[S](using Sequential[Int] { type Self = S }) +``` +and not to: +```scala +[S](using Sequential[S]) +``` + +**Discussion** + + Why not use `This` for the self type? The name `This` suggests that it is the type of `this`. But this is not true for type class traits. `Self` is the name of the type implementing a distinguished _member type_ of the trait in a `given` definition. `Self` is an established term in both Rust and Swift with the meaning used here. + + One possible objection to the `Self` based design is that it does not cover "multi-parameter" type classes. But neither do context bounds! "Multi-parameter" type classes in Scala are simply givens that can be synthesized with the standard mechanisms. Type classes in the strict sense abstract only over a single type, namely the implementation type of a trait. + + +## Auxiliary Type Alias `is` + +We introduce a standard type alias `is` in the Scala package or in `Predef`, defined like this: + +```scala + infix type is[A <: AnyKind, B <: {type Self <: AnyKind}] = B { type Self = A } +``` + +This makes writing instance definitions quite pleasant. Examples: + +```scala + given Int is Ord ... + given Int is Monoid ... + + type Reader = [X] =>> Env => X + given Reader is Monad ... +``` + +(more examples will follow below) + + + +## Naming Context Bounds + +Context bounds are a convenient and legible abbreviation. A problem so far is that they are always anonymous, +one cannot name the using parameter to which a context bound expands. + +For instance, consider a `reduce` method over `Monoid`s defined like this: + +```scala +def reduce[A : Monoid](xs: List[A]): A = ??? +``` +Since we don't have a name for the `Monoid` instance of `A`, we need to resort to `summon` in the body of `reduce`: +```scala +def reduce[A : Monoid](xs: List[A]): A = + xs.foldLeft(summon Monoid[A])(_ `combine` _) +``` +That's generally considered too painful to write and read, hence people usually adopt one of two alternatives. Either, eschew context bounds and switch to using clauses: +```scala +def reduce[A](xs: List[A])(using m: Monoid[A]): A = + xs.foldLeft(m)(_ `combine` _) +``` +Or, plan ahead and define a "trampoline" method in `Monoid`'s companion object: +```scala + trait Monoid[A] extends SemiGroup[A]: + def unit: A + object Monoid: + def unit[A](using m: Monoid[A]): A = m.unit + ... + def reduce[A : Monoid](xs: List[A]): A = + xs.foldLeft(Monoid.unit)(_ `combine` _) +``` +This is all accidental complexity which can be avoided by the following proposal. + +**Proposal:** Allow to name a context bound, like this: +```scala + def reduce[A : Monoid as m](xs: List[A]): A = + xs.foldLeft(m.unit)(_ `combine` _) +``` + +We use `as x` after the type to bind the instance to `x`. This is analogous to import renaming, which also introduces a new name for something that comes before. + +**Benefits:** The new syntax is simple and clear. +It avoids the awkward choice between concise context bounds that can't be named and verbose using clauses that can. + +### New Syntax for Aggregate Context Bounds + +Aggregate context bounds like `A : X : Y` are not obvious to read, and it becomes worse when we add names, e.g. `A : X as x : Y as y`. + +**Proposal:** Allow to combine several context bounds inside `{...}`, analogous +to import clauses. Example: + +```scala + trait: + def showMax[X : {Ordering, Show}](x: X, y: X): String + class B extends A: + def showMax[X : {Ordering as ordering, Show as show}](x: X, y: X): String = + show.asString(ordering.max(x, y)) +``` + +The old syntax with multiple `:` should be phased out over time. + +**Benefits:** The new syntax is much clearer than the old one, in particular for newcomers that don't know context bounds well. + +### Better Default Names for Context Bounds + +So far, an unnamed context bound for a type parameter gets a synthesized fresh name. It would be much more useful if it got the name of the constrained type parameter instead, translated to be a term name. This means our `reduce` method over monoids would not even need an `as` binding. We could simply formulate it as follows: +``` + def reduce[A : Monoid](xs: List[A]) = + xs.foldLeft(A.unit)(_ `combine` _) +``` + +The use of a name like `A` above in two variants, both as a type name and as a term name is of course familiar to Scala programmers. We use the same convention for classes and companion objects. In retrospect, the idea of generalizing this to also cover type parameters is obvious. It is surprising that it was not brought up before. + +**Proposed Rules** + + 1. The generated evidence parameter for a context bound `A : C as a` has name `a` + 2. The generated evidence for a context bound `A : C` without an `as` binding has name `A` (seen as a term name). So, `A : C` is equivalent to `A : C as A`. + 3. If there are multiple context bounds for a type parameter, as in `A : {C_1, ..., C_n}`, the generated evidence parameter for every context bound `C_i` has a fresh synthesized name, unless the context bound carries an `as` clause, in which case rule (1) applies. + +The default naming convention reduces the need for named context bounds. But named context bounds are still essential, for at least two reasons: + + - They are needed to give names to multiple context bounds. + - They give an explanation what a single unnamed context bound expands to. + + +### Expansion of Context Bounds + +Context bounds are currently translated to implicit parameters in the last parameter list of a method or class. This is a problem if a context bound is mentioned in one of the preceding parameter types. For example, consider a type class of parsers with associated type members `Input` and `Result` describing the input type on which the parsers operate and the type of results they produce: +```scala +trait Parser[P]: + type Input + type Result +``` +Here is a method `run` that runs a parser on an input of the required type: + +```scala +def run[P : Parser](in: P.Input): P.Result +``` +Or, making clearer what happens by using an explicit name for the context bound: +```scala +def run[P : Parser as p](in: p.Input): p.Result +``` +With the current translation this does not work since it would be expanded to: +```scala + def run[P](x: p.Input)(using p: Parser[P]): p.Result +``` +Note that the `p` in `p.Input` refers to the `p` introduced in the using clause, which comes later. So this is ill-formed. + +This problem would be fixed by changing the translation of context bounds so that they expand to using clauses immediately after the type parameter. But such a change is infeasible, for two reasons: + + 1. It would be a binary-incompatible change. + 2. Putting using clauses earlier can impair type inference. A type in + a using clause can be constrained by term arguments coming before that + clause. Moving the using clause first would miss those constraints, which could cause ambiguities in implicit search. + +But there is an alternative which is feasible: + +**Proposal:** Map the context bounds of a method or class as follows: + + 1. If one of the bounds is referred to by its term name in a subsequent parameter clause, the context bounds are mapped to a using clause immediately preceding the first such parameter clause. + 2. Otherwise, if the last parameter clause is a using (or implicit) clause, merge all parameters arising from context bounds in front of that clause, creating a single using clause. + 3. Otherwise, let the parameters arising from context bounds form a new using clause at the end. + +Rules (2) and (3) are the status quo, and match Scala 2's rules. Rule (1) is new but since context bounds so far could not be referred to, it does not apply to legacy code. Therefore, binary compatibility is maintained. + +**Discussion** More refined rules could be envisaged where context bounds are spread over different using clauses so that each comes as late as possible. But it would make matters more complicated and the gain in expressiveness is not clear to me. + +Named (either explicitly, or by default) context bounds in givens that produce classes are mapped to tracked val's of these classes (see #18958). This allows +references to these parameters to be precise, so that information about dependent type members is preserved. + + +## Context Bounds for Type Members + +It's not very orthogonal to allow subtype bounds for both type parameters and abstract type members, but context bounds only for type parameters. What's more, we don't even have the fallback of an explicit using clause for type members. The only alternative is to also introduce a set of abstract givens that get implemented in each subclass. This is extremely heavyweight and opaque to newcomers. + +**Proposal**: Allow context bounds for type members. Example: + +```scala + class Collection: + type Element : Ord +``` + +The question is how these bounds are expanded. Context bounds on type parameters +are expanded into using clauses. But for type members this does not work, since we cannot refer to a member type of a class in a parameter type of that class. What we are after is an equivalent of using parameter clauses but represented as class members. + +**Proposal:** Introduce a new way to implement a given definition in a trait like this: +```scala +given T = deferred +``` +`deferred` is a new method in the `scala.compiletime` package, which can appear only as the right hand side of a given defined in a trait. Any class implementing that trait will provide an implementation of this given. If a definition is not provided explicitly, it will be synthesized by searching for a given of type `T` in the scope of the inheriting class. Specifically, the scope in which this given will be searched is the environment of that class augmented by its parameters but not containing its members (since that would lead to recursive resolutions). If an implementation _is_ provided explicitly, it counts as an override of a concrete definition and needs an `override` modifier. + +Deferred givens allow a clean implementation of context bounds in traits, +as in the following example: +```scala +trait Sorted: + type Element : Ord + +class SortedSet[A : Ord] extends Sorted: + type Element = A +``` +The compiler expands this to the following implementation: +```scala +trait Sorted: + type Element + given Ord[Element] = compiletime.deferred + +class SortedSet[A](using A: Ord[A]) extends Sorted: + type Element = A + override given Ord[Element] = A // i.e. the A defined by the using clause +``` + +The using clause in class `SortedSet` provides an implementation for the deferred given in trait `Sorted`. + +**Benefits:** + + - Better orthogonality, type parameters and abstract type members now accept the same kinds of bounds. + - Better ergonomics, since deferred givens get naturally implemented in inheriting classes, no need for boilerplate to fill in definitions of abstract givens. + +**Alternative:** It was suggested that we use a modifier for a deferred given instead of a `= deferred`. Something like `deferred given C[T]`. But a modifier does not suggest the concept that a deferred given will be implemented automatically in subclasses unless an explicit definition is written. In a sense, we can see `= deferred` as the invocation of a magic macro that is provided by the compiler. So from a user's point of view a given with `deferred` right hand side is not abstract. +It is a concrete definition where the compiler will provide the correct implementation. + +## New Given Syntax + +A good language syntax is like a Bach fugue: A small set of motifs is combined in a multitude of harmonic ways. Dissonances and irregularities should be avoided. + +When designing Scala 3, I believe that, by and large, we achieved that goal, except in one area, which is the syntax of givens. There _are_ some glaring dissonances, as seen in this code for defining an ordering on lists: +```scala +given [A](using Ord[A]): Ord[List[A]] with + def compare(x: List[A], y: List[A]) = ... +``` +The `:` feels utterly foreign in this position. It's definitely not a type ascription, so what is its role? Just as bad is the trailing `with`. Everywhere else we use braces or trailing `:` to start a scope of nested definitions, so the need of `with` sticks out like a sore thumb. + +We arrived at that syntax not because of a flight of fancy but because even after trying for about a year to find other solutions it seemed like the least bad alternative. The awkwardness of the given syntax arose because we insisted that givens could be named or anonymous, with the default on anonymous, that we would not use underscore for an anonymous given, and that the name, if present, had to come first, and have the form `name [parameters] :`. In retrospect, that last requirement showed a lack of creativity on our part. + +Sometimes unconventional syntax grows on you and becomes natural after a while. But here it was unfortunately the opposite. The longer I used given definitions in this style the more awkward they felt, in particular since the rest of the language seemed so much better put together by comparison. And I believe many others agree with me on this. Since the current syntax is unnatural and esoteric, this means it's difficult to discover and very foreign even after that. This makes it much harder to learn and apply givens than it need be. + +Things become much simpler if we introduce the optional name instead with an `as name` clause at the end, just like we did for context bounds. We can then use a more intuitive syntax for givens like this: +```scala +given String is Ord: + def compare(x: String, y: String) = ... + +given [A : Ord] => List[A] is Ord: + def compare(x: List[A], y: List[A]) = ... + +given Int is Monoid: + extension (x: Int) def combine(y: Int) = x + y + def unit = 0 +``` +If explicit names are desired, we add them with `as` clauses: +```scala +given String is Ord as intOrd: + def compare(x: String, y: String) = ... + +given [A : Ord] => List[A] is Ord as listOrd: + def compare(x: List[A], y: List[A]) = ... + +given Int is Monoid as intMonoid: + extension (x: Int) def combine(y: Int) = x + y + def unit = 0 +``` + +The underlying principles are: + + - A `given` clause consists of the following elements: + + - An optional _precondition_, which introduces type parameters and/or using clauses and which ends in `=>`, + - the implemented _type_, + - an optional name binding using `as`, + - an implementation which consists of either an `=` and an expression, + or a template body. + + - Since there is no longer a middle `:` separating name and parameters from the implemented type, we can use a `:` to start the class body without looking unnatural, as is done everywhere else. That eliminates the special case where `with` was used before. + +This will be a fairly significant change to the given syntax. I believe there's still a possibility to do this. Not so much code has migrated to new style givens yet, and code that was written can be changed fairly easily. Specifically, there are about a 900K definitions of `implicit def`s +in Scala code on Github and about 10K definitions of `given ... with`. So about 1% of all code uses the Scala 3 syntax, which would have to be changed again. + +Changing something introduced just recently in Scala 3 is not fun, +but I believe these adjustments are preferable to let bad syntax +sit there and fester. The cost of changing should be amortized by improved developer experience over time, and better syntax would also help in migrating Scala 2 style implicits to Scala 3. But we should do it quickly before a lot more code +starts migrating. + +Migration to the new syntax is straightforward, and can be supported by automatic rewrites. For a transition period we can support both the old and the new syntax. It would be a good idea to backport the new given syntax to the LTS version of Scala so that code written in this version can already use it. The current LTS would then support old and new-style givens indefinitely, whereas new Scala 3.x versions would phase out the old syntax over time. + + +### Abolish Abstract Givens + +Another simplification is possible. So far we have special syntax for abstract givens: +```scala +given x: T +``` +The problem is that this syntax clashes with the quite common case where we want to establish a given without any nested definitions. For instance +consider a given that constructs a type tag: +```scala +class Tag[T] +``` +Then this works: +```scala +given Tag[String]() +given Tag[String] with {} +``` +But the following more natural syntax fails: +```scala +given Tag[String] +``` +The last line gives a rather cryptic error: +``` +1 |given Tag[String] + | ^ + | anonymous given cannot be abstract +``` +The problem is that the compiler thinks that the last given is intended to be abstract, and complains since abstract givens need to be named. This is another annoying dissonance. Nowhere else in Scala's syntax does adding a +`()` argument to a class cause a drastic change in meaning. And it's also a violation of the principle that it should be possible to define all givens without providing names for them. + +Fortunately, abstract givens are no longer necessary since they are superseded by the new `deferred` scheme. So we can deprecate that syntax over time. Abstract givens are a highly specialized mechanism with a so far non-obvious syntax. We have seen that this syntax clashes with reasonable expectations of Scala programmers. My estimate is that maybe a dozen people world-wide have used abstract givens in anger so far. + +**Proposal** In the future, let the `= deferred` mechanism be the only way to deliver the functionality of abstract givens. + +This is less of a disruption than it might appear at first: + + - `given T` was illegal before since abstract givens could not be anonymous. + It now means a concrete given of class `T` with no member definitions. + - `given x: T` is legacy syntax for an abstract given. + - `given T as x = deferred` is the analogous new syntax, which is more powerful since + it allows for automatic instantiation. + - `given T = deferred` is the anonymous version in the new syntax, which was not expressible before. + +**Benefits:** + + - Simplification of the language since a feature is dropped + - Eliminate non-obvious and misleading syntax. + +## Summary of Syntax Changes + +Here is the complete context-free syntax for all proposed features. +Overall the syntax for givens becomes a lot simpler than what it was before. + +``` +TmplDef ::= 'given' GivenDef +GivenDef ::= [GivenConditional '=>'] GivenSig +GivenConditional ::= [DefTypeParamClause | UsingParamClause] {UsingParamClause} +GivenSig ::= GivenType ['as' id] ([‘=’ Expr] | TemplateBody) + | ConstrApps ['as' id] TemplateBody +GivenType ::= AnnotType {id [nl] AnnotType} + +TypeDef ::= id [TypeParamClause] TypeAndCtxBounds +TypeParamBounds ::= TypeAndCtxBounds +TypeAndCtxBounds ::= TypeBounds [‘:’ ContextBounds] +ContextBounds ::= ContextBound | '{' ContextBound {',' ContextBound} '}' +ContextBound ::= Type ['as' id] +``` + + + +## Examples + + +### Example 1 + +Here are some standard type classes, which were mostly already introduced at the start of this note, now with associated instance givens and some test code: + +```scala + // Type classes + + trait Ord: + type Self + extension (x: Self) + def compareTo(y: Self): Int + def < (y: Self): Boolean = compareTo(y) < 0 + def > (y: Self): Boolean = compareTo(y) > 0 + def <= (y: Self): Boolean = compareTo(y) <= 0 + def >= (y: Self): Boolean = compareTo(y) >= 0 + def max(y: Self): Self = if x < y then y else x + + trait Show: + type Self + extension (x: Self) def show: String + + trait SemiGroup: + type Self + extension (x: Self) def combine(y: Self): Self + + trait Monoid extends SemiGroup: + def unit: Self + + trait Functor: + type Self[A] // Here, Self is a type constructor with parameter A + extension [A](x: Self[A]) def map[B](f: A => B): Self[B] + + trait Monad extends Functor: + def pure[A](x: A): Self[A] + extension [A](x: Self[A]) + def flatMap[B](f: A => Self[B]): Self[B] + def map[B](f: A => B) = x.flatMap(f `andThen` pure) + + // Instances + + given Int is Ord: + extension (x: Int) + def compareTo(y: Int) = + if x < y then -1 + else if x > y then +1 + else 0 + + given [T: Ord] => List[T] is Ord: + extension (xs: List[T]) def compareTo(ys: List[T]): Int = + (xs, ys) match + case (Nil, Nil) => 0 + case (Nil, _) => -1 + case (_, Nil) => +1 + case (x :: xs1, y :: ys1) => + val fst = x.compareTo(y) + if (fst != 0) fst else xs1.compareTo(ys1) + + given List is Monad: + extension [A](xs: List[A]) + def flatMap[B](f: A => List[B]): List[B] = + xs.flatMap(f) + def pure[A](x: A): List[A] = + List(x) + + type Reader[Ctx] = [X] =>> Ctx => X + + given [Ctx] => Reader[Ctx] is Monad: + extension [A](r: Ctx => A) + def flatMap[B](f: A => Ctx => B): Ctx => B = + ctx => f(r(ctx))(ctx) + def pure[A](x: A): Ctx => A = + ctx => x + + // Usages + + extension (xs: Seq[String]) + def longestStrings: Seq[String] = + val maxLength = xs.map(_.length).max + xs.filter(_.length == maxLength) + + extension [M[_]: Monad, A](xss: M[M[A]]) + def flatten: M[A] = + xss.flatMap(identity) + + def maximum[T: Ord](xs: List[T]): T = + xs.reduce(_ `max` _) + + given [T: Ord] => T is Ord as descending: + extension (x: T) def compareTo(y: T) = T.compareTo(y)(x) + + def minimum[T: Ord](xs: List[T]) = + maximum(xs)(using descending) +``` + + +### Example 2 + +The following contributed code by @LPTK (issue #10929) did _not_ work at first since +references were not tracked correctly. The version below adds explicit tracked parameters which makes the code compile. +```scala +infix abstract class TupleOf[T, +A]: + type Mapped[+A] <: Tuple + def map[B](x: T)(f: A => B): Mapped[B] + +object TupleOf: + + given TupleOf[EmptyTuple, Nothing] with + type Mapped[+A] = EmptyTuple + def map[B](x: EmptyTuple)(f: Nothing => B): Mapped[B] = x + + given [A, Rest <: Tuple](using tracked val tup: Rest TupleOf A): TupleOf[A *: Rest, A] with + type Mapped[+A] = A *: tup.Mapped[A] + def map[B](x: A *: Rest)(f: A => B): Mapped[B] = + f(x.head) *: tup.map(x.tail)(f) +``` + +Note the quite convoluted syntax, which makes the code hard to understand. Here is the same example in the new type class syntax, which also compiles correctly: +```scala +//> using options -language:experimental.modularity -source future + +trait TupleOf[+A]: + type Self + type Mapped[+A] <: Tuple + def map[B](x: Self)(f: A => B): Mapped[B] + +object TupleOf: + + given EmptyTuple is TupleOf[Nothing]: + type Mapped[+A] = EmptyTuple + def map[B](x: EmptyTuple)(f: Nothing => B): Mapped[B] = x + + given [A, Rest <: Tuple : TupleOf[A]] => A *: Rest is TupleOf[A]: + type Mapped[+A] = A *: Rest.Mapped[A] + def map[B](x: A *: Rest)(f: A => B): Mapped[B] = + f(x.head) *: Rest.map(x.tail)(f) +``` +Note in particular the following points: + + - In the original code, it was not clear that `TupleOf` is a type class, + since it contained two type parameters, one of which played the role + of the instance type `Self`. The new version is much clearer: `TupleOf` is + a type class over `Self` with one additional parameter, the common type of all tuple elements. + - The two given definitions are obfuscated in the old code. Their version + in the new code makes it clear what kind of instances they define: + + - `EmptyTuple` is a tuple of `Nothing`. + - if `Rest` is a tuple of `A`, then `A *: Rest` is also a tuple of `A`. + + - There's no need to introduce names for parameter instances in using clauses; the default naming scheme for context bound evidences works fine, and is more concise. + - There's no need to manually declare implicit parameters as `tracked`, + context bounds provide that automatically. + - Everything in the new code feels like idiomatic Scala 3, whereas the original code exhibits the awkward corner case that requires a `with` in + front of given definitions. + +### Example 3 + +Dimi Racordon tried to [define parser combinators](https://users.scala-lang.org/t/create-an-instance-of-a-type-class-with-methods-depending-on-type-members/9613) in Scala that use dependent type members for inputs and results. It was intended as a basic example of type class constraints, but it did not work in current Scala. + +Here is the problem solved with the new syntax. Note how much clearer that syntax is compared to Dimi's original version, which did not work out in the end. + +```scala +/** A parser combinator */ +trait Combinator: + type Self + + type Input + type Result + + extension (self: Self) + /** Parses and returns an element from input `in` */ + def parse(in: Input): Option[Result] +end Combinator + +case class Apply[I, R](action: I => Option[R]) +case class Combine[A, B](a: A, b: B) + +given [I, R] => Apply[I, R] is Combinator: + type Input = I + type Result = R + extension (self: Apply[I, R]) + def parse(in: I): Option[R] = self.action(in) + +given [A: Combinator, B: Combinator { type Input = A.Input }] + => Combine[A, B] is Combinator: + type Input = A.Input + type Result = (A.Result, B.Result) + extension (self: Combine[A, B]) + def parse(in: Input): Option[Result] = + for + x <- self.a.parse(in) + y <- self.b.parse(in) + yield (x, y) +``` +The example is now as expressed as straightforwardly as it should be: + + - `Combinator` is a type class with two associated types, `Input` and `Result`, and a `parse` method. + - `Apply` and `Combine` are two data constructors representing parser combinators. They are declared to be `Combinators` in the two subsequent `given` declarations. + - `Apply`'s parse method applies the `action` function to the input. + - `Combine[A, B]` is a parser combinator provided `A` and `B` are parser combinators + that process the same type of `Input`, which is also the input type of + `Combine[A, B]`. Its `Result` type is a pair of the `Result` types of `A` and `B`. + Results are produced by a simple for-expression. + +Compared to the original example, which required serious contortions, this is now all completely straightforward. + +_Note 1:_ One could also explore improvements, for instance making this purely functional. But that's not the point of the demonstration here, where I wanted +to take the original example and show how it can be made to work with the new constructs, and be expressed more clearly as well. + +_Note 2:_ One could improve the notation even further by adding equality constraints in the style of Swift, which in turn resemble the _sharing constraints_ of SML. A hypothetical syntax applied to the second given would be: +```scala +given [A: Combinator, B: Combinator with A.Input == B.Input] + => Combine[A, B] is Combinator: +``` +This variant is aesthetically pleasing since it makes the equality constraint symmetric. The original version had to use an asymmetric refinement on the second type parameter bound instead. For now, such constraints are neither implemented nor proposed. This is left as a possibility for future work. Note also the analogy with +the work of @mbovel and @Sporarum on refinement types, where similar `with` clauses can appear for term parameters. If that work goes ahead, we could possibly revisit the issue of `with` clauses also for type parameters. + +### Example 4 + +Dimi Racordon tried to [port some core elements](https://github.com/kyouko-taiga/scala-hylolib) of the type class based [Hylo standard library to Scala](https://github.com/hylo-lang/hylo/tree/main/StandardLibrary/Sources). It worked to some degree, but there were some things that could not be expressed, and more things that could be expressed only awkwardly. + +With the improvements proposed here, the library can now be expressed quite clearly and straightforwardly. See tests/pos/hylolib in this PR for details. + +## Suggested Improvements unrelated to Type Classes + +The following improvements elsewhere would make sense alongside the suggested changes to type classes. But they are currently not part of this proposal or implementation. + +### Fixing Singleton + +We know the current treatment of `Singleton` as a type bound is broken since +`x.type | y.type <: Singleton` holds by the subtyping rules for union types, even though `x.type | y.type` is clearly not a singleton. + +A better approach is to treat `Singleton` as a type class that is interpreted specially by the compiler. + +We can do this in a backwards-compatible way by defining `Singleton` like this: + +```scala +trait Singleton: + type Self +``` + +Then, instead of using an unsound upper bound we can use a context bound: + +```scala +def f[X: Singleton](x: X) = ... +``` + +The context bound would be treated specially by the compiler so that no using clause is generated at runtime. + +_Aside_: This can also lead to a solution how to express precise type variables. We can introduce another special type class `Precise` and use it like this: + +```scala +def f[X: Precise](x: X) = ... +``` +This would disable automatic widening of singleton types in inferred instances of type variable `X`. + +### Using `as` also in Patterns + +Since we have now more precedents of `as` as a postfix binder, I want to come back to the proposal to use it in patterns as well, in favor of `@`, which should be deprecated. + +Examples: + +```scala + xs match + case (Person(name, age) as p) :: rest => ... + + tp match + case Param(tl, _) :: _ as tparams => ... + + val x :: xs1 as xs = ys.checkedCast +``` + +These would replace the previous syntax using `@`: + +```scala + xs match + case p @ Person(name, age) :: rest => ... + + tp match + case tparams @ (Param(tl, _) :: _) => ... + + val xs @ (x :: xs1) = ys.checkedCast +``` +**Advantages:** No unpronounceable and non-standard symbol like `@`. More regularity. + +Generally, we want to use `as name` to attach a name for some entity that could also have been used stand-alone. + +**Proposed Syntax Change** + +``` +Pattern2 ::= InfixPattern ['as' id] +``` + +## Summary + +I have proposed some tweaks to Scala 3, which would greatly increase its usability for modular, type class based, generic programming. The proposed changes are: + + 1. Allow context bounds over classes that define a `Self` member type. + 1. Allow context bounds to be named with `as`. Use the bound parameter name as a default name for the generated context bound evidence. + 1. Add a new `{...}` syntax for multiple context bounds. + 1. Make context bounds also available for type members, which expand into a new form of deferred given. Phase out the previous abstract givens in favor of the new form. + 1. Add a predefined type alias `is`. + 1. Introduce a new cleaner syntax of given clauses. + +It's interesting that givens, which are a very general concept in Scala, were "almost there" when it comes to full support of concepts and generic programming. We only needed to add a few usability tweaks to context bounds, +alongside two syntactic changes that supersede the previous forms of `given .. with` clauses and abstract givens. Also interesting is that the superseded syntax constructs were the two areas where we collectively felt that the previous solutions were a bit awkward, but we could not think of better ones at the time. It's very nice that more satisfactory solutions are now emerging. + +## Conclusion + +Generic programming can be expressed in a number of languages. For instance, with +type classes in Haskell, or with traits in Rust, or with protocols in Swift, or with concepts in C++. Each of these is constructed from a fairly heavyweight set of new constructs, different from expressions and types. By contrast, equivalent solutions in Scala rely on regular types. Type classes are simply traits that define a `Self` type member. + +The proposed scheme has similar expressiveness to Protocols in Swift or Traits in Rust. Both of these were largely influenced by Jeremy Siek's PdD thesis "[A language for generic programming](https://scholarworks.iu.edu/dspace/handle/2022/7067)", which was first proposed as a way to implement concepts in C++. C++ did not follow Siek's approach, but Swift and Rust did. + +In Siek's thesis and in the formal treatments of Rust and Swift, + type class concepts are explained by mapping them to a lower level language of explicit dictionaries with representations for terms and types. Crucially, that lower level is not expressible without loss of granularity in the source language itself, since type representations are mapped to term dictionaries. By contrast, the current proposal expands type class concepts into other well-typed Scala constructs, which ultimately map into well-typed DOT programs. Type classes are simply a convenient notation for something that can already be expressed in Scala. In that sense, we stay true to the philosophy of a _scalable language_, where a small core can support a large range of advanced use cases. + diff --git a/docs/sidebar.yml b/docs/sidebar.yml index 160698f1f44b..efdab80595a6 100644 --- a/docs/sidebar.yml +++ b/docs/sidebar.yml @@ -156,6 +156,7 @@ subsection: - page: reference/experimental/tupled-function.md - page: reference/experimental/named-tuples.md - page: reference/experimental/modularity.md + - page: reference/experimental/typeclasses.md - page: reference/syntax.md - title: Language Versions index: reference/language-versions/language-versions.md diff --git a/library/src/scala/runtime/stdLibPatches/language.scala b/library/src/scala/runtime/stdLibPatches/language.scala index e9c480919902..a5cd683775f0 100644 --- a/library/src/scala/runtime/stdLibPatches/language.scala +++ b/library/src/scala/runtime/stdLibPatches/language.scala @@ -105,6 +105,7 @@ object language: * - ability to merge exported types in intersections * * @see [[https://dotty.epfl.ch/docs/reference/experimental/modularity]] + * @see [[https://dotty.epfl.ch/docs/reference/experimental/typeclasses]] */ @compileTimeOnly("`modularity` can only be used at compile time in import statements") object modularity From a6f918b8955997d98174f0e5d1e712392596801d Mon Sep 17 00:00:00 2001 From: odersky Date: Fri, 5 Apr 2024 20:21:30 +0200 Subject: [PATCH 226/371] Fix Singleton Allow to constrain type variables to be singletons by a context bound [X: Singleton] instead of an unsound supertype [X <: Singleton]. This fixes the soundness hole of singletons. [Cherry-picked f71365250688a6bc886b9900f8535e8babdd94be] --- .../tools/dotc/core/ConstraintHandling.scala | 18 +++----- .../dotty/tools/dotc/core/Definitions.scala | 12 ++--- .../dotty/tools/dotc/core/TypeComparer.scala | 8 ++-- .../src/dotty/tools/dotc/core/TypeOps.scala | 2 +- .../src/dotty/tools/dotc/core/Types.scala | 43 +++++++++++++++--- .../src/dotty/tools/dotc/typer/Namer.scala | 2 +- .../dotty/tools/dotc/typer/ProtoTypes.scala | 37 ++++++++++++---- .../dotty/tools/dotc/typer/Synthesizer.scala | 13 +++++- .../src/dotty/tools/dotc/typer/Typer.scala | 4 +- .../reference/experimental/typeclasses.md | 15 +++++-- .../scala/runtime/stdLibPatches/Predef.scala | 2 +- tests/neg/singleton-ctx-bound.scala | 20 +++++++++ tests/pos/singleton-ctx-bound.scala | 44 +++++++++++++++++++ 13 files changed, 175 insertions(+), 45 deletions(-) create mode 100644 tests/neg/singleton-ctx-bound.scala create mode 100644 tests/pos/singleton-ctx-bound.scala diff --git a/compiler/src/dotty/tools/dotc/core/ConstraintHandling.scala b/compiler/src/dotty/tools/dotc/core/ConstraintHandling.scala index 109929f0c6f5..06711ec97abf 100644 --- a/compiler/src/dotty/tools/dotc/core/ConstraintHandling.scala +++ b/compiler/src/dotty/tools/dotc/core/ConstraintHandling.scala @@ -647,9 +647,9 @@ trait ConstraintHandling { * At this point we also drop the @Repeated annotation to avoid inferring type arguments with it, * as those could leak the annotation to users (see run/inferred-repeated-result). */ - def widenInferred(inst: Type, bound: Type, widenUnions: Boolean)(using Context): Type = + def widenInferred(inst: Type, bound: Type, widen: Widen)(using Context): Type = def widenOr(tp: Type) = - if widenUnions then + if widen == Widen.Unions then val tpw = tp.widenUnion if tpw ne tp then if tpw.isTransparent() then @@ -667,14 +667,10 @@ trait ConstraintHandling { val tpw = tp.widenSingletons(skipSoftUnions) if (tpw ne tp) && (tpw <:< bound) then tpw else tp - def isSingleton(tp: Type): Boolean = tp match - case WildcardType(optBounds) => optBounds.exists && isSingleton(optBounds.bounds.hi) - case _ => isSubTypeWhenFrozen(tp, defn.SingletonType) - val wideInst = - if isSingleton(bound) then inst + if widen == Widen.None || bound.isSingletonBounded(frozen = true) then inst else - val widenedFromSingle = widenSingle(inst, skipSoftUnions = widenUnions) + val widenedFromSingle = widenSingle(inst, skipSoftUnions = widen == Widen.Unions) val widenedFromUnion = widenOr(widenedFromSingle) val widened = dropTransparentTraits(widenedFromUnion, bound) widenIrreducible(widened) @@ -713,10 +709,10 @@ trait ConstraintHandling { * The instance type is not allowed to contain references to types nested deeper * than `maxLevel`. */ - def instanceType(param: TypeParamRef, fromBelow: Boolean, widenUnions: Boolean, maxLevel: Int)(using Context): Type = { + def instanceType(param: TypeParamRef, fromBelow: Boolean, widen: Widen, maxLevel: Int)(using Context): Type = { val approx = approximation(param, fromBelow, maxLevel).simplified if fromBelow then - val widened = widenInferred(approx, param, widenUnions) + val widened = widenInferred(approx, param, widen) // Widening can add extra constraints, in particular the widened type might // be a type variable which is now instantiated to `param`, and therefore // cannot be used as an instantiation of `param` without creating a loop. @@ -724,7 +720,7 @@ trait ConstraintHandling { // (we do not check for non-toplevel occurrences: those should never occur // since `addOneBound` disallows recursive lower bounds). if constraint.occursAtToplevel(param, widened) then - instanceType(param, fromBelow, widenUnions, maxLevel) + instanceType(param, fromBelow, widen, maxLevel) else widened else diff --git a/compiler/src/dotty/tools/dotc/core/Definitions.scala b/compiler/src/dotty/tools/dotc/core/Definitions.scala index b408883009ab..6d3a4de7b026 100644 --- a/compiler/src/dotty/tools/dotc/core/Definitions.scala +++ b/compiler/src/dotty/tools/dotc/core/Definitions.scala @@ -59,10 +59,10 @@ class Definitions { private def enterCompleteClassSymbol(owner: Symbol, name: TypeName, flags: FlagSet, parents: List[TypeRef], decls: Scope) = newCompleteClassSymbol(owner, name, flags | Permanent | NoInits | Open, parents, decls).entered - private def enterTypeField(cls: ClassSymbol, name: TypeName, flags: FlagSet, scope: MutableScope) = + private def enterTypeField(cls: ClassSymbol, name: TypeName, flags: FlagSet, scope: MutableScope): TypeSymbol = scope.enter(newPermanentSymbol(cls, name, flags, TypeBounds.empty)) - private def enterTypeParam(cls: ClassSymbol, name: TypeName, flags: FlagSet, scope: MutableScope) = + private def enterTypeParam(cls: ClassSymbol, name: TypeName, flags: FlagSet, scope: MutableScope): TypeSymbol = enterTypeField(cls, name, flags | ClassTypeParamCreationFlags, scope) private def enterSyntheticTypeParam(cls: ClassSymbol, paramFlags: FlagSet, scope: MutableScope, suffix: String = "T0") = @@ -538,9 +538,11 @@ class Definitions { @tu lazy val SingletonClass: ClassSymbol = // needed as a synthetic class because Scala 2.x refers to it in classfiles // but does not define it as an explicit class. - enterCompleteClassSymbol( - ScalaPackageClass, tpnme.Singleton, PureInterfaceCreationFlags | Final, - List(AnyType), EmptyScope) + val cls = enterCompleteClassSymbol( + ScalaPackageClass, tpnme.Singleton, PureInterfaceCreationFlags | Final | Erased, + List(AnyType)) + enterTypeField(cls, tpnme.Self, Deferred, cls.info.decls.openForMutations) + cls @tu lazy val SingletonType: TypeRef = SingletonClass.typeRef @tu lazy val MaybeCapabilityAnnot: ClassSymbol = diff --git a/compiler/src/dotty/tools/dotc/core/TypeComparer.scala b/compiler/src/dotty/tools/dotc/core/TypeComparer.scala index 27dd4b7134a9..c2c502a984c4 100644 --- a/compiler/src/dotty/tools/dotc/core/TypeComparer.scala +++ b/compiler/src/dotty/tools/dotc/core/TypeComparer.scala @@ -3257,8 +3257,8 @@ object TypeComparer { def subtypeCheckInProgress(using Context): Boolean = comparing(_.subtypeCheckInProgress) - def instanceType(param: TypeParamRef, fromBelow: Boolean, widenUnions: Boolean, maxLevel: Int = Int.MaxValue)(using Context): Type = - comparing(_.instanceType(param, fromBelow, widenUnions, maxLevel)) + def instanceType(param: TypeParamRef, fromBelow: Boolean, widen: Widen, maxLevel: Int = Int.MaxValue)(using Context): Type = + comparing(_.instanceType(param, fromBelow, widen: Widen, maxLevel)) def approximation(param: TypeParamRef, fromBelow: Boolean, maxLevel: Int = Int.MaxValue)(using Context): Type = comparing(_.approximation(param, fromBelow, maxLevel)) @@ -3278,8 +3278,8 @@ object TypeComparer { def addToConstraint(tl: TypeLambda, tvars: List[TypeVar])(using Context): Boolean = comparing(_.addToConstraint(tl, tvars)) - def widenInferred(inst: Type, bound: Type, widenUnions: Boolean)(using Context): Type = - comparing(_.widenInferred(inst, bound, widenUnions)) + def widenInferred(inst: Type, bound: Type, widen: Widen)(using Context): Type = + comparing(_.widenInferred(inst, bound, widen: Widen)) def dropTransparentTraits(tp: Type, bound: Type)(using Context): Type = comparing(_.dropTransparentTraits(tp, bound)) diff --git a/compiler/src/dotty/tools/dotc/core/TypeOps.scala b/compiler/src/dotty/tools/dotc/core/TypeOps.scala index 8461c0f091fe..1282b77f013e 100644 --- a/compiler/src/dotty/tools/dotc/core/TypeOps.scala +++ b/compiler/src/dotty/tools/dotc/core/TypeOps.scala @@ -545,7 +545,7 @@ object TypeOps: val lo = TypeComparer.instanceType( tp.origin, fromBelow = variance > 0 || variance == 0 && tp.hasLowerBound, - widenUnions = tp.widenUnions)(using mapCtx) + tp.widenPolicy)(using mapCtx) val lo1 = apply(lo) if (lo1 ne lo) lo1 else tp case _ => diff --git a/compiler/src/dotty/tools/dotc/core/Types.scala b/compiler/src/dotty/tools/dotc/core/Types.scala index ac3aef2a59d2..27931bad0bc3 100644 --- a/compiler/src/dotty/tools/dotc/core/Types.scala +++ b/compiler/src/dotty/tools/dotc/core/Types.scala @@ -44,8 +44,6 @@ import CaptureSet.{CompareResult, IdempotentCaptRefMap, IdentityCaptRefMap} import scala.annotation.internal.sharable import scala.annotation.threadUnsafe - - object Types extends TypeUtils { @sharable private var nextId = 0 @@ -330,6 +328,21 @@ object Types extends TypeUtils { /** Is this type a (possibly aliased) singleton type? */ def isSingleton(using Context): Boolean = dealias.isInstanceOf[SingletonType] + /** Is this upper-bounded by a (possibly aliased) singleton type? + * Overridden in TypeVar + */ + def isSingletonBounded(frozen: Boolean)(using Context): Boolean = this.dealias.normalized match + case tp: SingletonType => tp.isStable + case tp: TypeRef => + tp.name == tpnme.Singleton && tp.symbol == defn.SingletonClass + || tp.superType.isSingletonBounded(frozen) + case tp: TypeVar if !tp.isInstantiated => + if frozen then tp frozen_<:< defn.SingletonType else tp <:< defn.SingletonType + case tp: HKTypeLambda => false + case tp: TypeProxy => tp.superType.isSingletonBounded(frozen) + case AndType(tpL, tpR) => tpL.isSingletonBounded(frozen) || tpR.isSingletonBounded(frozen) + case _ => false + /** Is this type of kind `AnyKind`? */ def hasAnyKind(using Context): Boolean = { @tailrec def loop(tp: Type): Boolean = tp match { @@ -4924,7 +4937,11 @@ object Types extends TypeUtils { * @param creatorState the typer state in which the variable was created. * @param initNestingLevel the initial nesting level of the type variable. (c.f. nestingLevel) */ - final class TypeVar private(initOrigin: TypeParamRef, creatorState: TyperState | Null, val initNestingLevel: Int) extends CachedProxyType with ValueType { + final class TypeVar private( + initOrigin: TypeParamRef, + creatorState: TyperState | Null, + val initNestingLevel: Int, + precise: Boolean) extends CachedProxyType with ValueType { private var currentOrigin = initOrigin def origin: TypeParamRef = currentOrigin @@ -5012,7 +5029,7 @@ object Types extends TypeUtils { } def typeToInstantiateWith(fromBelow: Boolean)(using Context): Type = - TypeComparer.instanceType(origin, fromBelow, widenUnions, nestingLevel) + TypeComparer.instanceType(origin, fromBelow, widenPolicy, nestingLevel) /** Instantiate variable from the constraints over its `origin`. * If `fromBelow` is true, the variable is instantiated to the lub @@ -5029,7 +5046,10 @@ object Types extends TypeUtils { instantiateWith(tp) /** Widen unions when instantiating this variable in the current context? */ - def widenUnions(using Context): Boolean = !ctx.typerState.constraint.isHard(this) + def widenPolicy(using Context): Widen = + if precise then Widen.None + else if ctx.typerState.constraint.isHard(this) then Widen.Singletons + else Widen.Unions /** For uninstantiated type variables: the entry in the constraint (either bounds or * provisional instance value) @@ -5070,8 +5090,17 @@ object Types extends TypeUtils { } } object TypeVar: - def apply(using Context)(initOrigin: TypeParamRef, creatorState: TyperState | Null, nestingLevel: Int = ctx.nestingLevel) = - new TypeVar(initOrigin, creatorState, nestingLevel) + def apply(using Context)( + initOrigin: TypeParamRef, + creatorState: TyperState | Null, + nestingLevel: Int = ctx.nestingLevel, + precise: Boolean = false) = + new TypeVar(initOrigin, creatorState, nestingLevel, precise) + + enum Widen: + case None // no widening + case Singletons // widen singletons but not unions + case Unions // widen singletons and unions type TypeVars = SimpleIdentitySet[TypeVar] diff --git a/compiler/src/dotty/tools/dotc/typer/Namer.scala b/compiler/src/dotty/tools/dotc/typer/Namer.scala index 393b38c5ff57..b69d9f76852a 100644 --- a/compiler/src/dotty/tools/dotc/typer/Namer.scala +++ b/compiler/src/dotty/tools/dotc/typer/Namer.scala @@ -2087,7 +2087,7 @@ class Namer { typer: Typer => if defaultTp.exists then TypeOps.SimplifyKeepUnchecked() else null) match case ctp: ConstantType if sym.isInlineVal => ctp - case tp => TypeComparer.widenInferred(tp, pt, widenUnions = true) + case tp => TypeComparer.widenInferred(tp, pt, Widen.Unions) // Replace aliases to Unit by Unit itself. If we leave the alias in // it would be erased to BoxedUnit. diff --git a/compiler/src/dotty/tools/dotc/typer/ProtoTypes.scala b/compiler/src/dotty/tools/dotc/typer/ProtoTypes.scala index 46c12b244fbb..7afdc836f656 100644 --- a/compiler/src/dotty/tools/dotc/typer/ProtoTypes.scala +++ b/compiler/src/dotty/tools/dotc/typer/ProtoTypes.scala @@ -701,6 +701,12 @@ object ProtoTypes { case FunProto((arg: untpd.TypedSplice) :: Nil, _) => arg.isExtensionReceiver case _ => false + object SingletonConstrained: + def unapply(tp: Type)(using Context): Option[Type] = tp.dealias match + case RefinedType(parent, tpnme.Self, TypeAlias(tp)) + if parent.typeSymbol == defn.SingletonClass => Some(tp) + case _ => None + /** Add all parameters of given type lambda `tl` to the constraint's domain. * If the constraint contains already some of these parameters in its domain, * make a copy of the type lambda and add the copy's type parameters instead. @@ -713,26 +719,41 @@ object ProtoTypes { tl: TypeLambda, owningTree: untpd.Tree, alwaysAddTypeVars: Boolean, nestingLevel: Int = ctx.nestingLevel - ): (TypeLambda, List[TypeVar]) = { + ): (TypeLambda, List[TypeVar]) = val state = ctx.typerState val addTypeVars = alwaysAddTypeVars || !owningTree.isEmpty if (tl.isInstanceOf[PolyType]) assert(!ctx.typerState.isCommittable || addTypeVars, s"inconsistent: no typevars were added to committable constraint ${state.constraint}") // hk type lambdas can be added to constraints without typevars during match reduction + val added = state.constraint.ensureFresh(tl) + + def singletonConstrainedRefs(tp: Type): Set[TypeParamRef] = tp match + case tp: MethodType if tp.isContextualMethod => + val ownBounds = + for case SingletonConstrained(ref: TypeParamRef) <- tp.paramInfos + yield ref + ownBounds.toSet ++ singletonConstrainedRefs(tp.resType) + case tp: LambdaType => + singletonConstrainedRefs(tp.resType) + case _ => + Set.empty + + val singletonRefs = singletonConstrainedRefs(added) + def isSingleton(ref: TypeParamRef) = singletonRefs.contains(ref) - def newTypeVars(tl: TypeLambda): List[TypeVar] = - for paramRef <- tl.paramRefs - yield - val tvar = TypeVar(paramRef, state, nestingLevel) + def newTypeVars: List[TypeVar] = + for paramRef <- added.paramRefs yield + val tvar = TypeVar(paramRef, state, nestingLevel, precise = isSingleton(paramRef)) state.ownedVars += tvar tvar - val added = state.constraint.ensureFresh(tl) - val tvars = if addTypeVars then newTypeVars(added) else Nil + val tvars = if addTypeVars then newTypeVars else Nil TypeComparer.addToConstraint(added, tvars) + for paramRef <- added.paramRefs do + if isSingleton(paramRef) then paramRef <:< defn.SingletonType (added, tvars) - } + end constrained def constrained(tl: TypeLambda, owningTree: untpd.Tree)(using Context): (TypeLambda, List[TypeVar]) = constrained(tl, owningTree, diff --git a/compiler/src/dotty/tools/dotc/typer/Synthesizer.scala b/compiler/src/dotty/tools/dotc/typer/Synthesizer.scala index 21d1151bcfd3..9fb091e3306c 100644 --- a/compiler/src/dotty/tools/dotc/typer/Synthesizer.scala +++ b/compiler/src/dotty/tools/dotc/typer/Synthesizer.scala @@ -237,6 +237,16 @@ class Synthesizer(typer: Typer)(using @constructorOnly c: Context): EmptyTreeNoError end synthesizedValueOf + val synthesizedSingleton: SpecialHandler = (formal, span) => formal match + case SingletonConstrained(tp) => + if tp.isSingletonBounded(frozen = false) then + withNoErrors: + ref(defn.Compiletime_erasedValue).appliedToType(formal).withSpan(span) + else + withErrors(i"$tp is not a singleton") + case _ => + EmptyTreeNoError + /** Create an anonymous class `new Object { type MirroredMonoType = ... }` * and mark it with given attachment so that it is made into a mirror at PostTyper. */ @@ -536,7 +546,7 @@ class Synthesizer(typer: Typer)(using @constructorOnly c: Context): val tparams = poly.paramRefs val variances = childClass.typeParams.map(_.paramVarianceSign) val instanceTypes = tparams.lazyZip(variances).map((tparam, variance) => - TypeComparer.instanceType(tparam, fromBelow = variance < 0, widenUnions = true) + TypeComparer.instanceType(tparam, fromBelow = variance < 0, Widen.Unions) ) val instanceType = resType.substParams(poly, instanceTypes) // this is broken in tests/run/i13332intersection.scala, @@ -738,6 +748,7 @@ class Synthesizer(typer: Typer)(using @constructorOnly c: Context): defn.MirrorClass -> synthesizedMirror, defn.ManifestClass -> synthesizedManifest, defn.OptManifestClass -> synthesizedOptManifest, + defn.SingletonClass -> synthesizedSingleton, ) def tryAll(formal: Type, span: Span)(using Context): TreeWithErrors = diff --git a/compiler/src/dotty/tools/dotc/typer/Typer.scala b/compiler/src/dotty/tools/dotc/typer/Typer.scala index 6ac41ed619b6..d23f77143e14 100644 --- a/compiler/src/dotty/tools/dotc/typer/Typer.scala +++ b/compiler/src/dotty/tools/dotc/typer/Typer.scala @@ -3321,8 +3321,8 @@ class Typer(@constructorOnly nestingLevel: Int = 0) extends Namer val app1 = typed(app, if ctx.mode.is(Mode.Pattern) then pt else defn.TupleXXLClass.typeRef) if ctx.mode.is(Mode.Pattern) then app1 else - val elemTpes = elems.lazyZip(pts).map((elem, pt) => - TypeComparer.widenInferred(elem.tpe, pt, widenUnions = true)) + val elemTpes = elems.lazyZip(pts).map: (elem, pt) => + TypeComparer.widenInferred(elem.tpe, pt, Widen.Unions) val resTpe = TypeOps.nestedPairs(elemTpes) app1.cast(resTpe) diff --git a/docs/_docs/reference/experimental/typeclasses.md b/docs/_docs/reference/experimental/typeclasses.md index 5ac81061e42d..8c95152b8e46 100644 --- a/docs/_docs/reference/experimental/typeclasses.md +++ b/docs/_docs/reference/experimental/typeclasses.md @@ -7,7 +7,7 @@ nightlyOf: https://docs.scala-lang.org/scala3/reference/experimental/typeclasses # Some Proposed Changes for Better Support of Type Classes -Martin Odersky, 8.1.2024 +Martin Odersky, 8.1.2024, edited 5.4.2024 A type class in Scala is a pattern where we define @@ -27,6 +27,8 @@ under source version `future` if the additional experimental language import `mo scala compile -source:future -language:experimental.modularity ``` +It is intended to turn features described here into proposals under the Scala improvement process. A first installment is SIP 64, which covers some syntactic changes, names for context bounds, multiple context bounds and deferred givens. The order of exposition described in this note is different from the planned proposals of SIPs. This doc is not a guide on how to sequence details, but instead wants to present a vision of what is possible. For instance, we start here with a feature (Self types and `is` syntax) that has turned out to be controversial and that will probably be proposed only late in the sequence of SIPs. + ## Generalizing Context Bounds The only place in Scala's syntax where the type class pattern is relevant is @@ -54,6 +56,8 @@ requires that `Ordering` is a trait or class with a single type parameter (which trait Monoid extends SemiGroup: def unit: Self + object Monoid: + def unit[M](using m: Monoid { type Self = M}): M trait Functor: type Self[A] @@ -129,7 +133,7 @@ We introduce a standard type alias `is` in the Scala package or in `Predef`, def infix type is[A <: AnyKind, B <: {type Self <: AnyKind}] = B { type Self = A } ``` -This makes writing instance definitions quite pleasant. Examples: +This makes writing instance definitions and using clauses quite pleasant. Examples: ```scala given Int is Ord ... @@ -137,6 +141,9 @@ This makes writing instance definitions quite pleasant. Examples: type Reader = [X] =>> Env => X given Reader is Monad ... + + object Monoid: + def unit[M](using m: M is Monoid): M ``` (more examples will follow below) @@ -682,7 +689,7 @@ With the improvements proposed here, the library can now be expressed quite clea ## Suggested Improvements unrelated to Type Classes -The following improvements elsewhere would make sense alongside the suggested changes to type classes. But they are currently not part of this proposal or implementation. +The following two improvements elsewhere would make sense alongside the suggested changes to type classes. But only the first (fixing singleton) forms a part of this proposal and is implemented. ### Fixing Singleton @@ -704,7 +711,7 @@ Then, instead of using an unsound upper bound we can use a context bound: def f[X: Singleton](x: X) = ... ``` -The context bound would be treated specially by the compiler so that no using clause is generated at runtime. +The context bound is treated specially by the compiler so that no using clause is generated at runtime (this is straightforward, using the erased definitions mechanism). _Aside_: This can also lead to a solution how to express precise type variables. We can introduce another special type class `Precise` and use it like this: diff --git a/library/src/scala/runtime/stdLibPatches/Predef.scala b/library/src/scala/runtime/stdLibPatches/Predef.scala index a68a628623bf..6c286f322ba7 100644 --- a/library/src/scala/runtime/stdLibPatches/Predef.scala +++ b/library/src/scala/runtime/stdLibPatches/Predef.scala @@ -77,6 +77,6 @@ object Predef: * * which is what is needed for a context bound `[A: TC]`. */ - infix type is[A <: AnyKind, B <: {type Self <: AnyKind}] = B { type Self = A } + infix type is[A <: AnyKind, B <: Any{type Self <: AnyKind}] = B { type Self = A } end Predef diff --git a/tests/neg/singleton-ctx-bound.scala b/tests/neg/singleton-ctx-bound.scala new file mode 100644 index 000000000000..64bb63a288b0 --- /dev/null +++ b/tests/neg/singleton-ctx-bound.scala @@ -0,0 +1,20 @@ +//> using options -language:experimental.modularity -source future +object Test: + + def someInt = 1 + + def f1[T <: Singleton](x: T): T = x + f1(someInt) // error + f1(if ??? then 1 else 2) // OK, but should be error + f1(3 * 2) // OK + + def f2[T](x: T)(using T is Singleton): T = x + f2(someInt) // error + f2(if ??? then 1 else 2) // error + f2(3 * 2) // OK + + def f3[T: Singleton](x: T): T = x + f3(someInt) // error + f3(if ??? then 1 else 2) // error + f3(3 * 2) // OK + f3(6) // OK diff --git a/tests/pos/singleton-ctx-bound.scala b/tests/pos/singleton-ctx-bound.scala new file mode 100644 index 000000000000..5d15cf53836e --- /dev/null +++ b/tests/pos/singleton-ctx-bound.scala @@ -0,0 +1,44 @@ +//> using options -language:experimental.modularity -source future +object Test: + + class Wrap[T](x: T) + + def f0[T](x: T): Wrap[T] = Wrap(x) + val x0 = f0(1) + val _: Wrap[Int] = x0 + + def f1[T <: Singleton](x: T): Wrap[T] = Wrap(x) + val x1 = f1(1) + val _: Wrap[1] = x1 + + def f2[T](x: T)(using Singleton { type Self = T}): Wrap[T] = Wrap(x) + val x2 = f2(1) + val _: Wrap[1] = x2 + + def f3[T: Singleton](x: T): Wrap[T] = Wrap(x) + val x3 = f3(1) + val _: Wrap[1] = x3 + + def f4[T](x: T)(using T is Singleton): Wrap[T] = Wrap(x) + val x4 = f4(1) + val _: Wrap[1] = x4 + + class C0[T](x: T): + def fld: T = x + val y0 = C0("hi") + val _: String = y0.fld + + class C1[T <: Singleton](x: T): + def fld: T = x + val y1 = C1("hi") + val _: "hi" = y1.fld + + class C2[T](x: T)(using T is Singleton): + def fld: T = x + val y2 = C2("hi") + val _: "hi" = y1.fld + + class C3[T: Singleton](x: T): + def fld: T = x + val y3 = C3("hi") + val _: "hi" = y1.fld \ No newline at end of file From 0c941e21bc34a27d418b9050630f13ba27ec1c62 Mon Sep 17 00:00:00 2001 From: odersky Date: Sat, 6 Apr 2024 15:13:07 +0200 Subject: [PATCH 227/371] Tweaks to doc pages [Cherry-picked 1f2e735565a7cb95b8b4ea3f71d330511da1f516] --- docs/_docs/reference/experimental/modularity.md | 2 +- docs/_docs/reference/experimental/typeclasses.md | 7 ++++++- 2 files changed, 7 insertions(+), 2 deletions(-) diff --git a/docs/_docs/reference/experimental/modularity.md b/docs/_docs/reference/experimental/modularity.md index 2062c4d5eda2..a989b71770af 100644 --- a/docs/_docs/reference/experimental/modularity.md +++ b/docs/_docs/reference/experimental/modularity.md @@ -138,7 +138,7 @@ when typechecking recursive class graphs. So an explicit `tracked` looks like th Since `tracked` parameters create refinements in constructor types, it is now possible that a class has a parent that is a refined type. -Previously such types were not permitted, since we were not quite sure how to handle them. But with tracked parameters it becomes pressing so +Previously such types were not permitted, since we were not quite sure how to handle them. But with tracked parameters it becomes pressing to admit such types. **Proposal** Allow refined types as parent types of classes. All refinements that are inherited in this way become synthetic members of the class. diff --git a/docs/_docs/reference/experimental/typeclasses.md b/docs/_docs/reference/experimental/typeclasses.md index 8c95152b8e46..dab612512579 100644 --- a/docs/_docs/reference/experimental/typeclasses.md +++ b/docs/_docs/reference/experimental/typeclasses.md @@ -220,7 +220,7 @@ So far, an unnamed context bound for a type parameter gets a synthesized fresh n xs.foldLeft(A.unit)(_ `combine` _) ``` -The use of a name like `A` above in two variants, both as a type name and as a term name is of course familiar to Scala programmers. We use the same convention for classes and companion objects. In retrospect, the idea of generalizing this to also cover type parameters is obvious. It is surprising that it was not brought up before. +In Scala we are already familiar with using one name for two related things where one version names a type and the other an associated value. For instance, we use that convention for classes and companion objects. In retrospect, the idea of generalizing this to also cover type parameters is obvious. It is surprising that it was not brought up before. **Proposed Rules** @@ -228,6 +228,8 @@ The use of a name like `A` above in two variants, both as a type name and as a t 2. The generated evidence for a context bound `A : C` without an `as` binding has name `A` (seen as a term name). So, `A : C` is equivalent to `A : C as A`. 3. If there are multiple context bounds for a type parameter, as in `A : {C_1, ..., C_n}`, the generated evidence parameter for every context bound `C_i` has a fresh synthesized name, unless the context bound carries an `as` clause, in which case rule (1) applies. +TODO: Present context bound proxy concept. + The default naming convention reduces the need for named context bounds. But named context bounds are still essential, for at least two reasons: - They are needed to give names to multiple context bounds. @@ -357,6 +359,8 @@ given Int is Monoid: extension (x: Int) def combine(y: Int) = x + y def unit = 0 ``` +Here, the second given can be read as if `A` is an `Ord` then `List[A]` is also an`Ord`. Or: for all `A: Ord`, `List[A]` is `Ord`. The arrow can be seen as an implication, note also the analogy to pattern matching syntax. + If explicit names are desired, we add them with `as` clauses: ```scala given String is Ord as intOrd: @@ -558,6 +562,7 @@ Here are some standard type classes, which were mostly already introduced at the def minimum[T: Ord](xs: List[T]) = maximum(xs)(using descending) ``` +The `Reader` type is a bit hairy. It is a type class (written in the parameterized syntax) where we fix a context `Ctx` and then let `Reader` be the polymorphic function type over `X` that takes a context `Ctx` and returns an `X`. Type classes like this are commonly used in monadic effect systems. ### Example 2 From 09a6a26a818b9503a989ec33aabf1999021d300a Mon Sep 17 00:00:00 2001 From: odersky Date: Sat, 6 Apr 2024 15:13:46 +0200 Subject: [PATCH 228/371] Add Precise type class for precise type inference [Cherry-picked 94bc6fee3aa23e0d00fb5a044b3f99ea13a3cc37] --- .../dotty/tools/dotc/core/Definitions.scala | 2 + .../src/dotty/tools/dotc/core/Types.scala | 14 +++- .../dotty/tools/dotc/typer/ProtoTypes.scala | 71 +++++++++++++------ .../dotty/tools/dotc/typer/Synthesizer.scala | 10 ++- .../src/dotty/tools/dotc/typer/Typer.scala | 2 +- .../dotty/tools/repl/TabcompleteTests.scala | 4 +- .../reference/experimental/typeclasses.md | 65 +++++++++-------- library/src/scala/Precise.scala | 11 +++ tests/neg/singleton-ctx-bound.check | 34 +++++++++ tests/neg/singleton-ctx-bound.scala | 15 ++++ tests/pos/deferred-givens-singletons.scala | 13 ++++ tests/pos/precise-ctx-bound.scala | 51 +++++++++++++ tests/pos/precise-indexof.scala | 46 ++++++++++++ tests/pos/singleton-ctx-bound.scala | 7 +- .../stdlibExperimentalDefinitions.scala | 3 + 15 files changed, 287 insertions(+), 61 deletions(-) create mode 100644 library/src/scala/Precise.scala create mode 100644 tests/neg/singleton-ctx-bound.check create mode 100644 tests/pos/deferred-givens-singletons.scala create mode 100644 tests/pos/precise-ctx-bound.scala create mode 100644 tests/pos/precise-indexof.scala diff --git a/compiler/src/dotty/tools/dotc/core/Definitions.scala b/compiler/src/dotty/tools/dotc/core/Definitions.scala index 6d3a4de7b026..11a4a8473e79 100644 --- a/compiler/src/dotty/tools/dotc/core/Definitions.scala +++ b/compiler/src/dotty/tools/dotc/core/Definitions.scala @@ -535,6 +535,8 @@ class Definitions { def ConsType: TypeRef = ConsClass.typeRef @tu lazy val SeqFactoryClass: Symbol = requiredClass("scala.collection.SeqFactory") + @tu lazy val PreciseClass: ClassSymbol = requiredClass("scala.Precise") + @tu lazy val SingletonClass: ClassSymbol = // needed as a synthetic class because Scala 2.x refers to it in classfiles // but does not define it as an explicit class. diff --git a/compiler/src/dotty/tools/dotc/core/Types.scala b/compiler/src/dotty/tools/dotc/core/Types.scala index 27931bad0bc3..3c6d9ecbf204 100644 --- a/compiler/src/dotty/tools/dotc/core/Types.scala +++ b/compiler/src/dotty/tools/dotc/core/Types.scala @@ -4941,7 +4941,7 @@ object Types extends TypeUtils { initOrigin: TypeParamRef, creatorState: TyperState | Null, val initNestingLevel: Int, - precise: Boolean) extends CachedProxyType with ValueType { + val precise: Boolean) extends CachedProxyType with ValueType { private var currentOrigin = initOrigin def origin: TypeParamRef = currentOrigin @@ -5045,9 +5045,19 @@ object Types extends TypeUtils { else instantiateWith(tp) + def isPrecise(using Context) = + precise + || { + val constr = ctx.typerState.constraint + constr.upper(origin).exists: tparam => + constr.typeVarOfParam(tparam) match + case tvar: TypeVar => tvar.precise + case _ => false + } + /** Widen unions when instantiating this variable in the current context? */ def widenPolicy(using Context): Widen = - if precise then Widen.None + if isPrecise then Widen.None else if ctx.typerState.constraint.isHard(this) then Widen.Singletons else Widen.Unions diff --git a/compiler/src/dotty/tools/dotc/typer/ProtoTypes.scala b/compiler/src/dotty/tools/dotc/typer/ProtoTypes.scala index 7afdc836f656..bb1d5ac71269 100644 --- a/compiler/src/dotty/tools/dotc/typer/ProtoTypes.scala +++ b/compiler/src/dotty/tools/dotc/typer/ProtoTypes.scala @@ -11,6 +11,7 @@ import Constants.* import util.{Stats, SimpleIdentityMap, SimpleIdentitySet} import Decorators.* import Uniques.* +import Flags.Method import inlines.Inlines import config.Printers.typr import Inferencing.* @@ -26,7 +27,7 @@ object ProtoTypes { import tpd.* /** A trait defining an `isCompatible` method. */ - trait Compatibility { + trait Compatibility: /** Is there an implicit conversion from `tp` to `pt`? */ def viewExists(tp: Type, pt: Type)(using Context): Boolean @@ -106,19 +107,34 @@ object ProtoTypes { if !res then ctx.typerState.constraint = savedConstraint res - /** Constrain result with special case if `meth` is an inlineable method in an inlineable context. - * In that case, we should always succeed and not constrain type parameters in the expected type, - * because the actual return type can be a subtype of the currently known return type. - * However, we should constrain parameters of the declared return type. This distinction is - * achieved by replacing expected type parameters with wildcards. + /** Constrain result with two special cases: + * 1. If `meth` is an inlineable method in an inlineable context, + * we should always succeed and not constrain type parameters in the expected type, + * because the actual return type can be a subtype of the currently known return type. + * However, we should constrain parameters of the declared return type. This distinction is + * achieved by replacing expected type parameters with wildcards. + * 2. When constraining the result of a primitive value operation against + * a precise typevar, don't lower-bound the typevar with a non-singleton type. */ def constrainResult(meth: Symbol, mt: Type, pt: Type)(using Context): Boolean = - if (Inlines.isInlineable(meth)) { + + def constFoldException(pt: Type): Boolean = pt.dealias match + case tvar: TypeVar => + tvar.isPrecise + && meth.is(Method) && meth.owner.isPrimitiveValueClass + && mt.resultType.isPrimitiveValueType && !mt.resultType.isSingleton + case tparam: TypeParamRef => + constFoldException(ctx.typerState.constraint.typeVarOfParam(tparam)) + case _ => + false + + if Inlines.isInlineable(meth) then constrainResult(mt, wildApprox(pt)) true - } - else constrainResult(mt, pt) - } + else + constFoldException(pt) || constrainResult(mt, pt) + end constrainResult + end Compatibility object NoViewsAllowed extends Compatibility { override def viewExists(tp: Type, pt: Type)(using Context): Boolean = false @@ -701,10 +717,18 @@ object ProtoTypes { case FunProto((arg: untpd.TypedSplice) :: Nil, _) => arg.isExtensionReceiver case _ => false - object SingletonConstrained: - def unapply(tp: Type)(using Context): Option[Type] = tp.dealias match - case RefinedType(parent, tpnme.Self, TypeAlias(tp)) - if parent.typeSymbol == defn.SingletonClass => Some(tp) + /** An extractor for Singleton and Precise witness types. + * + * Singleton { type Self = T } returns Some(T, true) + * Precise { type Self = T } returns Some(T, false) + */ + object PreciseConstrained: + def unapply(tp: Type)(using Context): Option[(Type, Boolean)] = tp.dealias match + case RefinedType(parent, tpnme.Self, TypeAlias(tp)) => + val tsym = parent.typeSymbol + if tsym == defn.SingletonClass then Some((tp, true)) + else if tsym == defn.PreciseClass then Some((tp, false)) + else None case _ => None /** Add all parameters of given type lambda `tl` to the constraint's domain. @@ -728,30 +752,31 @@ object ProtoTypes { // hk type lambdas can be added to constraints without typevars during match reduction val added = state.constraint.ensureFresh(tl) - def singletonConstrainedRefs(tp: Type): Set[TypeParamRef] = tp match + def preciseConstrainedRefs(tp: Type, singletonOnly: Boolean): Set[TypeParamRef] = tp match case tp: MethodType if tp.isContextualMethod => val ownBounds = - for case SingletonConstrained(ref: TypeParamRef) <- tp.paramInfos + for + case PreciseConstrained(ref: TypeParamRef, singleton) <- tp.paramInfos + if !singletonOnly || singleton yield ref - ownBounds.toSet ++ singletonConstrainedRefs(tp.resType) + ownBounds.toSet ++ preciseConstrainedRefs(tp.resType, singletonOnly) case tp: LambdaType => - singletonConstrainedRefs(tp.resType) + preciseConstrainedRefs(tp.resType, singletonOnly) case _ => Set.empty - val singletonRefs = singletonConstrainedRefs(added) - def isSingleton(ref: TypeParamRef) = singletonRefs.contains(ref) - def newTypeVars: List[TypeVar] = + val preciseRefs = preciseConstrainedRefs(added, singletonOnly = false) for paramRef <- added.paramRefs yield - val tvar = TypeVar(paramRef, state, nestingLevel, precise = isSingleton(paramRef)) + val tvar = TypeVar(paramRef, state, nestingLevel, precise = preciseRefs.contains(paramRef)) state.ownedVars += tvar tvar val tvars = if addTypeVars then newTypeVars else Nil TypeComparer.addToConstraint(added, tvars) + val singletonRefs = preciseConstrainedRefs(added, singletonOnly = true) for paramRef <- added.paramRefs do - if isSingleton(paramRef) then paramRef <:< defn.SingletonType + if singletonRefs.contains(paramRef) then paramRef <:< defn.SingletonType (added, tvars) end constrained diff --git a/compiler/src/dotty/tools/dotc/typer/Synthesizer.scala b/compiler/src/dotty/tools/dotc/typer/Synthesizer.scala index 9fb091e3306c..6b18540b6551 100644 --- a/compiler/src/dotty/tools/dotc/typer/Synthesizer.scala +++ b/compiler/src/dotty/tools/dotc/typer/Synthesizer.scala @@ -238,7 +238,7 @@ class Synthesizer(typer: Typer)(using @constructorOnly c: Context): end synthesizedValueOf val synthesizedSingleton: SpecialHandler = (formal, span) => formal match - case SingletonConstrained(tp) => + case PreciseConstrained(tp, true) => if tp.isSingletonBounded(frozen = false) then withNoErrors: ref(defn.Compiletime_erasedValue).appliedToType(formal).withSpan(span) @@ -247,6 +247,13 @@ class Synthesizer(typer: Typer)(using @constructorOnly c: Context): case _ => EmptyTreeNoError + val synthesizedPrecise: SpecialHandler = (formal, span) => formal match + case PreciseConstrained(tp, false) => + withNoErrors: + ref(defn.Compiletime_erasedValue).appliedToType(formal).withSpan(span) + case _ => + EmptyTreeNoError + /** Create an anonymous class `new Object { type MirroredMonoType = ... }` * and mark it with given attachment so that it is made into a mirror at PostTyper. */ @@ -749,6 +756,7 @@ class Synthesizer(typer: Typer)(using @constructorOnly c: Context): defn.ManifestClass -> synthesizedManifest, defn.OptManifestClass -> synthesizedOptManifest, defn.SingletonClass -> synthesizedSingleton, + defn.PreciseClass -> synthesizedPrecise, ) def tryAll(formal: Type, span: Span)(using Context): TreeWithErrors = diff --git a/compiler/src/dotty/tools/dotc/typer/Typer.scala b/compiler/src/dotty/tools/dotc/typer/Typer.scala index d23f77143e14..b1b21bd1eee5 100644 --- a/compiler/src/dotty/tools/dotc/typer/Typer.scala +++ b/compiler/src/dotty/tools/dotc/typer/Typer.scala @@ -3027,7 +3027,7 @@ class Typer(@constructorOnly nestingLevel: Int = 0) extends Namer cpy.Select(id)(This(cls), id.name) case _ => super.transform(tree) - ValDef(impl, anchorParams.transform(rhs)) + ValDef(impl, anchorParams.transform(rhs)).withSpan(impl.span.endPos) end givenImpl val givenImpls = diff --git a/compiler/test/dotty/tools/repl/TabcompleteTests.scala b/compiler/test/dotty/tools/repl/TabcompleteTests.scala index e4c3a2557e7d..f719752be353 100644 --- a/compiler/test/dotty/tools/repl/TabcompleteTests.scala +++ b/compiler/test/dotty/tools/repl/TabcompleteTests.scala @@ -122,11 +122,11 @@ class TabcompleteTests extends ReplTest { } @Test def moduleCompletion = initially { - assertEquals(List("Predef"), tabComplete("object Foo { type T = Pre")) + assertEquals(List("Predef"), tabComplete("object Foo { type T = Pred")) } @Test def i6415 = initially { - assertEquals(List("Predef"), tabComplete("object Foo { opaque type T = Pre")) + assertEquals(List("Predef"), tabComplete("object Foo { opaque type T = Pred")) } @Test def i6361 = initially { diff --git a/docs/_docs/reference/experimental/typeclasses.md b/docs/_docs/reference/experimental/typeclasses.md index dab612512579..cf5f3220faa6 100644 --- a/docs/_docs/reference/experimental/typeclasses.md +++ b/docs/_docs/reference/experimental/typeclasses.md @@ -444,6 +444,39 @@ This is less of a disruption than it might appear at first: - Simplification of the language since a feature is dropped - Eliminate non-obvious and misleading syntax. + +### Bonus: Fixing Singleton + +We know the current treatment of `Singleton` as a type bound is broken since +`x.type | y.type <: Singleton` holds by the subtyping rules for union types, even though `x.type | y.type` is clearly not a singleton. + +A better approach is to treat `Singleton` as a type class that is interpreted specially by the compiler. + +We can do this in a backwards-compatible way by defining `Singleton` like this: + +```scala +trait Singleton: + type Self +``` + +Then, instead of using an unsound upper bound we can use a context bound: + +```scala +def f[X: Singleton](x: X) = ... +``` + +The context bound is treated specially by the compiler so that no using clause is generated at runtime (this is straightforward, using the erased definitions mechanism). + +### Bonus: Precise Typing + +This approach also presents a solution to the problem how to express precise type variables. We can introduce another special type class `Precise` and use it like this: + +```scala +def f[X: Precise](x: X) = ... +``` +Like a `Singleton` bound, a `Precise` bound disables automatic widening of singleton types or union types in inferred instances of type variable `X`. But there is no requirement that the type argument _must_ be a singleton. + + ## Summary of Syntax Changes Here is the complete context-free syntax for all proposed features. @@ -692,38 +725,10 @@ Dimi Racordon tried to [port some core elements](https://github.com/kyouko-taiga With the improvements proposed here, the library can now be expressed quite clearly and straightforwardly. See tests/pos/hylolib in this PR for details. -## Suggested Improvements unrelated to Type Classes - -The following two improvements elsewhere would make sense alongside the suggested changes to type classes. But only the first (fixing singleton) forms a part of this proposal and is implemented. - -### Fixing Singleton - -We know the current treatment of `Singleton` as a type bound is broken since -`x.type | y.type <: Singleton` holds by the subtyping rules for union types, even though `x.type | y.type` is clearly not a singleton. - -A better approach is to treat `Singleton` as a type class that is interpreted specially by the compiler. +## Suggested Improvement unrelated to Type Classes -We can do this in a backwards-compatible way by defining `Singleton` like this: +The following improvement would make sense alongside the suggested changes to type classes. But it does not form part of this proposal and is not yet implemented. -```scala -trait Singleton: - type Self -``` - -Then, instead of using an unsound upper bound we can use a context bound: - -```scala -def f[X: Singleton](x: X) = ... -``` - -The context bound is treated specially by the compiler so that no using clause is generated at runtime (this is straightforward, using the erased definitions mechanism). - -_Aside_: This can also lead to a solution how to express precise type variables. We can introduce another special type class `Precise` and use it like this: - -```scala -def f[X: Precise](x: X) = ... -``` -This would disable automatic widening of singleton types in inferred instances of type variable `X`. ### Using `as` also in Patterns diff --git a/library/src/scala/Precise.scala b/library/src/scala/Precise.scala new file mode 100644 index 000000000000..aad42ca8950f --- /dev/null +++ b/library/src/scala/Precise.scala @@ -0,0 +1,11 @@ +package scala +import annotation.experimental +import language.experimental.erasedDefinitions + +/** A type class-like trait intended as a context bound for type variables. + * If we have `[X: Precise]`, instances of the type variable `X` are inferred + * in precise mode. This means that singleton types and union types are not + * widened. + */ +@experimental erased trait Precise: + type Self diff --git a/tests/neg/singleton-ctx-bound.check b/tests/neg/singleton-ctx-bound.check new file mode 100644 index 000000000000..785123c0e680 --- /dev/null +++ b/tests/neg/singleton-ctx-bound.check @@ -0,0 +1,34 @@ +-- [E007] Type Mismatch Error: tests/neg/singleton-ctx-bound.scala:7:5 ------------------------------------------------- +7 | f1(someInt) // error + | ^^^^^^^ + | Found: Int + | Required: Singleton + | + | longer explanation available when compiling with `-explain` +-- [E007] Type Mismatch Error: tests/neg/singleton-ctx-bound.scala:12:5 ------------------------------------------------ +12 | f2(someInt) // error + | ^^^^^^^ + | Found: Int + | Required: Singleton + | + | longer explanation available when compiling with `-explain` +-- [E172] Type Error: tests/neg/singleton-ctx-bound.scala:13:26 -------------------------------------------------------- +13 | f2(if ??? then 1 else 2) // error + | ^ + |No given instance of type (1 : Int) | (2 : Int) is Singleton was found for parameter x$2 of method f2 in object Test. Failed to synthesize an instance of type (1 : Int) | (2 : Int) is Singleton: (1 : Int) | (2 : Int) is not a singleton +-- [E007] Type Mismatch Error: tests/neg/singleton-ctx-bound.scala:17:5 ------------------------------------------------ +17 | f3(someInt) // error + | ^^^^^^^ + | Found: Int + | Required: Singleton + | + | longer explanation available when compiling with `-explain` +-- [E172] Type Error: tests/neg/singleton-ctx-bound.scala:18:26 -------------------------------------------------------- +18 | f3(if ??? then 1 else 2) // error + | ^ + |No given instance of type Singleton{type Self = (1 : Int) | (2 : Int)} was found for a context parameter of method f3 in object Test. Failed to synthesize an instance of type Singleton{type Self = (1 : Int) | (2 : Int)}: (1 : Int) | (2 : Int) is not a singleton +-- [E172] Type Error: tests/neg/singleton-ctx-bound.scala:33:6 --------------------------------------------------------- +33 |class D extends A: // error + |^ + |No given instance of type Singleton{type Self = D.this.Elem} was found for inferring the implementation of the deferred given instance given_Singleton_Elem in trait A. Failed to synthesize an instance of type Singleton{type Self = D.this.Elem}: D.this.Elem is not a singleton +34 | type Elem = Int diff --git a/tests/neg/singleton-ctx-bound.scala b/tests/neg/singleton-ctx-bound.scala index 64bb63a288b0..e061ec54bb16 100644 --- a/tests/neg/singleton-ctx-bound.scala +++ b/tests/neg/singleton-ctx-bound.scala @@ -18,3 +18,18 @@ object Test: f3(if ??? then 1 else 2) // error f3(3 * 2) // OK f3(6) // OK + +import compiletime.* + +trait A: + type Elem: Singleton + +class B extends A: + type Elem = 1 // OK + +class C[X: Singleton] extends A: + type Elem = X // OK + +class D extends A: // error + type Elem = Int + diff --git a/tests/pos/deferred-givens-singletons.scala b/tests/pos/deferred-givens-singletons.scala new file mode 100644 index 000000000000..60a881340b75 --- /dev/null +++ b/tests/pos/deferred-givens-singletons.scala @@ -0,0 +1,13 @@ +//> using options -language:experimental.modularity -source future +import compiletime.* + +trait A: + type Elem: Singleton + +class B extends A: + type Elem = 1 + +class C[X: Singleton] extends A: + type Elem = X + + diff --git a/tests/pos/precise-ctx-bound.scala b/tests/pos/precise-ctx-bound.scala new file mode 100644 index 000000000000..3f17a5b4a54e --- /dev/null +++ b/tests/pos/precise-ctx-bound.scala @@ -0,0 +1,51 @@ +//> using options -language:experimental.modularity -source future +object Test: + + class Wrap[T](x: T) + + def f0[T](x: T): Wrap[T] = Wrap(x) + val x0 = f0(1) + val _: Wrap[Int] = x0 + + def f1[T: Precise](x: T): Wrap[T] = Wrap(x) + def l = "hello".length + val x1 = Wrap(l) + val _: Wrap[Int] = x1 + + def f2[T](x: T)(using Precise { type Self = T}): Wrap[T] = Wrap(x) + val x2 = f2(1) + val _: Wrap[1] = x2 + + def f3[T: Precise](x: T): Wrap[T] = Wrap(x) + val x3 = f3(identity(1)) + val _: Wrap[1] = x3 + val x3a = f3(1 + 2) + val _: Wrap[3] = x3a + + def f4[T](x: T)(using T is Precise): Wrap[T] = Wrap(x) + val x4 = f4(1) + val _: Wrap[1] = x4 + val x4a = f4(1 + 2) + val _: Wrap[3] = x4a + val y4 = f4(if ??? then 1 else 2) + val _: Wrap[1 | 2] = y4 + val z4 = f4(if ??? then B() else C()) + val _: Wrap[B | C] = z4 + trait A + class B extends A + class C extends A + + class C0[T](x: T): + def fld: T = x + val y0 = C0("hi") + val _: String = y0.fld + + class C2[T](x: T)(using T is Precise): + def fld: T = x + val y2 = C2(identity("hi")) + val _: "hi" = y2.fld + + class C3[T: Precise](x: T): + def fld: T = x + val y3 = C3("hi") + val _: "hi" = y3.fld diff --git a/tests/pos/precise-indexof.scala b/tests/pos/precise-indexof.scala new file mode 100644 index 000000000000..af1e6c5b504b --- /dev/null +++ b/tests/pos/precise-indexof.scala @@ -0,0 +1,46 @@ +//> using options -language:experimental.modularity -source future +import compiletime.* +import compiletime.ops.int.* + +/** The index of `Y` in tuple `X` as a literal constant Int, + * or `Size[X]` if `Y` does not occur in `X` + */ +type IndexOf[X <: Tuple, Y] <: Int = X match + case Y *: _ => 0 + case x *: xs => S[IndexOf[xs, Y]] + case EmptyTuple => 0 + +extension [X <: Tuple](inline x: X) + + /** The index (starting at 0) of the first element in the type `X` of `x` + * that matches type `Y`. + */ + inline def indexOfType[Y] = constValue[IndexOf[X, Y]] + + inline def indexOf[Y: Precise](y: Y) = constValue[IndexOf[X, Y]] + +// Note: without the Precise, the index calcularion would go wrong. For instance, +// (1, 2, "hello", true).indexOf(2) would be 0, the same as (1, 2, "hello", true).indexOTypef[Int] +// (1, 2, "hello", true).indexOf("foo") would be 2, the same as (1, 2, "hello", true).indexOTypef[String] +// But we could alternatively pick Singleton + +@main def Test = + val t: (1, 2, "hello", true) = (1, 2, "hello", true) + val x1: 0 = t.indexOfType[1] + val x2: 1 = t.indexOfType[2] + val x3: 2 = t.indexOfType["hello"] + val x4: 3 = t.indexOfType[true] + val x5: 4 = t.indexOfType[77] + val x6: 0 = t.indexOfType[Int] + val x7: 2 = t.indexOfType[String] + val x8: 4 = t.indexOfType[Double] + + val y1: 0 = t.indexOf(1) + val y2: 1 = t.indexOf(2) + val y3: 2 = t.indexOf("hello") + val y4: 3 = t.indexOf(true) + val y5: 4 = t.indexOf(identity(77)) + val y6: 0 = t.indexOf(identity(1)) + val y7: 4 = t.indexOf("foo") + + diff --git a/tests/pos/singleton-ctx-bound.scala b/tests/pos/singleton-ctx-bound.scala index 5d15cf53836e..c6b0d2fb823c 100644 --- a/tests/pos/singleton-ctx-bound.scala +++ b/tests/pos/singleton-ctx-bound.scala @@ -36,9 +36,12 @@ object Test: class C2[T](x: T)(using T is Singleton): def fld: T = x val y2 = C2("hi") - val _: "hi" = y1.fld + val _: "hi" = y2.fld class C3[T: Singleton](x: T): def fld: T = x val y3 = C3("hi") - val _: "hi" = y1.fld \ No newline at end of file + val _: "hi" = y3.fld + + + diff --git a/tests/run-tasty-inspector/stdlibExperimentalDefinitions.scala b/tests/run-tasty-inspector/stdlibExperimentalDefinitions.scala index 48ff5407ac87..df35bed19360 100644 --- a/tests/run-tasty-inspector/stdlibExperimentalDefinitions.scala +++ b/tests/run-tasty-inspector/stdlibExperimentalDefinitions.scala @@ -79,6 +79,9 @@ val experimentalDefinitionInLibrary = Set( "scala.NamedTuple$", "scala.NamedTupleDecomposition", "scala.NamedTupleDecomposition$", + + // New feature: Precise trait + "scala.Precise", ) From e82cfbefac06aae6db231765dfde5219b1a379c3 Mon Sep 17 00:00:00 2001 From: odersky Date: Sun, 14 Apr 2024 15:26:30 +0200 Subject: [PATCH 229/371] Fix rebase breakage [Cherry-picked 887fbc4b4996d95360e5dd92492d8f3904cde27a] --- compiler/src/dotty/tools/dotc/typer/Applications.scala | 2 +- tests/neg/cb-companion-leaks.check | 6 +++--- 2 files changed, 4 insertions(+), 4 deletions(-) diff --git a/compiler/src/dotty/tools/dotc/typer/Applications.scala b/compiler/src/dotty/tools/dotc/typer/Applications.scala index 63e86e3a321d..c3369ac58e31 100644 --- a/compiler/src/dotty/tools/dotc/typer/Applications.scala +++ b/compiler/src/dotty/tools/dotc/typer/Applications.scala @@ -1791,7 +1791,7 @@ trait Applications extends Compatibility { * a. always as good as a method or a polymorphic method. * b. as good as a member of any other type `tp2` if `asGoodValueType(tp1, tp2) = true` */ - def isAsGood(alt1: TermRef, tp1: Type, alt2: TermRef, tp2: Type): Boolean = trace(i"isAsSpecific $tp1 $tp2", overload) { + def isAsGood(alt1: TermRef, tp1: Type, alt2: TermRef, tp2: Type): Boolean = trace(i"isAsGood $tp1 $tp2", overload) { tp1 match case tp1: MethodType => // (1) tp1.paramInfos.isEmpty && tp2.isInstanceOf[LambdaType] diff --git a/tests/neg/cb-companion-leaks.check b/tests/neg/cb-companion-leaks.check index 156f8a7ab3ee..560561e0e261 100644 --- a/tests/neg/cb-companion-leaks.check +++ b/tests/neg/cb-companion-leaks.check @@ -1,4 +1,4 @@ --- [E194] Type Error: tests/neg/cb-companion-leaks.scala:9:23 ---------------------------------------------------------- +-- [E195] Type Error: tests/neg/cb-companion-leaks.scala:9:23 ---------------------------------------------------------- 9 | def foo[A: {C, D}] = A // error | ^ | context bound companion value A cannot be used as a value @@ -20,7 +20,7 @@ | companion value with the (term-)name `A`. However, these context bound companions | are not values themselves, they can only be referred to in selections. --------------------------------------------------------------------------------------------------------------------- --- [E194] Type Error: tests/neg/cb-companion-leaks.scala:13:10 --------------------------------------------------------- +-- [E195] Type Error: tests/neg/cb-companion-leaks.scala:13:10 --------------------------------------------------------- 13 | val x = A // error | ^ | context bound companion value A cannot be used as a value @@ -42,7 +42,7 @@ | companion value with the (term-)name `A`. However, these context bound companions | are not values themselves, they can only be referred to in selections. -------------------------------------------------------------------------------------------------------------------- --- [E194] Type Error: tests/neg/cb-companion-leaks.scala:15:9 ---------------------------------------------------------- +-- [E195] Type Error: tests/neg/cb-companion-leaks.scala:15:9 ---------------------------------------------------------- 15 | val y: A.type = ??? // error | ^ | context bound companion value A cannot be used as a value From 3b814bbaa402fd09a12914f71d3a0c65b82cc638 Mon Sep 17 00:00:00 2001 From: odersky Date: Wed, 17 Apr 2024 23:01:54 +0200 Subject: [PATCH 230/371] Delay roll-out of new prioritization scheme: Now: 3.5: old scheme but warn if there are changes in the future 3.6-migration: new scheme, warn if prioritization has changed 3.6: new scheme, no warning [Cherry-picked 1e72282418d93a968b36fa43415f1ea63125d982] --- compiler/src/dotty/tools/dotc/typer/Applications.scala | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/compiler/src/dotty/tools/dotc/typer/Applications.scala b/compiler/src/dotty/tools/dotc/typer/Applications.scala index c3369ac58e31..fd4c634801be 100644 --- a/compiler/src/dotty/tools/dotc/typer/Applications.scala +++ b/compiler/src/dotty/tools/dotc/typer/Applications.scala @@ -1880,7 +1880,7 @@ trait Applications extends Compatibility { val tp1p = prepare(tp1) val tp2p = prepare(tp2) - if Feature.sourceVersion.isAtMost(SourceVersion.`3.4`) + if Feature.sourceVersion.isAtMost(SourceVersion.`3.5`) || oldResolution || !alt1isGiven && !alt2isGiven then From a8f7585ba757faaf74854d20129a9111c5489051 Mon Sep 17 00:00:00 2001 From: odersky Date: Sun, 28 Apr 2024 13:12:43 +0200 Subject: [PATCH 231/371] Fix rebase breakage again [Cherry-picked 9d0ca20f949c4c390f4fa414f3c5ff4460013960] --- compiler/src/dotty/tools/dotc/typer/Typer.scala | 2 +- project/MiMaFilters.scala | 2 +- 2 files changed, 2 insertions(+), 2 deletions(-) diff --git a/compiler/src/dotty/tools/dotc/typer/Typer.scala b/compiler/src/dotty/tools/dotc/typer/Typer.scala index b1b21bd1eee5..a2291d55bac8 100644 --- a/compiler/src/dotty/tools/dotc/typer/Typer.scala +++ b/compiler/src/dotty/tools/dotc/typer/Typer.scala @@ -2366,7 +2366,7 @@ class Typer(@constructorOnly nestingLevel: Int = 0) extends Namer val tparam = untpd.Ident(tree.paramName).withSpan(tree.span) if tycon.tpe.typeParams.nonEmpty then typed(untpd.AppliedTypeTree(tyconSplice, tparam :: Nil)) - else if Feature.enabled(modularity) && tycon.tpe.member(tpnme.Self).symbol.isAbstractType then + else if Feature.enabled(modularity) && tycon.tpe.member(tpnme.Self).symbol.isAbstractOrParamType then val tparamSplice = untpd.TypedSplice(typedExpr(tparam)) typed(untpd.RefinedTypeTree(tyconSplice, List(untpd.TypeDef(tpnme.Self, tparamSplice)))) else diff --git a/project/MiMaFilters.scala b/project/MiMaFilters.scala index 6c3640eed12c..18d2e985f844 100644 --- a/project/MiMaFilters.scala +++ b/project/MiMaFilters.scala @@ -98,7 +98,7 @@ object MiMaFilters { val ForwardsBreakingChanges: Map[String, Seq[ProblemFilter]] = Map( // Additions that require a new minor version of tasty core Build.mimaPreviousDottyVersion -> Seq( - ProblemFilters.exclude[DirectMissingMethodProblem]("dotty.tools.tasty.TastyFormat.FLEXIBLEtype") + ProblemFilters.exclude[DirectMissingMethodProblem]("dotty.tools.tasty.TastyFormat.FLEXIBLEtype"), ProblemFilters.exclude[DirectMissingMethodProblem]("dotty.tools.tasty.TastyFormat.TRACKED"), ), From dd8061fd8897870b35093c284af44fed3016eadf Mon Sep 17 00:00:00 2001 From: odersky Date: Sun, 28 Apr 2024 14:13:26 +0200 Subject: [PATCH 232/371] Make best effort compilation work with context bound companions If they are illegally used as values, we need to return an error tree, not a tree with a symbol that can't be pickled. [Cherry-picked fd072dc686bf0f0cc789ef0b7385d8189d64e374] --- .../tools/dotc/transform/PostTyper.scala | 19 +++++++++++-------- 1 file changed, 11 insertions(+), 8 deletions(-) diff --git a/compiler/src/dotty/tools/dotc/transform/PostTyper.scala b/compiler/src/dotty/tools/dotc/transform/PostTyper.scala index a110ec53abc0..22370e923a4b 100644 --- a/compiler/src/dotty/tools/dotc/transform/PostTyper.scala +++ b/compiler/src/dotty/tools/dotc/transform/PostTyper.scala @@ -279,13 +279,15 @@ class PostTyper extends MacroTransform with InfoTransformer { thisPhase => } } - def checkUsableAsValue(tree: Tree)(using Context): Unit = + def checkUsableAsValue(tree: Tree)(using Context): Tree = def unusable(msg: Symbol => Message) = - report.error(msg(tree.symbol), tree.srcPos) + errorTree(tree, msg(tree.symbol)) if tree.symbol.is(ConstructorProxy) then unusable(ConstructorProxyNotValue(_)) - if tree.symbol.isContextBoundCompanion then + else if tree.symbol.isContextBoundCompanion then unusable(ContextBoundCompanionNotValue(_)) + else + tree def checkStableSelection(tree: Tree)(using Context): Unit = def check(qual: Tree) = @@ -330,11 +332,11 @@ class PostTyper extends MacroTransform with InfoTransformer { thisPhase => if tree.isType then checkNotPackage(tree) else - checkUsableAsValue(tree) registerNeedsInlining(tree) - tree.tpe match { + val tree1 = checkUsableAsValue(tree) + tree1.tpe match { case tpe: ThisType => This(tpe.cls).withSpan(tree.span) - case _ => tree + case _ => tree1 } case tree @ Select(qual, name) => registerNeedsInlining(tree) @@ -342,8 +344,9 @@ class PostTyper extends MacroTransform with InfoTransformer { thisPhase => Checking.checkRealizable(qual.tpe, qual.srcPos) withMode(Mode.Type)(super.transform(checkNotPackage(tree))) else - checkUsableAsValue(tree) - transformSelect(tree, Nil) + checkUsableAsValue(tree) match + case tree1: Select => transformSelect(tree1, Nil) + case tree1 => tree1 case tree: Apply => val methType = tree.fun.tpe.widen.asInstanceOf[MethodType] val app = From 04d402345763d31e9f61cd95cf494a4219061f51 Mon Sep 17 00:00:00 2001 From: odersky Date: Tue, 30 Apr 2024 11:01:33 +0200 Subject: [PATCH 233/371] Tweaks after review [Cherry-picked 21f5e678e6a58380d47b8f68edf89317402595a9] --- .../src/dotty/tools/dotc/ast/Desugar.scala | 93 +++++++++++-------- .../src/dotty/tools/dotc/ast/TreeInfo.scala | 8 +- compiler/src/dotty/tools/dotc/ast/untpd.scala | 1 + .../src/dotty/tools/dotc/config/Config.scala | 3 +- compiler/src/dotty/tools/dotc/core/Mode.scala | 4 +- .../src/dotty/tools/dotc/core/NamerOps.scala | 8 +- .../tools/dotc/core/SymDenotations.scala | 2 +- .../src/dotty/tools/dotc/core/Types.scala | 11 ++- .../dotty/tools/dotc/parsing/Parsers.scala | 30 ++++-- .../tools/dotc/transform/PostTyper.scala | 6 +- .../src/dotty/tools/dotc/typer/Namer.scala | 18 ++-- .../dotty/tools/dotc/typer/ProtoTypes.scala | 1 + .../dotty/tools/dotc/typer/RefChecks.scala | 8 +- .../src/dotty/tools/dotc/typer/Typer.scala | 64 ++++++++----- .../annotation/internal/WitnessNames.scala | 3 +- library/src/scala/compiletime/package.scala | 3 +- .../scala/runtime/stdLibPatches/Predef.scala | 1 + tests/neg/i12348.check | 12 +-- tests/neg/i12348.scala | 2 +- tests/pos/typeclasses-this.scala | 10 +- .../stdlibExperimentalDefinitions.scala | 5 +- 21 files changed, 175 insertions(+), 118 deletions(-) diff --git a/compiler/src/dotty/tools/dotc/ast/Desugar.scala b/compiler/src/dotty/tools/dotc/ast/Desugar.scala index 08953f1dec6b..0681492a4ba7 100644 --- a/compiler/src/dotty/tools/dotc/ast/Desugar.scala +++ b/compiler/src/dotty/tools/dotc/ast/Desugar.scala @@ -226,10 +226,21 @@ object desugar { private def defDef(meth: DefDef, isPrimaryConstructor: Boolean = false)(using Context): Tree = addDefaultGetters(elimContextBounds(meth, isPrimaryConstructor)) + /** Drop context bounds in given TypeDef, replacing them with evidence ValDefs that + * get added to a buffer. + * @param tdef The given TypeDef + * @param evidenceBuf The buffer to which evidence gets added. This buffer + * is shared between desugarings of different type parameters + * of the same method. + * @param evidenceFlags The flags to use for evidence definitions + * @param freshName A function to generate fresh names for evidence definitions + * @param allParams If `tdef` is a type paramter, all parameters of the owning method, + * otherwise the empty list. + */ private def desugarContextBounds( tdef: TypeDef, evidenceBuf: mutable.ListBuffer[ValDef], - flags: FlagSet, + evidenceFlags: FlagSet, freshName: untpd.Tree => TermName, allParamss: List[ParamClause])(using Context): TypeDef = @@ -237,18 +248,18 @@ object desugar { def desugarRhs(rhs: Tree): Tree = rhs match case ContextBounds(tbounds, cxbounds) => - val isMember = flags.isAllOf(DeferredGivenFlags) + val isMember = evidenceFlags.isAllOf(DeferredGivenFlags) for bound <- cxbounds do val evidenceName = bound match case ContextBoundTypeTree(_, _, ownName) if !ownName.isEmpty => - ownName + ownName // if there is an explicitly given name, use it. case _ if Config.nameSingleContextBounds && !isMember && cxbounds.tail.isEmpty && Feature.enabled(Feature.modularity) => tdef.name.toTermName case _ => freshName(bound) evidenceNames += evidenceName - val evidenceParam = ValDef(evidenceName, bound, EmptyTree).withFlags(flags) + val evidenceParam = ValDef(evidenceName, bound, EmptyTree).withFlags(evidenceFlags) evidenceParam.pushAttachment(ContextBoundParam, ()) evidenceBuf += evidenceParam tbounds @@ -258,9 +269,13 @@ object desugar { rhs val tdef1 = cpy.TypeDef(tdef)(rhs = desugarRhs(tdef.rhs)) + // Under x.modularity, if there was a context bound, and `tdef`s name as a term name is + // neither a name of an existing parameter nor a name of generated evidence for + // the same method, add a WitnessAnnotation with all generated evidence names to `tdef`. + // This means a context bound proxy will be created later. if Feature.enabled(Feature.modularity) && evidenceNames.nonEmpty - && !evidenceNames.contains(tdef.name.toTermName) + && !evidenceBuf.exists(_.name == tdef.name.toTermName) && !allParamss.nestedExists(_.name == tdef.name.toTermName) then tdef1.withAddedAnnotation: @@ -332,9 +347,9 @@ object desugar { def getterParamss(n: Int): List[ParamClause] = mapParamss(takeUpTo(paramssNoRHS, n)) { - tparam => dropContextBounds(toDefParam(tparam, KeepAnnotations.All)) + tparam => dropContextBounds(toMethParam(tparam, KeepAnnotations.All)) } { - vparam => toDefParam(vparam, KeepAnnotations.All, keepDefault = false) + vparam => toMethParam(vparam, KeepAnnotations.All, keepDefault = false) } def defaultGetters(paramss: List[ParamClause], n: Int): List[DefDef] = paramss match @@ -429,32 +444,30 @@ object desugar { * The position of the added parameters is determined as follows: * * - If there is an existing parameter list that refers to one of the added - * parameters in one of its parameter types, add the new parameters - * in front of the first such parameter list. - * - Otherwise, if the last parameter list consists implicit or using parameters, + * parameters or their future context bound proxies in one of its parameter + * types, add the new parameters in front of the first such parameter list. + * - Otherwise, if the last parameter list consists of implicit or using parameters, * join the new parameters in front of this parameter list, creating one - * parameter list (this is equilavent to Scala 2's scheme). + * parameter list (this is equivalent to Scala 2's scheme). * - Otherwise, add the new parameter list at the end as a separate parameter clause. */ private def addEvidenceParams(meth: DefDef, params: List[ValDef])(using Context): DefDef = if params.isEmpty then return meth - var boundNames = params.map(_.name).toSet + var boundNames = params.map(_.name).toSet // all evidence parameter + context bound proxy names for mparams <- meth.paramss; mparam <- mparams do mparam match case tparam: TypeDef if tparam.mods.annotations.exists(WitnessNamesAnnot.unapply(_).isDefined) => boundNames += tparam.name.toTermName case _ => - //println(i"add ev params ${meth.name}, ${boundNames.toList}") - - def references(vdef: ValDef): Boolean = + def referencesBoundName(vdef: ValDef): Boolean = vdef.tpt.existsSubTree: case Ident(name: TermName) => boundNames.contains(name) case _ => false def recur(mparamss: List[ParamClause]): List[ParamClause] = mparamss match - case ValDefs(mparams) :: _ if mparams.exists(references) => + case ValDefs(mparams) :: _ if mparams.exists(referencesBoundName) => params :: mparamss case ValDefs(mparams @ (mparam :: _)) :: Nil if mparam.mods.isOneOf(GivenOrImplicit) => (params ++ mparams) :: Nil @@ -468,12 +481,12 @@ object desugar { /** The parameters generated from the contextual bounds of `meth`, as generated by `desugar.defDef` */ private def evidenceParams(meth: DefDef)(using Context): List[ValDef] = - meth.paramss.reverse match { - case ValDefs(vparams @ (vparam :: _)) :: _ if vparam.mods.isOneOf(GivenOrImplicit) => - vparams.takeWhile(_.hasAttachment(ContextBoundParam)) - case _ => - Nil - } + for + case ValDefs(vparams @ (vparam :: _)) <- meth.paramss + if vparam.mods.isOneOf(GivenOrImplicit) + param <- vparams.takeWhile(_.hasAttachment(ContextBoundParam)) + yield + param @sharable private val synthetic = Modifiers(Synthetic) @@ -491,11 +504,13 @@ object desugar { case WitnessNamesAnnot(_) => true case _ => false - private def toDefParam(tparam: TypeDef, keep: KeepAnnotations)(using Context): TypeDef = + /** Map type parameter accessor to corresponding method (i.e. constructor) parameter */ + private def toMethParam(tparam: TypeDef, keep: KeepAnnotations)(using Context): TypeDef = val mods = filterAnnots(tparam.rawMods, keep) tparam.withMods(mods & EmptyFlags | Param) - private def toDefParam(vparam: ValDef, keep: KeepAnnotations, keepDefault: Boolean)(using Context): ValDef = { + /** Map term parameter accessor to corresponding method (i.e. constructor) parameter */ + private def toMethParam(vparam: ValDef, keep: KeepAnnotations, keepDefault: Boolean)(using Context): ValDef = { val mods = filterAnnots(vparam.rawMods, keep) val hasDefault = if keepDefault then HasDefault else EmptyFlags // Need to ensure that tree is duplicated since term parameters can be watched @@ -507,22 +522,16 @@ object desugar { .withMods(mods & (GivenOrImplicit | Erased | hasDefault | Tracked) | Param) } - def mkApply(fn: Tree, paramss: List[ParamClause])(using Context): Tree = - paramss.foldLeft(fn) { (fn, params) => params match - case TypeDefs(params) => - TypeApply(fn, params.map(refOfDef)) - case (vparam: ValDef) :: _ if vparam.mods.is(Given) => - Apply(fn, params.map(refOfDef)).setApplyKind(ApplyKind.Using) - case _ => - Apply(fn, params.map(refOfDef)) - } - + /** Desugar type def (not param): Under x.moduliity this can expand + * context bounds, which are expanded to evidence ValDefs. These will + * ultimately map to deferred givens. + */ def typeDef(tdef: TypeDef)(using Context): Tree = val evidenceBuf = new mutable.ListBuffer[ValDef] val result = desugarContextBounds( tdef, evidenceBuf, (tdef.mods.flags.toTermFlags & AccessFlags) | Lazy | DeferredGivenFlags, - inventGivenOrExtensionName, Nil) + inventGivenName, Nil) if evidenceBuf.isEmpty then result else Thicket(result :: evidenceBuf.toList) /** The expansion of a class definition. See inline comments for what is involved */ @@ -597,7 +606,7 @@ object desugar { // Annotations on class _type_ parameters are set on the derived parameters // but not on the constructor parameters. The reverse is true for // annotations on class _value_ parameters. - val constrTparams = impliedTparams.map(toDefParam(_, KeepAnnotations.WitnessOnly)) + val constrTparams = impliedTparams.map(toMethParam(_, KeepAnnotations.WitnessOnly)) val constrVparamss = if (originalVparamss.isEmpty) { // ensure parameter list is non-empty if (isCaseClass) @@ -608,7 +617,7 @@ object desugar { report.error(CaseClassMissingNonImplicitParamList(cdef), namePos) ListOfNil } - else originalVparamss.nestedMap(toDefParam(_, KeepAnnotations.All, keepDefault = true)) + else originalVparamss.nestedMap(toMethParam(_, KeepAnnotations.All, keepDefault = true)) val derivedTparams = constrTparams.zipWithConserve(impliedTparams)((tparam, impliedParam) => derivedTypeParam(tparam).withAnnotations(impliedParam.mods.annotations)) @@ -630,7 +639,7 @@ object desugar { defDef( addEvidenceParams( cpy.DefDef(ddef)(paramss = joinParams(constrTparams, ddef.paramss)), - evidenceParams(constr1).map(toDefParam(_, KeepAnnotations.None, keepDefault = false))))) + evidenceParams(constr1).map(toMethParam(_, KeepAnnotations.None, keepDefault = false))))) case stat => stat } @@ -1148,7 +1157,7 @@ object desugar { */ def normalizeName(mdef: MemberDef, impl: Tree)(using Context): Name = { var name = mdef.name - if (name.isEmpty) name = name.likeSpaced(inventGivenOrExtensionName(impl)) + if (name.isEmpty) name = name.likeSpaced(inventGivenName(impl)) def errPos = mdef.source.atSpan(mdef.nameSpan) if (ctx.owner == defn.ScalaPackageClass && defn.reservedScalaClassNames.contains(name.toTypeName)) { val kind = if (name.isTypeName) "class" else "object" @@ -1195,7 +1204,7 @@ object desugar { end makePolyFunctionType /** Invent a name for an anonympus given of type or template `impl`. */ - def inventGivenOrExtensionName(impl: Tree)(using Context): SimpleName = + def inventGivenName(impl: Tree)(using Context): SimpleName = val str = impl match case impl: Template => if impl.parents.isEmpty then @@ -1207,6 +1216,10 @@ object desugar { "given_" ++ inventTypeName(impl) str.toTermName.asSimpleName + /** Extract a synthesized given name from a type tree. This is used for + * both anonymous givens and (under x.modularity) deferred givens. + * @param followArgs If true include argument types in the name + */ private class NameExtractor(followArgs: Boolean) extends UntypedTreeAccumulator[String] { private def extractArgs(args: List[Tree])(using Context): String = args.map(argNameExtractor.apply("", _)).mkString("_") diff --git a/compiler/src/dotty/tools/dotc/ast/TreeInfo.scala b/compiler/src/dotty/tools/dotc/ast/TreeInfo.scala index 990fb37f4e60..11fb572b66c6 100644 --- a/compiler/src/dotty/tools/dotc/ast/TreeInfo.scala +++ b/compiler/src/dotty/tools/dotc/ast/TreeInfo.scala @@ -382,15 +382,15 @@ trait TreeInfo[T <: Untyped] { self: Trees.Instance[T] => case _ => tree.tpe.isInstanceOf[ThisType] } - - /** Extractor for annotation.internal.WitnessNames(name_1, ..., name_n)` + + /** Under x.modularity: Extractor for `annotation.internal.WitnessNames(name_1, ..., name_n)` * represented as an untyped or typed tree. */ object WitnessNamesAnnot: - def apply(names0: List[TermName])(using Context): untpd.Tree = + def apply(names: List[TermName])(using Context): untpd.Tree = untpd.TypedSplice(tpd.New( defn.WitnessNamesAnnot.typeRef, - tpd.SeqLiteral(names0.map(n => tpd.Literal(Constant(n.toString))), tpd.TypeTree(defn.StringType)) :: Nil + tpd.SeqLiteral(names.map(n => tpd.Literal(Constant(n.toString))), tpd.TypeTree(defn.StringType)) :: Nil )) def unapply(tree: Tree)(using Context): Option[List[TermName]] = diff --git a/compiler/src/dotty/tools/dotc/ast/untpd.scala b/compiler/src/dotty/tools/dotc/ast/untpd.scala index 0486e2e6d3d7..64f9fb4df95e 100644 --- a/compiler/src/dotty/tools/dotc/ast/untpd.scala +++ b/compiler/src/dotty/tools/dotc/ast/untpd.scala @@ -119,6 +119,7 @@ object untpd extends Trees.Instance[Untyped] with UntypedTreeInfo { case class PatDef(mods: Modifiers, pats: List[Tree], tpt: Tree, rhs: Tree)(implicit @constructorOnly src: SourceFile) extends DefTree case class ExtMethods(paramss: List[ParamClause], methods: List[Tree])(implicit @constructorOnly src: SourceFile) extends Tree case class ContextBoundTypeTree(tycon: Tree, paramName: TypeName, ownName: TermName)(implicit @constructorOnly src: SourceFile) extends Tree + // `paramName: tycon as ownName`, ownName != EmptyTermName only under x.modularity case class MacroTree(expr: Tree)(implicit @constructorOnly src: SourceFile) extends Tree case class ImportSelector(imported: Ident, renamed: Tree = EmptyTree, bound: Tree = EmptyTree)(implicit @constructorOnly src: SourceFile) extends Tree { diff --git a/compiler/src/dotty/tools/dotc/config/Config.scala b/compiler/src/dotty/tools/dotc/config/Config.scala index 293044c245ef..ee8ed4b215d7 100644 --- a/compiler/src/dotty/tools/dotc/config/Config.scala +++ b/compiler/src/dotty/tools/dotc/config/Config.scala @@ -236,7 +236,8 @@ object Config { inline val checkLevelsOnConstraints = false inline val checkLevelsOnInstantiation = true - /** If a type parameter `X` has a single context bound `X: C`, should the + /** Under x.modularity: + * If a type parameter `X` has a single context bound `X: C`, should the * witness parameter be named `X`? This would prevent the creation of a * context bound companion. */ diff --git a/compiler/src/dotty/tools/dotc/core/Mode.scala b/compiler/src/dotty/tools/dotc/core/Mode.scala index 5dab5631c62a..14d7827974c0 100644 --- a/compiler/src/dotty/tools/dotc/core/Mode.scala +++ b/compiler/src/dotty/tools/dotc/core/Mode.scala @@ -104,8 +104,8 @@ object Mode { val CheckBoundsOrSelfType: Mode = newMode(14, "CheckBoundsOrSelfType") /** Use previous Scheme for implicit resolution. Currently significant - * in 3.0-migration where we use Scala-2's scheme instead and in 3.5-migration - * where we use the previous scheme up to 3.4 instead. + * in 3.0-migration where we use Scala-2's scheme instead and in 3.5 and 3.6-migration + * where we use the previous scheme up to 3.4 for comparison with the new scheme. */ val OldImplicitResolution: Mode = newMode(15, "OldImplicitResolution") diff --git a/compiler/src/dotty/tools/dotc/core/NamerOps.scala b/compiler/src/dotty/tools/dotc/core/NamerOps.scala index 58b4ad681c6f..5e76b09bbde6 100644 --- a/compiler/src/dotty/tools/dotc/core/NamerOps.scala +++ b/compiler/src/dotty/tools/dotc/core/NamerOps.scala @@ -24,9 +24,9 @@ object NamerOps: addParamRefinements(ctor.owner.typeRef, paramss) /** Given a method with tracked term-parameters `p1, ..., pn`, and result type `R`, add the - * refinements R { p1 = p1' } ... { pn = pn' }, where pi' is the term parameter ref + * refinements R { p1 = p1' } ... { pn = pn' }, where pi' is the TermParamRef * of the parameter and pi is its name. This matters only under experimental.modularity, - * since wothout it there are no tracked parameters. Parameter refinements are added for + * since without it there are no tracked parameters. Parameter refinements are added for * constructors and given companion methods. */ def addParamRefinements(resType: Type, paramss: List[List[Symbol]])(using Context): Type = @@ -261,7 +261,7 @@ object NamerOps: /** Create a context-bound companion for type symbol `tsym`, which has a context * bound that defines a set of witnesses with names `witnessNames`. * - * @param parans If `tsym` is a type parameter, a list of parameter symbols + * @param params If `tsym` is a type parameter, a list of parameter symbols * that include all witnesses, otherwise the empty list. * * The context-bound companion has as name the name of `tsym` translated to @@ -299,7 +299,7 @@ object NamerOps: * this class. This assumes that these types already have their * WitnessNames annotation set even before they are completed. This is * the case for unpickling but currently not for Namer. So the method - * is only called during unpickling, and is not part of NamerOps. + * is only called during unpickling. */ def addContextBoundCompanions(cls: ClassSymbol)(using Context): Unit = for sym <- cls.info.decls do diff --git a/compiler/src/dotty/tools/dotc/core/SymDenotations.scala b/compiler/src/dotty/tools/dotc/core/SymDenotations.scala index 49c466f0bfd5..3904228756a0 100644 --- a/compiler/src/dotty/tools/dotc/core/SymDenotations.scala +++ b/compiler/src/dotty/tools/dotc/core/SymDenotations.scala @@ -1194,7 +1194,7 @@ object SymDenotations { || is(JavaDefinedVal, butNot = Method) || isConstructor || !owner.isExtensibleClass && !is(Deferred) - // Deferred symbols can arise through parent refinements. + // Deferred symbols can arise through parent refinements under x.modularity. // For them, the overriding relationship reverses anyway, so // being in a final class does not mean the symbol cannot be // implemented concretely in a superclass. diff --git a/compiler/src/dotty/tools/dotc/core/Types.scala b/compiler/src/dotty/tools/dotc/core/Types.scala index 3c6d9ecbf204..a92893678a17 100644 --- a/compiler/src/dotty/tools/dotc/core/Types.scala +++ b/compiler/src/dotty/tools/dotc/core/Types.scala @@ -1655,7 +1655,7 @@ object Types extends TypeUtils { * * P { ... type T = / += / -= U ... } # T * - * to just U. Analogously, `P { val x: S} # x` is reduced tp `S` is `S` + * to just U. Analogously, `P { val x: S} # x` is reduced tp `S` if `S` * is a singleton type. * * Does not perform the reduction if the resulting type would contain @@ -4936,6 +4936,7 @@ object Types extends TypeUtils { * @param origin the parameter that's tracked by the type variable. * @param creatorState the typer state in which the variable was created. * @param initNestingLevel the initial nesting level of the type variable. (c.f. nestingLevel) + * @param precise whether we should use instantiation without widening for this TypeVar. */ final class TypeVar private( initOrigin: TypeParamRef, @@ -5045,6 +5046,9 @@ object Types extends TypeUtils { else instantiateWith(tp) + /** Should we suppress widening? True if this TypeVar is precise + * or if it has as an upper bound a precise TypeVar. + */ def isPrecise(using Context) = precise || { @@ -5055,7 +5059,9 @@ object Types extends TypeUtils { case _ => false } - /** Widen unions when instantiating this variable in the current context? */ + /** The policy used for widening singletons or unions when instantiating + * this variable in the current context. + */ def widenPolicy(using Context): Widen = if isPrecise then Widen.None else if ctx.typerState.constraint.isHard(this) then Widen.Singletons @@ -5107,6 +5113,7 @@ object Types extends TypeUtils { precise: Boolean = false) = new TypeVar(initOrigin, creatorState, nestingLevel, precise) + /** The three possible widening policies */ enum Widen: case None // no widening case Singletons // widen singletons but not unions diff --git a/compiler/src/dotty/tools/dotc/parsing/Parsers.scala b/compiler/src/dotty/tools/dotc/parsing/Parsers.scala index f3d02dda5c48..fe23d97d58c3 100644 --- a/compiler/src/dotty/tools/dotc/parsing/Parsers.scala +++ b/compiler/src/dotty/tools/dotc/parsing/Parsers.scala @@ -3542,23 +3542,26 @@ object Parsers { paramMods() if paramOwner.takesOnlyUsingClauses && !impliedMods.is(Given) then syntaxError(em"`using` expected") - val (firstParamMod, isParams) = + val (firstParamMod, paramsAreNamed) = var mods = EmptyModifiers if in.lookahead.isColon then (mods, true) else if isErased then mods = addModifier(mods) - val isParams = + val paramsAreNamed = !impliedMods.is(Given) || startParamTokens.contains(in.token) || isIdent - && (in.name == nme.inline || in.name == nme.tracked || in.lookahead.isColon) - (mods, isParams) - (if isParams then commaSeparated(() => param()) - else contextTypes(paramOwner, numLeadParams, impliedMods)) match { + && (in.name == nme.inline + || in.name == nme.tracked && in.featureEnabled(Feature.modularity) + || in.lookahead.isColon) + (mods, paramsAreNamed) + val params = + if paramsAreNamed then commaSeparated(() => param()) + else contextTypes(paramOwner, numLeadParams, impliedMods) + params match case Nil => Nil case (h :: t) => h.withAddedFlags(firstParamMod.flags) :: t - } checkVarArgsRules(clause) clause } @@ -4156,7 +4159,10 @@ object Parsers { else // need to be careful with last `with` withConstrApps() - // TODO Change syntax description + // Adjust parameter modifiers so that they are now parameters of a method + // (originally, we created class parameters) + // TODO: syntax.md should be adjusted to reflect the difference that + // parameters of an alias given cannot be vals. def adjustDefParams(paramss: List[ParamClause]): List[ParamClause] = paramss.nestedMap: param => if !param.mods.isAllOf(PrivateLocal) then @@ -4173,7 +4179,8 @@ object Parsers { else Nil newLinesOpt() val noParams = tparams.isEmpty && vparamss.isEmpty - if !(name.isEmpty && noParams) then + val hasParamsOrId = !name.isEmpty || !noParams + if hasParamsOrId then if in.isColon then newSyntaxAllowed = false in.nextToken() @@ -4184,7 +4191,7 @@ object Parsers { rejectWildcardType(annotType()) :: Nil else constrApp() match case parent: Apply => parent :: moreConstrApps() - case parent if in.isIdent => + case parent if in.isIdent && newSyntaxAllowed => infixTypeRest(parent, _ => annotType1()) :: Nil case parent => parent :: moreConstrApps() if newSyntaxAllowed && in.isIdent(nme.as) then @@ -4193,6 +4200,7 @@ object Parsers { val parentsIsType = parents.length == 1 && parents.head.isType if in.token == EQUALS && parentsIsType then + // given alias accept(EQUALS) mods1 |= Final if noParams && !mods.is(Inline) then @@ -4201,10 +4209,12 @@ object Parsers { else DefDef(name, adjustDefParams(joinParams(tparams, vparamss)), parents.head, subExpr()) else if (isStatSep || isStatSeqEnd) && parentsIsType && !newSyntaxAllowed then + // old-style abstract given if name.isEmpty then syntaxError(em"anonymous given cannot be abstract") DefDef(name, adjustDefParams(joinParams(tparams, vparamss)), parents.head, EmptyTree) else + // structural instance val vparamss1 = vparamss.nestedMap: vparam => if vparam.mods.is(Private) then vparam.withMods(vparam.mods &~ PrivateLocal | Protected) diff --git a/compiler/src/dotty/tools/dotc/transform/PostTyper.scala b/compiler/src/dotty/tools/dotc/transform/PostTyper.scala index 22370e923a4b..c6ad1bb860e8 100644 --- a/compiler/src/dotty/tools/dotc/transform/PostTyper.scala +++ b/compiler/src/dotty/tools/dotc/transform/PostTyper.scala @@ -476,9 +476,9 @@ class PostTyper extends MacroTransform with InfoTransformer { thisPhase => val relativePath = util.SourceFile.relativePath(ctx.compilationUnit.source, reference) sym.addAnnotation(Annotation(defn.SourceFileAnnot, Literal(Constants.Constant(relativePath)), tree.span)) else - if !sym.is(Param) then - if !sym.owner.isOneOf(AbstractOrTrait) then - Checking.checkGoodBounds(tree.symbol) + if !sym.is(Param) && !sym.owner.isOneOf(AbstractOrTrait) then + Checking.checkGoodBounds(tree.symbol) + // Delete all context bound companions of this TypeDef if sym.owner.isClass && sym.hasAnnotation(defn.WitnessNamesAnnot) then val decls = sym.owner.info.decls for cbCompanion <- decls.lookupAll(sym.name.toTermName) do diff --git a/compiler/src/dotty/tools/dotc/typer/Namer.scala b/compiler/src/dotty/tools/dotc/typer/Namer.scala index b69d9f76852a..0588e27ea54f 100644 --- a/compiler/src/dotty/tools/dotc/typer/Namer.scala +++ b/compiler/src/dotty/tools/dotc/typer/Namer.scala @@ -296,13 +296,13 @@ class Namer { typer: Typer => createOrRefine[Symbol](tree, name, flags, ctx.owner, _ => info, (fs, _, pwithin) => newSymbol(ctx.owner, name, fs, info, pwithin, tree.nameSpan)) case tree: Import => - recordSym(newImportSym(tree), tree) + recordSym(importSymbol(tree), tree) case _ => NoSymbol } } - private def newImportSym(imp: Import)(using Context): Symbol = + private def importSymbol(imp: Import)(using Context): Symbol = newImportSymbol(ctx.owner, Completer(imp)(ctx), imp.span) /** If `sym` exists, enter it in effective scope. Check that @@ -719,7 +719,7 @@ class Namer { typer: Typer => */ def expandTopLevel(stats: List[Tree])(using Context): Unit = stats match case (imp @ Import(qual, _)) :: stats1 if untpd.languageImport(qual).isDefined => - expandTopLevel(stats1)(using ctx.importContext(imp, newImportSym(imp))) + expandTopLevel(stats1)(using ctx.importContext(imp, importSymbol(imp))) case stat :: stats1 => expand(stat) expandTopLevel(stats1) @@ -1624,7 +1624,8 @@ class Namer { typer: Typer => } /** Enter all parent refinements as public class members, unless a definition - * with the same name already exists in the class. + * with the same name already exists in the class. Remember the refining symbols + * as an attachment on the ClassDef tree. */ def enterParentRefinementSyms(refinements: List[(Name, Type)]) = val refinedSyms = mutable.ListBuffer[Symbol]() @@ -1852,19 +1853,20 @@ class Namer { typer: Typer => // Beware: ddef.name need not match sym.name if sym was freshened! val isConstructor = sym.name == nme.CONSTRUCTOR + // A map from context-bounded type parameters to associated evidence parameter names val witnessNamesOfParam = mutable.Map[TypeDef, List[TermName]]() if !ddef.name.is(DefaultGetterName) && !sym.is(Synthetic) then for params <- ddef.paramss; case tdef: TypeDef <- params do for case WitnessNamesAnnot(ws) <- tdef.mods.annotations do witnessNamesOfParam(tdef) = ws - /** Are all names in `wnames` defined by the longest prefix of all `params` + /** Is each name in `wnames` defined spmewhere in the longest prefix of all `params` * that have been typed ahead (i.e. that carry the TypedAhead attachment)? */ def allParamsSeen(wnames: List[TermName], params: List[MemberDef]) = (wnames.toSet[Name] -- params.takeWhile(_.hasAttachment(TypedAhead)).map(_.name)).isEmpty - /** Enter and typecheck parameter list, add context companions as. + /** Enter and typecheck parameter list. * Once all witness parameters for a context bound are seen, create a * context bound companion for it. */ @@ -1909,7 +1911,9 @@ class Namer { typer: Typer => val paramSymss = normalizeIfConstructor(ddef.paramss.nestedMap(symbolOfTree), isConstructor) sym.setParamss(paramSymss) - /** We add `tracked` to context bound witnesses that have abstract type members */ + /** Under x.modularity, we add `tracked` to context bound witnesses + * that have abstract type members + */ def needsTracked(sym: Symbol, param: ValDef)(using Context) = !sym.is(Tracked) && param.hasAttachment(ContextBoundParam) diff --git a/compiler/src/dotty/tools/dotc/typer/ProtoTypes.scala b/compiler/src/dotty/tools/dotc/typer/ProtoTypes.scala index bb1d5ac71269..ecf1da30cac1 100644 --- a/compiler/src/dotty/tools/dotc/typer/ProtoTypes.scala +++ b/compiler/src/dotty/tools/dotc/typer/ProtoTypes.scala @@ -776,6 +776,7 @@ object ProtoTypes { TypeComparer.addToConstraint(added, tvars) val singletonRefs = preciseConstrainedRefs(added, singletonOnly = true) for paramRef <- added.paramRefs do + // Constrain all type parameters [T: Singleton] to T <: Singleton if singletonRefs.contains(paramRef) then paramRef <:< defn.SingletonType (added, tvars) end constrained diff --git a/compiler/src/dotty/tools/dotc/typer/RefChecks.scala b/compiler/src/dotty/tools/dotc/typer/RefChecks.scala index 266b69d029c1..cb1aea27c444 100644 --- a/compiler/src/dotty/tools/dotc/typer/RefChecks.scala +++ b/compiler/src/dotty/tools/dotc/typer/RefChecks.scala @@ -552,7 +552,11 @@ object RefChecks { overrideError("is an extension method, cannot override a normal method") else if (other.is(ExtensionMethod) && !member.is(ExtensionMethod)) // (1.3) overrideError("is a normal method, cannot override an extension method") - else if (!other.is(Deferred) || other.isAllOf(Given | HasDefault)) + else if (!other.is(Deferred) + || other.isAllOf(Given | HasDefault) + // deferred givens have flags Given, HasDefault and Deferred set. These + // need to be checked for overriding as if they were concrete members + ) && !member.is(Deferred) && !other.name.is(DefaultGetterName) && !member.isAnyOverride @@ -626,7 +630,7 @@ object RefChecks { else if intoOccurrences(memberTp(self)) != intoOccurrences(otherTp(self)) then overrideError("has different occurrences of `into` modifiers", compareTypes = true) else if other.is(ParamAccessor) && !isInheritedAccessor(member, other) - && !member.is(Tracked) + && !member.is(Tracked) // see remark on tracked members above then // (1.12) report.errorOrMigrationWarning( em"cannot override val parameter ${other.showLocated}", diff --git a/compiler/src/dotty/tools/dotc/typer/Typer.scala b/compiler/src/dotty/tools/dotc/typer/Typer.scala index a2291d55bac8..2eeccb6e477d 100644 --- a/compiler/src/dotty/tools/dotc/typer/Typer.scala +++ b/compiler/src/dotty/tools/dotc/typer/Typer.scala @@ -183,7 +183,7 @@ class Typer(@constructorOnly nestingLevel: Int = 0) extends Namer // Overridden in derived typers def newLikeThis(nestingLevel: Int): Typer = new Typer(nestingLevel) - // Overridden to do nothing in derived typers + /** Apply given migration. Overridden to use `disabled` instead in ReTypers. */ protected def migrate[T](migration: => T, disabled: => T = ()): T = migration /** Find the type of an identifier with given `name` in given context `ctx`. @@ -869,7 +869,8 @@ class Typer(@constructorOnly nestingLevel: Int = 0) extends Namer type Alts = List[(/*prev: */Tree, /*prevState: */TyperState, /*prevWitness: */TermRef)] /** Compare two alternative selections `alt1` and `alt2` from witness types - * `wit1`, `wit2` according to the 3 criteria in the enclosing doc comment. I.e. + * `wit1`, `wit2` according to the 3 criteria in Step 3 of the doc comment + * of annotation.internal.WitnessNames. I.e. * * alt1 = qual1.m, alt2 = qual2.m, qual1: wit1, qual2: wit2 * @@ -887,13 +888,16 @@ class Typer(@constructorOnly nestingLevel: Int = 0) extends Namer case (tp1: TermRef, tp2: TermRef) => if tp1.info.isSingleton && (tp1 frozen_=:= tp2) then 1 else compare(tp1, tp2, preferGeneral = false) - case (tp1: TermRef, _) => 1 + case (tp1: TermRef, _) => 1 // should not happen, but prefer TermRefs over othersver others case (_, tp2: TermRef) => -1 case _ => 0 - /** Find the set of maximally preferred alternative among `prev` and the - * remaining alternatives generated from `witnesses` with is a union type - * of witness references. + /** Find the set of maximally preferred alternatives among `prevs` and + * alternatives referred to by `witnesses`. + * @param prevs a list of (ref tree, typer state, term ref) tripls that + * represents previously identified alternatives + * @param witnesses a type of the form ref_1 | ... | ref_n containing references + * still to be considered. */ def tryAlts(prevs: Alts, witnesses: Type): Alts = witnesses match case OrType(wit1, wit2) => @@ -905,10 +909,10 @@ class Typer(@constructorOnly nestingLevel: Int = 0) extends Namer def current = (alt, altCtx.typerState, witness) if altCtx.reporter.hasErrors then prevs else - val cmps = prevs.map: (prevTree, prevState, prevWitness) => + val comparisons = prevs.map: (prevTree, prevState, prevWitness) => compareAlts(prevTree, alt, prevWitness, witness) - if cmps.exists(_ == 1) then prevs - else current :: prevs.zip(cmps).collect{ case (prev, cmp) if cmp != -1 => prev } + if comparisons.exists(_ == 1) then prevs + else current :: prevs.zip(comparisons).collect{ case (prev, cmp) if cmp != -1 => prev } qual.tpe.widen match case AppliedType(_, arg :: Nil) => @@ -2370,9 +2374,12 @@ class Typer(@constructorOnly nestingLevel: Int = 0) extends Namer val tparamSplice = untpd.TypedSplice(typedExpr(tparam)) typed(untpd.RefinedTypeTree(tyconSplice, List(untpd.TypeDef(tpnme.Self, tparamSplice)))) else + def selfNote = + if Feature.enabled(modularity) then + " and\ndoes not have an abstract type member named `Self` either" + else "" errorTree(tree, - em"""Illegal context bound: ${tycon.tpe} does not take type parameters and - |does not have an abstract type member named `Self` either.""") + em"Illegal context bound: ${tycon.tpe} does not take type parameters$selfNote.") def typedSingletonTypeTree(tree: untpd.SingletonTypeTree)(using Context): SingletonTypeTree = { val ref1 = typedExpr(tree.ref, SingletonTypeProto) @@ -2605,7 +2612,7 @@ class Typer(@constructorOnly nestingLevel: Int = 0) extends Namer var name = tree.name if (name == nme.WILDCARD && tree.mods.is(Given)) { val Typed(_, tpt) = tree.body: @unchecked - name = desugar.inventGivenOrExtensionName(tpt) + name = desugar.inventGivenName(tpt) } if (name == nme.WILDCARD) body1 else { @@ -2725,6 +2732,19 @@ class Typer(@constructorOnly nestingLevel: Int = 0) extends Namer if filters == List(MessageFilter.None) then sup.markUsed() ctx.run.nn.suppressions.addSuppression(sup) + /** Run `typed` on `rhs` except if `rhs` is the right hand side of a deferred given, + * in which case the empty tree is returned. + */ + private inline def excludeDeferredGiven( + rhs: untpd.Tree, sym: Symbol)( + inline typed: untpd.Tree => Tree)(using Context): Tree = + rhs match + case rhs: RefTree + if rhs.name == nme.deferred && sym.isAllOf(DeferredGivenFlags, butNot = Param) => + EmptyTree + case _ => + typed(rhs) + def typedValDef(vdef: untpd.ValDef, sym: Symbol)(using Context): Tree = { val ValDef(name, tpt, _) = vdef checkNonRootName(vdef.name, vdef.nameSpan) @@ -2732,15 +2752,12 @@ class Typer(@constructorOnly nestingLevel: Int = 0) extends Namer if sym.is(Implicit) then checkImplicitConversionDefOK(sym) if sym.is(Module) then checkNoModuleClash(sym) val tpt1 = checkSimpleKinded(typedType(tpt)) - val rhs1 = vdef.rhs match { + val rhs1 = vdef.rhs match case rhs @ Ident(nme.WILDCARD) => rhs.withType(tpt1.tpe) - case rhs: RefTree - if rhs.name == nme.deferred && sym.isAllOf(DeferredGivenFlags, butNot = Param) => - EmptyTree case rhs => - typedExpr(rhs, tpt1.tpe.widenExpr) - } + excludeDeferredGiven(rhs, sym): + typedExpr(_, tpt1.tpe.widenExpr) val vdef1 = assignType(cpy.ValDef(vdef)(name, tpt1, rhs1), sym) postProcessInfo(vdef1, sym) vdef1.setDefTree @@ -2800,13 +2817,10 @@ class Typer(@constructorOnly nestingLevel: Int = 0) extends Namer if sym.isInlineMethod then rhsCtx.addMode(Mode.InlineableBody) if sym.is(ExtensionMethod) then rhsCtx.addMode(Mode.InExtensionMethod) - val rhs1 = ddef.rhs match - case Ident(nme.deferred) if sym.isAllOf(DeferredGivenFlags) => - EmptyTree - case rhs => - PrepareInlineable.dropInlineIfError(sym, - if sym.isScala2Macro then typedScala2MacroBody(ddef.rhs)(using rhsCtx) - else typedExpr(ddef.rhs, tpt1.tpe.widenExpr)(using rhsCtx)) + val rhs1 = excludeDeferredGiven(ddef.rhs, sym): rhs => + PrepareInlineable.dropInlineIfError(sym, + if sym.isScala2Macro then typedScala2MacroBody(rhs)(using rhsCtx) + else typedExpr(rhs, tpt1.tpe.widenExpr)(using rhsCtx)) if sym.isInlineMethod then if StagingLevel.level > 0 then diff --git a/library/src/scala/annotation/internal/WitnessNames.scala b/library/src/scala/annotation/internal/WitnessNames.scala index f859cda96d06..80b8fea4a84d 100644 --- a/library/src/scala/annotation/internal/WitnessNames.scala +++ b/library/src/scala/annotation/internal/WitnessNames.scala @@ -36,7 +36,7 @@ package internal * 2. The underlying type (under widen) of ref_i is a true supertype of the * underlying type of ref_j. * 3. ref_i.m is a term, the underlying type of ref_j is not a strict subtype - * of the underlying type of ref_j, and the underlying type ref_i.m is a + * of the underlying type of ref_i, and the underlying type ref_i.m is a * strict subtype of the underlying type of ref_j.m. * * If there is such a selection, map A.m to ref_i.m, otherwise report an error. @@ -48,6 +48,7 @@ package internal * * 4. At PostTyper, issue an error when encountering any reference to a CB companion. */ +@experimental class WitnessNames(names: String*) extends StaticAnnotation diff --git a/library/src/scala/compiletime/package.scala b/library/src/scala/compiletime/package.scala index be76941a680b..a3896a1eeb06 100644 --- a/library/src/scala/compiletime/package.scala +++ b/library/src/scala/compiletime/package.scala @@ -1,7 +1,7 @@ package scala package compiletime -import annotation.compileTimeOnly +import annotation.{compileTimeOnly, experimental} /** Use this method when you have a type, do not have a value for it but want to * pattern match on it. For example, given a type `Tup <: Tuple`, one can @@ -52,6 +52,7 @@ def uninitialized: Nothing = ??? * that implement the enclosing trait and that do not contain an explicit overriding * definition of that given. */ +@experimental @compileTimeOnly("`deferred` can only be used as the right hand side of a given definition in a trait") def deferred: Nothing = ??? diff --git a/library/src/scala/runtime/stdLibPatches/Predef.scala b/library/src/scala/runtime/stdLibPatches/Predef.scala index 6c286f322ba7..77b014b80466 100644 --- a/library/src/scala/runtime/stdLibPatches/Predef.scala +++ b/library/src/scala/runtime/stdLibPatches/Predef.scala @@ -77,6 +77,7 @@ object Predef: * * which is what is needed for a context bound `[A: TC]`. */ + @experimental infix type is[A <: AnyKind, B <: Any{type Self <: AnyKind}] = B { type Self = A } end Predef diff --git a/tests/neg/i12348.check b/tests/neg/i12348.check index eded51f70f31..55806fa5ca1b 100644 --- a/tests/neg/i12348.check +++ b/tests/neg/i12348.check @@ -1,8 +1,4 @@ --- [E040] Syntax Error: tests/neg/i12348.scala:2:16 -------------------------------------------------------------------- -2 | given inline x: Int = 0 // error // error - | ^ - | an identifier expected, but ':' found --- [E067] Syntax Error: tests/neg/i12348.scala:2:8 --------------------------------------------------------------------- -2 | given inline x: Int = 0 // error // error - | ^ - |Declaration of given instance given_x_inline_ not allowed here: only classes can have declared but undefined members +-- [E040] Syntax Error: tests/neg/i12348.scala:2:15 -------------------------------------------------------------------- +2 | given inline x: Int = 0 // error + | ^ + | 'with' expected, but identifier found diff --git a/tests/neg/i12348.scala b/tests/neg/i12348.scala index 43daf9a2801b..bd8bf63994e6 100644 --- a/tests/neg/i12348.scala +++ b/tests/neg/i12348.scala @@ -1,2 +1,2 @@ object A { - given inline x: Int = 0 // error // error + given inline x: Int = 0 // error diff --git a/tests/pos/typeclasses-this.scala b/tests/pos/typeclasses-this.scala index 20ce78678b22..33ccb8d9d653 100644 --- a/tests/pos/typeclasses-this.scala +++ b/tests/pos/typeclasses-this.scala @@ -36,7 +36,7 @@ end Common object Instances extends Common: - given intOrd: Int is Ord with + given intOrd: (Int is Ord) with extension (x: Int) def compareTo(y: Int) = if x < y then -1 @@ -44,7 +44,7 @@ object Instances extends Common: else 0 // given [T](using tracked val ev: Ord { type Self = T}): Ord { type Self = List[T] } with - given [T: Ord]: List[T] is Ord with + given [T: Ord]: (List[T] is Ord) with extension (xs: List[T]) def compareTo(ys: List[T]): Int = (xs, ys) match case (Nil, Nil) => 0 case (Nil, _) => -1 @@ -53,7 +53,7 @@ object Instances extends Common: val fst = x.compareTo(y) if (fst != 0) fst else xs1.compareTo(ys1) - given listMonad: List is Monad with + given listMonad: (List is Monad) with extension [A](xs: List[A]) def flatMap[B](f: A => List[B]): List[B] = xs.flatMap(f) def pure[A](x: A): List[A] = @@ -61,7 +61,7 @@ object Instances extends Common: type Reader[Ctx] = [X] =>> Ctx => X - given readerMonad[Ctx]: Reader[Ctx] is Monad with + given readerMonad[Ctx]: (Reader[Ctx] is Monad) with extension [A](r: Ctx => A) def flatMap[B](f: A => Ctx => B): Ctx => B = ctx => f(r(ctx))(ctx) def pure[A](x: A): Ctx => A = @@ -83,7 +83,7 @@ object Instances extends Common: def maximum[T: Ord](xs: List[T]): T = xs.reduce(_ `max` _) - given descending[T: Ord]: T is Ord with + given descending[T: Ord]: (T is Ord) with extension (x: T) def compareTo(y: T) = T.compareTo(y)(x) def minimum[T: Ord](xs: List[T]) = diff --git a/tests/run-tasty-inspector/stdlibExperimentalDefinitions.scala b/tests/run-tasty-inspector/stdlibExperimentalDefinitions.scala index df35bed19360..9a01e711537b 100644 --- a/tests/run-tasty-inspector/stdlibExperimentalDefinitions.scala +++ b/tests/run-tasty-inspector/stdlibExperimentalDefinitions.scala @@ -80,8 +80,11 @@ val experimentalDefinitionInLibrary = Set( "scala.NamedTupleDecomposition", "scala.NamedTupleDecomposition$", - // New feature: Precise trait + // New feature: modularity "scala.Precise", + "scala.annotation.internal.WitnessNames", + "scala.compiletime.package$package$.deferred", + "scala.Predef$.is", ) From ff98f011d9b6172df58d5b4cc1345b1e8539aedd Mon Sep 17 00:00:00 2001 From: odersky Date: Mon, 6 May 2024 16:07:04 +0200 Subject: [PATCH 234/371] Fix rebase breakage [Cherry-picked d3e6a952d4e3914d8f7cfc1054f6ddbeab9b33c5] --- compiler/src/dotty/tools/dotc/typer/Applications.scala | 2 +- tests/run-tasty-inspector/stdlibExperimentalDefinitions.scala | 2 +- 2 files changed, 2 insertions(+), 2 deletions(-) diff --git a/compiler/src/dotty/tools/dotc/typer/Applications.scala b/compiler/src/dotty/tools/dotc/typer/Applications.scala index fd4c634801be..c3369ac58e31 100644 --- a/compiler/src/dotty/tools/dotc/typer/Applications.scala +++ b/compiler/src/dotty/tools/dotc/typer/Applications.scala @@ -1880,7 +1880,7 @@ trait Applications extends Compatibility { val tp1p = prepare(tp1) val tp2p = prepare(tp2) - if Feature.sourceVersion.isAtMost(SourceVersion.`3.5`) + if Feature.sourceVersion.isAtMost(SourceVersion.`3.4`) || oldResolution || !alt1isGiven && !alt2isGiven then diff --git a/tests/run-tasty-inspector/stdlibExperimentalDefinitions.scala b/tests/run-tasty-inspector/stdlibExperimentalDefinitions.scala index 9a01e711537b..7079c7320ba0 100644 --- a/tests/run-tasty-inspector/stdlibExperimentalDefinitions.scala +++ b/tests/run-tasty-inspector/stdlibExperimentalDefinitions.scala @@ -84,7 +84,7 @@ val experimentalDefinitionInLibrary = Set( "scala.Precise", "scala.annotation.internal.WitnessNames", "scala.compiletime.package$package$.deferred", - "scala.Predef$.is", + "scala.runtime.stdLibPatches.Predef$.is", ) From 0bcf69c8188e53b947e4d257d63a5b7c1463ce34 Mon Sep 17 00:00:00 2001 From: odersky Date: Mon, 6 May 2024 16:07:33 +0200 Subject: [PATCH 235/371] Make Singleton an erased class only under modularity import [Cherry-picked b2f0791a0ac337474fdd223085f8da6ee03ac01e] --- compiler/src/dotty/tools/dotc/core/TypeUtils.scala | 9 +++++++-- 1 file changed, 7 insertions(+), 2 deletions(-) diff --git a/compiler/src/dotty/tools/dotc/core/TypeUtils.scala b/compiler/src/dotty/tools/dotc/core/TypeUtils.scala index dd881bb1adf6..afc2cc39f9cf 100644 --- a/compiler/src/dotty/tools/dotc/core/TypeUtils.scala +++ b/compiler/src/dotty/tools/dotc/core/TypeUtils.scala @@ -8,6 +8,7 @@ import Names.{Name, TermName} import Constants.Constant import Names.Name +import config.Feature class TypeUtils: /** A decorator that provides methods on types @@ -22,7 +23,11 @@ class TypeUtils: self.classSymbol.isPrimitiveValueClass def isErasedClass(using Context): Boolean = - self.underlyingClassRef(refinementOK = true).typeSymbol.is(Flags.Erased) + val cls = self.underlyingClassRef(refinementOK = true).typeSymbol + cls.is(Flags.Erased) + && (cls != defn.SingletonClass || Feature.enabled(Feature.modularity)) + // Singleton counts as an erased class only under x.modularity + /** Is this type a checked exception? This is the case if the type * derives from Exception but not from RuntimeException. According to @@ -179,7 +184,7 @@ class TypeUtils: def isThisTypeOf(cls: Symbol)(using Context) = self match case self: Types.ThisType => self.cls == cls case _ => false - + /** Strip all outer refinements off this type */ def stripRefinement: Type = self match case self: RefinedOrRecType => self.parent.stripRefinement From 42de3703a50043284d9963891dfa9db8bf545f64 Mon Sep 17 00:00:00 2001 From: odersky Date: Mon, 6 May 2024 19:07:52 +0200 Subject: [PATCH 236/371] Address review comments [Cherry-picked 4f28d8a892b835a2598e10a7af48b05ed5a19e32] --- .../src/dotty/tools/dotc/ast/Desugar.scala | 28 ++++++++++--------- .../src/dotty/tools/dotc/ast/TreeInfo.scala | 24 ++++++---------- .../src/dotty/tools/dotc/core/Flags.scala | 2 +- .../src/dotty/tools/dotc/core/NamerOps.scala | 2 +- .../src/dotty/tools/dotc/core/Types.scala | 18 ++++++------ .../dotty/tools/dotc/parsing/Parsers.scala | 7 +++-- .../src/dotty/tools/dotc/typer/Namer.scala | 4 +-- .../src/dotty/tools/dotc/typer/Typer.scala | 4 +-- docs/_docs/internals/syntax.md | 2 +- .../annotation/internal/WitnessNames.scala | 4 +-- 10 files changed, 46 insertions(+), 49 deletions(-) diff --git a/compiler/src/dotty/tools/dotc/ast/Desugar.scala b/compiler/src/dotty/tools/dotc/ast/Desugar.scala index 0681492a4ba7..b1b771bc7512 100644 --- a/compiler/src/dotty/tools/dotc/ast/Desugar.scala +++ b/compiler/src/dotty/tools/dotc/ast/Desugar.scala @@ -234,7 +234,7 @@ object desugar { * of the same method. * @param evidenceFlags The flags to use for evidence definitions * @param freshName A function to generate fresh names for evidence definitions - * @param allParams If `tdef` is a type paramter, all parameters of the owning method, + * @param allParamss If `tdef` is a type paramter, all parameters of the owning method, * otherwise the empty list. */ private def desugarContextBounds( @@ -246,29 +246,31 @@ object desugar { val evidenceNames = mutable.ListBuffer[TermName]() - def desugarRhs(rhs: Tree): Tree = rhs match - case ContextBounds(tbounds, cxbounds) => + def desugarRHS(rhs: Tree): Tree = rhs match + case ContextBounds(tbounds, ctxbounds) => val isMember = evidenceFlags.isAllOf(DeferredGivenFlags) - for bound <- cxbounds do + for bound <- ctxbounds do val evidenceName = bound match case ContextBoundTypeTree(_, _, ownName) if !ownName.isEmpty => ownName // if there is an explicitly given name, use it. - case _ if Config.nameSingleContextBounds && !isMember - && cxbounds.tail.isEmpty && Feature.enabled(Feature.modularity) => - tdef.name.toTermName case _ => - freshName(bound) + if Config.nameSingleContextBounds + && !isMember + && ctxbounds.tail.isEmpty + && Feature.enabled(Feature.modularity) + then tdef.name.toTermName + else freshName(bound) evidenceNames += evidenceName val evidenceParam = ValDef(evidenceName, bound, EmptyTree).withFlags(evidenceFlags) evidenceParam.pushAttachment(ContextBoundParam, ()) evidenceBuf += evidenceParam tbounds case LambdaTypeTree(tparams, body) => - cpy.LambdaTypeTree(rhs)(tparams, desugarRhs(body)) + cpy.LambdaTypeTree(rhs)(tparams, desugarRHS(body)) case _ => rhs - val tdef1 = cpy.TypeDef(tdef)(rhs = desugarRhs(tdef.rhs)) + val tdef1 = cpy.TypeDef(tdef)(rhs = desugarRHS(tdef.rhs)) // Under x.modularity, if there was a context bound, and `tdef`s name as a term name is // neither a name of an existing parameter nor a name of generated evidence for // the same method, add a WitnessAnnotation with all generated evidence names to `tdef`. @@ -695,10 +697,10 @@ object desugar { case _ => false } - def isRepeated(tree: Tree): Boolean = stripByNameType(tree) match { + /** Is this a repeated argument x* (using a spread operator)? */ + def isRepeated(tree: Tree): Boolean = stripByNameType(tree) match case PostfixOp(_, Ident(tpnme.raw.STAR)) => true case _ => false - } def appliedRef(tycon: Tree, tparams: List[TypeDef] = constrTparams, widenHK: Boolean = false) = { val targs = for (tparam <- tparams) yield { @@ -1218,7 +1220,7 @@ object desugar { /** Extract a synthesized given name from a type tree. This is used for * both anonymous givens and (under x.modularity) deferred givens. - * @param followArgs If true include argument types in the name + * @param followArgs if true include argument types in the name */ private class NameExtractor(followArgs: Boolean) extends UntypedTreeAccumulator[String] { private def extractArgs(args: List[Tree])(using Context): String = diff --git a/compiler/src/dotty/tools/dotc/ast/TreeInfo.scala b/compiler/src/dotty/tools/dotc/ast/TreeInfo.scala index 11fb572b66c6..97de434ba9d5 100644 --- a/compiler/src/dotty/tools/dotc/ast/TreeInfo.scala +++ b/compiler/src/dotty/tools/dotc/ast/TreeInfo.scala @@ -394,22 +394,16 @@ trait TreeInfo[T <: Untyped] { self: Trees.Instance[T] => )) def unapply(tree: Tree)(using Context): Option[List[TermName]] = - def isWitnessNames(tp: Type) = tp match - case tp: TypeRef => - tp.name == tpnme.WitnessNames && tp.symbol == defn.WitnessNamesAnnot - case _ => - false unsplice(tree) match - case Apply( - Select(New(tpt: tpd.TypeTree), nme.CONSTRUCTOR), - SeqLiteral(elems, _) :: Nil - ) if isWitnessNames(tpt.tpe) => - Some: - elems.map: - case Literal(Constant(str: String)) => - ContextBoundParamName.unmangle(str.toTermName.asSimpleName) - case _ => - None + case Apply(Select(New(tpt: tpd.TypeTree), nme.CONSTRUCTOR), SeqLiteral(elems, _) :: Nil) => + tpt.tpe match + case tp: TypeRef if tp.name == tpnme.WitnessNames && tp.symbol == defn.WitnessNamesAnnot => + Some: + elems.map: + case Literal(Constant(str: String)) => + ContextBoundParamName.unmangle(str.toTermName.asSimpleName) + case _ => None + case _ => None end WitnessNamesAnnot } diff --git a/compiler/src/dotty/tools/dotc/core/Flags.scala b/compiler/src/dotty/tools/dotc/core/Flags.scala index e17834d61fdc..b1bf7a266c91 100644 --- a/compiler/src/dotty/tools/dotc/core/Flags.scala +++ b/compiler/src/dotty/tools/dotc/core/Flags.scala @@ -573,7 +573,7 @@ object Flags { val DeferredOrLazyOrMethod: FlagSet = Deferred | Lazy | Method val DeferredOrTermParamOrAccessor: FlagSet = Deferred | ParamAccessor | TermParam // term symbols without right-hand sides val DeferredOrTypeParam: FlagSet = Deferred | TypeParam // type symbols without right-hand sides - val DeferredGivenFlags = Deferred | Given | HasDefault + val DeferredGivenFlags: FlagSet = Deferred | Given | HasDefault val EnumValue: FlagSet = Enum | StableRealizable // A Scala enum value val FinalOrInline: FlagSet = Final | Inline val FinalOrModuleClass: FlagSet = Final | ModuleClass // A module class or a final class diff --git a/compiler/src/dotty/tools/dotc/core/NamerOps.scala b/compiler/src/dotty/tools/dotc/core/NamerOps.scala index 5e76b09bbde6..07cb9292baa4 100644 --- a/compiler/src/dotty/tools/dotc/core/NamerOps.scala +++ b/compiler/src/dotty/tools/dotc/core/NamerOps.scala @@ -262,7 +262,7 @@ object NamerOps: * bound that defines a set of witnesses with names `witnessNames`. * * @param params If `tsym` is a type parameter, a list of parameter symbols - * that include all witnesses, otherwise the empty list. + * that includes all witnesses, otherwise the empty list. * * The context-bound companion has as name the name of `tsym` translated to * a term name. We create a synthetic val of the form diff --git a/compiler/src/dotty/tools/dotc/core/Types.scala b/compiler/src/dotty/tools/dotc/core/Types.scala index a92893678a17..eeffc41d4159 100644 --- a/compiler/src/dotty/tools/dotc/core/Types.scala +++ b/compiler/src/dotty/tools/dotc/core/Types.scala @@ -1655,7 +1655,7 @@ object Types extends TypeUtils { * * P { ... type T = / += / -= U ... } # T * - * to just U. Analogously, `P { val x: S} # x` is reduced tp `S` if `S` + * to just U. Analogously, `P { val x: S} # x` is reduced to `S` if `S` * is a singleton type. * * Does not perform the reduction if the resulting type would contain @@ -5050,14 +5050,14 @@ object Types extends TypeUtils { * or if it has as an upper bound a precise TypeVar. */ def isPrecise(using Context) = - precise - || { - val constr = ctx.typerState.constraint - constr.upper(origin).exists: tparam => - constr.typeVarOfParam(tparam) match - case tvar: TypeVar => tvar.precise - case _ => false - } + precise || hasPreciseUpperBound + + private def hasPreciseUpperBound(using Context) = + val constr = ctx.typerState.constraint + constr.upper(origin).exists: tparam => + constr.typeVarOfParam(tparam) match + case tvar: TypeVar => tvar.precise + case _ => false /** The policy used for widening singletons or unions when instantiating * this variable in the current context. diff --git a/compiler/src/dotty/tools/dotc/parsing/Parsers.scala b/compiler/src/dotty/tools/dotc/parsing/Parsers.scala index fe23d97d58c3..e28ba5fd669e 100644 --- a/compiler/src/dotty/tools/dotc/parsing/Parsers.scala +++ b/compiler/src/dotty/tools/dotc/parsing/Parsers.scala @@ -3552,9 +3552,10 @@ object Parsers { !impliedMods.is(Given) || startParamTokens.contains(in.token) || isIdent - && (in.name == nme.inline - || in.name == nme.tracked && in.featureEnabled(Feature.modularity) - || in.lookahead.isColon) + && (in.name == nme.inline // inline starts a name binding + || in.name == nme.tracked // tracked starts a name binding under x.modularity + && in.featureEnabled(Feature.modularity) + || in.lookahead.isColon) // a following `:` starts a name binding (mods, paramsAreNamed) val params = if paramsAreNamed then commaSeparated(() => param()) diff --git a/compiler/src/dotty/tools/dotc/typer/Namer.scala b/compiler/src/dotty/tools/dotc/typer/Namer.scala index 0588e27ea54f..83964417a6f1 100644 --- a/compiler/src/dotty/tools/dotc/typer/Namer.scala +++ b/compiler/src/dotty/tools/dotc/typer/Namer.scala @@ -1860,7 +1860,7 @@ class Namer { typer: Typer => for case WitnessNamesAnnot(ws) <- tdef.mods.annotations do witnessNamesOfParam(tdef) = ws - /** Is each name in `wnames` defined spmewhere in the longest prefix of all `params` + /** Is each name in `wnames` defined somewhere in the longest prefix of all `params` * that have been typed ahead (i.e. that carry the TypedAhead attachment)? */ def allParamsSeen(wnames: List[TermName], params: List[MemberDef]) = @@ -1919,7 +1919,7 @@ class Namer { typer: Typer => && param.hasAttachment(ContextBoundParam) && sym.info.memberNames(abstractTypeNameFilter).nonEmpty - /** Set every context bound evidence parameter of a class to be tracked, + /** Under x.modularity, set every context bound evidence parameter of a class to be tracked, * provided it has a type that has an abstract type member. Reset private and local flags * so that the parameter becomes a `val`. */ diff --git a/compiler/src/dotty/tools/dotc/typer/Typer.scala b/compiler/src/dotty/tools/dotc/typer/Typer.scala index 2eeccb6e477d..2a69c948baae 100644 --- a/compiler/src/dotty/tools/dotc/typer/Typer.scala +++ b/compiler/src/dotty/tools/dotc/typer/Typer.scala @@ -888,7 +888,7 @@ class Typer(@constructorOnly nestingLevel: Int = 0) extends Namer case (tp1: TermRef, tp2: TermRef) => if tp1.info.isSingleton && (tp1 frozen_=:= tp2) then 1 else compare(tp1, tp2, preferGeneral = false) - case (tp1: TermRef, _) => 1 // should not happen, but prefer TermRefs over othersver others + case (tp1: TermRef, _) => 1 // should not happen, but prefer TermRefs over others case (_, tp2: TermRef) => -1 case _ => 0 @@ -4588,7 +4588,7 @@ class Typer(@constructorOnly nestingLevel: Int = 0) extends Namer cpy.Ident(qual)(qual.symbol.name.sourceModuleName.toTypeName) case _ => errorTree(tree, em"cannot convert from $tree to an instance creation expression") - val tycon = ctorResultType.underlyingClassRef(refinementOK = true) + val tycon = ctorResultType.underlyingClassRef(refinementOK = Feature.enabled(modularity)) typed( untpd.Select( untpd.New(untpd.TypedSplice(tpt.withType(tycon))), diff --git a/docs/_docs/internals/syntax.md b/docs/_docs/internals/syntax.md index 05f89a344148..dd4a3af403ab 100644 --- a/docs/_docs/internals/syntax.md +++ b/docs/_docs/internals/syntax.md @@ -191,7 +191,7 @@ MatchType ::= InfixType `match` <<< TypeCaseClauses >>> InfixType ::= RefinedType {id [nl] RefinedType} InfixOp(t1, op, t2) RefinedType ::= AnnotType {[nl] Refinement} RefinedTypeTree(t, ds) AnnotType ::= SimpleType {Annotation} Annotated(t, annot) -AnnotType1 ::= SimpleType1 {Annotation} Annotated(t, annot) +AnnotType1 ::= SimpleType1 {Annotation} Annotated(t, annot) SimpleType ::= SimpleLiteral SingletonTypeTree(l) | ‘?’ TypeBounds diff --git a/library/src/scala/annotation/internal/WitnessNames.scala b/library/src/scala/annotation/internal/WitnessNames.scala index 80b8fea4a84d..3921c2083617 100644 --- a/library/src/scala/annotation/internal/WitnessNames.scala +++ b/library/src/scala/annotation/internal/WitnessNames.scala @@ -11,7 +11,7 @@ package internal * * 2. During Namer or Unpickling, when encountering a type declaration A with * a WitnessNames(n_1, ... , n_k) annotation, create a CB companion `val A` with - * rtype ``[ref_1 | ... | ref_k] where ref_i is a TermRef + * type ``[ref_1 | ... | ref_k] where ref_i is a TermRef * with the same prefix as A and name n_i. Except, don't do this if the type in * question is a type parameter and there is already a term parameter with name A * defined for the same method. @@ -20,7 +20,7 @@ package internal * * type ``[-Refs] * - * The context bound companion's variance is negative, so that unons in the + * The context bound companion's variance is negative, so that unions in the * arguments are joined when encountering multiple definfitions and forming a glb. * * 3. Add a special case for typing a selection A.m on a value A of type From 65218227371ea4e5eae43ae67ba4fb308bc46d1a Mon Sep 17 00:00:00 2001 From: odersky Date: Mon, 6 May 2024 20:33:48 +0200 Subject: [PATCH 237/371] Adress review comments with changed docs and new tests [Cherry-picked 0dddcb7fb9511acf8e8ca676c95768d8b445d7bd] --- .../reference/experimental/typeclasses.md | 5 +--- tests/neg/deferred-givens-2.check | 12 ++++++++++ tests/neg/deferred-givens-2.scala | 23 +++++++++++++++++++ tests/pending/pos/cbproxy-default.scala | 4 ++++ tests/pending/pos/singleton-infer.scala | 8 +++++++ tests/pos/cbproxy-expansion.scala | 16 +++++++++++++ 6 files changed, 64 insertions(+), 4 deletions(-) create mode 100644 tests/neg/deferred-givens-2.check create mode 100644 tests/neg/deferred-givens-2.scala create mode 100644 tests/pending/pos/cbproxy-default.scala create mode 100644 tests/pending/pos/singleton-infer.scala create mode 100644 tests/pos/cbproxy-expansion.scala diff --git a/docs/_docs/reference/experimental/typeclasses.md b/docs/_docs/reference/experimental/typeclasses.md index cf5f3220faa6..a78e764bbe7d 100644 --- a/docs/_docs/reference/experimental/typeclasses.md +++ b/docs/_docs/reference/experimental/typeclasses.md @@ -1,12 +1,9 @@ - --- layout: doc-page -title: "Type Classes" +title: "Better Support for Type Classes" nightlyOf: https://docs.scala-lang.org/scala3/reference/experimental/typeclasses.html --- -# Some Proposed Changes for Better Support of Type Classes - Martin Odersky, 8.1.2024, edited 5.4.2024 A type class in Scala is a pattern where we define diff --git a/tests/neg/deferred-givens-2.check b/tests/neg/deferred-givens-2.check new file mode 100644 index 000000000000..4a29141cc48b --- /dev/null +++ b/tests/neg/deferred-givens-2.check @@ -0,0 +1,12 @@ +-- [E172] Type Error: tests/neg/deferred-givens-2.scala:17:6 ----------------------------------------------------------- +17 |class SortedIntWrong1 extends Sorted: // error + |^ + |No given instance of type Ord{type Self = SortedIntWrong1.this.Element} was found for inferring the implementation of the deferred given instance given_Ord_Element in trait Sorted +18 | type Element = Int +19 | override given (Element is Ord)() +-- [E172] Type Error: tests/neg/deferred-givens-2.scala:21:6 ----------------------------------------------------------- +21 |class SortedIntWrong2 extends Sorted: // error + |^ + |No given instance of type Ord{type Self = SortedIntWrong2.this.Element} was found for inferring the implementation of the deferred given instance given_Ord_Element in trait Sorted +22 | type Element = Int +23 | override given (Int is Ord)() diff --git a/tests/neg/deferred-givens-2.scala b/tests/neg/deferred-givens-2.scala new file mode 100644 index 000000000000..4e75ceb08728 --- /dev/null +++ b/tests/neg/deferred-givens-2.scala @@ -0,0 +1,23 @@ +//> using options -language:experimental.modularity -source future +trait Ord: + type Self + +trait Sorted: + type Element: Ord + +object Scoped: + given (Int is Ord)() + class SortedIntCorrect extends Sorted: + type Element = Int + +class SortedIntCorrect2 extends Sorted: + type Element = Int + override given (Int is Ord)() as given_Ord_Element + +class SortedIntWrong1 extends Sorted: // error + type Element = Int + override given (Element is Ord)() + +class SortedIntWrong2 extends Sorted: // error + type Element = Int + override given (Int is Ord)() \ No newline at end of file diff --git a/tests/pending/pos/cbproxy-default.scala b/tests/pending/pos/cbproxy-default.scala new file mode 100644 index 000000000000..e8f12ceeae75 --- /dev/null +++ b/tests/pending/pos/cbproxy-default.scala @@ -0,0 +1,4 @@ +def f[S: Monad]( + initial: S.Self = S.unit // error +) = + S.unit // works \ No newline at end of file diff --git a/tests/pending/pos/singleton-infer.scala b/tests/pending/pos/singleton-infer.scala new file mode 100644 index 000000000000..72e00baf3aab --- /dev/null +++ b/tests/pending/pos/singleton-infer.scala @@ -0,0 +1,8 @@ +//> using options -Xprint:typer -language:experimental.modularity -source future + +def f1[S, T <: S : Singleton](x: S) = () +def f2[S, T >: S : Singleton](x: S) = () + +def Test = + f1(42) // f1[Int, Singleton & Int] // should infer (42 : Int) or throw an error? + f2(42) // f2[(42 : Int), (42 : Int)] \ No newline at end of file diff --git a/tests/pos/cbproxy-expansion.scala b/tests/pos/cbproxy-expansion.scala new file mode 100644 index 000000000000..ee145b62d4ed --- /dev/null +++ b/tests/pos/cbproxy-expansion.scala @@ -0,0 +1,16 @@ +//> using options -language:experimental.modularity -source future +trait TC[T]: + type Self + +def f1[S, T: TC[S] as tc](x: S, y: tc.Self) = () +def f2[S, T: TC[S]](x: S, y: T.Self) = () +def f3[S, T: TC[S]](x: S, y: Int) = () + +given TC[String] with + type Self = Int + def unit = 42 + +def main = + f1("hello", 23) + f2("hello", 23) + f3("hello", 23) From 46c3eca537cafdcdc9c6e0eb3ccb0f03e2c4485d Mon Sep 17 00:00:00 2001 From: odersky Date: Tue, 7 May 2024 12:43:03 +0200 Subject: [PATCH 238/371] Update warn check files Error number changed [Cherry-picked 62e0244d0f77b4b9158da20d5a252e24e51e5db2] --- tests/warn/i16723.check | 2 +- tests/warn/i16723a.check | 2 +- 2 files changed, 2 insertions(+), 2 deletions(-) diff --git a/tests/warn/i16723.check b/tests/warn/i16723.check index ed8e55502a80..6d55fa0a89d2 100644 --- a/tests/warn/i16723.check +++ b/tests/warn/i16723.check @@ -1,4 +1,4 @@ --- [E195] Potential Issue Warning: tests/warn/i16723.scala:3:2 --------------------------------------------------------- +-- [E197] Potential Issue Warning: tests/warn/i16723.scala:3:2 --------------------------------------------------------- 3 | new Object {} // warn | ^ | New anonymous class definition will be duplicated at each inline site diff --git a/tests/warn/i16723a.check b/tests/warn/i16723a.check index ba4794fac23e..ace11c5af1f9 100644 --- a/tests/warn/i16723a.check +++ b/tests/warn/i16723a.check @@ -1,4 +1,4 @@ --- [E195] Potential Issue Warning: tests/warn/i16723a.scala:5:38 ------------------------------------------------------- +-- [E197] Potential Issue Warning: tests/warn/i16723a.scala:5:38 ------------------------------------------------------- 5 |inline given Converter[Int, String] = new Converter { // warn | ^ | New anonymous class definition will be duplicated at each inline site From 97afac079b522d7158eaa28505bff19c52d43fe3 Mon Sep 17 00:00:00 2001 From: odersky Date: Tue, 7 May 2024 13:05:53 +0200 Subject: [PATCH 239/371] Update InlayHints [Cherry-picked 9959f28ab5008d4a8deeb78f3764cec641f439db] --- .../test/dotty/tools/pc/tests/inlayHints/InlayHintsSuite.scala | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/presentation-compiler/test/dotty/tools/pc/tests/inlayHints/InlayHintsSuite.scala b/presentation-compiler/test/dotty/tools/pc/tests/inlayHints/InlayHintsSuite.scala index e470f492657c..8ce7cdce4382 100644 --- a/presentation-compiler/test/dotty/tools/pc/tests/inlayHints/InlayHintsSuite.scala +++ b/presentation-compiler/test/dotty/tools/pc/tests/inlayHints/InlayHintsSuite.scala @@ -898,7 +898,7 @@ class InlayHintsSuite extends BaseInlayHintsSuite { | import quotes.reflect.* | Type.of[T] match | case '[f] => - | val fr/*: TypeRepr<>*/ = TypeRepr.of[T]/*(using evidence$1<<(3:21)>>)*/ + | val fr/*: TypeRepr<>*/ = TypeRepr.of[T]/*(using evidence$1<<(3:23)>>)*/ |""".stripMargin ) From 8482eb15b9c8faa6f15c2dedb9a54a5f8b731b2d Mon Sep 17 00:00:00 2001 From: odersky Date: Tue, 7 May 2024 15:15:39 +0200 Subject: [PATCH 240/371] Fix typo [Cherry-picked 3c78ada957b8f77f6055ea280e09693f40d0e845] --- library/src/scala/runtime/stdLibPatches/language.scala | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/library/src/scala/runtime/stdLibPatches/language.scala b/library/src/scala/runtime/stdLibPatches/language.scala index a5cd683775f0..1171c62602fb 100644 --- a/library/src/scala/runtime/stdLibPatches/language.scala +++ b/library/src/scala/runtime/stdLibPatches/language.scala @@ -96,7 +96,7 @@ object language: * @see [[https://dotty.epfl.ch/docs/reference/experimental/into-modifier]] */ @compileTimeOnly("`namedTuples` can only be used at compile time in import statements") - object namedTupleas + object namedTuples /** Experimental support for new features for better modularity, including * - better tracking of dependencies through classes From cb37a1fd1b77e01159b777fa3180c148d055eae8 Mon Sep 17 00:00:00 2001 From: Jamie Thompson Date: Mon, 22 Apr 2024 10:23:54 +0200 Subject: [PATCH 241/371] step 1: basic script that forwards to prebuilt launcher and overrides default scala-version [Cherry-picked 4d291fb3b68867d5a970faad52d1ab5a861b395a] --- dist/bin/cli-common | 26 ++++++++++++++++ dist/bin/scala | 56 +++++++++++---------------------- dist/bin/scala_legacy | 72 +++++++++++++++++++++++++++++++++++++++++++ 3 files changed, 116 insertions(+), 38 deletions(-) create mode 100644 dist/bin/cli-common create mode 100755 dist/bin/scala_legacy diff --git a/dist/bin/cli-common b/dist/bin/cli-common new file mode 100644 index 000000000000..780094f2e3d8 --- /dev/null +++ b/dist/bin/cli-common @@ -0,0 +1,26 @@ +#!/usr/bin/env bash + +SCALA_CLI_LAUNCHER="/Users/jamie/workspace/scala-cli/out/cli/3.3.3/launcher.dest/launcher" + +#/*-------------------------------------------------------------------------- +# * Credits: This script is based on the script generated by sbt-pack. +# *--------------------------------------------------------------------------*/ + +# save terminal settings +saved_stty=$(stty -g 2>/dev/null) +# clear on error so we don't later try to restore them +if [[ ! $? ]]; then + saved_stty="" +fi + +# restore stty settings (echo in particular) +function restoreSttySettings() { + stty $saved_stty + saved_stty="" +} + +scala_exit_status=127 +function onExit() { + [[ "$saved_stty" != "" ]] && restoreSttySettings + exit $scala_exit_status +} diff --git a/dist/bin/scala b/dist/bin/scala index bd69d40c2b97..04215fcfaa0b 100755 --- a/dist/bin/scala +++ b/dist/bin/scala @@ -26,47 +26,27 @@ if [ -z "${PROG_HOME-}" ] ; then cd "$saveddir" fi -source "$PROG_HOME/bin/common" - -while [[ $# -gt 0 ]]; do - case "$1" in - -D*) - # pass to scala as well: otherwise we lose it sometimes when we - # need it, e.g. communicating with a server compiler. - # respect user-supplied -Dscala.usejavacp - addJava "$1" - addScala "$1" - shift - ;; - -J*) - # as with -D, pass to scala even though it will almost - # never be used. - addJava "${1:2}" - addScala "$1" - shift - ;; - -classpath*) - if [ "$1" != "${1##* }" ]; then - # -classpath and its value have been supplied in a single string e.g. "-classpath 'lib/*'" - A=$1 ; shift # consume $1 before adding its substrings back - set -- $A "$@" # split $1 on whitespace and put it back - else - addScala "$1" - shift - fi - ;; - *) - addScala "$1" - shift - ;; - esac -done +source "$PROG_HOME/bin/cli-common" + +SCALA_VERSION="" +# iterate through lines in VERSION_SRC +while IFS= read -r line; do + # if line starts with "version:=" then extract the version + if [[ "$line" == version:=* ]]; then + SCALA_VERSION="${line#version:=}" + break + fi +done < "$PROG_HOME/VERSION" + +# assert that SCALA_VERSION is not empty +if [ -z "$SCALA_VERSION" ]; then + echo "Failed to extract Scala version from $PROG_HOME/VERSION" + exit 1 +fi # exec here would prevent onExit from being called, leaving terminal in unusable state -compilerJavaClasspathArgs [ -z "${ConEmuPID-}" -o -n "${cygwin-}" ] && export MSYSTEM= PWD= # workaround for #12405 -eval "\"$JAVACMD\"" "${java_args[@]}" "-Dscala.home=\"$PROG_HOME\"" "-classpath \"$jvm_cp_args\"" "dotty.tools.MainGenericRunner" "-classpath \"$jvm_cp_args\"" "${scala_args[@]}" +eval "\"$SCALA_CLI_LAUNCHER\"" "--cli-default-scala-version \"$SCALA_VERSION\"" "$@" scala_exit_status=$? - onExit diff --git a/dist/bin/scala_legacy b/dist/bin/scala_legacy new file mode 100755 index 000000000000..bd69d40c2b97 --- /dev/null +++ b/dist/bin/scala_legacy @@ -0,0 +1,72 @@ +#!/usr/bin/env bash + +# Try to autodetect real location of the script +if [ -z "${PROG_HOME-}" ] ; then + ## resolve links - $0 may be a link to PROG_HOME + PRG="$0" + + # need this for relative symlinks + while [ -h "$PRG" ] ; do + ls=`ls -ld "$PRG"` + link=`expr "$ls" : '.*-> \(.*\)$'` + if expr "$link" : '/.*' > /dev/null; then + PRG="$link" + else + PRG="`dirname "$PRG"`/$link" + fi + done + + saveddir=`pwd` + + PROG_HOME=`dirname "$PRG"`/.. + + # make it fully qualified + PROG_HOME=`cd "$PROG_HOME" && pwd` + + cd "$saveddir" +fi + +source "$PROG_HOME/bin/common" + +while [[ $# -gt 0 ]]; do + case "$1" in + -D*) + # pass to scala as well: otherwise we lose it sometimes when we + # need it, e.g. communicating with a server compiler. + # respect user-supplied -Dscala.usejavacp + addJava "$1" + addScala "$1" + shift + ;; + -J*) + # as with -D, pass to scala even though it will almost + # never be used. + addJava "${1:2}" + addScala "$1" + shift + ;; + -classpath*) + if [ "$1" != "${1##* }" ]; then + # -classpath and its value have been supplied in a single string e.g. "-classpath 'lib/*'" + A=$1 ; shift # consume $1 before adding its substrings back + set -- $A "$@" # split $1 on whitespace and put it back + else + addScala "$1" + shift + fi + ;; + *) + addScala "$1" + shift + ;; + esac +done + +# exec here would prevent onExit from being called, leaving terminal in unusable state +compilerJavaClasspathArgs +[ -z "${ConEmuPID-}" -o -n "${cygwin-}" ] && export MSYSTEM= PWD= # workaround for #12405 +eval "\"$JAVACMD\"" "${java_args[@]}" "-Dscala.home=\"$PROG_HOME\"" "-classpath \"$jvm_cp_args\"" "dotty.tools.MainGenericRunner" "-classpath \"$jvm_cp_args\"" "${scala_args[@]}" +scala_exit_status=$? + + +onExit From 1784e67234ff5e9f611a82b8574c804d6f02a57d Mon Sep 17 00:00:00 2001 From: Jamie Thompson Date: Thu, 25 Apr 2024 09:33:31 +0200 Subject: [PATCH 242/371] Resolve artefacts to a local repo: assemble a map of artefacts in maven format to both the local artefacts and library dependencies. Write them to dist/target/local-repo/maven2. Copy the local-repo to dist/target/pack/local. TODO: - evaluate how to remove lib dir in pack, only resolve from repo [Cherry-picked 420e2450b5f64d96e02d57823d59dc9739c3c60c] --- dist/bin/cli-common | 6 ++ dist/bin/scala | 13 ++++- project/Build.scala | 135 ++++++++++++++++++++++++++++++++++++++++++-- 3 files changed, 148 insertions(+), 6 deletions(-) diff --git a/dist/bin/cli-common b/dist/bin/cli-common index 780094f2e3d8..67b8893223d3 100644 --- a/dist/bin/cli-common +++ b/dist/bin/cli-common @@ -24,3 +24,9 @@ function onExit() { [[ "$saved_stty" != "" ]] && restoreSttySettings exit $scala_exit_status } + +declare -a scala_args + +addScala () { + scala_args+=("'$1'") +} diff --git a/dist/bin/scala b/dist/bin/scala index 04215fcfaa0b..7d813d265e73 100755 --- a/dist/bin/scala +++ b/dist/bin/scala @@ -44,9 +44,20 @@ if [ -z "$SCALA_VERSION" ]; then exit 1 fi +MVN_REPOSITORY="file://$PROG_HOME/local/maven2" + +# escape all script arguments +while [[ $# -gt 0 ]]; do + addScala "$1" + shift +done + # exec here would prevent onExit from being called, leaving terminal in unusable state [ -z "${ConEmuPID-}" -o -n "${cygwin-}" ] && export MSYSTEM= PWD= # workaround for #12405 -eval "\"$SCALA_CLI_LAUNCHER\"" "--cli-default-scala-version \"$SCALA_VERSION\"" "$@" +eval "\"$SCALA_CLI_LAUNCHER\"" \ + "--cli-default-scala-version \"$SCALA_VERSION\"" \ + "-r \"$MVN_REPOSITORY\"" \ + "${scala_args[@]}" scala_exit_status=$? onExit diff --git a/project/Build.scala b/project/Build.scala index 350471cc3e12..6ebfa04f974f 100644 --- a/project/Build.scala +++ b/project/Build.scala @@ -26,6 +26,8 @@ import sbttastymima.TastyMiMaPlugin import sbttastymima.TastyMiMaPlugin.autoImport._ import scala.util.Properties.isJavaAtLeast +import scala.collection.mutable + import org.portablescala.sbtplatformdeps.PlatformDepsPlugin.autoImport._ import org.scalajs.linker.interface.{ModuleInitializer, StandardConfig} @@ -2110,17 +2112,128 @@ object Build { ) ) - lazy val commonDistSettings = Seq( + lazy val DistCacheConfig = config("DistCacheConfig") extend Compile + + val distModules = taskKey[Seq[(ModuleID, Map[Artifact, File])]]("fetch local artifacts for distribution.") + val distResolvedArtifacts = taskKey[Seq[ResolvedArtifacts]]("Resolve the dependencies for the distribution") + val distCaching = taskKey[File]("cache the dependencies for the distribution") + + def evalPublishSteps(dependencies: Seq[ProjectReference]): Def.Initialize[Task[Seq[(ModuleID, Map[Artifact, File])]]] = { + val publishAllLocalBin = dependencies.map({ d => ((d / publishLocalBin / packagedArtifacts)) }).join + val resolveId = dependencies.map({ d => ((d / projectID)) }).join + Def.task { + val s = streams.value + val log = s.log + val published = publishAllLocalBin.value + val ids = resolveId.value + + ids.zip(published) + } + } + + case class SimpleModuleId(org: String, name: String, revision: String) { + override def toString = s"$org:$name:$revision" + } + case class ResolvedArtifacts(id: SimpleModuleId, jar: File, pom: File) + + def commonDistSettings(dependencies: Seq[ClasspathDep[ProjectReference]]) = Seq( packMain := Map(), publishArtifact := false, packGenerateMakefile := false, - packExpandedClasspath := true, - packArchiveName := "scala3-" + dottyVersion + packArchiveName := "scala3-" + dottyVersion, + DistCacheConfig / distModules := { + evalPublishSteps(dependencies.map(_.project)).value + }, + DistCacheConfig / distResolvedArtifacts := { + val localArtifactIds = (DistCacheConfig / distModules).value + val report = (thisProjectRef / updateFull).value + + val found = mutable.Map.empty[SimpleModuleId, ResolvedArtifacts] + val evicted = mutable.Set.empty[SimpleModuleId] + + localArtifactIds.foreach({ case (id, as) => + val simpleId = { + val name0 = id.crossVersion match { + case _: CrossVersion.Binary => + // projectID does not add binary suffix + (id.name + "_3").ensuring(!id.name.endsWith("_3") && id.revision.startsWith("3.")) + case _ => id.name + } + SimpleModuleId(id.organization, name0, id.revision) + } + var jarOrNull: File = null + var pomOrNull: File = null + as.foreach({ case (a, f) => + if (a.`type` == "jar") { + jarOrNull = f + } else if (a.`type` == "pom") { + pomOrNull = f + } + }) + assert(jarOrNull != null, s"Could not find jar for ${id}") + assert(pomOrNull != null, s"Could not find pom for ${id}") + evicted += simpleId.copy(revision = simpleId.revision + "-nonbootstrapped") + found(simpleId) = ResolvedArtifacts(simpleId, jarOrNull, pomOrNull) + }) + + report.allModuleReports.foreach { mr => + val simpleId = { + val id = mr.module + SimpleModuleId(id.organization, id.name, id.revision) + } + + if (!found.contains(simpleId) && !evicted(simpleId)) { + var jarOrNull: File = null + var pomOrNull: File = null + mr.artifacts.foreach({ case (a, f) => + if (a.`type` == "jar" || a.`type` == "bundle") { + jarOrNull = f + } else if (a.`type` == "pom") { + pomOrNull = f + } + }) + assert(jarOrNull != null, s"Could not find jar for ${simpleId}") + if (pomOrNull == null) { + val jarPath = jarOrNull.toPath + // we found the jar, so assume we can resolve a sibling pom file + val pomPath = jarPath.resolveSibling(jarPath.getFileName.toString.stripSuffix(".jar") + ".pom") + assert(Files.exists(pomPath), s"Could not find pom for ${simpleId}") + pomOrNull = pomPath.toFile + } + found(simpleId) = ResolvedArtifacts(simpleId, jarOrNull, pomOrNull) + } + + } + found.values.toSeq + }, + DistCacheConfig / distCaching := { + val resolved = (DistCacheConfig / distResolvedArtifacts).value + val targetDir = target.value + val cacheDir = targetDir / "local-repo" + val mavenRepo = cacheDir / "maven2" + IO.createDirectory(mavenRepo) + resolved.foreach { ra => + val jar = ra.jar + val pom = ra.pom + + val pathElems = ra.id.org.split('.').toVector :+ ra.id.name :+ ra.id.revision + val artifactDir = pathElems.foldLeft(mavenRepo)(_ / _) + IO.createDirectory(artifactDir) + IO.copyFile(jar, artifactDir / jar.getName) + IO.copyFile(pom, artifactDir / pom.getName) + } + cacheDir + }, + Compile / pack := { + val localRepo = (DistCacheConfig / distCaching).value + (Compile / pack).value + } ) lazy val dist = project.asDist(Bootstrapped) .settings( packResourceDir += (baseDirectory.value / "bin" -> "bin"), + packResourceDir += (target.value / "local-repo" -> "local"), ) private def customMimaReportBinaryIssues(issueFilterLocation: String) = mimaReportBinaryIssues := { @@ -2251,12 +2364,24 @@ object Build { def asDist(implicit mode: Mode): Project = project. enablePlugins(PackPlugin). withCommonSettings. - dependsOn(`scala3-interfaces`, dottyCompiler, dottyLibrary, tastyCore, `scala3-staging`, `scala3-tasty-inspector`, scaladoc). - settings(commonDistSettings). + dependsOn( + `scala3-interfaces`, + dottyCompiler, + dottyLibrary, + tastyCore, + `scala3-staging`, + `scala3-tasty-inspector`, + scaladoc, + `scala3-sbt-bridge`, // for scala-cli + ). + withDepSettings(commonDistSettings). bootstrappedSettings( target := baseDirectory.value / "target" // override setting in commonBootstrappedSettings ) + def withDepSettings(f: Seq[ClasspathDep[ProjectReference]] => Seq[Setting[?]]): Project = + project.settings(f(project.dependencies)) + def withCommonSettings(implicit mode: Mode): Project = project.settings(mode match { case NonBootstrapped => commonNonBootstrappedSettings case Bootstrapped => commonBootstrappedSettings From 10bd87f5eb8dde7b853c58231dfd63e405e94697 Mon Sep 17 00:00:00 2001 From: Jamie Thompson Date: Fri, 26 Apr 2024 16:58:21 +0200 Subject: [PATCH 243/371] use scala-cli jar launcher (TODO: download automatically) [Cherry-picked 465da00d196c6ae4e27ab93fd22a07166bd49758] --- dist/bin/cli-common | 132 +++++++++++++++++++++++++++++++++++++++++++- dist/bin/scala | 3 +- 2 files changed, 132 insertions(+), 3 deletions(-) diff --git a/dist/bin/cli-common b/dist/bin/cli-common index 67b8893223d3..975df5abfee3 100644 --- a/dist/bin/cli-common +++ b/dist/bin/cli-common @@ -1,7 +1,5 @@ #!/usr/bin/env bash -SCALA_CLI_LAUNCHER="/Users/jamie/workspace/scala-cli/out/cli/3.3.3/launcher.dest/launcher" - #/*-------------------------------------------------------------------------- # * Credits: This script is based on the script generated by sbt-pack. # *--------------------------------------------------------------------------*/ @@ -25,6 +23,136 @@ function onExit() { exit $scala_exit_status } +#/*-------------------------------------------------------------------------- +# * SECTION FOR JAVA COMMAND +# *--------------------------------------------------------------------------*/ + +# to reenable echo if we are interrupted before completing. +trap onExit INT TERM EXIT + +unset cygwin mingw msys darwin conemu + +# COLUMNS is used together with command line option '-pageWidth'. +if command -v tput >/dev/null 2>&1; then + export COLUMNS="$(tput -Tdumb cols)" +fi + +case "`uname`" in + CYGWIN*) cygwin=true + ;; + MINGW*) mingw=true + ;; + MSYS*) msys=true + ;; + Darwin*) darwin=true + if [ -z "$JAVA_VERSION" ] ; then + JAVA_VERSION="CurrentJDK" + else + echo "Using Java version: $JAVA_VERSION" 1>&2 + fi + if [ -z "$JAVA_HOME" ] ; then + JAVA_HOME=/System/Library/Frameworks/JavaVM.framework/Versions/${JAVA_VERSION}/Home + fi + JAVACMD="`which java`" + ;; +esac + +unset CYGPATHCMD +if [[ ${cygwin-} || ${mingw-} || ${msys-} ]]; then + # ConEmu terminal is incompatible with jna-5.*.jar + [[ (${CONEMUANSI-} || ${ConEmuANSI-}) ]] && conemu=true + # cygpath is used by various windows shells: cygwin, git-sdk, gitbash, msys, etc. + CYGPATHCMD=`which cygpath 2>/dev/null` + case "$TERM" in + rxvt* | xterm* | cygwin*) + stty -icanon min 1 -echo + JAVA_OPTS="$JAVA_OPTS -Djline.terminal=unix" + ;; + esac +fi + +# Resolve JAVA_HOME from javac command path +if [ -z "$JAVA_HOME" ]; then + javaExecutable="`which javac`" + if [ -n "$javaExecutable" -a -f "$javaExecutable" -a ! "`expr \"$javaExecutable\" : '\([^ ]*\)'`" = "no" ]; then + # readlink(1) is not available as standard on Solaris 10. + readLink=`which readlink` + if [ ! `expr "$readLink" : '\([^ ]*\)'` = "no" ]; then + javaExecutable="`readlink -f \"$javaExecutable\"`" + javaHome="`dirname \"$javaExecutable\"`" + javaHome=`expr "$javaHome" : '\(.*\)/bin'` + JAVA_HOME="$javaHome" + export JAVA_HOME + fi + fi +fi + +if [ -z "${JAVACMD-}" ] ; then + if [ -n "${JAVA_HOME-}" ] ; then + if [ -x "$JAVA_HOME/jre/sh/java" ] ; then + # IBM's JDK on AIX uses strange locations for the executables + JAVACMD="$JAVA_HOME/jre/sh/java" + else + JAVACMD="$JAVA_HOME/bin/java" + fi + else + JAVACMD="`which java`" + fi +fi + +if [ ! -x "$JAVACMD" ] ; then + echo "Error: JAVA_HOME is not defined correctly." + echo " We cannot execute $JAVACMD" + exit 1 +fi + +if [ -z "$JAVA_HOME" ] ; then + echo "Warning: JAVA_HOME environment variable is not set." +fi + +CLASSPATH_SUFFIX="" +# Path separator used in EXTRA_CLASSPATH +PSEP=":" + +# translate paths to Windows-mixed format before running java +if [ -n "${CYGPATHCMD-}" ]; then + [ -n "${PROG_HOME-}" ] && + PROG_HOME=`"$CYGPATHCMD" -am "$PROG_HOME"` + [ -n "$JAVA_HOME" ] && + JAVA_HOME=`"$CYGPATHCMD" -am "$JAVA_HOME"` + CLASSPATH_SUFFIX=";" + PSEP=";" +elif [[ ${mingw-} || ${msys-} ]]; then + # For Mingw / Msys, convert paths from UNIX format before anything is touched + [ -n "$PROG_HOME" ] && + PROG_HOME="`(cd "$PROG_HOME"; pwd -W | sed 's|/|\\\\|g')`" + [ -n "$JAVA_HOME" ] && + JAVA_HOME="`(cd "$JAVA_HOME"; pwd -W | sed 's|/|\\\\|g')`" + CLASSPATH_SUFFIX=";" + PSEP=";" +fi + +#/*-------------------------------------------------- +# * The code below is for Dotty +# *-------------------------------------------------*/ + +find_lib () { + for lib in "$PROG_HOME"/lib/$1 ; do + if [[ -f "$lib" ]]; then + if [ -n "$CYGPATHCMD" ]; then + "$CYGPATHCMD" -am "$lib" + elif [[ $mingw || $msys ]]; then + echo "$lib" | sed 's|/|\\\\|g' + else + echo "$lib" + fi + return + fi + done +} + +SCALA_CLI_JAR=$(find_lib "*scala-cli*") + declare -a scala_args addScala () { diff --git a/dist/bin/scala b/dist/bin/scala index 7d813d265e73..59c5508f6ba4 100755 --- a/dist/bin/scala +++ b/dist/bin/scala @@ -54,7 +54,8 @@ done # exec here would prevent onExit from being called, leaving terminal in unusable state [ -z "${ConEmuPID-}" -o -n "${cygwin-}" ] && export MSYSTEM= PWD= # workaround for #12405 -eval "\"$SCALA_CLI_LAUNCHER\"" \ +eval "\"$JAVACMD\"" \ + "-jar \"$SCALA_CLI_JAR\"" \ "--cli-default-scala-version \"$SCALA_VERSION\"" \ "-r \"$MVN_REPOSITORY\"" \ "${scala_args[@]}" From 237a592b410bab7363f9f24485fe823d3dea2ef5 Mon Sep 17 00:00:00 2001 From: Jamie Thompson Date: Fri, 26 Apr 2024 17:26:38 +0200 Subject: [PATCH 244/371] refactor republishing to a plugin [Cherry-picked b1cd484d7f977d110f647ed162a389a473774a48] --- project/Build.scala | 120 ++--------------------------- project/RepublishPlugin.scala | 137 ++++++++++++++++++++++++++++++++++ 2 files changed, 144 insertions(+), 113 deletions(-) create mode 100644 project/RepublishPlugin.scala diff --git a/project/Build.scala b/project/Build.scala index 6ebfa04f974f..c3ee90fe722d 100644 --- a/project/Build.scala +++ b/project/Build.scala @@ -12,6 +12,8 @@ import pl.project13.scala.sbt.JmhPlugin import pl.project13.scala.sbt.JmhPlugin.JmhKeys.Jmh import sbt.Package.ManifestAttributes import sbt.PublishBinPlugin.autoImport._ +import dotty.tools.sbtplugin.RepublishPlugin +import dotty.tools.sbtplugin.RepublishPlugin.autoImport._ import sbt.plugins.SbtPlugin import sbt.ScriptedPlugin.autoImport._ import xerial.sbt.pack.PackPlugin @@ -2112,120 +2114,14 @@ object Build { ) ) - lazy val DistCacheConfig = config("DistCacheConfig") extend Compile - - val distModules = taskKey[Seq[(ModuleID, Map[Artifact, File])]]("fetch local artifacts for distribution.") - val distResolvedArtifacts = taskKey[Seq[ResolvedArtifacts]]("Resolve the dependencies for the distribution") - val distCaching = taskKey[File]("cache the dependencies for the distribution") - - def evalPublishSteps(dependencies: Seq[ProjectReference]): Def.Initialize[Task[Seq[(ModuleID, Map[Artifact, File])]]] = { - val publishAllLocalBin = dependencies.map({ d => ((d / publishLocalBin / packagedArtifacts)) }).join - val resolveId = dependencies.map({ d => ((d / projectID)) }).join - Def.task { - val s = streams.value - val log = s.log - val published = publishAllLocalBin.value - val ids = resolveId.value - - ids.zip(published) - } - } - - case class SimpleModuleId(org: String, name: String, revision: String) { - override def toString = s"$org:$name:$revision" - } - case class ResolvedArtifacts(id: SimpleModuleId, jar: File, pom: File) - - def commonDistSettings(dependencies: Seq[ClasspathDep[ProjectReference]]) = Seq( + lazy val commonDistSettings = Seq( packMain := Map(), publishArtifact := false, packGenerateMakefile := false, packArchiveName := "scala3-" + dottyVersion, - DistCacheConfig / distModules := { - evalPublishSteps(dependencies.map(_.project)).value - }, - DistCacheConfig / distResolvedArtifacts := { - val localArtifactIds = (DistCacheConfig / distModules).value - val report = (thisProjectRef / updateFull).value - - val found = mutable.Map.empty[SimpleModuleId, ResolvedArtifacts] - val evicted = mutable.Set.empty[SimpleModuleId] - - localArtifactIds.foreach({ case (id, as) => - val simpleId = { - val name0 = id.crossVersion match { - case _: CrossVersion.Binary => - // projectID does not add binary suffix - (id.name + "_3").ensuring(!id.name.endsWith("_3") && id.revision.startsWith("3.")) - case _ => id.name - } - SimpleModuleId(id.organization, name0, id.revision) - } - var jarOrNull: File = null - var pomOrNull: File = null - as.foreach({ case (a, f) => - if (a.`type` == "jar") { - jarOrNull = f - } else if (a.`type` == "pom") { - pomOrNull = f - } - }) - assert(jarOrNull != null, s"Could not find jar for ${id}") - assert(pomOrNull != null, s"Could not find pom for ${id}") - evicted += simpleId.copy(revision = simpleId.revision + "-nonbootstrapped") - found(simpleId) = ResolvedArtifacts(simpleId, jarOrNull, pomOrNull) - }) - - report.allModuleReports.foreach { mr => - val simpleId = { - val id = mr.module - SimpleModuleId(id.organization, id.name, id.revision) - } - - if (!found.contains(simpleId) && !evicted(simpleId)) { - var jarOrNull: File = null - var pomOrNull: File = null - mr.artifacts.foreach({ case (a, f) => - if (a.`type` == "jar" || a.`type` == "bundle") { - jarOrNull = f - } else if (a.`type` == "pom") { - pomOrNull = f - } - }) - assert(jarOrNull != null, s"Could not find jar for ${simpleId}") - if (pomOrNull == null) { - val jarPath = jarOrNull.toPath - // we found the jar, so assume we can resolve a sibling pom file - val pomPath = jarPath.resolveSibling(jarPath.getFileName.toString.stripSuffix(".jar") + ".pom") - assert(Files.exists(pomPath), s"Could not find pom for ${simpleId}") - pomOrNull = pomPath.toFile - } - found(simpleId) = ResolvedArtifacts(simpleId, jarOrNull, pomOrNull) - } - - } - found.values.toSeq - }, - DistCacheConfig / distCaching := { - val resolved = (DistCacheConfig / distResolvedArtifacts).value - val targetDir = target.value - val cacheDir = targetDir / "local-repo" - val mavenRepo = cacheDir / "maven2" - IO.createDirectory(mavenRepo) - resolved.foreach { ra => - val jar = ra.jar - val pom = ra.pom - - val pathElems = ra.id.org.split('.').toVector :+ ra.id.name :+ ra.id.revision - val artifactDir = pathElems.foldLeft(mavenRepo)(_ / _) - IO.createDirectory(artifactDir) - IO.copyFile(jar, artifactDir / jar.getName) - IO.copyFile(pom, artifactDir / pom.getName) - } - cacheDir - }, + republishRepo := target.value / "local-repo", Compile / pack := { - val localRepo = (DistCacheConfig / distCaching).value + val localRepo = republishClasspath.value // republish all artifacts to local repo (Compile / pack).value } ) @@ -2363,7 +2259,9 @@ object Build { def asDist(implicit mode: Mode): Project = project. enablePlugins(PackPlugin). + enablePlugins(RepublishPlugin). withCommonSettings. + settings(commonDistSettings). dependsOn( `scala3-interfaces`, dottyCompiler, @@ -2374,14 +2272,10 @@ object Build { scaladoc, `scala3-sbt-bridge`, // for scala-cli ). - withDepSettings(commonDistSettings). bootstrappedSettings( target := baseDirectory.value / "target" // override setting in commonBootstrappedSettings ) - def withDepSettings(f: Seq[ClasspathDep[ProjectReference]] => Seq[Setting[?]]): Project = - project.settings(f(project.dependencies)) - def withCommonSettings(implicit mode: Mode): Project = project.settings(mode match { case NonBootstrapped => commonNonBootstrappedSettings case Bootstrapped => commonBootstrappedSettings diff --git a/project/RepublishPlugin.scala b/project/RepublishPlugin.scala new file mode 100644 index 000000000000..314c39be7e8a --- /dev/null +++ b/project/RepublishPlugin.scala @@ -0,0 +1,137 @@ +package dotty.tools.sbtplugin + +import sbt._ +import xerial.sbt.pack.PackPlugin +import sbt.Keys._ +import sbt.AutoPlugin +import sbt.PublishBinPlugin +import sbt.PublishBinPlugin.autoImport._ + +import scala.collection.mutable +import java.nio.file.Files + +/** This local plugin provides ways of publishing a project classpath and library dependencies to + * .a local repository */ +object RepublishPlugin extends AutoPlugin { + override def trigger = allRequirements + override def requires = super.requires && PublishBinPlugin && PackPlugin + + object autoImport { + val republishProjectRefs = taskKey[Seq[ProjectRef]]("fetch the classpath deps from the project.") + val republishLocalResolved = taskKey[Seq[ResolvedArtifacts]]("resolve local artifacts for distribution.") + val republishAllResolved = taskKey[Seq[ResolvedArtifacts]]("Resolve the dependencies for the distribution") + val republishClasspath = taskKey[File]("cache the dependencies for the distribution") + val republishRepo = settingKey[File]("the location to store the republished artifacts.") + } + + import autoImport._ + + case class SimpleModuleId(org: String, name: String, revision: String) { + override def toString = s"$org:$name:$revision" + } + case class ResolvedArtifacts(id: SimpleModuleId, jar: File, pom: File) + + override val projectSettings: Seq[Def.Setting[_]] = Def.settings( + republishLocalResolved / republishProjectRefs := { + val proj = thisProjectRef.value + val deps = buildDependencies.value + + deps.classpathRefs(proj) + }, + republishLocalResolved := Def.taskDyn { + val deps = (republishLocalResolved / republishProjectRefs).value + val publishAllLocalBin = deps.map({ d => ((d / publishLocalBin / packagedArtifacts)) }).join + val resolveId = deps.map({ d => ((d / projectID)) }).join + Def.task { + val s = streams.value + val log = s.log + val published = publishAllLocalBin.value + val ids = resolveId.value + + ids.zip(published).map({ case (id, as) => + val simpleId = { + val name0 = id.crossVersion match { + case _: CrossVersion.Binary => + // projectID does not add binary suffix + (id.name + "_3").ensuring(!id.name.endsWith("_3") && id.revision.startsWith("3.")) + case _ => id.name + } + SimpleModuleId(id.organization, name0, id.revision) + } + var jarOrNull: File = null + var pomOrNull: File = null + as.foreach({ case (a, f) => + if (a.`type` == "jar") { + jarOrNull = f + } else if (a.`type` == "pom") { + pomOrNull = f + } + }) + assert(jarOrNull != null, s"Could not find jar for ${id}") + assert(pomOrNull != null, s"Could not find pom for ${id}") + ResolvedArtifacts(simpleId, jarOrNull, pomOrNull) + }) + } + }.value, + republishAllResolved := { + val localResolved = republishLocalResolved.value + val report = (thisProjectRef / updateFull).value + + val found = mutable.Map.empty[SimpleModuleId, ResolvedArtifacts] + val evicted = mutable.Set.empty[SimpleModuleId] + + localResolved.foreach({ resolved => + val simpleId = resolved.id + evicted += simpleId.copy(revision = simpleId.revision + "-nonbootstrapped") + found(simpleId) = resolved + }) + + report.allModuleReports.foreach { mr => + val simpleId = { + val id = mr.module + SimpleModuleId(id.organization, id.name, id.revision) + } + + if (!found.contains(simpleId) && !evicted(simpleId)) { + var jarOrNull: File = null + var pomOrNull: File = null + mr.artifacts.foreach({ case (a, f) => + if (a.`type` == "jar" || a.`type` == "bundle") { + jarOrNull = f + } else if (a.`type` == "pom") { + pomOrNull = f + } + }) + assert(jarOrNull != null, s"Could not find jar for ${simpleId}") + if (pomOrNull == null) { + val jarPath = jarOrNull.toPath + // we found the jar, so assume we can resolve a sibling pom file + val pomPath = jarPath.resolveSibling(jarPath.getFileName.toString.stripSuffix(".jar") + ".pom") + assert(Files.exists(pomPath), s"Could not find pom for ${simpleId}") + pomOrNull = pomPath.toFile + } + found(simpleId) = ResolvedArtifacts(simpleId, jarOrNull, pomOrNull) + } + + } + found.values.toSeq + }, + republishClasspath := { + val resolved = republishAllResolved.value + val cacheDir = republishRepo.value + val mavenRepo = cacheDir / "maven2" + IO.createDirectory(mavenRepo) + resolved.foreach { ra => + val jar = ra.jar + val pom = ra.pom + + val pathElems = ra.id.org.split('.').toVector :+ ra.id.name :+ ra.id.revision + val artifactDir = pathElems.foldLeft(mavenRepo)(_ / _) + IO.createDirectory(artifactDir) + IO.copyFile(jar, artifactDir / jar.getName) + IO.copyFile(pom, artifactDir / pom.getName) + } + cacheDir + } + ) +} From 34e8fc5fee8347477443dbef87f999a20798c5e8 Mon Sep 17 00:00:00 2001 From: Jamie Thompson Date: Mon, 29 Apr 2024 19:00:32 +0200 Subject: [PATCH 245/371] download and cache launcher, add scalajs library [Cherry-picked 13f5f04720f863a0fbc19c61e83e9d88ffd0de97] --- dist/bin/cli-common | 2 +- dist/bin/scala | 2 +- project/Build.scala | 23 ++++++++---- project/Modes.scala | 6 +++- project/RepublishPlugin.scala | 68 +++++++++++++++++++++++++++++++---- 5 files changed, 86 insertions(+), 15 deletions(-) diff --git a/dist/bin/cli-common b/dist/bin/cli-common index 975df5abfee3..d295d58916da 100644 --- a/dist/bin/cli-common +++ b/dist/bin/cli-common @@ -151,7 +151,7 @@ find_lib () { done } -SCALA_CLI_JAR=$(find_lib "*scala-cli*") +SCALA_CLI_JAR="$PROG_HOME/etc/scala-cli.jar" declare -a scala_args diff --git a/dist/bin/scala b/dist/bin/scala index 59c5508f6ba4..4d357918ae07 100755 --- a/dist/bin/scala +++ b/dist/bin/scala @@ -44,7 +44,7 @@ if [ -z "$SCALA_VERSION" ]; then exit 1 fi -MVN_REPOSITORY="file://$PROG_HOME/local/maven2" +MVN_REPOSITORY="file://$PROG_HOME/maven2" # escape all script arguments while [[ $# -gt 0 ]]; do diff --git a/project/Build.scala b/project/Build.scala index c3ee90fe722d..cbc35c3f2f92 100644 --- a/project/Build.scala +++ b/project/Build.scala @@ -118,6 +118,9 @@ object Build { */ val mimaPreviousLTSDottyVersion = "3.3.0" + /** Version of Scala CLI to download */ + val scalaCliLauncherVersion = "1.3.0" + object CompatMode { final val BinaryCompatible = 0 final val SourceAndBinaryCompatible = 1 @@ -2119,17 +2122,21 @@ object Build { publishArtifact := false, packGenerateMakefile := false, packArchiveName := "scala3-" + dottyVersion, - republishRepo := target.value / "local-repo", - Compile / pack := { - val localRepo = republishClasspath.value // republish all artifacts to local repo - (Compile / pack).value - } + republishRepo := target.value / "republish", + republishLaunchers := { + val cliV = scalaCliLauncherVersion + Seq( + ("scala-cli.jar", cliV, url(s"https://github.com/VirtusLab/scala-cli/releases/download/v$cliV/scala-cli.jar")) + ) + }, + Compile / pack := (Compile / pack).dependsOn(republish).value, ) lazy val dist = project.asDist(Bootstrapped) .settings( packResourceDir += (baseDirectory.value / "bin" -> "bin"), - packResourceDir += (target.value / "local-repo" -> "local"), + packResourceDir += (republishRepo.value / "maven2" -> "maven2"), + packResourceDir += (republishRepo.value / "etc" -> "etc"), ) private def customMimaReportBinaryIssues(issueFilterLocation: String) = mimaReportBinaryIssues := { @@ -2260,6 +2267,7 @@ object Build { def asDist(implicit mode: Mode): Project = project. enablePlugins(PackPlugin). enablePlugins(RepublishPlugin). + bootstrappedEnablePlugins(DottyJSPlugin). withCommonSettings. settings(commonDistSettings). dependsOn( @@ -2272,6 +2280,9 @@ object Build { scaladoc, `scala3-sbt-bridge`, // for scala-cli ). + bootstrappedDependsOn( + `scala3-library-bootstrappedJS` // for scala-cli + ). bootstrappedSettings( target := baseDirectory.value / "target" // override setting in commonBootstrappedSettings ) diff --git a/project/Modes.scala b/project/Modes.scala index eddb5a3f1a7b..fcc13dea8a89 100644 --- a/project/Modes.scala +++ b/project/Modes.scala @@ -1,4 +1,4 @@ -import sbt.{Project, ProjectReference, SettingsDefinition} +import sbt.{Project, ProjectReference, SettingsDefinition, Plugins} object Modes { @@ -25,5 +25,9 @@ object Modes { def bootstrappedDependsOn(s: sbt.ClasspathDep[ProjectReference]*)(implicit mode: Mode): Project = if (mode == NonBootstrapped) project else project.dependsOn(s: _*) + /** Plugins only if the mode is bootstrapped */ + def bootstrappedEnablePlugins(ns: Plugins*)(implicit mode: Mode): Project = + if (mode == NonBootstrapped) project else project.enablePlugins(ns: _*) + } } diff --git a/project/RepublishPlugin.scala b/project/RepublishPlugin.scala index 314c39be7e8a..0b71c9ecb6df 100644 --- a/project/RepublishPlugin.scala +++ b/project/RepublishPlugin.scala @@ -6,6 +6,8 @@ import sbt.Keys._ import sbt.AutoPlugin import sbt.PublishBinPlugin import sbt.PublishBinPlugin.autoImport._ +import sbt.io.Using +import sbt.util.CacheImplicits._ import scala.collection.mutable import java.nio.file.Files @@ -20,8 +22,11 @@ object RepublishPlugin extends AutoPlugin { val republishProjectRefs = taskKey[Seq[ProjectRef]]("fetch the classpath deps from the project.") val republishLocalResolved = taskKey[Seq[ResolvedArtifacts]]("resolve local artifacts for distribution.") val republishAllResolved = taskKey[Seq[ResolvedArtifacts]]("Resolve the dependencies for the distribution") - val republishClasspath = taskKey[File]("cache the dependencies for the distribution") + val republishClasspath = taskKey[Set[File]]("cache the dependencies for the distribution") + val republishFetchLaunchers = taskKey[Set[File]]("cache the launcher deps for the distribution") + val republish = taskKey[File]("cache the dependencies and download launchers for the distribution") val republishRepo = settingKey[File]("the location to store the republished artifacts.") + val republishLaunchers = settingKey[Seq[(String, String, URL)]]("launchers to download. Sequence of (name, version, URL).") } import autoImport._ @@ -43,17 +48,17 @@ object RepublishPlugin extends AutoPlugin { val publishAllLocalBin = deps.map({ d => ((d / publishLocalBin / packagedArtifacts)) }).join val resolveId = deps.map({ d => ((d / projectID)) }).join Def.task { - val s = streams.value - val log = s.log val published = publishAllLocalBin.value val ids = resolveId.value ids.zip(published).map({ case (id, as) => val simpleId = { + val disabled = CrossVersion.disabled val name0 = id.crossVersion match { - case _: CrossVersion.Binary => + case cv: CrossVersion.Binary => // projectID does not add binary suffix - (id.name + "_3").ensuring(!id.name.endsWith("_3") && id.revision.startsWith("3.")) + (s"${id.name}_${cv.prefix}${cv.suffix}3") + .ensuring(!id.name.endsWith("_3") && id.revision.startsWith("3.")) case _ => id.name } SimpleModuleId(id.organization, name0, id.revision) @@ -117,11 +122,15 @@ object RepublishPlugin extends AutoPlugin { found.values.toSeq }, republishClasspath := { + val s = streams.value val resolved = republishAllResolved.value val cacheDir = republishRepo.value + + val log = s.log val mavenRepo = cacheDir / "maven2" IO.createDirectory(mavenRepo) - resolved.foreach { ra => + resolved.map { ra => + log.info(s"[republish] publishing ${ra.id} to $mavenRepo...") val jar = ra.jar val pom = ra.pom @@ -130,7 +139,54 @@ object RepublishPlugin extends AutoPlugin { IO.createDirectory(artifactDir) IO.copyFile(jar, artifactDir / jar.getName) IO.copyFile(pom, artifactDir / pom.getName) + artifactDir + }.toSet + }, + republishFetchLaunchers := { + val s = streams.value + val log = s.log + val repoDir = republishRepo.value + val launcherVersions = republishLaunchers.value + + val etc = repoDir / "etc" + + val store = s.cacheStoreFactory / "versions" + + def work(dest: File, launcher: URL) = { + IO.delete(dest) + Using.urlInputStream(launcher) { in => + IO.createDirectory(etc) + log.info(s"[republish] Downloading $launcher to $dest...") + IO.transfer(in, dest) + log.info(s"[republish] Downloaded $launcher to $dest...") + } + dest + } + + val allLaunchers = { + for ((name, version, launcher) <- launcherVersions) yield { + val dest = etc / name + + val id = name.replaceAll("[^a-zA-Z0-9]", "_") + + val fetchAction = Tracked.inputChanged[String, File](store.make(id)) { (inChanged, version) => + if (inChanged || !Files.exists(dest.toPath)) { + work(dest, launcher) + } else { + log.info(s"[republish] Using cached $launcher at $dest...") + dest + } + } + + fetchAction(version) + } } + allLaunchers.toSet + }, + republish := { + val cacheDir = republishRepo.value + val artifacts = republishClasspath.value + val launchers = republishFetchLaunchers.value cacheDir } ) From 4682b52803313474ca1514336bb8a70043b9916b Mon Sep 17 00:00:00 2001 From: Jamie Thompson Date: Tue, 30 Apr 2024 17:42:00 +0200 Subject: [PATCH 246/371] fix project/scripts/bootstrappedOnlyCmdTests for new scala launcher [Cherry-picked 008b5eedfecdbc0dd90761e49597d4d70172a528] --- project/scripts/bootstrappedOnlyCmdTests | 31 +++++++++++++----------- 1 file changed, 17 insertions(+), 14 deletions(-) diff --git a/project/scripts/bootstrappedOnlyCmdTests b/project/scripts/bootstrappedOnlyCmdTests index 4e18e3a1d4a4..74a0f5b59a8f 100755 --- a/project/scripts/bootstrappedOnlyCmdTests +++ b/project/scripts/bootstrappedOnlyCmdTests @@ -18,28 +18,37 @@ grep -qe "val a: scala.Int = 3" "$tmp" # setup for `scalac`/`scala` script tests "$SBT" dist/pack +echo "capturing scala version from dist/target/pack/VERSION" +IFS=':=' read -ra versionProps < "$ROOT/dist/target/pack/VERSION" # temporarily set IFS to ':=' to split versionProps +[ ${#versionProps[@]} -eq 3 ] && \ + [ ${versionProps[0]} = "version" ] && \ + [ -n ${versionProps[2]} ] || die "Expected non-empty 'version' property in $ROOT/dist/target/pack/VERSION" +scala_version=${versionProps[2]} + # check that `scalac` compiles and `scala` runs it echo "testing ./bin/scalac and ./bin/scala" clear_out "$OUT" ./bin/scalac "$SOURCE" -d "$OUT" -./bin/scala -classpath "$OUT" "$MAIN" > "$tmp" +./bin/scala -classpath "$OUT" -M "$MAIN" > "$tmp" test "$EXPECTED_OUTPUT" = "$(cat "$tmp")" # Test scaladoc based on compiled classes ./bin/scaladoc -project Staging -d "$OUT1" "$OUT" clear_out "$OUT1" -# check that `scalac` and `scala` works for staging +# check that `scalac` and `scala` works for staging. +# TODO: scala3-staging should be automatically added by Scala CLI +# - see: https://github.com/VirtusLab/scala-cli/issues/2879 clear_out "$OUT" ./bin/scalac tests/run-staging/i4044f.scala -d "$OUT" -./bin/scala -with-compiler -classpath "$OUT" Test > "$tmp" +./bin/scala -with-compiler -classpath "$OUT" --dep "org.scala-lang::scala3-staging:$scala_version" -M Test > "$tmp" # check that `scalac -from-tasty` compiles and `scala` runs it echo "testing ./bin/scalac -from-tasty and scala -classpath" clear_out "$OUT1" ./bin/scalac "$SOURCE" -d "$OUT" ./bin/scalac -from-tasty -d "$OUT1" "$OUT/$TASTY" -./bin/scala -classpath "$OUT1" "$MAIN" > "$tmp" +./bin/scala -classpath "$OUT1" -M "$MAIN" > "$tmp" test "$EXPECTED_OUTPUT" = "$(cat "$tmp")" # check that `sbt scalac -decompile` runs @@ -90,10 +99,12 @@ clear_out "$OUT" ./bin/scalac -help > "$tmp" 2>&1 grep -qe "Usage: scalac " "$tmp" +# TODO: JAVA launcher should be able to override "scala-cli" program name +# - see: https://github.com/VirtusLab/scala-cli/issues/2838#issuecomment-2085130815 ./bin/scala -help > "$tmp" 2>&1 -grep -qe "Usage: scala " "$tmp" +grep -qe "See 'scala-cli --help' to read about a specific subcommand." "$tmp" -./bin/scala -d hello.jar tests/run/hello.scala +./bin/scala -d hello.jar tests/run/hello.scala --server=false ls hello.jar echo "testing i12973" @@ -102,14 +113,6 @@ clear_out "$OUT" echo "Bug12973().check" | TERM=dumb ./bin/scala -cp "$OUT/out.jar" > "$tmp" 2>&1 grep -qe "Bug12973 is fixed" "$tmp" -echo "capturing scala version from dist/target/pack/VERSION" -cwd=$(pwd) -IFS=':=' read -ra versionProps < "$cwd/dist/target/pack/VERSION" # temporarily set IFS to ':=' to split versionProps -[ ${#versionProps[@]} -eq 3 ] && \ - [ ${versionProps[0]} = "version" ] && \ - [ -n ${versionProps[2]} ] || die "Expected non-empty 'version' property in $cwd/dist/target/pack/VERSION" -scala_version=${versionProps[2]} - echo "testing -sourcepath with incremental compile: inlining changed inline def into a def" # Here we will test that a changed inline method symbol loaded from the sourcepath (-sourcepath compiler option) # will have its `defTree` correctly set when its method body is required for inlining. From c6cc0a3d314007f54daa5b71c9f449c7af8d38b1 Mon Sep 17 00:00:00 2001 From: Jamie Thompson Date: Fri, 3 May 2024 20:10:00 +0200 Subject: [PATCH 247/371] fix bash script tests [Cherry-picked fbe8323fae1445429f15a3a5ea64b8c809fc28b5] --- bin/scala | 2 +- .../scripting/classpathReport.sc | 4 +- .../test-resources/scripting/envtestNu.sc | 2 + .../test-resources/scripting/scriptPathNu.sc | 13 ++++++ compiler/test-resources/scripting/showArgs.sc | 2 +- .../test-resources/scripting/showArgsNu.sc | 6 +++ .../test-resources/scripting/sqlDateError.sc | 2 +- .../scripting/sqlDateErrorNu.sc | 6 +++ .../scripting/unglobClasspath.sc | 8 ++-- .../tools/scripting/BashExitCodeTests.scala | 16 ++++--- .../tools/scripting/BashScriptsTests.scala | 42 ++++++++++++------- .../tools/scripting/ClasspathTests.scala | 16 ++++++- .../tools/scripting/ExpressionTest.scala | 2 +- .../dotty/tools/scripting/ScriptTestEnv.scala | 15 ++++++- .../tools/scripting/ScriptingTests.scala | 10 ++++- 15 files changed, 108 insertions(+), 38 deletions(-) create mode 100755 compiler/test-resources/scripting/envtestNu.sc create mode 100755 compiler/test-resources/scripting/scriptPathNu.sc create mode 100755 compiler/test-resources/scripting/showArgsNu.sc create mode 100755 compiler/test-resources/scripting/sqlDateErrorNu.sc diff --git a/bin/scala b/bin/scala index 66ec9a5774c7..6506e3b38ab1 100755 --- a/bin/scala +++ b/bin/scala @@ -2,4 +2,4 @@ ROOT="$(cd "$(dirname "${BASH_SOURCE[0]}")" >& /dev/null && pwd)/.." -"$ROOT/bin/common" "$ROOT/dist/target/pack/bin/scala" "$@" +"$ROOT/bin/common" "$ROOT/dist/target/pack/bin/scala" "$@" "--offline" "--server=false" diff --git a/compiler/test-resources/scripting/classpathReport.sc b/compiler/test-resources/scripting/classpathReport.sc index a9eacbbba1f7..cc68c4b1d52e 100755 --- a/compiler/test-resources/scripting/classpathReport.sc +++ b/compiler/test-resources/scripting/classpathReport.sc @@ -1,8 +1,8 @@ -#!bin/scala -classpath 'dist/target/pack/lib/*' +#!/usr/bin/env bin/scala import java.nio.file.Paths -def main(args: Array[String]): Unit = +// def main(args: Array[String]): Unit = // MIGRATION: Scala CLI expects `*.sc` files to be straight-line code val cwd = Paths.get(".").toAbsolutePath.normalize.toString.norm printf("cwd: %s\n", cwd) printf("classpath: %s\n", sys.props("java.class.path").norm) diff --git a/compiler/test-resources/scripting/envtestNu.sc b/compiler/test-resources/scripting/envtestNu.sc new file mode 100755 index 000000000000..fe4cd7851b0a --- /dev/null +++ b/compiler/test-resources/scripting/envtestNu.sc @@ -0,0 +1,2 @@ +// MIGRATION: Scala CLI expects `*.sc` files to be straight-line code + println("Hello " + util.Properties.propOrNull("key")) diff --git a/compiler/test-resources/scripting/scriptPathNu.sc b/compiler/test-resources/scripting/scriptPathNu.sc new file mode 100755 index 000000000000..bb3e459654b9 --- /dev/null +++ b/compiler/test-resources/scripting/scriptPathNu.sc @@ -0,0 +1,13 @@ +#!/usr/bin/env bin/scala + +// THIS FILE IS RAN WITH SCALA CLI, which wraps scripts exposing scriptPath and args variables + +args.zipWithIndex.foreach { case (arg,i) => printf("arg %d: [%s]\n",i,arg) } + +if !scriptPath.endsWith("scriptPathNu.sc") then + printf( s"incorrect script.path defined as [$scriptPath]") +else + printf("scriptPath: %s\n", scriptPath) // report the value + +extension(s: String) + def norm: String = s.replace('\\', '/') diff --git a/compiler/test-resources/scripting/showArgs.sc b/compiler/test-resources/scripting/showArgs.sc index 28f16a9022b3..8ef08f8962b0 100755 --- a/compiler/test-resources/scripting/showArgs.sc +++ b/compiler/test-resources/scripting/showArgs.sc @@ -1,4 +1,4 @@ -#!/usr/bin/env scala +#!/usr/bin/env bin/scala // precise output format expected by BashScriptsTests.scala def main(args: Array[String]): Unit = diff --git a/compiler/test-resources/scripting/showArgsNu.sc b/compiler/test-resources/scripting/showArgsNu.sc new file mode 100755 index 000000000000..f4c1aa6af257 --- /dev/null +++ b/compiler/test-resources/scripting/showArgsNu.sc @@ -0,0 +1,6 @@ +#!/usr/bin/env bin/scala + +// precise output format expected by BashScriptsTests.scala +// MIGRATION: Scala CLI expects `*.sc` files to be straight-line code +for (a,i) <- args.zipWithIndex do + printf(s"arg %2d:[%s]\n",i,a) diff --git a/compiler/test-resources/scripting/sqlDateError.sc b/compiler/test-resources/scripting/sqlDateError.sc index ceff98f40cad..35160fd6fcd5 100755 --- a/compiler/test-resources/scripting/sqlDateError.sc +++ b/compiler/test-resources/scripting/sqlDateError.sc @@ -1,4 +1,4 @@ -#!bin/scala +#!/usr/bin/env bin/scala def main(args: Array[String]): Unit = { println(new java.sql.Date(100L)) diff --git a/compiler/test-resources/scripting/sqlDateErrorNu.sc b/compiler/test-resources/scripting/sqlDateErrorNu.sc new file mode 100755 index 000000000000..a6f1bd50297d --- /dev/null +++ b/compiler/test-resources/scripting/sqlDateErrorNu.sc @@ -0,0 +1,6 @@ +#!/usr/bin/env bin/scala + +// def main(args: Array[String]): Unit = { MIGRATION: Scala CLI expects `*.sc` files to be straight-line code + println(new java.sql.Date(100L)) + System.err.println("SCALA_OPTS="+Option(System.getenv("SCALA_OPTS")).getOrElse("")) +// } diff --git a/compiler/test-resources/scripting/unglobClasspath.sc b/compiler/test-resources/scripting/unglobClasspath.sc index 796697cdedf2..deab2b8982ac 100755 --- a/compiler/test-resources/scripting/unglobClasspath.sc +++ b/compiler/test-resources/scripting/unglobClasspath.sc @@ -1,8 +1,6 @@ -#!bin/scala -classpath 'dist/target/pack/lib/*' +// won't compile unless classpath is set correctly +import dotty.tools.tasty.TastyFormat -// won't compile unless the hashbang line sets classpath -import org.jline.terminal.Terminal - -def main(args: Array[String]) = +// def main(args: Array[String]) = // MIGRATION: Scala CLI expects `*.sc` files to be straight-line code val cp = sys.props("java.class.path") printf("unglobbed classpath: %s\n", cp) diff --git a/compiler/test/dotty/tools/scripting/BashExitCodeTests.scala b/compiler/test/dotty/tools/scripting/BashExitCodeTests.scala index cc53447cd64b..2fdc1eccaeb7 100644 --- a/compiler/test/dotty/tools/scripting/BashExitCodeTests.scala +++ b/compiler/test/dotty/tools/scripting/BashExitCodeTests.scala @@ -29,7 +29,7 @@ class BashExitCodeTests: }, expectedExitCode, exitCode) // Helpers for running scala, scalac, and scalac without the the output directory ("raw") - def scala(args: String*) = verifyExit(scalaPath, args*) + def scala(args: String*) = verifyExit(scalaPath, ("--offline" +: "--server=false" +: args)*) def scalacRaw(args: String*) = verifyExit(scalacPath, args*) def scalac(args: String*) = scalacRaw(("-d" +: tmpDir +: args)*) @@ -38,12 +38,16 @@ class BashExitCodeTests: Files.write(Files.createTempFile(tmpDir.toPath, getClass.getSimpleName, suffix), body.getBytes(UTF_8)).absPath @Test def neg = scalac(f("@main def Test = prin"))(1) - @Test def run = scalac(f("@main def Test = ???"))(0) & scala("-classpath", tmpDir, "Test")(1) - @Test def pos = scalac(f("@main def Test = ()"))(0) & scala("-classpath", tmpDir, "Test")(0) + @Test def run = scalac(f("@main def Test = ???"))(0) & scala("-classpath", tmpDir, "-M", "Test")(1) + @Test def pos = scalac(f("@main def Test = ()"))(0) & scala("-classpath", tmpDir, "-M", "Test")(0) - @Test def runNeg = scala(f("@main def Test = prin", ".sc"))(1) - @Test def runRun = scala(f("@main def Test = ???", ".sc"))(1) - @Test def runPos = scala(f("@main def Test = ()", ".sc"))(0) + @Test def runNeg_script = scala(f("prin", ".sc"))(1) + @Test def runRun_script = scala(f("???", ".sc"))(1) + @Test def runPos_script = scala(f("()", ".sc"))(0) + + @Test def runNeg = scala(f("@main def Test = prin", ".scala"))(1) + @Test def runRun = scala(f("@main def Test = ???", ".scala"))(1) + @Test def runPos = scala(f("@main def Test = ()", ".scala"))(0) @Test def scNeg = scalac("-script", f("@main def Test = prin", ".sc"))(1) @Test def scRun = scalac("-script", f("@main def Test = ???", ".sc"))(1) diff --git a/compiler/test/dotty/tools/scripting/BashScriptsTests.scala b/compiler/test/dotty/tools/scripting/BashScriptsTests.scala index f3f364754e20..69fb861a0516 100644 --- a/compiler/test/dotty/tools/scripting/BashScriptsTests.scala +++ b/compiler/test/dotty/tools/scripting/BashScriptsTests.scala @@ -5,7 +5,7 @@ package scripting import scala.language.unsafeNulls import java.nio.file.Paths -import org.junit.{Test, AfterClass} +import org.junit.{Test, Ignore, AfterClass} import org.junit.Assert.assertEquals import org.junit.Assume.assumeFalse import org.junit.experimental.categories.Category @@ -50,7 +50,9 @@ object BashScriptsTests: val testScriptArgs = Seq( "a", "b", "c", "-repl", "-run", "-script", "-debug" ) - val showArgsScript = testFiles.find(_.getName == "showArgs.sc").get.absPath + val Seq(showArgsScript, showArgsScalaCli) = Seq("showArgs.sc", "showArgsNu.sc").map { name => + testFiles.find(_.getName == name).get.absPath + } def testFile(name: String): String = val file = testFiles.find(_.getName == name) match { @@ -64,13 +66,13 @@ object BashScriptsTests: } file - val Seq(envtestSc, envtestScala) = Seq("envtest.sc", "envtest.scala").map { testFile(_) } + val Seq(envtestNuSc, envtestScala) = Seq("envtestNu.sc", "envtest.scala").map { testFile(_) } // create command line with given options, execute specified script, return stdout def callScript(tag: String, script: String, keyPre: String): String = val keyArg = s"$keyPre=$tag" printf("pass tag [%s] via [%s] to script [%s]\n", tag, keyArg, script) - val cmd: String = Seq("SCALA_OPTS= ", scalaPath, keyArg, script).mkString(" ") + val cmd: String = Seq("SCALA_OPTS= ", scalaPath, "run", keyArg, "--offline", "--server=false", script).mkString(" ") printf("cmd: [%s]\n", cmd) val (validTest, exitCode, stdout, stderr) = bashCommand(cmd) stderr.filter { !_.contains("Inappropriate ioctl") }.foreach { System.err.printf("stderr [%s]\n", _) } @@ -84,13 +86,15 @@ class BashScriptsTests: ////////////////////////// begin tests ////////////////////// /* verify that `dist/bin/scala` correctly passes args to the jvm via -J-D for script envtest.sc */ + @Ignore // SCALA CLI does not support `-J` to pass java properties, only things like -Xmx5g @Test def verifyScJProperty = assumeFalse("Scripts do not yet support Scala 2 library TASTy", Properties.usingScalaLibraryTasty) val tag = "World1" - val stdout = callScript(tag, envtestSc, s"-J-Dkey") + val stdout = callScript(tag, envtestNuSc, s"-J-Dkey") assertEquals( s"Hello $tag", stdout) /* verify that `dist/bin/scala` correctly passes args to the jvm via -J-D for script envtest.scala */ + @Ignore // SCALA CLI does not support `-J` to pass java properties, only things like -Xmx5g @Test def verifyScalaJProperty = assumeFalse("Scripts do not yet support Scala 2 library TASTy", Properties.usingScalaLibraryTasty) val tag = "World2" @@ -101,7 +105,7 @@ class BashScriptsTests: @Test def verifyScDProperty = assumeFalse("Scripts do not yet support Scala 2 library TASTy", Properties.usingScalaLibraryTasty) val tag = "World3" - val stdout = callScript(tag, envtestSc, s"-Dkey") + val stdout = callScript(tag, envtestNuSc, s"-Dkey") assertEquals(s"Hello $tag", stdout) /* verify that `dist/bin/scala` can set system properties via -D for envtest.scala */ @@ -114,7 +118,9 @@ class BashScriptsTests: /* verify that `dist/bin/scala` can set system properties via -D when executing compiled script via -jar envtest.jar */ @Test def saveAndRunWithDProperty = assumeFalse("Scripts do not yet support Scala 2 library TASTy", Properties.usingScalaLibraryTasty) - val commandline = Seq("SCALA_OPTS= ", scalaPath.relpath, "-save", envtestScala.relpath).mkString(" ") + val libOut = envtestScala.relpath.stripSuffix(".scala") + ".jar" + val commandline = Seq( + "SCALA_OPTS= ", scalaPath.relpath, "--power", "package", envtestScala.relpath, "-o", libOut, "--library", "--offline", "--server=false").mkString(" ") val (_, _, _, _) = bashCommand(commandline) // compile jar, discard output val testJar = testFile("envtest.jar") // jar is created by the previous bashCommand() if (testJar.isFile){ @@ -124,7 +130,8 @@ class BashScriptsTests: } val tag = "World5" - val commandline2 = Seq("SCALA_OPTS= ", scalaPath.relpath, s"-Dkey=$tag", testJar.relpath) + val commandline2 = Seq( + "SCALA_OPTS= ", scalaPath.relpath, "run", s"-Dkey=$tag", "-classpath", testJar.relpath, "--offline", "--server=false") printf("cmd[%s]\n", commandline2.mkString(" ")) val (validTest, exitCode, stdout, stderr) = bashCommand(commandline2.mkString(" ")) assertEquals(s"Hello $tag", stdout.mkString("/n")) @@ -148,7 +155,11 @@ class BashScriptsTests: /* verify `dist/bin/scala` non-interference with command line args following script name */ @Test def verifyScalaArgs = assumeFalse("Scripts do not yet support Scala 2 library TASTy", Properties.usingScalaLibraryTasty) - val commandline = (Seq("SCALA_OPTS= ", scalaPath, showArgsScript) ++ testScriptArgs).mkString(" ") + val commandline = ( + Seq("SCALA_OPTS= ", scalaPath, showArgsScalaCli) + ++ Seq("--offline", "--server=false") + ++ ("--" +: testScriptArgs) + ).mkString(" ") val (validTest, exitCode, stdout, stderr) = bashCommand(commandline) if verifyValid(validTest) then var fail = false @@ -162,13 +173,13 @@ class BashScriptsTests: assert(stdout == expectedOutput) /* - * verify that scriptPath.sc sees a valid script.path property, - * and that it's value is the path to "scriptPath.sc". + * verify that scriptPathNu.sc sees a valid script.path property, + * and that it's value is the path to "scriptPathNu.sc". */ @Category(Array(classOf[BootstrappedOnlyTests])) @Test def verifyScriptPathProperty = assumeFalse("Scripts do not yet support Scala 2 library TASTy", Properties.usingScalaLibraryTasty) - val scriptFile = testFiles.find(_.getName == "scriptPath.sc").get + val scriptFile = testFiles.find(_.getName == "scriptPathNu.sc").get val expected = s"${scriptFile.getName}" printf("===> verify valid system property script.path is reported by script [%s]\n", scriptFile.getName) printf("calling scriptFile: %s\n", scriptFile) @@ -177,8 +188,8 @@ class BashScriptsTests: stdout.foreach { printf("stdout: [%s]\n", _) } stderr.foreach { printf("stderr: [%s]\n", _) } val valid = stdout.exists { _.endsWith(expected) } - if valid then printf("# valid script.path reported by [%s]\n", scriptFile.getName) - assert(valid, s"script ${scriptFile.absPath} did not report valid script.path value") + if valid then printf("# valid scriptPath reported by [%s]\n", scriptFile.getName) + assert(valid, s"script ${scriptFile.absPath} did not report valid scriptPath value") /* * verify SCALA_OPTS can specify an @argsfile when launching a scala script in `dist/bin/scala`. @@ -208,7 +219,7 @@ class BashScriptsTests: */ @Test def sqlDateTest = assumeFalse("Scripts do not yet support Scala 2 library TASTy", Properties.usingScalaLibraryTasty) - val scriptBase = "sqlDateError" + val scriptBase = "sqlDateErrorNu" val scriptFile = testFiles.find(_.getName == s"$scriptBase.sc").get val testJar = testFile(s"$scriptBase.jar") // jar should not be created when scriptFile runs val tj = Paths.get(testJar).toFile @@ -236,7 +247,6 @@ class BashScriptsTests: printf("===> verify -e is properly handled by `dist/bin/scala`\n") val expected = "9" val expression = s"println(3*3)" - val cmd = s"bin/scala -e $expression" val (validTest, exitCode, stdout, stderr) = bashCommand(s"""bin/scala -e '$expression'""") val result = stdout.filter(_.nonEmpty).mkString("") printf("stdout: %s\n", result) diff --git a/compiler/test/dotty/tools/scripting/ClasspathTests.scala b/compiler/test/dotty/tools/scripting/ClasspathTests.scala index 4fd1211698f6..40c16b7e962d 100755 --- a/compiler/test/dotty/tools/scripting/ClasspathTests.scala +++ b/compiler/test/dotty/tools/scripting/ClasspathTests.scala @@ -51,7 +51,7 @@ class ClasspathTests: // convert scriptCp to a list of files val hashbangJars: List[File] = scriptCp.split(psep).map { _.toFile }.toList val hashbangClasspathJars = hashbangJars.map { _.name }.sorted.distinct // get jar basenames, remove duplicates - val packlibDir = s"$scriptCwd/$packLibDir" // classpathReport.sc specifies a wildcard classpath in this directory + val packlibDir: String = ??? /* ??? was s"$scriptCwd/$packLibDir" */ // classpathReport.sc specifies a wildcard classpath in this directory val packlibJars: List[File] = listJars(packlibDir) // classpath entries expected to have been reported by the script printf("%d jar files in dist/target/pack/lib\n", packlibJars.size) @@ -84,11 +84,23 @@ class ClasspathTests: case Some(file) => file val relpath = testScript.toPath.relpath.norm + val scalaCommand = scalaPath.relpath.norm printf("===> unglobClasspathVerifyTest for script [%s]\n", relpath) printf("bash is [%s]\n", bashExe) if packBinScalaExists then - val bashCmdline = s"set +x ; SCALA_OPTS= $relpath" + val sv = packScalaVersion + val tastyDirGlob = s"$packMavenDir/org/scala-lang/tasty-core_3/$sv/*" + // ^^^^^^^^^^^^^ + // the classpath is a glob pattern that should be unglobbed by scala command, + // otherwise the script could not compile because it references a class + // from tasty-core + + val bashCmdline = Seq( + "set +x ;", + "SCALA_OPTS=", + scalaCommand, "run", "--classpath", s"'$tastyDirGlob'", "--offline", "--server=false", relpath + ).mkString(" ") val cmd = Array(bashExe, "-c", bashCmdline) cmd.foreach { printf("[%s]\n", _) } diff --git a/compiler/test/dotty/tools/scripting/ExpressionTest.scala b/compiler/test/dotty/tools/scripting/ExpressionTest.scala index 6b5248e67f08..1430ab38ebec 100755 --- a/compiler/test/dotty/tools/scripting/ExpressionTest.scala +++ b/compiler/test/dotty/tools/scripting/ExpressionTest.scala @@ -44,7 +44,7 @@ class ExpressionTest: assert(success) def getResult(expression: String): String = - val (_, _, stdout, stderr) = bashCommand(s"$scalaPath -e '$expression'") + val (_, _, stdout, stderr) = bashCommand(s"$scalaPath -e '$expression' --offline --server=false") printf("stdout: %s\n", stdout.mkString("|")) printf("stderr: %s\n", stderr.mkString("\n", "\n", "")) stdout.filter(_.nonEmpty).mkString("") diff --git a/compiler/test/dotty/tools/scripting/ScriptTestEnv.scala b/compiler/test/dotty/tools/scripting/ScriptTestEnv.scala index 1db92d5415b4..a52014f14704 100644 --- a/compiler/test/dotty/tools/scripting/ScriptTestEnv.scala +++ b/compiler/test/dotty/tools/scripting/ScriptTestEnv.scala @@ -125,9 +125,22 @@ object ScriptTestEnv { def packBinDir = "dist/target/pack/bin" - def packLibDir = "dist/target/pack/lib" + // def packLibDir = "dist/target/pack/lib" // replaced by packMavenDir + def packMavenDir = "dist/target/pack/maven2" + def packVersionFile = "dist/target/pack/VERSION" def packBinScalaExists: Boolean = Files.exists(Paths.get(s"$packBinDir/scala")) + def packScalaVersion: String = { + val versionFile = Paths.get(packVersionFile) + if Files.exists(versionFile) then + val lines = Files.readAllLines(versionFile).asScala + lines.find { _.startsWith("version:=") } match + case Some(line) => line.drop(9) + case None => sys.error(s"no version:= found in $packVersionFile") + else + sys.error(s"no $packVersionFile found") + } + def listJars(dir: String): List[File] = val packlibDir = Paths.get(dir).toFile if packlibDir.isDirectory then diff --git a/compiler/test/dotty/tools/scripting/ScriptingTests.scala b/compiler/test/dotty/tools/scripting/ScriptingTests.scala index 5ec417090504..713695b62f4a 100644 --- a/compiler/test/dotty/tools/scripting/ScriptingTests.scala +++ b/compiler/test/dotty/tools/scripting/ScriptingTests.scala @@ -47,7 +47,10 @@ class ScriptingTests: */ @Test def scriptingMainTests = assumeFalse("Scripts do not yet support Scala 2 library TASTy", Properties.usingScalaLibraryTasty) - for (scriptFile, scriptArgs) <- scalaFilesWithArgs(".sc") do + for + (scriptFile, scriptArgs) <- scalaFilesWithArgs(".sc") + if !scriptFile.getName().endsWith("Nu.sc") + do showScriptUnderTest(scriptFile) val unexpectedJar = script2jar(scriptFile) unexpectedJar.delete @@ -66,7 +69,10 @@ class ScriptingTests: */ @Test def scriptingJarTest = assumeFalse("Scripts do not yet support Scala 2 library TASTy", Properties.usingScalaLibraryTasty) - for (scriptFile, scriptArgs) <- scalaFilesWithArgs(".sc") do + for + (scriptFile, scriptArgs) <- scalaFilesWithArgs(".sc") + if !scriptFile.getName().endsWith("Nu.sc") + do showScriptUnderTest(scriptFile) val expectedJar = script2jar(scriptFile) expectedJar.delete From eee90b41180882d0b49c66436935cc224f20fed5 Mon Sep 17 00:00:00 2001 From: Jamie Thompson Date: Mon, 6 May 2024 15:36:35 +0200 Subject: [PATCH 248/371] escape % in java executable path in batch commands. [Cherry-picked 26f918d4cdcb8e402983e7f1865c0fd0757ab7db] --- dist/bin/scala.bat | 3 +++ dist/bin/scalac.bat | 3 +++ dist/bin/scaladoc.bat | 4 ++++ 3 files changed, 10 insertions(+) diff --git a/dist/bin/scala.bat b/dist/bin/scala.bat index ca908fd340be..6c48794ddd40 100644 --- a/dist/bin/scala.bat +++ b/dist/bin/scala.bat @@ -21,6 +21,9 @@ call :args %* call :compilerJavaClasspathArgs +@rem we need to escape % in the java command path, for some reason this doesnt work in common.bat +set "_JAVACMD=!_JAVACMD:%%=%%%%!" + call "%_JAVACMD%" %_JAVA_ARGS% "-Dscala.home=%_PROG_HOME%" -classpath "%_JVM_CP_ARGS%" dotty.tools.MainGenericRunner -classpath "%_JVM_CP_ARGS%" %_SCALA_ARGS% if not %ERRORLEVEL%==0 ( set _EXITCODE=1& goto end ) diff --git a/dist/bin/scalac.bat b/dist/bin/scalac.bat index cb1a76471f70..c8cd0babe60b 100644 --- a/dist/bin/scalac.bat +++ b/dist/bin/scalac.bat @@ -21,6 +21,9 @@ call :args %* call :compilerJavaClasspathArgs +@rem we need to escape % in the java command path, for some reason this doesnt work in common.bat +set "_JAVACMD=!_JAVACMD:%%=%%%%!" + call "%_JAVACMD%" %_JAVA_ARGS% -classpath "%_JVM_CP_ARGS%" "-Dscala.usejavacp=true" "-Dscala.home=%_PROG_HOME%" dotty.tools.MainGenericCompiler %_SCALA_ARGS% if not %ERRORLEVEL%==0 ( set _EXITCODE=1 diff --git a/dist/bin/scaladoc.bat b/dist/bin/scaladoc.bat index bcc0d71788a3..c30a4689244c 100644 --- a/dist/bin/scaladoc.bat +++ b/dist/bin/scaladoc.bat @@ -26,6 +26,10 @@ call :classpathArgs if defined JAVA_OPTS ( set _JAVA_OPTS=%JAVA_OPTS% ) else ( set _JAVA_OPTS=%_DEFAULT_JAVA_OPTS% ) + +@rem we need to escape % in the java command path, for some reason this doesnt work in common.bat +set "_JAVACMD=!_JAVACMD:%%=%%%%!" + call "%_JAVACMD%" %_JAVA_OPTS% %_JAVA_DEBUG% %_JAVA_ARGS% ^ -classpath "%_CLASS_PATH%" ^ -Dscala.usejavacp=true ^ From eb3083b3dcc0ee6fc4bbb4bc3f6ce1fd5e104b8d Mon Sep 17 00:00:00 2001 From: Jamie Thompson Date: Mon, 6 May 2024 18:30:28 +0200 Subject: [PATCH 249/371] Add a warning message when launching from scala. [Cherry-picked 035c1d551c1be93bcba09103b32c5f76f413208f] --- compiler/src/dotty/tools/MainGenericRunner.scala | 16 ++++++++++++++++ .../tools/coursier/CoursierScalaTests.scala | 2 +- 2 files changed, 17 insertions(+), 1 deletion(-) diff --git a/compiler/src/dotty/tools/MainGenericRunner.scala b/compiler/src/dotty/tools/MainGenericRunner.scala index 1540cc86d7a6..5b238693a135 100644 --- a/compiler/src/dotty/tools/MainGenericRunner.scala +++ b/compiler/src/dotty/tools/MainGenericRunner.scala @@ -266,6 +266,22 @@ object MainGenericRunner { run(settings.withExecuteMode(ExecuteMode.Run)) else run(settings.withExecuteMode(ExecuteMode.Repl)) + end run + + val ranByCoursierBootstrap = + sys.props.isDefinedAt("coursier.mainJar") + || sys.props.get("bootstrap.mainClass").filter(_ == "dotty.tools.MainGenericRunner").isDefined + + val silenced = sys.props.get("scala.use_legacy_launcher") == Some("true") + + if !silenced then + Console.err.println(s"[warning] MainGenericRunner class is deprecated since Scala 3.5.0, and Scala CLI features will not work.") + Console.err.println(s"[warning] Please be sure to update to the Scala CLI launcher to use the new features.") + if ranByCoursierBootstrap then + Console.err.println(s"[warning] It appears that your Coursier-based Scala installation is misconfigured.") + Console.err.println(s"[warning] To update to the new Scala CLI runner, please update (coursier, cs) commands first before re-installing scala.") + Console.err.println(s"[warning] Check the Scala 3.5.0 release notes to troubleshoot your installation.") + run(settings) match case Some(ex: (StringDriverException | ScriptingException)) => errorFn(ex.getMessage) diff --git a/compiler/test-coursier/dotty/tools/coursier/CoursierScalaTests.scala b/compiler/test-coursier/dotty/tools/coursier/CoursierScalaTests.scala index b8dfa833c437..115803d79dc1 100644 --- a/compiler/test-coursier/dotty/tools/coursier/CoursierScalaTests.scala +++ b/compiler/test-coursier/dotty/tools/coursier/CoursierScalaTests.scala @@ -166,7 +166,7 @@ object CoursierScalaTests: case Nil => args case _ => "--" +: args val newJOpts = jOpts.map(s => s"--java-opt ${s.stripPrefix("-J")}").mkString(" ") - execCmd("./cs", (s"""launch "org.scala-lang:scala3-compiler_3:${sys.env("DOTTY_BOOTSTRAPPED_VERSION")}" $newJOpts --main-class "$entry" --property "scala.usejavacp=true"""" +: newOptions)*)._2 + execCmd("./cs", (s"""launch "org.scala-lang:scala3-compiler_3:${sys.env("DOTTY_BOOTSTRAPPED_VERSION")}" $newJOpts --main-class "$entry" --property "scala.usejavacp=true" --property "scala.use_legacy_launcher=true"""" +: newOptions)*)._2 /** Get coursier script */ @BeforeClass def setup(): Unit = From 8daca0cce4e98cd600bb659e678261cd5bd4a010 Mon Sep 17 00:00:00 2001 From: Jamie Thompson Date: Mon, 6 May 2024 22:25:05 +0200 Subject: [PATCH 250/371] Windows - extract scala version from VERSION file [Cherry-picked 673ae702cc5ce05e449823798457c4e843fff88f] --- dist/bin/scala.bat | 19 ++++++++++++++++++- 1 file changed, 18 insertions(+), 1 deletion(-) diff --git a/dist/bin/scala.bat b/dist/bin/scala.bat index 6c48794ddd40..ad622c87d1ed 100644 --- a/dist/bin/scala.bat +++ b/dist/bin/scala.bat @@ -21,10 +21,12 @@ call :args %* call :compilerJavaClasspathArgs +call :setScalaVersion + @rem we need to escape % in the java command path, for some reason this doesnt work in common.bat set "_JAVACMD=!_JAVACMD:%%=%%%%!" -call "%_JAVACMD%" %_JAVA_ARGS% "-Dscala.home=%_PROG_HOME%" -classpath "%_JVM_CP_ARGS%" dotty.tools.MainGenericRunner -classpath "%_JVM_CP_ARGS%" %_SCALA_ARGS% +call "%_JAVACMD%" %_JAVA_ARGS% "-Dscala.releaseversion=%_SCALA_VERSION%" "-Dscala.home=%_PROG_HOME%" -classpath "%_JVM_CP_ARGS%" dotty.tools.MainGenericRunner -classpath "%_JVM_CP_ARGS%" %_SCALA_ARGS% if not %ERRORLEVEL%==0 ( set _EXITCODE=1& goto end ) goto end @@ -36,6 +38,7 @@ goto end set _JAVA_ARGS= set _SCALA_ARGS= set _SCALA_CPATH= +set "_SCALA_VERSION=" :args_loop if "%~1"=="" goto args_done @@ -90,6 +93,20 @@ if defined _SCALA_CPATH ( ) goto :eof +:setScalaVersion + +@rem read for version:=_SCALA_VERSION in VERSION_FILE +FOR /F "usebackq delims=" %%G IN ("%_PROG_HOME%\VERSION") DO ( + SET "line=%%G" + IF "!line:~0,9!"=="version:=" ( + SET "_SCALA_VERSION=!line:~9!" + GOTO :foundVersion + ) +) + +:foundVersion +goto :eof + @rem ######################################################################### @rem ## Cleanups From b3d9aeee42bd58800dbb9dd72f2bedbc9339d46c Mon Sep 17 00:00:00 2001 From: Jamie Thompson Date: Mon, 6 May 2024 23:42:34 +0200 Subject: [PATCH 251/371] Windows - forward to scala-cli jar launcher [Cherry-picked ebbe3948da9077a48afaef4cdbd3f47f2a6b29a8] --- dist/bin/scala.bat | 71 ++++------------------------------------------ 1 file changed, 6 insertions(+), 65 deletions(-) diff --git a/dist/bin/scala.bat b/dist/bin/scala.bat index ad622c87d1ed..76617fb6057e 100644 --- a/dist/bin/scala.bat +++ b/dist/bin/scala.bat @@ -14,19 +14,15 @@ for %%f in ("%~dp0.") do ( call "%_PROG_HOME%\bin\common.bat" if not %_EXITCODE%==0 goto end -call :args %* - @rem ######################################################################### @rem ## Main -call :compilerJavaClasspathArgs - -call :setScalaVersion +call :setScalaOpts @rem we need to escape % in the java command path, for some reason this doesnt work in common.bat set "_JAVACMD=!_JAVACMD:%%=%%%%!" -call "%_JAVACMD%" %_JAVA_ARGS% "-Dscala.releaseversion=%_SCALA_VERSION%" "-Dscala.home=%_PROG_HOME%" -classpath "%_JVM_CP_ARGS%" dotty.tools.MainGenericRunner -classpath "%_JVM_CP_ARGS%" %_SCALA_ARGS% +call "%_JAVACMD%" "-jar" "%SCALA_CLI_JAR%" "--cli-default-scala-version" "%_SCALA_VERSION%" "-r" "%MVN_REPOSITORY%" %* if not %ERRORLEVEL%==0 ( set _EXITCODE=1& goto end ) goto end @@ -34,66 +30,11 @@ goto end @rem ######################################################################### @rem ## Subroutines -:args -set _JAVA_ARGS= -set _SCALA_ARGS= -set _SCALA_CPATH= -set "_SCALA_VERSION=" - -:args_loop -if "%~1"=="" goto args_done -set "__ARG=%~1" -if "%__ARG:~0,2%"=="-D" ( - @rem pass to scala as well: otherwise we lose it sometimes when we - @rem need it, e.g. communicating with a server compiler. - set _JAVA_ARGS=!_JAVA_ARGS! "%__ARG%" - set _SCALA_ARGS=!_SCALA_ARGS! "%__ARG%" -) else if "%__ARG:~0,2%"=="-J" ( - @rem as with -D, pass to scala even though it will almost - @rem never be used. - set _JAVA_ARGS=!_JAVA_ARGS! %__ARG:~2% - set _SCALA_ARGS=!_SCALA_ARGS! "%__ARG%" -) else if "%__ARG%"=="-classpath" ( - set "_SCALA_CPATH=%~2" - shift -) else if "%__ARG%"=="-cp" ( - set "_SCALA_CPATH=%~2" - shift -) else ( - set _SCALA_ARGS=!_SCALA_ARGS! "%__ARG%" -) -shift -goto args_loop -:args_done -goto :eof - -@rem output parameter: _JVM_CP_ARGS -:compilerJavaClasspathArgs -set __TOOLCHAIN= -set "__TOOLCHAIN=%__TOOLCHAIN%%_SCALA_LIB%%_PSEP%" -set "__TOOLCHAIN=%__TOOLCHAIN%%_SCALA3_LIB%%_PSEP%" -set "__TOOLCHAIN=%__TOOLCHAIN%%_SCALA_ASM%%_PSEP%" -set "__TOOLCHAIN=%__TOOLCHAIN%%_SBT_INTF%%_PSEP%" -set "__TOOLCHAIN=%__TOOLCHAIN%%_SCALA3_INTF%%_PSEP%" -set "__TOOLCHAIN=%__TOOLCHAIN%%_SCALA3_COMP%%_PSEP%" -set "__TOOLCHAIN=%__TOOLCHAIN%%_TASTY_CORE%%_PSEP%" -set "__TOOLCHAIN=%__TOOLCHAIN%%_SCALA3_STAGING%%_PSEP%" -set "__TOOLCHAIN=%__TOOLCHAIN%%_SCALA3_TASTY_INSPECTOR%%_PSEP%" - -@rem # jline -set "__TOOLCHAIN=%__TOOLCHAIN%%_JLINE_READER%%_PSEP%" -set "__TOOLCHAIN=%__TOOLCHAIN%%_JLINE_TERMINAL%%_PSEP%" -set "__TOOLCHAIN=%__TOOLCHAIN%%_JLINE_TERMINAL_JNA%%_PSEP%" -set "__TOOLCHAIN=%__TOOLCHAIN%%_JNA%%_PSEP%" - -if defined _SCALA_CPATH ( - set "_JVM_CP_ARGS=%__TOOLCHAIN%%_SCALA_CPATH%" -) else ( - set "_JVM_CP_ARGS=%__TOOLCHAIN%" -) -goto :eof +:setScalaOpts -:setScalaVersion +set "_SCALA_VERSION=" +set "MVN_REPOSITORY=file://%_PROG_HOME:\=/%/maven2" +set "SCALA_CLI_JAR=%_PROG_HOME%\etc\scala-cli.jar" @rem read for version:=_SCALA_VERSION in VERSION_FILE FOR /F "usebackq delims=" %%G IN ("%_PROG_HOME%\VERSION") DO ( From 230a0785a9498096bb6cf9e265ecd32820c1fd3a Mon Sep 17 00:00:00 2001 From: Jamie Thompson Date: Tue, 7 May 2024 00:06:32 +0200 Subject: [PATCH 252/371] properly convert path of repo to uri [Cherry-picked f31d7500bed7e4cbf869cc1e68556443b2836589] --- dist/bin/scala.bat | 22 +++++++++++++++++++++- 1 file changed, 21 insertions(+), 1 deletion(-) diff --git a/dist/bin/scala.bat b/dist/bin/scala.bat index 76617fb6057e..c6a515ba617a 100644 --- a/dist/bin/scala.bat +++ b/dist/bin/scala.bat @@ -32,8 +32,28 @@ goto end :setScalaOpts +@REM sfind the index of the first colon in _PROG_HOME +set "index=0" +set "char=!_PROG_HOME:~%index%,1!" +:findColon +if not "%char%"==":" ( + set /a "index+=1" + set "char=!_PROG_HOME:~%index%,1!" + goto :findColon +) + +@REM set _PROG_HOME to the substring from the first colon to the end +set "_PROG_HOME_SUB=!_PROG_HOME:~%index%!" +@REM strip initial character +set "_PROG_HOME_SUB=!_PROG_HOME_SUB:~1!" + +@REM set drive to substring from 0 to the first colon +set "_PROG_HOME_DRIVE=!_PROG_HOME:~0,%index%!" + + + set "_SCALA_VERSION=" -set "MVN_REPOSITORY=file://%_PROG_HOME:\=/%/maven2" +set "MVN_REPOSITORY=file://%_PROG_HOME_DRIVE%\%_PROG_HOME_SUB:\=/%/maven2" set "SCALA_CLI_JAR=%_PROG_HOME%\etc\scala-cli.jar" @rem read for version:=_SCALA_VERSION in VERSION_FILE From 20009dbdcbbe22ef2026f6bca3e35c9bdee818eb Mon Sep 17 00:00:00 2001 From: Jamie Thompson Date: Tue, 7 May 2024 11:16:17 +0200 Subject: [PATCH 253/371] fix windows command tests [Cherry-picked b53d7b27ffdf209fa8fade0c252d9192bd011a63] --- project/scripts/winCmdTests | 4 ++-- project/scripts/winCmdTests.bat | 4 ++-- 2 files changed, 4 insertions(+), 4 deletions(-) diff --git a/project/scripts/winCmdTests b/project/scripts/winCmdTests index d287b60992b2..2dffff5b196a 100644 --- a/project/scripts/winCmdTests +++ b/project/scripts/winCmdTests @@ -5,6 +5,6 @@ PREFIX="dist/target/pack" SOURCE="tests/pos/HelloWorld.scala" $PREFIX/bin/scalac @project/scripts/options "$SOURCE" $PREFIX/bin/scalac -d out "$SOURCE" -$PREFIX/bin/scala -classpath out HelloWorld -$PREFIX/bin/scala -classpath out -J-Xmx512m HelloWorld +$PREFIX/bin/scala --power -classpath out -M HelloWorld --offline '--server=false' +$PREFIX/bin/scala --power -classpath out -J -Xmx512m -M HelloWorld --offline '--server=false' mkdir -p _site && $PREFIX/bin/scaladoc -d _site -project Hello "$SOURCE" diff --git a/project/scripts/winCmdTests.bat b/project/scripts/winCmdTests.bat index ee9b8237c694..d9b594d560ab 100644 --- a/project/scripts/winCmdTests.bat +++ b/project/scripts/winCmdTests.bat @@ -14,10 +14,10 @@ if not %ERRORLEVEL%==0 endlocal& exit /b 1 call "%_PREFIX%\bin\scalac.bat" -d "%_OUT_DIR%" "%_SOURCE%" if not %ERRORLEVEL%==0 endlocal& exit /b 1 -call "%_PREFIX%\bin\scala.bat" -classpath "%_OUT_DIR%" HelloWorld +call "%_PREFIX%\bin\scala.bat" --power -classpath "%_OUT_DIR%" -M HelloWorld --offline --server=false if not %ERRORLEVEL%==0 endlocal& exit /b 1 -call "%_PREFIX%\bin\scala.bat" -classpath "%_OUT_DIR%" -J-Xmx512m HelloWorld +call "%_PREFIX%\bin\scala.bat" --power -classpath "%_OUT_DIR%" -J -Xmx512m -M HelloWorld --offline --server=false if not %ERRORLEVEL%==0 endlocal& exit /b 1 if not exist "%_SITE_DIR%" mkdir "%_SITE_DIR%" From cfbee3836036cd9308fe6208e46bfcfd66d75fc2 Mon Sep 17 00:00:00 2001 From: Jamie Thompson Date: Tue, 7 May 2024 11:59:15 +0200 Subject: [PATCH 254/371] adjust to new launcher scala cli 1.3.1 [Cherry-picked 040793ca11f860e6c4881fef0a34a09bba6e37e0] --- bin/scala | 2 +- bin/test/TestScripts.scala | 2 +- .../dotty/tools/scripting/BashExitCodeTests.scala | 2 +- .../test/dotty/tools/scripting/BashScriptsTests.scala | 6 +++--- .../test/dotty/tools/scripting/ClasspathTests.scala | 2 +- .../test/dotty/tools/scripting/ExpressionTest.scala | 2 +- dist/bin/scala | 1 + dist/bin/scala.bat | 2 +- project/Build.scala | 2 +- project/scripts/bootstrappedOnlyCmdTests | 11 +++-------- 10 files changed, 14 insertions(+), 18 deletions(-) diff --git a/bin/scala b/bin/scala index 6506e3b38ab1..85c1ac91d08f 100755 --- a/bin/scala +++ b/bin/scala @@ -2,4 +2,4 @@ ROOT="$(cd "$(dirname "${BASH_SOURCE[0]}")" >& /dev/null && pwd)/.." -"$ROOT/bin/common" "$ROOT/dist/target/pack/bin/scala" "$@" "--offline" "--server=false" +"$ROOT/bin/common" "$ROOT/dist/target/pack/bin/scala" "--power" "$@" "--offline" "--server=false" diff --git a/bin/test/TestScripts.scala b/bin/test/TestScripts.scala index bada140580fc..4a2fd9a05c83 100644 --- a/bin/test/TestScripts.scala +++ b/bin/test/TestScripts.scala @@ -57,7 +57,7 @@ class TestScripts { s"bin/scalac script did not run properly. Output:$lineSep$dotcOutput" ) - val (retDotr, dotrOutput) = executeScript("./bin/scala HelloWorld") + val (retDotr, dotrOutput) = executeScript("./bin/scala -M HelloWorld") assert( retDotr == 0 && dotrOutput == "hello world\n", s"Running hello world exited with status: $retDotr and output: $dotrOutput" diff --git a/compiler/test/dotty/tools/scripting/BashExitCodeTests.scala b/compiler/test/dotty/tools/scripting/BashExitCodeTests.scala index 2fdc1eccaeb7..90a8d80330b4 100644 --- a/compiler/test/dotty/tools/scripting/BashExitCodeTests.scala +++ b/compiler/test/dotty/tools/scripting/BashExitCodeTests.scala @@ -29,7 +29,7 @@ class BashExitCodeTests: }, expectedExitCode, exitCode) // Helpers for running scala, scalac, and scalac without the the output directory ("raw") - def scala(args: String*) = verifyExit(scalaPath, ("--offline" +: "--server=false" +: args)*) + def scala(args: String*) = verifyExit(scalaPath, ("--power" +: "--offline" +: "--server=false" +: args)*) def scalacRaw(args: String*) = verifyExit(scalacPath, args*) def scalac(args: String*) = scalacRaw(("-d" +: tmpDir +: args)*) diff --git a/compiler/test/dotty/tools/scripting/BashScriptsTests.scala b/compiler/test/dotty/tools/scripting/BashScriptsTests.scala index 69fb861a0516..25bc54e2dcbe 100644 --- a/compiler/test/dotty/tools/scripting/BashScriptsTests.scala +++ b/compiler/test/dotty/tools/scripting/BashScriptsTests.scala @@ -72,7 +72,7 @@ object BashScriptsTests: def callScript(tag: String, script: String, keyPre: String): String = val keyArg = s"$keyPre=$tag" printf("pass tag [%s] via [%s] to script [%s]\n", tag, keyArg, script) - val cmd: String = Seq("SCALA_OPTS= ", scalaPath, "run", keyArg, "--offline", "--server=false", script).mkString(" ") + val cmd: String = Seq("SCALA_OPTS= ", scalaPath, "run", keyArg, "--power", "--offline", "--server=false", script).mkString(" ") printf("cmd: [%s]\n", cmd) val (validTest, exitCode, stdout, stderr) = bashCommand(cmd) stderr.filter { !_.contains("Inappropriate ioctl") }.foreach { System.err.printf("stderr [%s]\n", _) } @@ -131,7 +131,7 @@ class BashScriptsTests: val tag = "World5" val commandline2 = Seq( - "SCALA_OPTS= ", scalaPath.relpath, "run", s"-Dkey=$tag", "-classpath", testJar.relpath, "--offline", "--server=false") + "SCALA_OPTS= ", scalaPath.relpath, "run", s"-Dkey=$tag", "-classpath", testJar.relpath, "--power", "--offline", "--server=false") printf("cmd[%s]\n", commandline2.mkString(" ")) val (validTest, exitCode, stdout, stderr) = bashCommand(commandline2.mkString(" ")) assertEquals(s"Hello $tag", stdout.mkString("/n")) @@ -157,7 +157,7 @@ class BashScriptsTests: assumeFalse("Scripts do not yet support Scala 2 library TASTy", Properties.usingScalaLibraryTasty) val commandline = ( Seq("SCALA_OPTS= ", scalaPath, showArgsScalaCli) - ++ Seq("--offline", "--server=false") + ++ Seq("--power", "--offline", "--server=false") ++ ("--" +: testScriptArgs) ).mkString(" ") val (validTest, exitCode, stdout, stderr) = bashCommand(commandline) diff --git a/compiler/test/dotty/tools/scripting/ClasspathTests.scala b/compiler/test/dotty/tools/scripting/ClasspathTests.scala index 40c16b7e962d..5107af5eee43 100755 --- a/compiler/test/dotty/tools/scripting/ClasspathTests.scala +++ b/compiler/test/dotty/tools/scripting/ClasspathTests.scala @@ -99,7 +99,7 @@ class ClasspathTests: val bashCmdline = Seq( "set +x ;", "SCALA_OPTS=", - scalaCommand, "run", "--classpath", s"'$tastyDirGlob'", "--offline", "--server=false", relpath + scalaCommand, "run", "--classpath", s"'$tastyDirGlob'", "--power", "--offline", "--server=false", relpath ).mkString(" ") val cmd = Array(bashExe, "-c", bashCmdline) diff --git a/compiler/test/dotty/tools/scripting/ExpressionTest.scala b/compiler/test/dotty/tools/scripting/ExpressionTest.scala index 1430ab38ebec..02963f50ee52 100755 --- a/compiler/test/dotty/tools/scripting/ExpressionTest.scala +++ b/compiler/test/dotty/tools/scripting/ExpressionTest.scala @@ -44,7 +44,7 @@ class ExpressionTest: assert(success) def getResult(expression: String): String = - val (_, _, stdout, stderr) = bashCommand(s"$scalaPath -e '$expression' --offline --server=false") + val (_, _, stdout, stderr) = bashCommand(s"$scalaPath -e '$expression' --power --offline --server=false") printf("stdout: %s\n", stdout.mkString("|")) printf("stderr: %s\n", stderr.mkString("\n", "\n", "")) stdout.filter(_.nonEmpty).mkString("") diff --git a/dist/bin/scala b/dist/bin/scala index 4d357918ae07..3040c5a9a0f3 100755 --- a/dist/bin/scala +++ b/dist/bin/scala @@ -56,6 +56,7 @@ done [ -z "${ConEmuPID-}" -o -n "${cygwin-}" ] && export MSYSTEM= PWD= # workaround for #12405 eval "\"$JAVACMD\"" \ "-jar \"$SCALA_CLI_JAR\"" \ + "--prog-name scala" \ "--cli-default-scala-version \"$SCALA_VERSION\"" \ "-r \"$MVN_REPOSITORY\"" \ "${scala_args[@]}" diff --git a/dist/bin/scala.bat b/dist/bin/scala.bat index c6a515ba617a..78336272055b 100644 --- a/dist/bin/scala.bat +++ b/dist/bin/scala.bat @@ -22,7 +22,7 @@ call :setScalaOpts @rem we need to escape % in the java command path, for some reason this doesnt work in common.bat set "_JAVACMD=!_JAVACMD:%%=%%%%!" -call "%_JAVACMD%" "-jar" "%SCALA_CLI_JAR%" "--cli-default-scala-version" "%_SCALA_VERSION%" "-r" "%MVN_REPOSITORY%" %* +call "%_JAVACMD%" "-jar" "%SCALA_CLI_JAR%" "--prog-name" "scala" "--cli-default-scala-version" "%_SCALA_VERSION%" "-r" "%MVN_REPOSITORY%" %* if not %ERRORLEVEL%==0 ( set _EXITCODE=1& goto end ) goto end diff --git a/project/Build.scala b/project/Build.scala index cbc35c3f2f92..11ed959b2c29 100644 --- a/project/Build.scala +++ b/project/Build.scala @@ -119,7 +119,7 @@ object Build { val mimaPreviousLTSDottyVersion = "3.3.0" /** Version of Scala CLI to download */ - val scalaCliLauncherVersion = "1.3.0" + val scalaCliLauncherVersion = "1.3.1" object CompatMode { final val BinaryCompatible = 0 diff --git a/project/scripts/bootstrappedOnlyCmdTests b/project/scripts/bootstrappedOnlyCmdTests index 74a0f5b59a8f..f3d730f8f494 100755 --- a/project/scripts/bootstrappedOnlyCmdTests +++ b/project/scripts/bootstrappedOnlyCmdTests @@ -14,7 +14,6 @@ echo "testing scala.quoted.Expr.run from sbt scala" "$SBT" ";scala3-compiler-bootstrapped/scalac -with-compiler tests/run-staging/quote-run.scala; scala3-compiler-bootstrapped/scala -with-compiler Test" > "$tmp" grep -qe "val a: scala.Int = 3" "$tmp" - # setup for `scalac`/`scala` script tests "$SBT" dist/pack @@ -37,11 +36,9 @@ test "$EXPECTED_OUTPUT" = "$(cat "$tmp")" clear_out "$OUT1" # check that `scalac` and `scala` works for staging. -# TODO: scala3-staging should be automatically added by Scala CLI -# - see: https://github.com/VirtusLab/scala-cli/issues/2879 clear_out "$OUT" ./bin/scalac tests/run-staging/i4044f.scala -d "$OUT" -./bin/scala -with-compiler -classpath "$OUT" --dep "org.scala-lang::scala3-staging:$scala_version" -M Test > "$tmp" +./bin/scala -with-compiler -classpath "$OUT" -M Test > "$tmp" # check that `scalac -from-tasty` compiles and `scala` runs it echo "testing ./bin/scalac -from-tasty and scala -classpath" @@ -99,12 +96,10 @@ clear_out "$OUT" ./bin/scalac -help > "$tmp" 2>&1 grep -qe "Usage: scalac " "$tmp" -# TODO: JAVA launcher should be able to override "scala-cli" program name -# - see: https://github.com/VirtusLab/scala-cli/issues/2838#issuecomment-2085130815 ./bin/scala -help > "$tmp" 2>&1 -grep -qe "See 'scala-cli --help' to read about a specific subcommand." "$tmp" +grep -qe "See 'scala --help' to read about a specific subcommand." "$tmp" -./bin/scala -d hello.jar tests/run/hello.scala --server=false +./bin/scala -d hello.jar tests/run/hello.scala ls hello.jar echo "testing i12973" From 205d0456611c22dc8ac6bc2b0efaa4e41a010bae Mon Sep 17 00:00:00 2001 From: Jamie Thompson Date: Tue, 7 May 2024 14:11:58 +0200 Subject: [PATCH 255/371] remove scala-js from local caching [Cherry-picked acbd46755bea5a81be175c183f1aa73234b467fd] --- project/Build.scala | 4 ---- 1 file changed, 4 deletions(-) diff --git a/project/Build.scala b/project/Build.scala index 11ed959b2c29..7656cb545413 100644 --- a/project/Build.scala +++ b/project/Build.scala @@ -2267,7 +2267,6 @@ object Build { def asDist(implicit mode: Mode): Project = project. enablePlugins(PackPlugin). enablePlugins(RepublishPlugin). - bootstrappedEnablePlugins(DottyJSPlugin). withCommonSettings. settings(commonDistSettings). dependsOn( @@ -2280,9 +2279,6 @@ object Build { scaladoc, `scala3-sbt-bridge`, // for scala-cli ). - bootstrappedDependsOn( - `scala3-library-bootstrappedJS` // for scala-cli - ). bootstrappedSettings( target := baseDirectory.value / "target" // override setting in commonBootstrappedSettings ) From fd672eb8ee5926909c99d9357db50e3a17c96330 Mon Sep 17 00:00:00 2001 From: Jamie Thompson Date: Tue, 7 May 2024 14:33:23 +0200 Subject: [PATCH 256/371] escape error message in test [Cherry-picked bcdf5e762c624258b8d48bc193576d5c2130e7af] --- tests/run-with-compiler/i14541.scala | 1 + 1 file changed, 1 insertion(+) diff --git a/tests/run-with-compiler/i14541.scala b/tests/run-with-compiler/i14541.scala index 0fdfb89674d5..2b942007c5b6 100644 --- a/tests/run-with-compiler/i14541.scala +++ b/tests/run-with-compiler/i14541.scala @@ -6,6 +6,7 @@ object Test: def main(args: Array[String]): Unit = getClass.getClassLoader.run("echo", List("hello", "raw", "world")) // caution: uses "SCALA_OPTS" + sys.props("scala.use_legacy_launcher") = "true" dotty.tools.MainGenericRunner.main(Array("--class-path", classpath, "echo", "hello", "run", "world")) @main def echo(args: String*): Unit = println { From 95e53df0b360849efc49f724125094869eaf98b3 Mon Sep 17 00:00:00 2001 From: =?UTF-8?q?Pawe=C5=82=20Marks?= Date: Fri, 10 May 2024 11:06:03 +0200 Subject: [PATCH 257/371] Disable windows tests for RC1 --- .github/workflows/ci.yaml | 19 ++----------------- 1 file changed, 2 insertions(+), 17 deletions(-) diff --git a/.github/workflows/ci.yaml b/.github/workflows/ci.yaml index 12e90eb9d653..b606e6ae1732 100644 --- a/.github/workflows/ci.yaml +++ b/.github/workflows/ci.yaml @@ -204,16 +204,7 @@ jobs: test_windows_fast: runs-on: [self-hosted, Windows] - if: "( - github.event_name == 'push' - && github.ref != 'refs/heads/main' - ) - || github.event_name == 'merge_group' - || ( - github.event_name == 'pull_request' - && !contains(github.event.pull_request.body, '[skip ci]') - && !contains(github.event.pull_request.body, '[skip test_windows_fast]') - )" + if: false steps: - name: Reset existing repo @@ -251,13 +242,7 @@ jobs: test_windows_full: runs-on: [self-hosted, Windows] - if: "github.event_name == 'schedule' && github.repository == 'scala/scala3' - || github.event_name == 'push' - || ( - github.event_name == 'pull_request' - && !contains(github.event.pull_request.body, '[skip ci]') - && contains(github.event.pull_request.body, '[test_windows_full]') - )" + if: false steps: - name: Reset existing repo From d08d71bc3f5e0563a491a34621fe25ebd8a89a32 Mon Sep 17 00:00:00 2001 From: =?UTF-8?q?Pawe=C5=82=20Marks?= Date: Fri, 10 May 2024 13:40:50 +0200 Subject: [PATCH 258/371] Filter out the dot directories form tests --- compiler/test/dotty/tools/vulpix/FileFilter.scala | 4 ++++ compiler/test/dotty/tools/vulpix/ParallelTesting.scala | 2 +- 2 files changed, 5 insertions(+), 1 deletion(-) diff --git a/compiler/test/dotty/tools/vulpix/FileFilter.scala b/compiler/test/dotty/tools/vulpix/FileFilter.scala index b2aef8af038e..9f62a7db2fb6 100644 --- a/compiler/test/dotty/tools/vulpix/FileFilter.scala +++ b/compiler/test/dotty/tools/vulpix/FileFilter.scala @@ -23,4 +23,8 @@ object FileFilter { object NoFilter extends FileFilter { def accept(file: String) = true } + + object ExcludeDotFiles extends FileFilter { + def accept(file: String) = !file.startsWith(".") + } } diff --git a/compiler/test/dotty/tools/vulpix/ParallelTesting.scala b/compiler/test/dotty/tools/vulpix/ParallelTesting.scala index e7e5936a4b29..09d3614b64a5 100644 --- a/compiler/test/dotty/tools/vulpix/ParallelTesting.scala +++ b/compiler/test/dotty/tools/vulpix/ParallelTesting.scala @@ -1411,7 +1411,7 @@ trait ParallelTesting extends RunnerOrchestration { self => private def compilationTargets(sourceDir: JFile, fileFilter: FileFilter = FileFilter.NoFilter): (List[JFile], List[JFile]) = sourceDir.listFiles.foldLeft((List.empty[JFile], List.empty[JFile])) { case ((dirs, files), f) => if (!fileFilter.accept(f.getName)) (dirs, files) - else if (f.isDirectory) (f :: dirs, files) + else if (f.isDirectory && FileFilter.ExcludeDotFiles.accept(f.getName)) (f :: dirs, files) else if (isSourceFile(f)) (dirs, f :: files) else (dirs, files) } From 638d15a24b4e88ac892ae22a4ae81a73dfa5fa3e Mon Sep 17 00:00:00 2001 From: =?UTF-8?q?Pawe=C5=82=20Marks?= Date: Mon, 13 May 2024 14:42:59 +0200 Subject: [PATCH 259/371] Add changelog for 3.4.2 --- changelogs/3.4.2.md | 209 ++++++++++++++++++++++++++++++++++++++++++++ 1 file changed, 209 insertions(+) create mode 100644 changelogs/3.4.2.md diff --git a/changelogs/3.4.2.md b/changelogs/3.4.2.md new file mode 100644 index 000000000000..bb6fcc40c952 --- /dev/null +++ b/changelogs/3.4.2.md @@ -0,0 +1,209 @@ +# Highlights of the release + +- Bump JLine 3.19.0 -> 3.24.1 & sbt 1.9.7 -> 1.9.9 [#19744](https://github.com/lampepfl/dotty/pull/19744) +- Refactor settings & improve dx [#19766](https://github.com/lampepfl/dotty/pull/19766) +- Publish `scala2-library-tasty-experimental` [#19588](https://github.com/lampepfl/dotty/pull/19588) +- Repl - method signatures in autocomplete [#19917](https://github.com/lampepfl/dotty/pull/19917) + +# Other changes and fixes + +## Annotations + +- Attempt implicit search for old style `implicit` parameters in Application matchArgs [#19737](https://github.com/lampepfl/dotty/pull/19737) + +## Backend + +- Fix(#17255): cannot find Scala companion module from Java [#19773](https://github.com/lampepfl/dotty/pull/19773) +- Change isStatic to isStaticOwner in hasLocalInstantiation [#19803](https://github.com/lampepfl/dotty/pull/19803) + +## Coverage + +- Port coverage filter options for packages and files [#19727](https://github.com/lampepfl/dotty/pull/19727) + +## Default parameters + +- Lift all non trivial prefixes for default parameters [#19739](https://github.com/lampepfl/dotty/pull/19739) + +## Doctool + +- Prevent HTML/XSS Injection in Scala Search [#19980](https://github.com/lampepfl/dotty/pull/19980) +- Parse search query param in Scaladoc [#19669](https://github.com/lampepfl/dotty/pull/19669) + +## Experimental: Capture Checking + +- Disallow covariant `cap`s in the lower bound of type members [#19624](https://github.com/lampepfl/dotty/pull/19624) +- Ignore orphan parameters inside a retains annotation during Ycheck [#19684](https://github.com/lampepfl/dotty/pull/19684) +- Fix the pickling of `This` inside capture sets [#19797](https://github.com/lampepfl/dotty/pull/19797) +- Add updated to SeqViewOps [#19798](https://github.com/lampepfl/dotty/pull/19798) +- Fix Function tree copier [#19822](https://github.com/lampepfl/dotty/pull/19822) +- Drop FreeSeqFactory from stdlib-cc [#19849](https://github.com/lampepfl/dotty/pull/19849) +- Fix i19859 [#19860](https://github.com/lampepfl/dotty/pull/19860) +- Various fixes to stdlib-cc [#19873](https://github.com/lampepfl/dotty/pull/19873) +- Add more methods in `SeqViewOps` [#19993](https://github.com/lampepfl/dotty/pull/19993) +- Check `This` references in `refersToParamOf` [#20005](https://github.com/lampepfl/dotty/pull/20005) + +## Exports + +- Fix the tparam bounds of exported inherited classes [#18647](https://github.com/lampepfl/dotty/pull/18647) + +## Implicits + +- Prefer extensions over conversions for member selection [#19717](https://github.com/lampepfl/dotty/pull/19717) +- Don't allow implicit conversions on prefixes of type selections [#19934](https://github.com/lampepfl/dotty/pull/19934) +- Make sure typeParams returns a stable result even in the presence of completions [#19974](https://github.com/lampepfl/dotty/pull/19974) + +## Incremental Compilation + +- Fix undercompilation upon ctor change [#19911](https://github.com/lampepfl/dotty/pull/19911) +- Load but not enter case accessors fields in Scala2Unpickler [#19926](https://github.com/lampepfl/dotty/pull/19926) + +## Initialization + +- Add supports for type cast and filtering type for field and method owner in global initialization checker [#19612](https://github.com/lampepfl/dotty/pull/19612) +- Added a second trace for global init checker showing creation of mutable fields [#19996](https://github.com/lampepfl/dotty/pull/19996) +- Suppressing repetitive warnings in the global initialization checker [#19898](https://github.com/lampepfl/dotty/pull/19898) + +## Inline + +- Specialized retained inline FunctionN apply methods [#19801](https://github.com/lampepfl/dotty/pull/19801) +- Avoid crash after StopMacroExpansion [#19883](https://github.com/lampepfl/dotty/pull/19883) +- Check deprecation of inline methods [#19914](https://github.com/lampepfl/dotty/pull/19914) +- Inline transparent implicit parameters when typing Unapply trees [#19646](https://github.com/lampepfl/dotty/pull/19646) +- Restore pre-3.3.2 behavior of `inline implicit def` [#19877](https://github.com/lampepfl/dotty/pull/19877) + +## Match Types + +- Cover patterns using `reflect.TypeTest` in isMatchTypeShaped [#19923](https://github.com/lampepfl/dotty/pull/19923) +- Rework MatchType recursion in collectParts [#19867](https://github.com/lampepfl/dotty/pull/19867) + +## Nullability + +- Fix #19808: Don't force to compute the owner of a symbol when there is no denotation [#19813](https://github.com/lampepfl/dotty/pull/19813) + +## Parser + +- Add support for JEP-409 (sealed classes) + Add javacOpt directive [#19080](https://github.com/lampepfl/dotty/pull/19080) +- Fix(#16458): regression in xml syntax parsing [#19522](https://github.com/lampepfl/dotty/pull/19522) +- Fix parsing of conditional expressions in parentheses [#19985](https://github.com/lampepfl/dotty/pull/19985) + +## Presentation Compiler + +- Allow range selection on function parameter to select a parameter list [#19777](https://github.com/lampepfl/dotty/pull/19777) + +## Quotes + +- Disallow ill-staged references to local classes [#19869](https://github.com/lampepfl/dotty/pull/19869) +- Add regression test for #19909 [#19915](https://github.com/lampepfl/dotty/pull/19915) +- Detect non `Expr[..]` splice patterns [#19944](https://github.com/lampepfl/dotty/pull/19944) +- Avoid spurious `val` binding in quote pattern [#19948](https://github.com/lampepfl/dotty/pull/19948) +- Add regression test and imporve -Xprint-suspension message [#19688](https://github.com/lampepfl/dotty/pull/19688) + +## REPL + +- Repl truncation copes with null [#17336](https://github.com/lampepfl/dotty/pull/17336) +- Catch stackoverflow errors in the highlighter [#19836](https://github.com/lampepfl/dotty/pull/19836) +- Fix a REPL bad symbolic reference [#19786](https://github.com/lampepfl/dotty/pull/19786) + +## Reflection + +- Fix `TypeTreeTypeTest` to not match `TypeBoundsTree`s [#19485](https://github.com/lampepfl/dotty/pull/19485) +- Improve message when tree cannot be shown as source [#19906](https://github.com/lampepfl/dotty/pull/19906) +- Fix #19732: quotes.reflect.Ref incorrectly casting `This` to `RefTree` [#19930](https://github.com/lampepfl/dotty/pull/19930) +- Add check for parents in Quotes (#19842) [#19870](https://github.com/lampepfl/dotty/pull/19870) + +## Reporting + +- Improve error reporting for missing members [#19800](https://github.com/lampepfl/dotty/pull/19800) +- Avoid repetitions in name hints [#19975](https://github.com/lampepfl/dotty/pull/19975) +- Improve error message when using experimental definitions [#19782](https://github.com/lampepfl/dotty/pull/19782) +- Make -Xprompt work as desired under -Werror [#19765](https://github.com/lampepfl/dotty/pull/19765) +- Fix #19402: emit proper error in absence of using in given definitions [#19714](https://github.com/lampepfl/dotty/pull/19714) +- Bugfix: Choose correct signature is signatureHelp for overloaded methods [#19707](https://github.com/lampepfl/dotty/pull/19707) +- Unify completion pos usage, fix presentation compiler crash in interpolation [#19614](https://github.com/lampepfl/dotty/pull/19614) + +## Scaladoc + +- Fix(#16610): warn ignored Scaladoc on multiple enum cases [#19555](https://github.com/lampepfl/dotty/pull/19555) + +## TASTy format + +- Add patch for undefined behavior with `object $` [#19705](https://github.com/lampepfl/dotty/pull/19705) +- Fix(#19806): wrong tasty of scala module class reference [#19827](https://github.com/lampepfl/dotty/pull/19827) +- Used derived types to type arguments of dependent function type [#19838](https://github.com/lampepfl/dotty/pull/19838) + +## Tooling + +- Java TASTy: use new threadsafe writer implementation [#19690](https://github.com/lampepfl/dotty/pull/19690) +- Remove `-Yforce-inline-while-typing` [#19889](https://github.com/lampepfl/dotty/pull/19889) +- Cleanup unnecessary language flag [#19865](https://github.com/lampepfl/dotty/pull/19865) +- Bugfix: Auto imports in worksheets in Scala 3 [#19793](https://github.com/lampepfl/dotty/pull/19793) +- Refine behavior of `-Yno-experimental` [#19741](https://github.com/lampepfl/dotty/pull/19741) + +## Transform + +- Short-circuit isCheckable with classSymbol [#19634](https://github.com/lampepfl/dotty/pull/19634) +- Avoid eta-reduction of `(..., f: T => R, ...) => f.apply(..)` into `f` [#19966](https://github.com/lampepfl/dotty/pull/19966) +- Tweak parameter accessor scheme [#19719](https://github.com/lampepfl/dotty/pull/19719) + +## Typer + +- Update phrasing for NotClassType explain error message [#19635](https://github.com/lampepfl/dotty/pull/19635) +- Fix java typer problems with inner class references and raw types [#19747](https://github.com/lampepfl/dotty/pull/19747) +- Approximate MatchTypes with lub of case bodies, if non-recursive [#19761](https://github.com/lampepfl/dotty/pull/19761) +- Revert broken changes with transparent inline [#19922](https://github.com/lampepfl/dotty/pull/19922) +- Delay hard argument comparisons [#20007](https://github.com/lampepfl/dotty/pull/20007) +- Fix #19607: Allow to instantiate *wildcard* type captures to TypeBounds. [#19627](https://github.com/lampepfl/dotty/pull/19627) +- Fix #19907: Skip soft unions in widenSingle of widenInferred [#19995](https://github.com/lampepfl/dotty/pull/19995) +- Fix untupling of functions in for comprehensions [#19620](https://github.com/lampepfl/dotty/pull/19620) + +# Contributors + +Thank you to all the contributors who made this release possible 🎉 + +According to `git shortlog -sn --no-merges 3.4.1..3.4.2` these are: + +``` + 46 Nicolas Stucki + 33 Martin Odersky + 25 Dale Wijnand + 22 Hamza REMMAL + 18 Yichen Xu + 17 Jamie Thompson + 15 Szymon Rodziewicz + 11 EnzeXing + 11 i10416 + 9 Paweł Marks + 6 Kacper Korban + 4 Dan13llljws + 4 Katarzyna Marek + 4 Matt Bovel + 4 Som Snytt + 4 noti0na1 + 3 110416 + 3 Eugene Flesselle + 3 Sébastien Doeraene + 3 dependabot[bot] + 2 Bersier + 2 Hamza Remmal + 2 Jakub Ciesluk + 2 João Costa + 2 Jędrzej Rochala + 2 Natsu Kagami + 2 Stephane Bersier + 2 Taro L. Saito + 2 aherlihy + 1 Aleksander Boruch-Gruszecki + 1 Aviv Keller + 1 Eugene Yokota + 1 Guillaume Martres + 1 Jan Chyb + 1 Lukas Rytz + 1 Mikołaj Fornal + 1 Olga Mazhara + 1 Ondřej Lhoták + 1 Robert Stoll + 1 Seth Tisue + 1 Valentin Schneeberger + 1 Yilin Wei + 1 willerf +``` From 0f7f990b3bc20ad87b163b73ea4c858bff30a77e Mon Sep 17 00:00:00 2001 From: =?UTF-8?q?Pawe=C5=82=20Marks?= Date: Mon, 13 May 2024 14:43:17 +0200 Subject: [PATCH 260/371] Release 3.4.2 --- project/Build.scala | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/project/Build.scala b/project/Build.scala index a5569c0d8888..b75bf1778b3f 100644 --- a/project/Build.scala +++ b/project/Build.scala @@ -85,7 +85,7 @@ object Build { val referenceVersion = "3.4.1" - val baseVersion = "3.4.2-RC1" + val baseVersion = "3.4.2" // LTS or Next val versionLine = "Next" From 4ebe8f429e2259e0eab7543ef592d8e5ec431add Mon Sep 17 00:00:00 2001 From: Hamza REMMAL Date: Tue, 14 May 2024 18:50:06 +0200 Subject: [PATCH 261/371] Take into account the version when releasing in the CI --- project/RepublishPlugin.scala | 7 ++++++- 1 file changed, 6 insertions(+), 1 deletion(-) diff --git a/project/RepublishPlugin.scala b/project/RepublishPlugin.scala index 0b71c9ecb6df..bd1190dfec88 100644 --- a/project/RepublishPlugin.scala +++ b/project/RepublishPlugin.scala @@ -36,6 +36,8 @@ object RepublishPlugin extends AutoPlugin { } case class ResolvedArtifacts(id: SimpleModuleId, jar: File, pom: File) + val isRelease = sys.env.get("RELEASEBUILD") == Some("yes") + override val projectSettings: Seq[Def.Setting[_]] = Def.settings( republishLocalResolved / republishProjectRefs := { val proj = thisProjectRef.value @@ -87,7 +89,10 @@ object RepublishPlugin extends AutoPlugin { localResolved.foreach({ resolved => val simpleId = resolved.id - evicted += simpleId.copy(revision = simpleId.revision + "-nonbootstrapped") + if(isRelease) + evicted += simpleId.copy(revision = simpleId.revision + "-bin-nonbootstrapped") + else + evicted += simpleId.copy(revision = simpleId.revision + "-nonbootstrapped") found(simpleId) = resolved }) From 782d1f64529215e888718d294c761b697994e52d Mon Sep 17 00:00:00 2001 From: =?UTF-8?q?Pawe=C5=82=20Marks?= Date: Tue, 14 May 2024 14:36:24 +0200 Subject: [PATCH 262/371] Add changelog for 3.5.0-RC1 --- changelogs/3.5.0-RC1.md | 254 ++++++++++++++++++++++++++++++++++++++++ 1 file changed, 254 insertions(+) create mode 100644 changelogs/3.5.0-RC1.md diff --git a/changelogs/3.5.0-RC1.md b/changelogs/3.5.0-RC1.md new file mode 100644 index 000000000000..4cbc2aa1d668 --- /dev/null +++ b/changelogs/3.5.0-RC1.md @@ -0,0 +1,254 @@ +# Highlights of the release + +- Bundle scala-cli in scala command (For RC1 requires JVM 17, further RCs will use native launchers) +- Introduce Best Effort compilation options [#17582](https://github.com/lampepfl/dotty/pull/17582) +- Add support for Pipelined builds [#18880](https://github.com/lampepfl/dotty/pull/18880) +- Add support for `var` in refinements [#19982](https://github.com/lampepfl/dotty/pull/19982) +- Implement SIP-42 - Support for binary integer literals [#19405](https://github.com/lampepfl/dotty/pull/19405) + +# Other changes and fixes + +## Backend + +- Fix Closure span assignment in makeClosure [#15841](https://github.com/lampepfl/dotty/pull/15841) + +## Default parameters + +- Fix default args lookup for given classes [#20256](https://github.com/lampepfl/dotty/pull/20256) +- Fix implicit search failure reporting [#20261](https://github.com/lampepfl/dotty/pull/20261) + +## Derivation + +- Fix infinite loop in Mirror synthesis of unreducible match type [#20133](https://github.com/lampepfl/dotty/pull/20133) + +## Desugaring + +- Add explanation to checkCaseClassInheritanceInvariant error msg [#20141](https://github.com/lampepfl/dotty/pull/20141) + +## Exports + +- Add annotations in parameters for exports [#20140](https://github.com/lampepfl/dotty/pull/20140) +- Fix isAliasType [#20195](https://github.com/lampepfl/dotty/pull/20195) + +## Implicits + +- Fix implicitNotFound message for type aliases [#19343](https://github.com/lampepfl/dotty/pull/19343) +- Normalize types before collecting parts determining implicit scope [#20077](https://github.com/lampepfl/dotty/pull/20077) +- Better error diagnostics under -explain-cyclic [#20251](https://github.com/lampepfl/dotty/pull/20251) +- Update unreducible match types error reporting [#19954](https://github.com/lampepfl/dotty/pull/19954) +- Improve ConstraintHandling of SkolemTypes [#20175](https://github.com/lampepfl/dotty/pull/20175) + +## Incremental Compilation + +- Retain default parameters with `export` [#20167](https://github.com/lampepfl/dotty/pull/20167) + +## Inline + +- Fix by-name parameter in beta-reduction [#20096](https://github.com/lampepfl/dotty/pull/20096) +- Add warning for anonymous inline classes (#16723) [#20291](https://github.com/lampepfl/dotty/pull/20291) +- Avoid conversion of `Unit` type into `()` term [#20295](https://github.com/lampepfl/dotty/pull/20295) +- Type desugared `transparent inline def unapply` call in the correct mode [#20108](https://github.com/lampepfl/dotty/pull/20108) +- Regression: fix compilation performance on Windows [#20193](https://github.com/lampepfl/dotty/pull/20193) +- Fix inline match on blocks with multiple statements [#20125](https://github.com/lampepfl/dotty/pull/20125) +- Inline `unapply`s in the inlining phase [#19382](https://github.com/lampepfl/dotty/pull/19382) +- Fix outerSelect in Inliner [#20313](https://github.com/lampepfl/dotty/pull/20313) + +## Linting + +- Fix #20146: attach the original name if there is an import selection for an indent [#20163](https://github.com/lampepfl/dotty/pull/20163) +- Add regression test for issue 18632 [#20308](https://github.com/lampepfl/dotty/pull/20308) + +## Match Types + +- Make aliases of `MatchAlias`es normal `TypeAlias`es [#19871](https://github.com/lampepfl/dotty/pull/19871) +- Fix #19746: Do not follow param term refs in `isConcrete`. [#20015](https://github.com/lampepfl/dotty/pull/20015) +- Do match type reduction atPhaseNoLater than ElimOpaque [#20017](https://github.com/lampepfl/dotty/pull/20017) +- Do not flag match types as `Deferred` and amend #20077 [#20147](https://github.com/lampepfl/dotty/pull/20147) +- Always use baseType when constraining patternTp with scrutineeTp [#20032](https://github.com/lampepfl/dotty/pull/20032) +- Use `MirrorSource.reduce` result for `companionPath` [#20207](https://github.com/lampepfl/dotty/pull/20207) +- Regression: Fix match type extraction of a MatchAlias [#20111](https://github.com/lampepfl/dotty/pull/20111) + +## Polyfunctions + +- Discard poly-functions when trying to resolve overloading [#20181](https://github.com/lampepfl/dotty/pull/20181) + +## Presentation Compiler + +- Stabilise returned completions by improving deduplication + extra completions for constructors [#19976](https://github.com/lampepfl/dotty/pull/19976) +- Fix active param index for empty param lists [#20142](https://github.com/lampepfl/dotty/pull/20142) +- Delias type members in hover [#20173](https://github.com/lampepfl/dotty/pull/20173) +- Interactive: handle context bounds in extension construct workaround [#20201](https://github.com/lampepfl/dotty/pull/20201) +- Fix: prefer non-export definition locations [#20252](https://github.com/lampepfl/dotty/pull/20252) +- Don't show enum completions in new keyword context [#20304](https://github.com/lampepfl/dotty/pull/20304) +- Chore: Backport changes for presentation compiler [#20345](https://github.com/lampepfl/dotty/pull/20345) +- Add custom matchers for completions (fuzzy search for presentation compiler) [#19850](https://github.com/lampepfl/dotty/pull/19850) + +## Quotes + +- Fix TermRef prefixes not having their type healed [#20102](https://github.com/lampepfl/dotty/pull/20102) +- Improve reporting in staging about the possible use of an incorrect class loader [#20137](https://github.com/lampepfl/dotty/pull/20137) +- Introduce MethodTypeKind to quotes reflection API [#20249](https://github.com/lampepfl/dotty/pull/20249) +- Add quote ASTs to TASTy [#20165](https://github.com/lampepfl/dotty/pull/20165) + +## Reflection + +- Allow to beta reduce curried function applications in quotes reflect [#18121](https://github.com/lampepfl/dotty/pull/18121) +- Set the inlining phase in the Context used for checking macro trees [#20087](https://github.com/lampepfl/dotty/pull/20087) +- Add Symbol.isSuperAccessor to reflection API [#13388](https://github.com/lampepfl/dotty/pull/13388) +- Stabilize reflect `SymbolMethods.isSuperAccessor` [#20198](https://github.com/lampepfl/dotty/pull/20198) + +## Repl + +- Fix validity period of derived SingleDenotations [#19983](https://github.com/lampepfl/dotty/pull/19983) +- Fix #18383: Never consider top-level `import`s as unused in the repl. [#20310](https://github.com/lampepfl/dotty/pull/20310) + +## Reporting + +- Warn if extension receiver already has member [#17543](https://github.com/lampepfl/dotty/pull/17543) +- Deprecation of case class elements [#17911](https://github.com/lampepfl/dotty/pull/17911) +- Support src filter in -WConf (Closes #17635) [#18783](https://github.com/lampepfl/dotty/pull/18783) +- Add note about type mismatch in automatically inserted apply argument [#20023](https://github.com/lampepfl/dotty/pull/20023) +- Make error reporting resilient to exception thrown while reporting [#20158](https://github.com/lampepfl/dotty/pull/20158) +- Remove duplicate comma from Matchable selector warning [#20159](https://github.com/lampepfl/dotty/pull/20159) +- Generalize warnings for top-level calls to Any or AnyRef methods [#20312](https://github.com/lampepfl/dotty/pull/20312) +- Make CheckUnused not slow. [#20321](https://github.com/lampepfl/dotty/pull/20321) + +## Rewrites + +- Patch indentation when removing braces (and other bug fixes in `-indent -rewrite`) [#17522](https://github.com/lampepfl/dotty/pull/17522) +- Extra check to avoid converting block expressions on the rhs of an in… [#20043](https://github.com/lampepfl/dotty/pull/20043) + +## Scaladoc + +- Fix scaladoc crash on Windows - illegal path character [#20311](https://github.com/lampepfl/dotty/pull/20311) +- Scaladoc: improve refined function types rendering [#20333](https://github.com/lampepfl/dotty/pull/20333) +- Relax font-weight reset [#20348](https://github.com/lampepfl/dotty/pull/20348) + +## Scala JS + +- Optimize main.js [#20093](https://github.com/lampepfl/dotty/pull/20093) + +## Settings + +- Lift Scala Settings from experimental to stabilized [#20199](https://github.com/lampepfl/dotty/pull/20199) + +## Tooling + +- Detect macro dependencies that are missing from the classloader [#20139](https://github.com/lampepfl/dotty/pull/20139) +- Write pipelined tasty in parallel. [#20153](https://github.com/lampepfl/dotty/pull/20153) +- ConsoleReporter sends INFO to stdout [#20328](https://github.com/lampepfl/dotty/pull/20328) + +## Transform + +- Fix overloaded default methods test in RefChecks [#20218](https://github.com/lampepfl/dotty/pull/20218) +- Fix handling of AppliedType aliases in outerPrefix [#20190](https://github.com/lampepfl/dotty/pull/20190) +- Elide unit binding when beta-reducing [#20085](https://github.com/lampepfl/dotty/pull/20085) + +## Typer + +- Reduce projections of type aliases with class type prefixes [#19931](https://github.com/lampepfl/dotty/pull/19931) +- Re-lub also hard union types in simplify [#20027](https://github.com/lampepfl/dotty/pull/20027) +- Fix #19789: Merge same TypeParamRef in orDominator [#20090](https://github.com/lampepfl/dotty/pull/20090) +- Allow SAM types to contain match alias refinements [#20092](https://github.com/lampepfl/dotty/pull/20092) +- Don't dealias when deciding which arguments to defer [#20116](https://github.com/lampepfl/dotty/pull/20116) +- Avoid the TypeVar.inst trap [#20160](https://github.com/lampepfl/dotty/pull/20160) +- Avoid crash when superType does not exist after erasure [#20188](https://github.com/lampepfl/dotty/pull/20188) +- Refine overloading and implicit disambiguation [#20084](https://github.com/lampepfl/dotty/pull/20084) +- Refactor constant folding of applications [#20099](https://github.com/lampepfl/dotty/pull/20099) +- Rollback constraints if `isSameType` failed second direction [#20109](https://github.com/lampepfl/dotty/pull/20109) +- Suppress "extension method will never be selected" for overrides [#20164](https://github.com/lampepfl/dotty/pull/20164) +- Allow SAM types to contain multiple refinements [#20172](https://github.com/lampepfl/dotty/pull/20172) +- Normalize when verifying if TypeTestCasts are unchecked [#20258](https://github.com/lampepfl/dotty/pull/20258) + +# Experimental Changes + +- Named tuples second implementation [#19174](https://github.com/lampepfl/dotty/pull/19174) +- Change rules for given prioritization [#19300](https://github.com/lampepfl/dotty/pull/19300) +- Enable experimental mode when experimental feature is imported [#19807](https://github.com/lampepfl/dotty/pull/19807) +- Add message parameter to `@experimental` annotation [#19935](https://github.com/lampepfl/dotty/pull/19935) +- Implement match type amendment: extractors follow aliases and singletons [#20161](https://github.com/lampepfl/dotty/pull/20161) + +## Capture Checking + +- Carry and check universal capability from parents correctly [#20004](https://github.com/lampepfl/dotty/pull/20004) +- Make parameter types of context functions inferred type trees [#20155](https://github.com/lampepfl/dotty/pull/20155) +- Handle reach capabilities correctly in depedent functions [#20203](https://github.com/lampepfl/dotty/pull/20203) +- Fix the visibility check in `markFree` [#20221](https://github.com/lampepfl/dotty/pull/20221) +- Make inline proxy vals have inferred types [#20241](https://github.com/lampepfl/dotty/pull/20241) +- CC: Give more info when context function parameters leak [#20244](https://github.com/lampepfl/dotty/pull/20244) +- Plug soundness hole for reach capabilities [#20051](https://github.com/lampepfl/dotty/pull/20051) +- Tighten the screws a bit more to seal the soundness hole for reach capabilities [#20056](https://github.com/lampepfl/dotty/pull/20056) +- Drop retains annotations in inferred type trees [#20057](https://github.com/lampepfl/dotty/pull/20057) +- Allow @retains arguments to be context functions [#20232](https://github.com/lampepfl/dotty/pull/20232) +- Fix conversion of this.fld capture refs under separate compilation [#20238](https://github.com/lampepfl/dotty/pull/20238) + +## Erased definitions + +- Fix "Compiler crash when using CanThrow" [#20210](https://github.com/lampepfl/dotty/pull/20210) +- Only allow erased parameters in erased definitions [#19686](https://github.com/lampepfl/dotty/pull/19686) + +## Initialization + +- Deprecate `StandardPlugin.init` in favor of `initialize` method taking implicit Context [#20330](https://github.com/lampepfl/dotty/pull/20330) +- Fix missing changesParents in PostTyper [#20062](https://github.com/lampepfl/dotty/pull/20062) +- Special case for next field of colon colon in global init checker [#20281](https://github.com/lampepfl/dotty/pull/20281) +- Extend whitelist in global initialization checker [#20290](https://github.com/lampepfl/dotty/pull/20290) + +## Macro Annotations + +- Allow macro annotation to transform companion [#19677](https://github.com/lampepfl/dotty/pull/19677) +- Remove experimental `MainAnnotation`/`newMain` (replaced with `MacroAnnotation`) [#19937](https://github.com/lampepfl/dotty/pull/19937) + +## Nullability + +- Add flexible types to deal with Java-defined signatures under -Yexplicit-nulls [#18112](https://github.com/lampepfl/dotty/pull/18112) +- Fix #20287: Add flexible types to Quotes library [#20293](https://github.com/lampepfl/dotty/pull/20293) +- Add fromNullable to Predef for explicit nulls [#20222](https://github.com/lampepfl/dotty/pull/20222) + + +# Contributors + +Thank you to all the contributors who made this release possible 🎉 + +According to `git shortlog -sn --no-merges 3.4.2..3.5.0-RC1` these are: + +``` + 137 Martin Odersky + 51 Eugene Flesselle + 32 Jamie Thompson + 25 Nicolas Stucki + 22 Sébastien Doeraene + 18 noti0na1 + 16 Matt Bovel + 12 Guillaume Martres + 9 Paweł Marks + 9 Yichen Xu + 8 Jan Chyb + 7 Hamza REMMAL + 6 Jędrzej Rochala + 6 Som Snytt + 5 Fengyun Liu + 5 dependabot[bot] + 3 Mikołaj Fornal + 2 Aviv Keller + 2 EnzeXing + 2 Wojciech Mazur + 1 Chris Pado + 1 Filip Zybała + 1 Georgi Krastev + 1 Hamza Remmal + 1 Jisoo Park + 1 Katarzyna Marek + 1 Lucas Nouguier + 1 Lucy Martin + 1 Ola Flisbäck + 1 Pascal Weisenburger + 1 Quentin Bernet + 1 Raphael Jolly + 1 Stephane Bersier + 1 Tomasz Godzik + 1 Yoonjae Jeon + 1 aherlihy + 1 rochala + 1 willerf +``` From a15fc7d5f3d793cf6aad50565aab73f309b0b859 Mon Sep 17 00:00:00 2001 From: =?UTF-8?q?Pawe=C5=82=20Marks?= Date: Tue, 14 May 2024 14:38:52 +0200 Subject: [PATCH 263/371] Release 3.5.0-RC1 --- project/Build.scala | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/project/Build.scala b/project/Build.scala index 7656cb545413..0876353a6a2f 100644 --- a/project/Build.scala +++ b/project/Build.scala @@ -87,7 +87,7 @@ object DottyJSPlugin extends AutoPlugin { object Build { import ScaladocConfigs._ - val referenceVersion = "3.4.2-RC1" + val referenceVersion = "3.4.2" val baseVersion = "3.5.0-RC1" From 4992e3740bc2cd1ddd07673060844f3b90f0e866 Mon Sep 17 00:00:00 2001 From: Guillaume Martres Date: Tue, 14 May 2024 22:23:10 +0200 Subject: [PATCH 264/371] Backport: Avoid forcing whole package when using `-experimental` This backports https://github.com/scala/scala3/pull/20409 which fixes a regression introduced in 3.5.0-RC1 causing compiler crashes when enabling `-experimental`. --- compiler/src/dotty/tools/dotc/typer/Checking.scala | 3 ++- sbt-test/java-compat/moduleInfo/A.scala | 2 ++ sbt-test/java-compat/moduleInfo/build.sbt | 5 +++++ sbt-test/java-compat/moduleInfo/test | 1 + 4 files changed, 10 insertions(+), 1 deletion(-) create mode 100644 sbt-test/java-compat/moduleInfo/A.scala create mode 100644 sbt-test/java-compat/moduleInfo/build.sbt create mode 100644 sbt-test/java-compat/moduleInfo/test diff --git a/compiler/src/dotty/tools/dotc/typer/Checking.scala b/compiler/src/dotty/tools/dotc/typer/Checking.scala index 073055ba5b58..1f82b9ddc084 100644 --- a/compiler/src/dotty/tools/dotc/typer/Checking.scala +++ b/compiler/src/dotty/tools/dotc/typer/Checking.scala @@ -806,10 +806,11 @@ object Checking { def checkAndAdaptExperimentalImports(trees: List[Tree])(using Context): Unit = def nonExperimentalTopLevelDefs(pack: Symbol): Iterator[Symbol] = def isNonExperimentalTopLevelDefinition(sym: Symbol) = - !sym.isExperimental + sym.isDefinedInCurrentRun && sym.source == ctx.compilationUnit.source && !sym.isConstructor // not constructor of package object && !sym.is(Package) && !sym.name.isPackageObjectName + && !sym.isExperimental pack.info.decls.toList.iterator.flatMap: sym => if sym.isClass && (sym.is(Package) || sym.isPackageObject) then diff --git a/sbt-test/java-compat/moduleInfo/A.scala b/sbt-test/java-compat/moduleInfo/A.scala new file mode 100644 index 000000000000..4b46ae7047d6 --- /dev/null +++ b/sbt-test/java-compat/moduleInfo/A.scala @@ -0,0 +1,2 @@ +// Previously, we crashed trying to parse module-info.class in the empty package. +class A diff --git a/sbt-test/java-compat/moduleInfo/build.sbt b/sbt-test/java-compat/moduleInfo/build.sbt new file mode 100644 index 000000000000..a0308b6cb83a --- /dev/null +++ b/sbt-test/java-compat/moduleInfo/build.sbt @@ -0,0 +1,5 @@ +scalaVersion := sys.props("plugin.scalaVersion") + +scalacOptions ++= Seq( + "-experimental" +) diff --git a/sbt-test/java-compat/moduleInfo/test b/sbt-test/java-compat/moduleInfo/test new file mode 100644 index 000000000000..5df2af1f3956 --- /dev/null +++ b/sbt-test/java-compat/moduleInfo/test @@ -0,0 +1 @@ +> compile From 7885c247391c19a3b0c00c3aeda2df477b411fad Mon Sep 17 00:00:00 2001 From: Wojciech Mazur Date: Mon, 17 Jun 2024 10:54:07 +0200 Subject: [PATCH 265/371] This reverts one part of #20261. When we fail with both an ambiguity on one implicit argument and another error on another argument we prefer the other error. I added a comment why this is needed. --- .../src/dotty/tools/dotc/typer/Typer.scala | 9 +++++- tests/pos/i20344.scala | 28 +++++++++++++++++++ 2 files changed, 36 insertions(+), 1 deletion(-) create mode 100644 tests/pos/i20344.scala diff --git a/compiler/src/dotty/tools/dotc/typer/Typer.scala b/compiler/src/dotty/tools/dotc/typer/Typer.scala index 2a69c948baae..ae50d626cb1f 100644 --- a/compiler/src/dotty/tools/dotc/typer/Typer.scala +++ b/compiler/src/dotty/tools/dotc/typer/Typer.scala @@ -4113,7 +4113,14 @@ class Typer(@constructorOnly nestingLevel: Int = 0) extends Namer * `SearchFailureType`. */ def issueErrors(fun: Tree, args: List[Tree]): Tree = - def firstFailure = args.tpes.find(_.isInstanceOf[SearchFailureType]).getOrElse(NoType) + // Prefer other errors over ambiguities. If nested in outer searches a missing + // implicit can be healed by simply dropping this alternative and trying something + // else. But an ambiguity is sticky and propagates outwards. If we have both + // a missing implicit on one argument and an ambiguity on another the whole + // branch should be classified as a missing implicit. + val firstNonAmbiguous = args.tpes.find(tp => tp.isError && !tp.isInstanceOf[AmbiguousImplicits]) + def firstError = args.tpes.find(_.isInstanceOf[SearchFailureType]).getOrElse(NoType) + def firstFailure = firstNonAmbiguous.getOrElse(firstError) val errorType = firstFailure match case tp: AmbiguousImplicits => diff --git a/tests/pos/i20344.scala b/tests/pos/i20344.scala new file mode 100644 index 000000000000..d3b2a060d6e2 --- /dev/null +++ b/tests/pos/i20344.scala @@ -0,0 +1,28 @@ +trait Monad[F[_]] extends Invariant[F] + +trait Invariant[F[_]] +object Invariant: + implicit def catsInstancesForList: Monad[List] = ??? + implicit def catsInstancesForVector: Monad[Vector] = ??? + +trait Shrink[T] +object Shrink extends ShrinkLowPriorityImplicits: + trait Buildable[T,C] + implicit def shrinkContainer[C[_],T](implicit v: C[T] => Traversable[T], s: Shrink[T], b: Buildable[T,C[T]]): Shrink[C[T]] = ??? +trait ShrinkLowPriorityImplicits: + implicit def shrinkAny[T]: Shrink[T] = ??? + +trait Distribution[F[_], -P, X] extends (P => F[X]) +type GenBeta[A, B, X] = [F[_]] =>> Distribution[F, Beta.Params[A, B], X] +type Beta[R] = [F[_]] =>> GenBeta[R, R, R][F] + +object Beta: + trait Params[+A, +B] +trait BetaInstances: + given schrodingerRandomBetaForDouble[F[_]: Monad]: Beta[Double][F] = ??? + +object all extends BetaInstances + +@main def Test = + import all.given + summon[Shrink[Beta.Params[Double, Double]]] \ No newline at end of file From f913d89129259459d7c1d29901c11fbb7f2d092f Mon Sep 17 00:00:00 2001 From: Wojciech Mazur Date: Mon, 17 Jun 2024 11:54:25 +0200 Subject: [PATCH 266/371] Treat 3.5-migration the same as 3.5 for a warning about implicit priority change Fixes #20420 --- .../dotty/tools/dotc/typer/Implicits.scala | 4 +-- tests/warn/i20420.scala | 27 +++++++++++++++++++ 2 files changed, 29 insertions(+), 2 deletions(-) create mode 100644 tests/warn/i20420.scala diff --git a/compiler/src/dotty/tools/dotc/typer/Implicits.scala b/compiler/src/dotty/tools/dotc/typer/Implicits.scala index fd22f0ec5529..54821444aed6 100644 --- a/compiler/src/dotty/tools/dotc/typer/Implicits.scala +++ b/compiler/src/dotty/tools/dotc/typer/Implicits.scala @@ -1311,14 +1311,14 @@ trait Implicits: else var cmp = comp(using searchContext()) val sv = Feature.sourceVersion - if sv == SourceVersion.`3.5` || sv == SourceVersion.`3.6-migration` then + if sv.stable == SourceVersion.`3.5` || sv == SourceVersion.`3.6-migration` then val prev = comp(using searchContext().addMode(Mode.OldImplicitResolution)) if cmp != prev then def choice(c: Int) = c match case -1 => "the second alternative" case 1 => "the first alternative" case _ => "none - it's ambiguous" - if sv == SourceVersion.`3.5` then + if sv.stable == SourceVersion.`3.5` then report.warning( em"""Given search preference for $pt between alternatives ${alt1.ref} and ${alt2.ref} will change |Current choice : ${choice(prev)} diff --git a/tests/warn/i20420.scala b/tests/warn/i20420.scala new file mode 100644 index 000000000000..d28270509f91 --- /dev/null +++ b/tests/warn/i20420.scala @@ -0,0 +1,27 @@ +//> using options -source 3.5-migration + +final class StrictEqual[V] +final class Less[V] +type LessEqual[V] = Less[V] | StrictEqual[V] + +object TapirCodecIron: + trait ValidatorForPredicate[Value, Predicate] + trait PrimitiveValidatorForPredicate[Value, Predicate] + extends ValidatorForPredicate[Value, Predicate] + + given validatorForLessEqual[N: Numeric, NM <: N](using + ValueOf[NM] + ): PrimitiveValidatorForPredicate[N, LessEqual[NM]] = ??? + given validatorForDescribedOr[N, P](using + IsDescription[P] + ): ValidatorForPredicate[N, P] = ??? + + trait IsDescription[A] + object IsDescription: + given derived[A]: IsDescription[A] = ??? + +@main def Test = { + import TapirCodecIron.{*, given} + type IntConstraint = LessEqual[3] + summon[ValidatorForPredicate[Int, IntConstraint]] // warn +} \ No newline at end of file From 0626b972a961910d6654b95835923fa1d560d6f6 Mon Sep 17 00:00:00 2001 From: Wojciech Mazur Date: Mon, 17 Jun 2024 12:54:32 +0200 Subject: [PATCH 267/371] Set default source version to 3.5 --- community-build/community-projects/Lucre | 2 +- community-build/community-projects/Monocle | 2 +- community-build/community-projects/akka | 2 +- community-build/community-projects/cask | 2 +- .../community-projects/endpoints4s | 2 +- .../community-projects/izumi-reflect | 2 +- community-build/community-projects/os-lib | 2 +- community-build/community-projects/scalatest | 2 +- community-build/community-projects/scalaz | 2 +- community-build/community-projects/scas | 2 +- community-build/community-projects/spire | 2 +- community-build/community-projects/upickle | 2 +- community-build/community-projects/utest | 2 +- .../tools/dotc/config/SourceVersion.scala | 2 +- library/src/scala/quoted/ToExpr.scala | 2 +- .../src/scala/Array.scala | 690 +++++++ .../src/scala/collection/ArrayOps.scala | 1664 +++++++++++++++++ .../src/scala/collection/Factory.scala | 784 ++++++++ .../src/scala/collection/Iterable.scala | 1043 +++++++++++ .../src/scala/collection/SortedMap.scala | 220 +++ .../StrictOptimizedSortedMapOps.scala | 46 + .../generic/DefaultSerializationProxy.scala | 87 + .../scala/collection/mutable/ArraySeq.scala | 354 ++++ .../mutable/CollisionProofHashMap.scala | 888 +++++++++ scala2-library-cc/src/scala/Array.scala | 690 +++++++ .../src/scala/collection/ArrayOps.scala | 4 +- .../src/scala/collection/Factory.scala | 20 +- .../src/scala/collection/Iterable.scala | 12 +- .../src/scala/collection/SortedMap.scala | 6 +- .../StrictOptimizedSortedMapOps.scala | 2 +- .../generic/DefaultSerializationProxy.scala | 4 +- .../scala/collection/mutable/ArraySeq.scala | 6 +- .../mutable/CollisionProofHashMap.scala | 2 +- tests/neg/given-loop-prevention.check | 14 + tests/neg/given-loop-prevention.scala | 12 + tests/neg/i6716.check | 6 +- tests/neg/i6716.scala | 4 +- tests/neg/i7294-a.check | 27 - tests/neg/i7294-a.scala | 14 - tests/neg/i7294-b.scala | 12 - tests/neg/i7294.check | 25 + tests/neg/i7294.scala | 10 + tests/neg/looping-givens.check | 48 + tests/neg/looping-givens.scala | 11 + .../CollectionStrawMan6.scala | 4 +- tests/pos/extmethods.scala | 2 +- tests/pos/given-loop-prevention.scala | 14 - tests/pos/i17245.scala | 2 +- tests/pos/i9967.scala | 2 +- tests/pos/t5643.scala | 2 +- .../run/colltest6/CollectionStrawMan6_1.scala | 4 +- tests/run/i502.scala | 6 +- tests/run/t2029.scala | 2 +- tests/run/t3326.scala | 8 +- .../expect/InventedNames.expect.scala | 2 +- tests/semanticdb/expect/InventedNames.scala | 2 +- tests/semanticdb/metac.expect | 12 +- tests/warn/context-bounds-migration.scala | 9 - tests/warn/i15474.scala | 2 +- tests/warn/looping-givens.check | 45 + tests/warn/looping-givens.scala | 2 + 61 files changed, 6702 insertions(+), 153 deletions(-) create mode 100644 scala2-library-bootstrapped/src/scala/Array.scala create mode 100644 scala2-library-bootstrapped/src/scala/collection/ArrayOps.scala create mode 100644 scala2-library-bootstrapped/src/scala/collection/Factory.scala create mode 100644 scala2-library-bootstrapped/src/scala/collection/Iterable.scala create mode 100644 scala2-library-bootstrapped/src/scala/collection/SortedMap.scala create mode 100644 scala2-library-bootstrapped/src/scala/collection/StrictOptimizedSortedMapOps.scala create mode 100644 scala2-library-bootstrapped/src/scala/collection/generic/DefaultSerializationProxy.scala create mode 100644 scala2-library-bootstrapped/src/scala/collection/mutable/ArraySeq.scala create mode 100644 scala2-library-bootstrapped/src/scala/collection/mutable/CollisionProofHashMap.scala create mode 100644 scala2-library-cc/src/scala/Array.scala create mode 100644 tests/neg/given-loop-prevention.check create mode 100644 tests/neg/given-loop-prevention.scala delete mode 100644 tests/neg/i7294-a.check delete mode 100644 tests/neg/i7294-a.scala delete mode 100644 tests/neg/i7294-b.scala create mode 100644 tests/neg/i7294.check create mode 100644 tests/neg/i7294.scala create mode 100644 tests/neg/looping-givens.check create mode 100644 tests/neg/looping-givens.scala delete mode 100644 tests/pos/given-loop-prevention.scala delete mode 100644 tests/warn/context-bounds-migration.scala create mode 100644 tests/warn/looping-givens.check diff --git a/community-build/community-projects/Lucre b/community-build/community-projects/Lucre index 1008f0b7f513..21a27a294ac7 160000 --- a/community-build/community-projects/Lucre +++ b/community-build/community-projects/Lucre @@ -1 +1 @@ -Subproject commit 1008f0b7f51374ddbc947e677c505fa97677b7d4 +Subproject commit 21a27a294ac7c413f80839d96a02942b2c6d021c diff --git a/community-build/community-projects/Monocle b/community-build/community-projects/Monocle index a0e70744e9b3..b303aa3b98d9 160000 --- a/community-build/community-projects/Monocle +++ b/community-build/community-projects/Monocle @@ -1 +1 @@ -Subproject commit a0e70744e9b3bfb0f12e4ea292151c49c3302cd1 +Subproject commit b303aa3b98d9a10c3f77a56765ca5be2f3cc51f7 diff --git a/community-build/community-projects/akka b/community-build/community-projects/akka index 79b294048f89..ee0ac854f36f 160000 --- a/community-build/community-projects/akka +++ b/community-build/community-projects/akka @@ -1 +1 @@ -Subproject commit 79b294048f893d9d6b9332618f7aebedce9a5340 +Subproject commit ee0ac854f36f537bf3062fd4e9d9f2ff5c1de4c9 diff --git a/community-build/community-projects/cask b/community-build/community-projects/cask index d5fa6d47da5e..2db6020a2d11 160000 --- a/community-build/community-projects/cask +++ b/community-build/community-projects/cask @@ -1 +1 @@ -Subproject commit d5fa6d47da5ea99d94887fafd555696ba07aa205 +Subproject commit 2db6020a2d11566d504ae9af4de28c7a6e20b7ed diff --git a/community-build/community-projects/endpoints4s b/community-build/community-projects/endpoints4s index 3a667a3608ff..b004d1388872 160000 --- a/community-build/community-projects/endpoints4s +++ b/community-build/community-projects/endpoints4s @@ -1 +1 @@ -Subproject commit 3a667a3608ff9950c24e9b2b5038c71c1690a21d +Subproject commit b004d13888723de9f6a86f560137fc31e22edcb6 diff --git a/community-build/community-projects/izumi-reflect b/community-build/community-projects/izumi-reflect index c0756faa7311..2c7e4a69c386 160000 --- a/community-build/community-projects/izumi-reflect +++ b/community-build/community-projects/izumi-reflect @@ -1 +1 @@ -Subproject commit c0756faa7311f70c6da6af29b8cb25506634bf09 +Subproject commit 2c7e4a69c386201e479584333a84ce018fef1795 diff --git a/community-build/community-projects/os-lib b/community-build/community-projects/os-lib index a4400deb3bec..4c8c82b23d76 160000 --- a/community-build/community-projects/os-lib +++ b/community-build/community-projects/os-lib @@ -1 +1 @@ -Subproject commit a4400deb3bec415fd82d331fc1f8b749f3d64e60 +Subproject commit 4c8c82b23d767bc927290829514b8de7148052d9 diff --git a/community-build/community-projects/scalatest b/community-build/community-projects/scalatest index d430625d9621..d6eeedbfc1e0 160000 --- a/community-build/community-projects/scalatest +++ b/community-build/community-projects/scalatest @@ -1 +1 @@ -Subproject commit d430625d96218c9031b1434cc0c2110f3740fa1c +Subproject commit d6eeedbfc1e04f2eff55506f07f93f448cc21407 diff --git a/community-build/community-projects/scalaz b/community-build/community-projects/scalaz index 97cccf3b3fcb..868749fdb951 160000 --- a/community-build/community-projects/scalaz +++ b/community-build/community-projects/scalaz @@ -1 +1 @@ -Subproject commit 97cccf3b3fcb71885a32b2e567171c0f70b06104 +Subproject commit 868749fdb951909bb04bd6dd7ad2cd89295fd439 diff --git a/community-build/community-projects/scas b/community-build/community-projects/scas index fbccb263207b..acaad1055738 160000 --- a/community-build/community-projects/scas +++ b/community-build/community-projects/scas @@ -1 +1 @@ -Subproject commit fbccb263207b3a7b735b8a9dc312acf7368a0816 +Subproject commit acaad1055738dbbcae7b18e6c6c2fc95f06eb7d6 diff --git a/community-build/community-projects/spire b/community-build/community-projects/spire index bc524eeea735..d60fe2c38848 160000 --- a/community-build/community-projects/spire +++ b/community-build/community-projects/spire @@ -1 +1 @@ -Subproject commit bc524eeea735a3cf4d5108039f95950b024a14e4 +Subproject commit d60fe2c38848ef193031c18eab3a14d3306b3761 diff --git a/community-build/community-projects/upickle b/community-build/community-projects/upickle index aa3bc0e43ec7..0c09bbcabc66 160000 --- a/community-build/community-projects/upickle +++ b/community-build/community-projects/upickle @@ -1 +1 @@ -Subproject commit aa3bc0e43ec7b618eb087753878f3d845e58277a +Subproject commit 0c09bbcabc664abf98462022fc9036a366135e70 diff --git a/community-build/community-projects/utest b/community-build/community-projects/utest index eae17c7a4d0d..f4a9789e2750 160000 --- a/community-build/community-projects/utest +++ b/community-build/community-projects/utest @@ -1 +1 @@ -Subproject commit eae17c7a4d0d63bab1406ca75791d3cb6394233d +Subproject commit f4a9789e2750523feee4a3477efb42eb15424fc7 diff --git a/compiler/src/dotty/tools/dotc/config/SourceVersion.scala b/compiler/src/dotty/tools/dotc/config/SourceVersion.scala index 3a44021af2df..935b95003729 100644 --- a/compiler/src/dotty/tools/dotc/config/SourceVersion.scala +++ b/compiler/src/dotty/tools/dotc/config/SourceVersion.scala @@ -28,7 +28,7 @@ enum SourceVersion: def isAtMost(v: SourceVersion) = stable.ordinal <= v.ordinal object SourceVersion extends Property.Key[SourceVersion]: - def defaultSourceVersion = `3.4` + def defaultSourceVersion = `3.5` /** language versions that may appear in a language import, are deprecated, but not removed from the standard library. */ val illegalSourceVersionNames = List("3.1-migration").map(_.toTermName) diff --git a/library/src/scala/quoted/ToExpr.scala b/library/src/scala/quoted/ToExpr.scala index 042c8ff37a52..6c167c353d87 100644 --- a/library/src/scala/quoted/ToExpr.scala +++ b/library/src/scala/quoted/ToExpr.scala @@ -97,7 +97,7 @@ object ToExpr { /** Default implementation of `ToExpr[Array[T]]` */ given ArrayToExpr[T: Type: ToExpr: ClassTag]: ToExpr[Array[T]] with { def apply(arr: Array[T])(using Quotes): Expr[Array[T]] = - '{ Array[T](${Expr(arr.toSeq)}*)(${Expr(summon[ClassTag[T]])}) } + '{ Array[T](${Expr(arr.toSeq)}*)(using ${Expr(summon[ClassTag[T]])}) } } /** Default implementation of `ToExpr[Array[Boolean]]` */ diff --git a/scala2-library-bootstrapped/src/scala/Array.scala b/scala2-library-bootstrapped/src/scala/Array.scala new file mode 100644 index 000000000000..d2098a76f32f --- /dev/null +++ b/scala2-library-bootstrapped/src/scala/Array.scala @@ -0,0 +1,690 @@ +/* + * Scala (https://www.scala-lang.org) + * + * Copyright EPFL and Lightbend, Inc. + * + * Licensed under Apache License 2.0 + * (http://www.apache.org/licenses/LICENSE-2.0). + * + * See the NOTICE file distributed with this work for + * additional information regarding copyright ownership. + */ + +package scala + +//import scala.collection.generic._ +import scala.collection.{Factory, immutable, mutable} +import mutable.ArrayBuilder +import immutable.ArraySeq +import scala.language.implicitConversions +import scala.reflect.{ClassTag, classTag} +import scala.runtime.BoxedUnit +import scala.runtime.ScalaRunTime +import scala.runtime.ScalaRunTime.{array_apply, array_update} + +/** Utility methods for operating on arrays. + * For example: + * {{{ + * val a = Array(1, 2) + * val b = Array.ofDim[Int](2) + * val c = Array.concat(a, b) + * }}} + * where the array objects `a`, `b` and `c` have respectively the values + * `Array(1, 2)`, `Array(0, 0)` and `Array(1, 2, 0, 0)`. + */ +object Array { + val emptyBooleanArray = new Array[Boolean](0) + val emptyByteArray = new Array[Byte](0) + val emptyCharArray = new Array[Char](0) + val emptyDoubleArray = new Array[Double](0) + val emptyFloatArray = new Array[Float](0) + val emptyIntArray = new Array[Int](0) + val emptyLongArray = new Array[Long](0) + val emptyShortArray = new Array[Short](0) + val emptyObjectArray = new Array[Object](0) + + /** Provides an implicit conversion from the Array object to a collection Factory */ + implicit def toFactory[A : ClassTag](dummy: Array.type): Factory[A, Array[A]] = new ArrayFactory(dummy) + @SerialVersionUID(3L) + private class ArrayFactory[A : ClassTag](dummy: Array.type) extends Factory[A, Array[A]] with Serializable { + def fromSpecific(it: IterableOnce[A]): Array[A] = Array.from[A](it) + def newBuilder: mutable.Builder[A, Array[A]] = Array.newBuilder[A] + } + + /** + * Returns a new [[scala.collection.mutable.ArrayBuilder]]. + */ + def newBuilder[T](implicit t: ClassTag[T]): ArrayBuilder[T] = ArrayBuilder.make[T](using t) + + /** Build an array from the iterable collection. + * + * {{{ + * scala> val a = Array.from(Seq(1, 5)) + * val a: Array[Int] = Array(1, 5) + * + * scala> val b = Array.from(Range(1, 5)) + * val b: Array[Int] = Array(1, 2, 3, 4) + * }}} + * + * @param it the iterable collection + * @return an array consisting of elements of the iterable collection + */ + def from[A : ClassTag](it: IterableOnce[A]): Array[A] = it match { + case it: Iterable[A] => it.toArray[A] + case _ => it.iterator.toArray[A] + } + + private def slowcopy(src : AnyRef, + srcPos : Int, + dest : AnyRef, + destPos : Int, + length : Int): Unit = { + var i = srcPos + var j = destPos + val srcUntil = srcPos + length + while (i < srcUntil) { + array_update(dest, j, array_apply(src, i)) + i += 1 + j += 1 + } + } + + /** Copy one array to another. + * Equivalent to Java's + * `System.arraycopy(src, srcPos, dest, destPos, length)`, + * except that this also works for polymorphic and boxed arrays. + * + * Note that the passed-in `dest` array will be modified by this call. + * + * @param src the source array. + * @param srcPos starting position in the source array. + * @param dest destination array. + * @param destPos starting position in the destination array. + * @param length the number of array elements to be copied. + * + * @see `java.lang.System#arraycopy` + */ + def copy(src: AnyRef, srcPos: Int, dest: AnyRef, destPos: Int, length: Int): Unit = { + val srcClass = src.getClass + if (srcClass.isArray && dest.getClass.isAssignableFrom(srcClass)) + java.lang.System.arraycopy(src, srcPos, dest, destPos, length) + else + slowcopy(src, srcPos, dest, destPos, length) + } + + /** Copy one array to another, truncating or padding with default values (if + * necessary) so the copy has the specified length. + * + * Equivalent to Java's + * `java.util.Arrays.copyOf(original, newLength)`, + * except that this works for primitive and object arrays in a single method. + * + * @see `java.util.Arrays#copyOf` + */ + def copyOf[A](original: Array[A], newLength: Int): Array[A] = ((original: @unchecked) match { + case x: Array[BoxedUnit] => newUnitArray(newLength).asInstanceOf[Array[A]] + case x: Array[AnyRef] => java.util.Arrays.copyOf(x, newLength) + case x: Array[Int] => java.util.Arrays.copyOf(x, newLength) + case x: Array[Double] => java.util.Arrays.copyOf(x, newLength) + case x: Array[Long] => java.util.Arrays.copyOf(x, newLength) + case x: Array[Float] => java.util.Arrays.copyOf(x, newLength) + case x: Array[Char] => java.util.Arrays.copyOf(x, newLength) + case x: Array[Byte] => java.util.Arrays.copyOf(x, newLength) + case x: Array[Short] => java.util.Arrays.copyOf(x, newLength) + case x: Array[Boolean] => java.util.Arrays.copyOf(x, newLength) + }).asInstanceOf[Array[A]] + + /** Copy one array to another, truncating or padding with default values (if + * necessary) so the copy has the specified length. The new array can have + * a different type than the original one as long as the values are + * assignment-compatible. When copying between primitive and object arrays, + * boxing and unboxing are supported. + * + * Equivalent to Java's + * `java.util.Arrays.copyOf(original, newLength, newType)`, + * except that this works for all combinations of primitive and object arrays + * in a single method. + * + * @see `java.util.Arrays#copyOf` + */ + def copyAs[A](original: Array[_], newLength: Int)(implicit ct: ClassTag[A]): Array[A] = { + val runtimeClass = ct.runtimeClass + if (runtimeClass == Void.TYPE) newUnitArray(newLength).asInstanceOf[Array[A]] + else { + val destClass = runtimeClass.asInstanceOf[Class[A]] + if (destClass.isAssignableFrom(original.getClass.getComponentType)) { + if (destClass.isPrimitive) copyOf[A](original.asInstanceOf[Array[A]], newLength) + else { + val destArrayClass = java.lang.reflect.Array.newInstance(destClass, 0).getClass.asInstanceOf[Class[Array[AnyRef]]] + java.util.Arrays.copyOf(original.asInstanceOf[Array[AnyRef]], newLength, destArrayClass).asInstanceOf[Array[A]] + } + } else { + val dest = new Array[A](newLength) + Array.copy(original, 0, dest, 0, original.length) + dest + } + } + } + + private def newUnitArray(len: Int): Array[Unit] = { + val result = new Array[Unit](len) + java.util.Arrays.fill(result.asInstanceOf[Array[AnyRef]], ()) + result + } + + /** Returns an array of length 0 */ + def empty[T: ClassTag]: Array[T] = new Array[T](0) + + /** Creates an array with given elements. + * + * @param xs the elements to put in the array + * @return an array containing all elements from xs. + */ + // Subject to a compiler optimization in Cleanup. + // Array(e0, ..., en) is translated to { val a = new Array(3); a(i) = ei; a } + def apply[T: ClassTag](xs: T*): Array[T] = { + val len = xs.length + xs match { + case wa: immutable.ArraySeq[_] if wa.unsafeArray.getClass.getComponentType == classTag[T].runtimeClass => + // We get here in test/files/run/sd760a.scala, `Array[T](t)` for + // a specialized type parameter `T`. While we still pay for two + // copies of the array it is better than before when we also boxed + // each element when populating the result. + ScalaRunTime.array_clone(wa.unsafeArray).asInstanceOf[Array[T]] + case _ => + val array = new Array[T](len) + val iterator = xs.iterator + var i = 0 + while (iterator.hasNext) { + array(i) = iterator.next(); i += 1 + } + array + } + } + + /** Creates an array of `Boolean` objects */ + // Subject to a compiler optimization in Cleanup, see above. + def apply(x: Boolean, xs: Boolean*): Array[Boolean] = { + val array = new Array[Boolean](xs.length + 1) + array(0) = x + val iterator = xs.iterator + var i = 1 + while (iterator.hasNext) { + array(i) = iterator.next(); i += 1 + } + array + } + + /** Creates an array of `Byte` objects */ + // Subject to a compiler optimization in Cleanup, see above. + def apply(x: Byte, xs: Byte*): Array[Byte] = { + val array = new Array[Byte](xs.length + 1) + array(0) = x + val iterator = xs.iterator + var i = 1 + while (iterator.hasNext) { + array(i) = iterator.next(); i += 1 + } + array + } + + /** Creates an array of `Short` objects */ + // Subject to a compiler optimization in Cleanup, see above. + def apply(x: Short, xs: Short*): Array[Short] = { + val array = new Array[Short](xs.length + 1) + array(0) = x + val iterator = xs.iterator + var i = 1 + while (iterator.hasNext) { + array(i) = iterator.next(); i += 1 + } + array + } + + /** Creates an array of `Char` objects */ + // Subject to a compiler optimization in Cleanup, see above. + def apply(x: Char, xs: Char*): Array[Char] = { + val array = new Array[Char](xs.length + 1) + array(0) = x + val iterator = xs.iterator + var i = 1 + while (iterator.hasNext) { + array(i) = iterator.next(); i += 1 + } + array + } + + /** Creates an array of `Int` objects */ + // Subject to a compiler optimization in Cleanup, see above. + def apply(x: Int, xs: Int*): Array[Int] = { + val array = new Array[Int](xs.length + 1) + array(0) = x + val iterator = xs.iterator + var i = 1 + while (iterator.hasNext) { + array(i) = iterator.next(); i += 1 + } + array + } + + /** Creates an array of `Long` objects */ + // Subject to a compiler optimization in Cleanup, see above. + def apply(x: Long, xs: Long*): Array[Long] = { + val array = new Array[Long](xs.length + 1) + array(0) = x + val iterator = xs.iterator + var i = 1 + while (iterator.hasNext) { + array(i) = iterator.next(); i += 1 + } + array + } + + /** Creates an array of `Float` objects */ + // Subject to a compiler optimization in Cleanup, see above. + def apply(x: Float, xs: Float*): Array[Float] = { + val array = new Array[Float](xs.length + 1) + array(0) = x + val iterator = xs.iterator + var i = 1 + while (iterator.hasNext) { + array(i) = iterator.next(); i += 1 + } + array + } + + /** Creates an array of `Double` objects */ + // Subject to a compiler optimization in Cleanup, see above. + def apply(x: Double, xs: Double*): Array[Double] = { + val array = new Array[Double](xs.length + 1) + array(0) = x + val iterator = xs.iterator + var i = 1 + while (iterator.hasNext) { + array(i) = iterator.next(); i += 1 + } + array + } + + /** Creates an array of `Unit` objects */ + def apply(x: Unit, xs: Unit*): Array[Unit] = { + val array = new Array[Unit](xs.length + 1) + array(0) = x + val iterator = xs.iterator + var i = 1 + while (iterator.hasNext) { + array(i) = iterator.next(); i += 1 + } + array + } + + /** Creates array with given dimensions */ + def ofDim[T: ClassTag](n1: Int): Array[T] = + new Array[T](n1) + /** Creates a 2-dimensional array */ + def ofDim[T: ClassTag](n1: Int, n2: Int): Array[Array[T]] = { + val arr: Array[Array[T]] = (new Array[Array[T]](n1): Array[Array[T]]) + for (i <- 0 until n1) arr(i) = new Array[T](n2) + arr + // tabulate(n1)(_ => ofDim[T](n2)) + } + /** Creates a 3-dimensional array */ + def ofDim[T: ClassTag](n1: Int, n2: Int, n3: Int): Array[Array[Array[T]]] = + tabulate(n1)(_ => ofDim[T](n2, n3)) + /** Creates a 4-dimensional array */ + def ofDim[T: ClassTag](n1: Int, n2: Int, n3: Int, n4: Int): Array[Array[Array[Array[T]]]] = + tabulate(n1)(_ => ofDim[T](n2, n3, n4)) + /** Creates a 5-dimensional array */ + def ofDim[T: ClassTag](n1: Int, n2: Int, n3: Int, n4: Int, n5: Int): Array[Array[Array[Array[Array[T]]]]] = + tabulate(n1)(_ => ofDim[T](n2, n3, n4, n5)) + + /** Concatenates all arrays into a single array. + * + * @param xss the given arrays + * @return the array created from concatenating `xss` + */ + def concat[T: ClassTag](xss: Array[T]*): Array[T] = { + val b = newBuilder[T] + b.sizeHint(xss.map(_.length).sum) + for (xs <- xss) b ++= xs + b.result() + } + + /** Returns an array that contains the results of some element computation a number + * of times. + * + * Note that this means that `elem` is computed a total of n times: + * {{{ + * scala> Array.fill(3){ math.random } + * res3: Array[Double] = Array(0.365461167592537, 1.550395944913685E-4, 0.7907242137333306) + * }}} + * + * @param n the number of elements desired + * @param elem the element computation + * @return an Array of size n, where each element contains the result of computing + * `elem`. + */ + def fill[T: ClassTag](n: Int)(elem: => T): Array[T] = { + if (n <= 0) { + empty[T] + } else { + val array = new Array[T](n) + var i = 0 + while (i < n) { + array(i) = elem + i += 1 + } + array + } + } + + /** Returns a two-dimensional array that contains the results of some element + * computation a number of times. + * + * @param n1 the number of elements in the 1st dimension + * @param n2 the number of elements in the 2nd dimension + * @param elem the element computation + */ + def fill[T: ClassTag](n1: Int, n2: Int)(elem: => T): Array[Array[T]] = + tabulate(n1)(_ => fill(n2)(elem)) + + /** Returns a three-dimensional array that contains the results of some element + * computation a number of times. + * + * @param n1 the number of elements in the 1st dimension + * @param n2 the number of elements in the 2nd dimension + * @param n3 the number of elements in the 3rd dimension + * @param elem the element computation + */ + def fill[T: ClassTag](n1: Int, n2: Int, n3: Int)(elem: => T): Array[Array[Array[T]]] = + tabulate(n1)(_ => fill(n2, n3)(elem)) + + /** Returns a four-dimensional array that contains the results of some element + * computation a number of times. + * + * @param n1 the number of elements in the 1st dimension + * @param n2 the number of elements in the 2nd dimension + * @param n3 the number of elements in the 3rd dimension + * @param n4 the number of elements in the 4th dimension + * @param elem the element computation + */ + def fill[T: ClassTag](n1: Int, n2: Int, n3: Int, n4: Int)(elem: => T): Array[Array[Array[Array[T]]]] = + tabulate(n1)(_ => fill(n2, n3, n4)(elem)) + + /** Returns a five-dimensional array that contains the results of some element + * computation a number of times. + * + * @param n1 the number of elements in the 1st dimension + * @param n2 the number of elements in the 2nd dimension + * @param n3 the number of elements in the 3rd dimension + * @param n4 the number of elements in the 4th dimension + * @param n5 the number of elements in the 5th dimension + * @param elem the element computation + */ + def fill[T: ClassTag](n1: Int, n2: Int, n3: Int, n4: Int, n5: Int)(elem: => T): Array[Array[Array[Array[Array[T]]]]] = + tabulate(n1)(_ => fill(n2, n3, n4, n5)(elem)) + + /** Returns an array containing values of a given function over a range of integer + * values starting from 0. + * + * @param n The number of elements in the array + * @param f The function computing element values + * @return An `Array` consisting of elements `f(0),f(1), ..., f(n - 1)` + */ + def tabulate[T: ClassTag](n: Int)(f: Int => T): Array[T] = { + if (n <= 0) { + empty[T] + } else { + val array = new Array[T](n) + var i = 0 + while (i < n) { + array(i) = f(i) + i += 1 + } + array + } + } + + /** Returns a two-dimensional array containing values of a given function + * over ranges of integer values starting from `0`. + * + * @param n1 the number of elements in the 1st dimension + * @param n2 the number of elements in the 2nd dimension + * @param f The function computing element values + */ + def tabulate[T: ClassTag](n1: Int, n2: Int)(f: (Int, Int) => T): Array[Array[T]] = + tabulate(n1)(i1 => tabulate(n2)(f(i1, _))) + + /** Returns a three-dimensional array containing values of a given function + * over ranges of integer values starting from `0`. + * + * @param n1 the number of elements in the 1st dimension + * @param n2 the number of elements in the 2nd dimension + * @param n3 the number of elements in the 3rd dimension + * @param f The function computing element values + */ + def tabulate[T: ClassTag](n1: Int, n2: Int, n3: Int)(f: (Int, Int, Int) => T): Array[Array[Array[T]]] = + tabulate(n1)(i1 => tabulate(n2, n3)(f(i1, _, _))) + + /** Returns a four-dimensional array containing values of a given function + * over ranges of integer values starting from `0`. + * + * @param n1 the number of elements in the 1st dimension + * @param n2 the number of elements in the 2nd dimension + * @param n3 the number of elements in the 3rd dimension + * @param n4 the number of elements in the 4th dimension + * @param f The function computing element values + */ + def tabulate[T: ClassTag](n1: Int, n2: Int, n3: Int, n4: Int)(f: (Int, Int, Int, Int) => T): Array[Array[Array[Array[T]]]] = + tabulate(n1)(i1 => tabulate(n2, n3, n4)(f(i1, _, _, _))) + + /** Returns a five-dimensional array containing values of a given function + * over ranges of integer values starting from `0`. + * + * @param n1 the number of elements in the 1st dimension + * @param n2 the number of elements in the 2nd dimension + * @param n3 the number of elements in the 3rd dimension + * @param n4 the number of elements in the 4th dimension + * @param n5 the number of elements in the 5th dimension + * @param f The function computing element values + */ + def tabulate[T: ClassTag](n1: Int, n2: Int, n3: Int, n4: Int, n5: Int)(f: (Int, Int, Int, Int, Int) => T): Array[Array[Array[Array[Array[T]]]]] = + tabulate(n1)(i1 => tabulate(n2, n3, n4, n5)(f(i1, _, _, _, _))) + + /** Returns an array containing a sequence of increasing integers in a range. + * + * @param start the start value of the array + * @param end the end value of the array, exclusive (in other words, this is the first value '''not''' returned) + * @return the array with values in range `start, start + 1, ..., end - 1` + * up to, but excluding, `end`. + */ + def range(start: Int, end: Int): Array[Int] = range(start, end, 1) + + /** Returns an array containing equally spaced values in some integer interval. + * + * @param start the start value of the array + * @param end the end value of the array, exclusive (in other words, this is the first value '''not''' returned) + * @param step the increment value of the array (may not be zero) + * @return the array with values in `start, start + step, ...` up to, but excluding `end` + */ + def range(start: Int, end: Int, step: Int): Array[Int] = { + if (step == 0) throw new IllegalArgumentException("zero step") + val array = new Array[Int](immutable.Range.count(start, end, step, isInclusive = false)) + + var n = 0 + var i = start + while (if (step < 0) end < i else i < end) { + array(n) = i + i += step + n += 1 + } + array + } + + /** Returns an array containing repeated applications of a function to a start value. + * + * @param start the start value of the array + * @param len the number of elements returned by the array + * @param f the function that is repeatedly applied + * @return the array returning `len` values in the sequence `start, f(start), f(f(start)), ...` + */ + def iterate[T: ClassTag](start: T, len: Int)(f: T => T): Array[T] = { + if (len > 0) { + val array = new Array[T](len) + var acc = start + var i = 1 + array(0) = acc + + while (i < len) { + acc = f(acc) + array(i) = acc + i += 1 + } + array + } else { + empty[T] + } + } + + /** Compare two arrays per element. + * + * A more efficient version of `xs.sameElements(ys)`. + * + * Note that arrays are invariant in Scala, but it may + * be sound to cast an array of arbitrary reference type + * to `Array[AnyRef]`. Arrays on the JVM are covariant + * in their element type. + * + * `Array.equals(xs.asInstanceOf[Array[AnyRef]], ys.asInstanceOf[Array[AnyRef]])` + * + * @param xs an array of AnyRef + * @param ys an array of AnyRef + * @return true if corresponding elements are equal + */ + def equals(xs: Array[AnyRef], ys: Array[AnyRef]): Boolean = + (xs eq ys) || + (xs.length == ys.length) && { + var i = 0 + while (i < xs.length && xs(i) == ys(i)) i += 1 + i >= xs.length + } + + /** Called in a pattern match like `{ case Array(x,y,z) => println('3 elements')}`. + * + * @param x the selector value + * @return sequence wrapped in a [[scala.Some]], if `x` is an Array, otherwise `None` + */ + def unapplySeq[T](x: Array[T]): UnapplySeqWrapper[T] = new UnapplySeqWrapper(x) + + final class UnapplySeqWrapper[T](private val a: Array[T]) extends AnyVal { + def isEmpty: false = false + def get: UnapplySeqWrapper[T] = this + def lengthCompare(len: Int): Int = a.lengthCompare(len) + def apply(i: Int): T = a(i) + def drop(n: Int): scala.Seq[T] = ArraySeq.unsafeWrapArray(a.drop(n)) // clones the array, also if n == 0 + def toSeq: scala.Seq[T] = a.toSeq // clones the array + } +} + +/** Arrays are mutable, indexed collections of values. `Array[T]` is Scala's representation + * for Java's `T[]`. + * + * {{{ + * val numbers = Array(1, 2, 3, 4) + * val first = numbers(0) // read the first element + * numbers(3) = 100 // replace the 4th array element with 100 + * val biggerNumbers = numbers.map(_ * 2) // multiply all numbers by two + * }}} + * + * Arrays make use of two common pieces of Scala syntactic sugar, shown on lines 2 and 3 of the above + * example code. + * Line 2 is translated into a call to `apply(Int)`, while line 3 is translated into a call to + * `update(Int, T)`. + * + * Two implicit conversions exist in [[scala.Predef]] that are frequently applied to arrays: a conversion + * to [[scala.collection.ArrayOps]] (shown on line 4 of the example above) and a conversion + * to [[scala.collection.mutable.ArraySeq]] (a subtype of [[scala.collection.Seq]]). + * Both types make available many of the standard operations found in the Scala collections API. + * The conversion to `ArrayOps` is temporary, as all operations defined on `ArrayOps` return an `Array`, + * while the conversion to `ArraySeq` is permanent as all operations return a `ArraySeq`. + * + * The conversion to `ArrayOps` takes priority over the conversion to `ArraySeq`. For instance, + * consider the following code: + * + * {{{ + * val arr = Array(1, 2, 3) + * val arrReversed = arr.reverse + * val seqReversed : collection.Seq[Int] = arr.reverse + * }}} + * + * Value `arrReversed` will be of type `Array[Int]`, with an implicit conversion to `ArrayOps` occurring + * to perform the `reverse` operation. The value of `seqReversed`, on the other hand, will be computed + * by converting to `ArraySeq` first and invoking the variant of `reverse` that returns another + * `ArraySeq`. + * + * @see [[https://www.scala-lang.org/files/archive/spec/2.13/ Scala Language Specification]], for in-depth information on the transformations the Scala compiler makes on Arrays (Sections 6.6 and 6.15 respectively.) + * @see [[https://docs.scala-lang.org/sips/scala-2-8-arrays.html "Scala 2.8 Arrays"]] the Scala Improvement Document detailing arrays since Scala 2.8. + * @see [[https://docs.scala-lang.org/overviews/collections-2.13/arrays.html "The Scala 2.8 Collections' API"]] section on `Array` by Martin Odersky for more information. + * @hideImplicitConversion scala.Predef.booleanArrayOps + * @hideImplicitConversion scala.Predef.byteArrayOps + * @hideImplicitConversion scala.Predef.charArrayOps + * @hideImplicitConversion scala.Predef.doubleArrayOps + * @hideImplicitConversion scala.Predef.floatArrayOps + * @hideImplicitConversion scala.Predef.intArrayOps + * @hideImplicitConversion scala.Predef.longArrayOps + * @hideImplicitConversion scala.Predef.refArrayOps + * @hideImplicitConversion scala.Predef.shortArrayOps + * @hideImplicitConversion scala.Predef.unitArrayOps + * @hideImplicitConversion scala.LowPriorityImplicits.wrapRefArray + * @hideImplicitConversion scala.LowPriorityImplicits.wrapIntArray + * @hideImplicitConversion scala.LowPriorityImplicits.wrapDoubleArray + * @hideImplicitConversion scala.LowPriorityImplicits.wrapLongArray + * @hideImplicitConversion scala.LowPriorityImplicits.wrapFloatArray + * @hideImplicitConversion scala.LowPriorityImplicits.wrapCharArray + * @hideImplicitConversion scala.LowPriorityImplicits.wrapByteArray + * @hideImplicitConversion scala.LowPriorityImplicits.wrapShortArray + * @hideImplicitConversion scala.LowPriorityImplicits.wrapBooleanArray + * @hideImplicitConversion scala.LowPriorityImplicits.wrapUnitArray + * @hideImplicitConversion scala.LowPriorityImplicits.genericWrapArray + * @define coll array + * @define Coll `Array` + * @define orderDependent + * @define orderDependentFold + * @define mayNotTerminateInf + * @define willNotTerminateInf + * @define collectExample + * @define undefinedorder + */ +final class Array[T](_length: Int) extends java.io.Serializable with java.lang.Cloneable { + + /** The length of the array */ + def length: Int = throw new Error() + + /** The element at given index. + * + * Indices start at `0`; `xs.apply(0)` is the first element of array `xs`. + * Note the indexing syntax `xs(i)` is a shorthand for `xs.apply(i)`. + * + * @param i the index + * @return the element at the given index + * @throws ArrayIndexOutOfBoundsException if `i < 0` or `length <= i` + */ + def apply(i: Int): T = throw new Error() + + /** Update the element at given index. + * + * Indices start at `0`; `xs.update(i, x)` replaces the i^th^ element in the array. + * Note the syntax `xs(i) = x` is a shorthand for `xs.update(i, x)`. + * + * @param i the index + * @param x the value to be written at index `i` + * @throws ArrayIndexOutOfBoundsException if `i < 0` or `length <= i` + */ + def update(i: Int, x: T): Unit = { throw new Error() } + + /** Clone the Array. + * + * @return A clone of the Array. + */ + override def clone(): Array[T] = throw new Error() +} diff --git a/scala2-library-bootstrapped/src/scala/collection/ArrayOps.scala b/scala2-library-bootstrapped/src/scala/collection/ArrayOps.scala new file mode 100644 index 000000000000..d4659bbb0dba --- /dev/null +++ b/scala2-library-bootstrapped/src/scala/collection/ArrayOps.scala @@ -0,0 +1,1664 @@ +/* + * Scala (https://www.scala-lang.org) + * + * Copyright EPFL and Lightbend, Inc. + * + * Licensed under Apache License 2.0 + * (http://www.apache.org/licenses/LICENSE-2.0). + * + * See the NOTICE file distributed with this work for + * additional information regarding copyright ownership. + */ + +package scala +package collection + +import java.lang.Math.{max, min} +import java.util.Arrays + +import scala.Predef.{ // unimport all array-related implicit conversions to avoid triggering them accidentally + genericArrayOps => _, + booleanArrayOps => _, + byteArrayOps => _, + charArrayOps => _, + doubleArrayOps => _, + floatArrayOps => _, + intArrayOps => _, + longArrayOps => _, + refArrayOps => _, + shortArrayOps => _, + unitArrayOps => _, + genericWrapArray => _, + wrapRefArray => _, + wrapIntArray => _, + wrapDoubleArray => _, + wrapLongArray => _, + wrapFloatArray => _, + wrapCharArray => _, + wrapByteArray => _, + wrapShortArray => _, + wrapBooleanArray => _, + wrapUnitArray => _, + wrapString => _, + copyArrayToImmutableIndexedSeq => _, + _ +} +import scala.collection.Stepper.EfficientSplit +import scala.collection.immutable.Range +import scala.collection.mutable.ArrayBuilder +import scala.math.Ordering +import scala.reflect.ClassTag +import scala.util.Sorting + +object ArrayOps { + + @SerialVersionUID(3L) + private class ArrayView[A](xs: Array[A]) extends AbstractIndexedSeqView[A] { + def length = xs.length + def apply(n: Int) = xs(n) + override def toString: String = immutable.ArraySeq.unsafeWrapArray(xs).mkString("ArrayView(", ", ", ")") + } + + /** A lazy filtered array. No filtering is applied until one of `foreach`, `map` or `flatMap` is called. */ + class WithFilter[A](p: A => Boolean, xs: Array[A]) { + + /** Apply `f` to each element for its side effects. + * Note: [U] parameter needed to help scalac's type inference. + */ + def foreach[U](f: A => U): Unit = { + val len = xs.length + var i = 0 + while(i < len) { + val x = xs(i) + if(p(x)) f(x) + i += 1 + } + } + + /** Builds a new array by applying a function to all elements of this array. + * + * @param f the function to apply to each element. + * @tparam B the element type of the returned array. + * @return a new array resulting from applying the given function + * `f` to each element of this array and collecting the results. + */ + def map[B: ClassTag](f: A => B): Array[B] = { + val b = ArrayBuilder.make[B] + var i = 0 + while (i < xs.length) { + val x = xs(i) + if(p(x)) b += f(x) + i = i + 1 + } + b.result() + } + + /** Builds a new array by applying a function to all elements of this array + * and using the elements of the resulting collections. + * + * @param f the function to apply to each element. + * @tparam B the element type of the returned array. + * @return a new array resulting from applying the given collection-valued function + * `f` to each element of this array and concatenating the results. + */ + def flatMap[B: ClassTag](f: A => IterableOnce[B]): Array[B] = { + val b = ArrayBuilder.make[B] + var i = 0 + while(i < xs.length) { + val x = xs(i) + if(p(x)) b ++= f(xs(i)) + i += 1 + } + b.result() + } + + def flatMap[BS, B](f: A => BS)(implicit asIterable: BS => Iterable[B], m: ClassTag[B]): Array[B] = + flatMap[B](x => asIterable(f(x))) + + /** Creates a new non-strict filter which combines this filter with the given predicate. */ + def withFilter(q: A => Boolean): WithFilter[A] = new WithFilter[A](a => p(a) && q(a), xs) + } + + @SerialVersionUID(3L) + private[collection] final class ArrayIterator[@specialized(Specializable.Everything) A](xs: Array[A]) extends AbstractIterator[A] with Serializable { + private[this] var pos = 0 + private[this] val len = xs.length + override def knownSize: Int = len - pos + def hasNext: Boolean = pos < len + def next(): A = { + if (pos >= xs.length) Iterator.empty.next() + val r = xs(pos) + pos += 1 + r + } + override def drop(n: Int): Iterator[A] = { + if (n > 0) { + val newPos = pos + n + pos = + if (newPos < 0 /* overflow */) len + else Math.min(len, newPos) + } + this + } + } + + @SerialVersionUID(3L) + private final class ReverseIterator[@specialized(Specializable.Everything) A](xs: Array[A]) extends AbstractIterator[A] with Serializable { + private[this] var pos = xs.length-1 + def hasNext: Boolean = pos >= 0 + def next(): A = { + if (pos < 0) Iterator.empty.next() + val r = xs(pos) + pos -= 1 + r + } + + override def drop(n: Int): Iterator[A] = { + if (n > 0) pos = Math.max( -1, pos - n) + this + } + } + + @SerialVersionUID(3L) + private final class GroupedIterator[A](xs: Array[A], groupSize: Int) extends AbstractIterator[Array[A]] with Serializable { + private[this] var pos = 0 + def hasNext: Boolean = pos < xs.length + def next(): Array[A] = { + if(pos >= xs.length) throw new NoSuchElementException + val r = new ArrayOps(xs).slice(pos, pos+groupSize) + pos += groupSize + r + } + } + + /** The cut-off point for the array size after which we switch from `Sorting.stableSort` to + * an implementation that copies the data to a boxed representation for use with `Arrays.sort`. + */ + private final val MaxStableSortLength = 300 + + /** Avoid an allocation in [[collect]]. */ + private val fallback: Any => Any = _ => fallback +} + +/** This class serves as a wrapper for `Array`s with many of the operations found in + * indexed sequences. Where needed, instances of arrays are implicitly converted + * into this class. There is generally no reason to create an instance explicitly or use + * an `ArrayOps` type. It is better to work with plain `Array` types instead and rely on + * the implicit conversion to `ArrayOps` when calling a method (which does not actually + * allocate an instance of `ArrayOps` because it is a value class). + * + * Neither `Array` nor `ArrayOps` are proper collection types + * (i.e. they do not extend `Iterable` or even `IterableOnce`). `mutable.ArraySeq` and + * `immutable.ArraySeq` serve this purpose. + * + * The difference between this class and `ArraySeq`s is that calling transformer methods such as + * `filter` and `map` will yield an array, whereas an `ArraySeq` will remain an `ArraySeq`. + * + * @tparam A type of the elements contained in this array. + */ +final class ArrayOps[A](private val xs: Array[A]) extends AnyVal { + + @`inline` private[this] implicit def elemTag: ClassTag[A] = ClassTag(xs.getClass.getComponentType) + + /** The size of this array. + * + * @return the number of elements in this array. + */ + @`inline` def size: Int = xs.length + + /** The size of this array. + * + * @return the number of elements in this array. + */ + @`inline` def knownSize: Int = xs.length + + /** Tests whether the array is empty. + * + * @return `true` if the array contains no elements, `false` otherwise. + */ + @`inline` def isEmpty: Boolean = xs.length == 0 + + /** Tests whether the array is not empty. + * + * @return `true` if the array contains at least one element, `false` otherwise. + */ + @`inline` def nonEmpty: Boolean = xs.length != 0 + + /** Selects the first element of this array. + * + * @return the first element of this array. + * @throws NoSuchElementException if the array is empty. + */ + def head: A = if (nonEmpty) xs.apply(0) else throw new NoSuchElementException("head of empty array") + + /** Selects the last element. + * + * @return The last element of this array. + * @throws NoSuchElementException If the array is empty. + */ + def last: A = if (nonEmpty) xs.apply(xs.length-1) else throw new NoSuchElementException("last of empty array") + + /** Optionally selects the first element. + * + * @return the first element of this array if it is nonempty, + * `None` if it is empty. + */ + def headOption: Option[A] = if(isEmpty) None else Some(head) + + /** Optionally selects the last element. + * + * @return the last element of this array$ if it is nonempty, + * `None` if it is empty. + */ + def lastOption: Option[A] = if(isEmpty) None else Some(last) + + /** Compares the size of this array to a test value. + * + * @param otherSize the test value that gets compared with the size. + * @return A value `x` where + * {{{ + * x < 0 if this.size < otherSize + * x == 0 if this.size == otherSize + * x > 0 if this.size > otherSize + * }}} + */ + def sizeCompare(otherSize: Int): Int = Integer.compare(xs.length, otherSize) + + /** Compares the length of this array to a test value. + * + * @param len the test value that gets compared with the length. + * @return A value `x` where + * {{{ + * x < 0 if this.length < len + * x == 0 if this.length == len + * x > 0 if this.length > len + * }}} + */ + def lengthCompare(len: Int): Int = Integer.compare(xs.length, len) + + /** Method mirroring [[SeqOps.sizeIs]] for consistency, except it returns an `Int` + * because `size` is known and comparison is constant-time. + * + * These operations are equivalent to [[sizeCompare(Int) `sizeCompare(Int)`]], and + * allow the following more readable usages: + * + * {{{ + * this.sizeIs < size // this.sizeCompare(size) < 0 + * this.sizeIs <= size // this.sizeCompare(size) <= 0 + * this.sizeIs == size // this.sizeCompare(size) == 0 + * this.sizeIs != size // this.sizeCompare(size) != 0 + * this.sizeIs >= size // this.sizeCompare(size) >= 0 + * this.sizeIs > size // this.sizeCompare(size) > 0 + * }}} + */ + def sizeIs: Int = xs.length + + /** Method mirroring [[SeqOps.lengthIs]] for consistency, except it returns an `Int` + * because `length` is known and comparison is constant-time. + * + * These operations are equivalent to [[lengthCompare(Int) `lengthCompare(Int)`]], and + * allow the following more readable usages: + * + * {{{ + * this.lengthIs < len // this.lengthCompare(len) < 0 + * this.lengthIs <= len // this.lengthCompare(len) <= 0 + * this.lengthIs == len // this.lengthCompare(len) == 0 + * this.lengthIs != len // this.lengthCompare(len) != 0 + * this.lengthIs >= len // this.lengthCompare(len) >= 0 + * this.lengthIs > len // this.lengthCompare(len) > 0 + * }}} + */ + def lengthIs: Int = xs.length + + /** Selects an interval of elements. The returned array is made up + * of all elements `x` which satisfy the invariant: + * {{{ + * from <= indexOf(x) < until + * }}} + * + * @param from the lowest index to include from this array. + * @param until the lowest index to EXCLUDE from this array. + * @return an array containing the elements greater than or equal to + * index `from` extending up to (but not including) index `until` + * of this array. + */ + def slice(from: Int, until: Int): Array[A] = { + import java.util.Arrays.copyOfRange + val lo = max(from, 0) + val hi = min(until, xs.length) + if (hi > lo) { + (((xs: Array[_]): @unchecked) match { + case x: Array[AnyRef] => copyOfRange(x, lo, hi) + case x: Array[Int] => copyOfRange(x, lo, hi) + case x: Array[Double] => copyOfRange(x, lo, hi) + case x: Array[Long] => copyOfRange(x, lo, hi) + case x: Array[Float] => copyOfRange(x, lo, hi) + case x: Array[Char] => copyOfRange(x, lo, hi) + case x: Array[Byte] => copyOfRange(x, lo, hi) + case x: Array[Short] => copyOfRange(x, lo, hi) + case x: Array[Boolean] => copyOfRange(x, lo, hi) + }).asInstanceOf[Array[A]] + } else new Array[A](0) + } + + /** The rest of the array without its first element. */ + def tail: Array[A] = + if(xs.length == 0) throw new UnsupportedOperationException("tail of empty array") else slice(1, xs.length) + + /** The initial part of the array without its last element. */ + def init: Array[A] = + if(xs.length == 0) throw new UnsupportedOperationException("init of empty array") else slice(0, xs.length-1) + + /** Iterates over the tails of this array. The first value will be this + * array and the final one will be an empty array, with the intervening + * values the results of successive applications of `tail`. + * + * @return an iterator over all the tails of this array + */ + def tails: Iterator[Array[A]] = iterateUntilEmpty(xs => new ArrayOps(xs).tail) + + /** Iterates over the inits of this array. The first value will be this + * array and the final one will be an empty array, with the intervening + * values the results of successive applications of `init`. + * + * @return an iterator over all the inits of this array + */ + def inits: Iterator[Array[A]] = iterateUntilEmpty(xs => new ArrayOps(xs).init) + + // A helper for tails and inits. + private[this] def iterateUntilEmpty(f: Array[A] => Array[A]): Iterator[Array[A]] = + Iterator.iterate(xs)(f).takeWhile(x => x.length != 0) ++ Iterator.single(Array.empty[A]) + + /** An array containing the first `n` elements of this array. */ + def take(n: Int): Array[A] = slice(0, n) + + /** The rest of the array without its `n` first elements. */ + def drop(n: Int): Array[A] = slice(n, xs.length) + + /** An array containing the last `n` elements of this array. */ + def takeRight(n: Int): Array[A] = drop(xs.length - max(n, 0)) + + /** The rest of the array without its `n` last elements. */ + def dropRight(n: Int): Array[A] = take(xs.length - max(n, 0)) + + /** Takes longest prefix of elements that satisfy a predicate. + * + * @param p The predicate used to test elements. + * @return the longest prefix of this array whose elements all satisfy + * the predicate `p`. + */ + def takeWhile(p: A => Boolean): Array[A] = { + val i = indexWhere(x => !p(x)) + val hi = if(i < 0) xs.length else i + slice(0, hi) + } + + /** Drops longest prefix of elements that satisfy a predicate. + * + * @param p The predicate used to test elements. + * @return the longest suffix of this array whose first element + * does not satisfy the predicate `p`. + */ + def dropWhile(p: A => Boolean): Array[A] = { + val i = indexWhere(x => !p(x)) + val lo = if(i < 0) xs.length else i + slice(lo, xs.length) + } + + def iterator: Iterator[A] = + ((xs: Any @unchecked) match { + case xs: Array[AnyRef] => new ArrayOps.ArrayIterator(xs) + case xs: Array[Int] => new ArrayOps.ArrayIterator(xs) + case xs: Array[Double] => new ArrayOps.ArrayIterator(xs) + case xs: Array[Long] => new ArrayOps.ArrayIterator(xs) + case xs: Array[Float] => new ArrayOps.ArrayIterator(xs) + case xs: Array[Char] => new ArrayOps.ArrayIterator(xs) + case xs: Array[Byte] => new ArrayOps.ArrayIterator(xs) + case xs: Array[Short] => new ArrayOps.ArrayIterator(xs) + case xs: Array[Boolean] => new ArrayOps.ArrayIterator(xs) + case xs: Array[Unit] => new ArrayOps.ArrayIterator(xs) + case null => throw new NullPointerException + }).asInstanceOf[Iterator[A]] + + def stepper[S <: Stepper[_]](implicit shape: StepperShape[A, S]): S with EfficientSplit = { + import convert.impl._ + val s = (shape.shape: @unchecked) match { + case StepperShape.ReferenceShape => (xs: Any) match { + case bs: Array[Boolean] => new BoxedBooleanArrayStepper(bs, 0, xs.length) + case _ => new ObjectArrayStepper[AnyRef](xs.asInstanceOf[Array[AnyRef ]], 0, xs.length) + } + case StepperShape.IntShape => new IntArrayStepper (xs.asInstanceOf[Array[Int ]], 0, xs.length) + case StepperShape.LongShape => new LongArrayStepper (xs.asInstanceOf[Array[Long ]], 0, xs.length) + case StepperShape.DoubleShape => new DoubleArrayStepper (xs.asInstanceOf[Array[Double ]], 0, xs.length) + case StepperShape.ByteShape => new WidenedByteArrayStepper (xs.asInstanceOf[Array[Byte ]], 0, xs.length) + case StepperShape.ShortShape => new WidenedShortArrayStepper (xs.asInstanceOf[Array[Short ]], 0, xs.length) + case StepperShape.CharShape => new WidenedCharArrayStepper (xs.asInstanceOf[Array[Char ]], 0, xs.length) + case StepperShape.FloatShape => new WidenedFloatArrayStepper (xs.asInstanceOf[Array[Float ]], 0, xs.length) + } + s.asInstanceOf[S with EfficientSplit] + } + + /** Partitions elements in fixed size arrays. + * @see [[scala.collection.Iterator]], method `grouped` + * + * @param size the number of elements per group + * @return An iterator producing arrays of size `size`, except the + * last will be less than size `size` if the elements don't divide evenly. + */ + def grouped(size: Int): Iterator[Array[A]] = new ArrayOps.GroupedIterator[A](xs, size) + + /** Splits this array into a prefix/suffix pair according to a predicate. + * + * Note: `c span p` is equivalent to (but more efficient than) + * `(c takeWhile p, c dropWhile p)`, provided the evaluation of the + * predicate `p` does not cause any side-effects. + * + * @param p the test predicate + * @return a pair consisting of the longest prefix of this array whose + * elements all satisfy `p`, and the rest of this array. + */ + def span(p: A => Boolean): (Array[A], Array[A]) = { + val i = indexWhere(x => !p(x)) + val idx = if(i < 0) xs.length else i + (slice(0, idx), slice(idx, xs.length)) + } + + /** Splits this array into two at a given position. + * Note: `c splitAt n` is equivalent to `(c take n, c drop n)`. + * + * @param n the position at which to split. + * @return a pair of arrays consisting of the first `n` + * elements of this array, and the other elements. + */ + def splitAt(n: Int): (Array[A], Array[A]) = (take(n), drop(n)) + + /** A pair of, first, all elements that satisfy predicate `p` and, second, all elements that do not. */ + def partition(p: A => Boolean): (Array[A], Array[A]) = { + val res1, res2 = ArrayBuilder.make[A] + var i = 0 + while(i < xs.length) { + val x = xs(i) + (if(p(x)) res1 else res2) += x + i += 1 + } + (res1.result(), res2.result()) + } + + /** Applies a function `f` to each element of the array and returns a pair of arrays: the first one + * made of those values returned by `f` that were wrapped in [[scala.util.Left]], and the second + * one made of those wrapped in [[scala.util.Right]]. + * + * Example: + * {{{ + * val xs = Array(1, "one", 2, "two", 3, "three") partitionMap { + * case i: Int => Left(i) + * case s: String => Right(s) + * } + * // xs == (Array(1, 2, 3), + * // Array(one, two, three)) + * }}} + * + * @tparam A1 the element type of the first resulting collection + * @tparam A2 the element type of the second resulting collection + * @param f the 'split function' mapping the elements of this array to an [[scala.util.Either]] + * + * @return a pair of arrays: the first one made of those values returned by `f` that were wrapped in [[scala.util.Left]], + * and the second one made of those wrapped in [[scala.util.Right]]. */ + def partitionMap[A1: ClassTag, A2: ClassTag](f: A => Either[A1, A2]): (Array[A1], Array[A2]) = { + val res1 = ArrayBuilder.make[A1] + val res2 = ArrayBuilder.make[A2] + var i = 0 + while(i < xs.length) { + f(xs(i)) match { + case Left(x) => res1 += x + case Right(x) => res2 += x + } + i += 1 + } + (res1.result(), res2.result()) + } + + /** Returns a new array with the elements in reversed order. */ + @inline def reverse: Array[A] = { + val len = xs.length + val res = new Array[A](len) + var i = 0 + while(i < len) { + res(len-i-1) = xs(i) + i += 1 + } + res + } + + /** An iterator yielding elements in reversed order. + * + * Note: `xs.reverseIterator` is the same as `xs.reverse.iterator` but implemented more efficiently. + * + * @return an iterator yielding the elements of this array in reversed order + */ + def reverseIterator: Iterator[A] = + ((xs: Any @unchecked) match { + case xs: Array[AnyRef] => new ArrayOps.ReverseIterator(xs) + case xs: Array[Int] => new ArrayOps.ReverseIterator(xs) + case xs: Array[Double] => new ArrayOps.ReverseIterator(xs) + case xs: Array[Long] => new ArrayOps.ReverseIterator(xs) + case xs: Array[Float] => new ArrayOps.ReverseIterator(xs) + case xs: Array[Char] => new ArrayOps.ReverseIterator(xs) + case xs: Array[Byte] => new ArrayOps.ReverseIterator(xs) + case xs: Array[Short] => new ArrayOps.ReverseIterator(xs) + case xs: Array[Boolean] => new ArrayOps.ReverseIterator(xs) + case xs: Array[Unit] => new ArrayOps.ReverseIterator(xs) + case null => throw new NullPointerException + }).asInstanceOf[Iterator[A]] + + /** Selects all elements of this array which satisfy a predicate. + * + * @param p the predicate used to test elements. + * @return a new array consisting of all elements of this array that satisfy the given predicate `p`. + */ + def filter(p: A => Boolean): Array[A] = { + val res = ArrayBuilder.make[A] + var i = 0 + while(i < xs.length) { + val x = xs(i) + if(p(x)) res += x + i += 1 + } + res.result() + } + + /** Selects all elements of this array which do not satisfy a predicate. + * + * @param p the predicate used to test elements. + * @return a new array consisting of all elements of this array that do not satisfy the given predicate `p`. + */ + def filterNot(p: A => Boolean): Array[A] = filter(x => !p(x)) + + /** Sorts this array according to an Ordering. + * + * The sort is stable. That is, elements that are equal (as determined by + * `lt`) appear in the same order in the sorted sequence as in the original. + * + * @see [[scala.math.Ordering]] + * + * @param ord the ordering to be used to compare elements. + * @return an array consisting of the elements of this array + * sorted according to the ordering `ord`. + */ + def sorted[B >: A](implicit ord: Ordering[B]): Array[A] = { + val len = xs.length + def boxed = if(len < ArrayOps.MaxStableSortLength) { + val a = xs.clone() + Sorting.stableSort(a)(using ord.asInstanceOf[Ordering[A]]) + a + } else { + val a = Array.copyAs[AnyRef](xs, len)(ClassTag.AnyRef) + Arrays.sort(a, ord.asInstanceOf[Ordering[AnyRef]]) + Array.copyAs[A](a, len) + } + if(len <= 1) xs.clone() + else ((xs: Array[_]) match { + case xs: Array[AnyRef] => + val a = Arrays.copyOf(xs, len); Arrays.sort(a, ord.asInstanceOf[Ordering[AnyRef]]); a + case xs: Array[Int] => + if(ord eq Ordering.Int) { val a = Arrays.copyOf(xs, len); Arrays.sort(a); a } + else boxed + case xs: Array[Long] => + if(ord eq Ordering.Long) { val a = Arrays.copyOf(xs, len); Arrays.sort(a); a } + else boxed + case xs: Array[Char] => + if(ord eq Ordering.Char) { val a = Arrays.copyOf(xs, len); Arrays.sort(a); a } + else boxed + case xs: Array[Byte] => + if(ord eq Ordering.Byte) { val a = Arrays.copyOf(xs, len); Arrays.sort(a); a } + else boxed + case xs: Array[Short] => + if(ord eq Ordering.Short) { val a = Arrays.copyOf(xs, len); Arrays.sort(a); a } + else boxed + case xs: Array[Boolean] => + if(ord eq Ordering.Boolean) { val a = Arrays.copyOf(xs, len); Sorting.stableSort(a); a } + else boxed + case xs => boxed + }).asInstanceOf[Array[A]] + } + + /** Sorts this array according to a comparison function. + * + * The sort is stable. That is, elements that are equal (as determined by + * `lt`) appear in the same order in the sorted sequence as in the original. + * + * @param lt the comparison function which tests whether + * its first argument precedes its second argument in + * the desired ordering. + * @return an array consisting of the elements of this array + * sorted according to the comparison function `lt`. + */ + def sortWith(lt: (A, A) => Boolean): Array[A] = sorted(Ordering.fromLessThan(lt)) + + /** Sorts this array according to the Ordering which results from transforming + * an implicitly given Ordering with a transformation function. + * + * @see [[scala.math.Ordering]] + * @param f the transformation function mapping elements + * to some other domain `B`. + * @param ord the ordering assumed on domain `B`. + * @tparam B the target type of the transformation `f`, and the type where + * the ordering `ord` is defined. + * @return an array consisting of the elements of this array + * sorted according to the ordering where `x < y` if + * `ord.lt(f(x), f(y))`. + */ + def sortBy[B](f: A => B)(implicit ord: Ordering[B]): Array[A] = sorted(ord on f) + + /** Creates a non-strict filter of this array. + * + * Note: the difference between `c filter p` and `c withFilter p` is that + * the former creates a new array, whereas the latter only + * restricts the domain of subsequent `map`, `flatMap`, `foreach`, + * and `withFilter` operations. + * + * @param p the predicate used to test elements. + * @return an object of class `ArrayOps.WithFilter`, which supports + * `map`, `flatMap`, `foreach`, and `withFilter` operations. + * All these operations apply to those elements of this array + * which satisfy the predicate `p`. + */ + def withFilter(p: A => Boolean): ArrayOps.WithFilter[A] = new ArrayOps.WithFilter[A](p, xs) + + /** Finds index of first occurrence of some value in this array after or at some start index. + * + * @param elem the element value to search for. + * @param from the start index + * @return the index `>= from` of the first element of this array that is equal (as determined by `==`) + * to `elem`, or `-1`, if none exists. + */ + def indexOf(elem: A, from: Int = 0): Int = { + var i = from + while(i < xs.length) { + if(elem == xs(i)) return i + i += 1 + } + -1 + } + + /** Finds index of the first element satisfying some predicate after or at some start index. + * + * @param p the predicate used to test elements. + * @param from the start index + * @return the index `>= from` of the first element of this array that satisfies the predicate `p`, + * or `-1`, if none exists. + */ + def indexWhere(@deprecatedName("f", "2.13.3") p: A => Boolean, from: Int = 0): Int = { + var i = from + while(i < xs.length) { + if(p(xs(i))) return i + i += 1 + } + -1 + } + + /** Finds index of last occurrence of some value in this array before or at a given end index. + * + * @param elem the element value to search for. + * @param end the end index. + * @return the index `<= end` of the last element of this array that is equal (as determined by `==`) + * to `elem`, or `-1`, if none exists. + */ + def lastIndexOf(elem: A, end: Int = xs.length - 1): Int = { + var i = min(end, xs.length-1) + while(i >= 0) { + if(elem == xs(i)) return i + i -= 1 + } + -1 + } + + /** Finds index of last element satisfying some predicate before or at given end index. + * + * @param p the predicate used to test elements. + * @return the index `<= end` of the last element of this array that satisfies the predicate `p`, + * or `-1`, if none exists. + */ + def lastIndexWhere(p: A => Boolean, end: Int = xs.length - 1): Int = { + var i = min(end, xs.length-1) + while(i >= 0) { + if(p(xs(i))) return i + i -= 1 + } + -1 + } + + /** Finds the first element of the array satisfying a predicate, if any. + * + * @param p the predicate used to test elements. + * @return an option value containing the first element in the array + * that satisfies `p`, or `None` if none exists. + */ + def find(@deprecatedName("f", "2.13.3") p: A => Boolean): Option[A] = { + val idx = indexWhere(p) + if(idx == -1) None else Some(xs(idx)) + } + + /** Tests whether a predicate holds for at least one element of this array. + * + * @param p the predicate used to test elements. + * @return `true` if the given predicate `p` is satisfied by at least one element of this array, otherwise `false` + */ + def exists(@deprecatedName("f", "2.13.3") p: A => Boolean): Boolean = indexWhere(p) >= 0 + + /** Tests whether a predicate holds for all elements of this array. + * + * @param p the predicate used to test elements. + * @return `true` if this array is empty or the given predicate `p` + * holds for all elements of this array, otherwise `false`. + */ + def forall(@deprecatedName("f", "2.13.3") p: A => Boolean): Boolean = { + var i = 0 + while(i < xs.length) { + if(!p(xs(i))) return false + i += 1 + } + true + } + + /** Applies a binary operator to a start value and all elements of this array, + * going left to right. + * + * @param z the start value. + * @param op the binary operator. + * @tparam B the result type of the binary operator. + * @return the result of inserting `op` between consecutive elements of this array, + * going left to right with the start value `z` on the left: + * {{{ + * op(...op(z, x_1), x_2, ..., x_n) + * }}} + * where `x,,1,,, ..., x,,n,,` are the elements of this array. + * Returns `z` if this array is empty. + */ + def foldLeft[B](z: B)(op: (B, A) => B): B = { + def f[@specialized(Specializable.Everything) T](xs: Array[T], op: (Any, Any) => Any, z: Any): Any = { + val length = xs.length + var v: Any = z + var i = 0 + while(i < length) { + v = op(v, xs(i)) + i += 1 + } + v + } + ((xs: Any @unchecked) match { + case null => throw new NullPointerException // null-check first helps static analysis of instanceOf + case xs: Array[AnyRef] => f(xs, op.asInstanceOf[(Any, Any) => Any], z) + case xs: Array[Int] => f(xs, op.asInstanceOf[(Any, Any) => Any], z) + case xs: Array[Double] => f(xs, op.asInstanceOf[(Any, Any) => Any], z) + case xs: Array[Long] => f(xs, op.asInstanceOf[(Any, Any) => Any], z) + case xs: Array[Float] => f(xs, op.asInstanceOf[(Any, Any) => Any], z) + case xs: Array[Char] => f(xs, op.asInstanceOf[(Any, Any) => Any], z) + case xs: Array[Byte] => f(xs, op.asInstanceOf[(Any, Any) => Any], z) + case xs: Array[Short] => f(xs, op.asInstanceOf[(Any, Any) => Any], z) + case xs: Array[Boolean] => f(xs, op.asInstanceOf[(Any, Any) => Any], z) + case xs: Array[Unit] => f(xs, op.asInstanceOf[(Any, Any) => Any], z) + }).asInstanceOf[B] + } + + /** Produces an array containing cumulative results of applying the binary + * operator going left to right. + * + * @param z the start value. + * @param op the binary operator. + * @tparam B the result type of the binary operator. + * @return array with intermediate values. + * + * Example: + * {{{ + * Array(1, 2, 3, 4).scanLeft(0)(_ + _) == Array(0, 1, 3, 6, 10) + * }}} + * + */ + def scanLeft[ B : ClassTag ](z: B)(op: (B, A) => B): Array[B] = { + var v = z + var i = 0 + val res = new Array[B](xs.length + 1) + while(i < xs.length) { + res(i) = v + v = op(v, xs(i)) + i += 1 + } + res(i) = v + res + } + + /** Computes a prefix scan of the elements of the array. + * + * Note: The neutral element `z` may be applied more than once. + * + * @tparam B element type of the resulting array + * @param z neutral element for the operator `op` + * @param op the associative operator for the scan + * + * @return a new array containing the prefix scan of the elements in this array + */ + def scan[B >: A : ClassTag](z: B)(op: (B, B) => B): Array[B] = scanLeft(z)(op) + + /** Produces an array containing cumulative results of applying the binary + * operator going right to left. + * + * @param z the start value. + * @param op the binary operator. + * @tparam B the result type of the binary operator. + * @return array with intermediate values. + * + * Example: + * {{{ + * Array(4, 3, 2, 1).scanRight(0)(_ + _) == Array(10, 6, 3, 1, 0) + * }}} + * + */ + def scanRight[ B : ClassTag ](z: B)(op: (A, B) => B): Array[B] = { + var v = z + var i = xs.length - 1 + val res = new Array[B](xs.length + 1) + res(xs.length) = z + while(i >= 0) { + v = op(xs(i), v) + res(i) = v + i -= 1 + } + res + } + + /** Applies a binary operator to all elements of this array and a start value, + * going right to left. + * + * @param z the start value. + * @param op the binary operator. + * @tparam B the result type of the binary operator. + * @return the result of inserting `op` between consecutive elements of this array, + * going right to left with the start value `z` on the right: + * {{{ + * op(x_1, op(x_2, ... op(x_n, z)...)) + * }}} + * where `x,,1,,, ..., x,,n,,` are the elements of this array. + * Returns `z` if this array is empty. + */ + def foldRight[B](z: B)(op: (A, B) => B): B = { + def f[@specialized(Specializable.Everything) T](xs: Array[T], op: (Any, Any) => Any, z: Any): Any = { + var v = z + var i = xs.length - 1 + while(i >= 0) { + v = op(xs(i), v) + i -= 1 + } + v + } + ((xs: Any @unchecked) match { + case null => throw new NullPointerException + case xs: Array[AnyRef] => f(xs, op.asInstanceOf[(Any, Any) => Any], z) + case xs: Array[Int] => f(xs, op.asInstanceOf[(Any, Any) => Any], z) + case xs: Array[Double] => f(xs, op.asInstanceOf[(Any, Any) => Any], z) + case xs: Array[Long] => f(xs, op.asInstanceOf[(Any, Any) => Any], z) + case xs: Array[Float] => f(xs, op.asInstanceOf[(Any, Any) => Any], z) + case xs: Array[Char] => f(xs, op.asInstanceOf[(Any, Any) => Any], z) + case xs: Array[Byte] => f(xs, op.asInstanceOf[(Any, Any) => Any], z) + case xs: Array[Short] => f(xs, op.asInstanceOf[(Any, Any) => Any], z) + case xs: Array[Boolean] => f(xs, op.asInstanceOf[(Any, Any) => Any], z) + case xs: Array[Unit] => f(xs, op.asInstanceOf[(Any, Any) => Any], z) + }).asInstanceOf[B] + + } + + /** Folds the elements of this array using the specified associative binary operator. + * + * @tparam A1 a type parameter for the binary operator, a supertype of `A`. + * @param z a neutral element for the fold operation; may be added to the result + * an arbitrary number of times, and must not change the result (e.g., `Nil` for list concatenation, + * 0 for addition, or 1 for multiplication). + * @param op a binary operator that must be associative. + * @return the result of applying the fold operator `op` between all the elements, or `z` if this array is empty. + */ + def fold[A1 >: A](z: A1)(op: (A1, A1) => A1): A1 = foldLeft(z)(op) + + /** Builds a new array by applying a function to all elements of this array. + * + * @param f the function to apply to each element. + * @tparam B the element type of the returned array. + * @return a new array resulting from applying the given function + * `f` to each element of this array and collecting the results. + */ + def map[B](f: A => B)(implicit ct: ClassTag[B]): Array[B] = { + val len = xs.length + val ys = new Array[B](len) + if(len > 0) { + var i = 0 + (xs: Any @unchecked) match { + case xs: Array[AnyRef] => while (i < len) { ys(i) = f(xs(i).asInstanceOf[A]); i = i+1 } + case xs: Array[Int] => while (i < len) { ys(i) = f(xs(i).asInstanceOf[A]); i = i+1 } + case xs: Array[Double] => while (i < len) { ys(i) = f(xs(i).asInstanceOf[A]); i = i+1 } + case xs: Array[Long] => while (i < len) { ys(i) = f(xs(i).asInstanceOf[A]); i = i+1 } + case xs: Array[Float] => while (i < len) { ys(i) = f(xs(i).asInstanceOf[A]); i = i+1 } + case xs: Array[Char] => while (i < len) { ys(i) = f(xs(i).asInstanceOf[A]); i = i+1 } + case xs: Array[Byte] => while (i < len) { ys(i) = f(xs(i).asInstanceOf[A]); i = i+1 } + case xs: Array[Short] => while (i < len) { ys(i) = f(xs(i).asInstanceOf[A]); i = i+1 } + case xs: Array[Boolean] => while (i < len) { ys(i) = f(xs(i).asInstanceOf[A]); i = i+1 } + } + } + ys + } + + def mapInPlace(f: A => A): Array[A] = { + var i = 0 + while (i < xs.length) { + xs.update(i, f(xs(i))) + i = i + 1 + } + xs + } + + /** Builds a new array by applying a function to all elements of this array + * and using the elements of the resulting collections. + * + * @param f the function to apply to each element. + * @tparam B the element type of the returned array. + * @return a new array resulting from applying the given collection-valued function + * `f` to each element of this array and concatenating the results. + */ + def flatMap[B : ClassTag](f: A => IterableOnce[B]): Array[B] = { + val b = ArrayBuilder.make[B] + var i = 0 + while(i < xs.length) { + b ++= f(xs(i)) + i += 1 + } + b.result() + } + + def flatMap[BS, B](f: A => BS)(implicit asIterable: BS => Iterable[B], m: ClassTag[B]): Array[B] = + flatMap[B](x => asIterable(f(x))) + + /** Flattens a two-dimensional array by concatenating all its rows + * into a single array. + * + * @tparam B Type of row elements. + * @param asIterable A function that converts elements of this array to rows - Iterables of type `B`. + * @return An array obtained by concatenating rows of this array. + */ + def flatten[B](implicit asIterable: A => IterableOnce[B], m: ClassTag[B]): Array[B] = { + val b = ArrayBuilder.make[B] + val len = xs.length + var size = 0 + var i = 0 + while(i < len) { + xs(i) match { + case it: IterableOnce[_] => + val k = it.knownSize + if(k > 0) size += k + case a: Array[_] => size += a.length + case _ => + } + i += 1 + } + if(size > 0) b.sizeHint(size) + i = 0 + while(i < len) { + b ++= asIterable(xs(i)) + i += 1 + } + b.result() + } + + /** Builds a new array by applying a partial function to all elements of this array + * on which the function is defined. + * + * @param pf the partial function which filters and maps the array. + * @tparam B the element type of the returned array. + * @return a new array resulting from applying the given partial function + * `pf` to each element on which it is defined and collecting the results. + * The order of the elements is preserved. + */ + def collect[B: ClassTag](pf: PartialFunction[A, B]): Array[B] = { + val fallback: Any => Any = ArrayOps.fallback + val b = ArrayBuilder.make[B] + var i = 0 + while (i < xs.length) { + val v = pf.applyOrElse(xs(i), fallback) + if (v.asInstanceOf[AnyRef] ne fallback) b.addOne(v.asInstanceOf[B]) + i += 1 + } + b.result() + } + + /** Finds the first element of the array for which the given partial function is defined, and applies the + * partial function to it. */ + def collectFirst[B](@deprecatedName("f","2.13.9") pf: PartialFunction[A, B]): Option[B] = { + val fallback: Any => Any = ArrayOps.fallback + var i = 0 + while (i < xs.length) { + val v = pf.applyOrElse(xs(i), fallback) + if (v.asInstanceOf[AnyRef] ne fallback) return Some(v.asInstanceOf[B]) + i += 1 + } + None + } + + /** Returns an array formed from this array and another iterable collection + * by combining corresponding elements in pairs. + * If one of the two collections is longer than the other, its remaining elements are ignored. + * + * @param that The iterable providing the second half of each result pair + * @tparam B the type of the second half of the returned pairs + * @return a new array containing pairs consisting of corresponding elements of this array and `that`. + * The length of the returned array is the minimum of the lengths of this array and `that`. + */ + def zip[B](that: IterableOnce[B]): Array[(A, B)] = { + val b = new ArrayBuilder.ofRef[(A, B)]() + val k = that.knownSize + b.sizeHint(if(k >= 0) min(k, xs.length) else xs.length) + var i = 0 + val it = that.iterator + while(i < xs.length && it.hasNext) { + b += ((xs(i), it.next())) + i += 1 + } + b.result() + } + + /** Analogous to `zip` except that the elements in each collection are not consumed until a strict operation is + * invoked on the returned `LazyZip2` decorator. + * + * Calls to `lazyZip` can be chained to support higher arities (up to 4) without incurring the expense of + * constructing and deconstructing intermediary tuples. + * + * {{{ + * val xs = List(1, 2, 3) + * val res = (xs lazyZip xs lazyZip xs lazyZip xs).map((a, b, c, d) => a + b + c + d) + * // res == List(4, 8, 12) + * }}} + * + * @param that the iterable providing the second element of each eventual pair + * @tparam B the type of the second element in each eventual pair + * @return a decorator `LazyZip2` that allows strict operations to be performed on the lazily evaluated pairs + * or chained calls to `lazyZip`. Implicit conversion to `Iterable[(A, B)]` is also supported. + */ + def lazyZip[B](that: Iterable[B]): LazyZip2[A, B, Array[A]] = new LazyZip2(xs, immutable.ArraySeq.unsafeWrapArray(xs), that) + + /** Returns an array formed from this array and another iterable collection + * by combining corresponding elements in pairs. + * If one of the two collections is shorter than the other, + * placeholder elements are used to extend the shorter collection to the length of the longer. + * + * @param that the iterable providing the second half of each result pair + * @param thisElem the element to be used to fill up the result if this array is shorter than `that`. + * @param thatElem the element to be used to fill up the result if `that` is shorter than this array. + * @return a new array containing pairs consisting of corresponding elements of this array and `that`. + * The length of the returned array is the maximum of the lengths of this array and `that`. + * If this array is shorter than `that`, `thisElem` values are used to pad the result. + * If `that` is shorter than this array, `thatElem` values are used to pad the result. + */ + def zipAll[A1 >: A, B](that: Iterable[B], thisElem: A1, thatElem: B): Array[(A1, B)] = { + val b = new ArrayBuilder.ofRef[(A1, B)]() + val k = that.knownSize + b.sizeHint(max(k, xs.length)) + var i = 0 + val it = that.iterator + while(i < xs.length && it.hasNext) { + b += ((xs(i), it.next())) + i += 1 + } + while(it.hasNext) { + b += ((thisElem, it.next())) + i += 1 + } + while(i < xs.length) { + b += ((xs(i), thatElem)) + i += 1 + } + b.result() + } + + /** Zips this array with its indices. + * + * @return A new array containing pairs consisting of all elements of this array paired with their index. + * Indices start at `0`. + */ + def zipWithIndex: Array[(A, Int)] = { + val b = new Array[(A, Int)](xs.length) + var i = 0 + while(i < xs.length) { + b(i) = ((xs(i), i)) + i += 1 + } + b + } + + /** A copy of this array with an element appended. */ + def appended[B >: A : ClassTag](x: B): Array[B] = { + val dest = Array.copyAs[B](xs, xs.length+1) + dest(xs.length) = x + dest + } + + @`inline` final def :+ [B >: A : ClassTag](x: B): Array[B] = appended(x) + + /** A copy of this array with an element prepended. */ + def prepended[B >: A : ClassTag](x: B): Array[B] = { + val dest = new Array[B](xs.length + 1) + dest(0) = x + Array.copy(xs, 0, dest, 1, xs.length) + dest + } + + @`inline` final def +: [B >: A : ClassTag](x: B): Array[B] = prepended(x) + + /** A copy of this array with all elements of a collection prepended. */ + def prependedAll[B >: A : ClassTag](prefix: IterableOnce[B]): Array[B] = { + val b = ArrayBuilder.make[B] + val k = prefix.knownSize + if(k >= 0) b.sizeHint(k + xs.length) + b.addAll(prefix) + if(k < 0) b.sizeHint(b.length + xs.length) + b.addAll(xs) + b.result() + } + + /** A copy of this array with all elements of an array prepended. */ + def prependedAll[B >: A : ClassTag](prefix: Array[_ <: B]): Array[B] = { + val dest = Array.copyAs[B](prefix, prefix.length+xs.length) + Array.copy(xs, 0, dest, prefix.length, xs.length) + dest + } + + @`inline` final def ++: [B >: A : ClassTag](prefix: IterableOnce[B]): Array[B] = prependedAll(prefix) + + @`inline` final def ++: [B >: A : ClassTag](prefix: Array[_ <: B]): Array[B] = prependedAll(prefix) + + /** A copy of this array with all elements of a collection appended. */ + def appendedAll[B >: A : ClassTag](suffix: IterableOnce[B]): Array[B] = { + val b = ArrayBuilder.make[B] + val k = suffix.knownSize + if(k >= 0) b.sizeHint(k + xs.length) + b.addAll(xs) + b.addAll(suffix) + b.result() + } + + /** A copy of this array with all elements of an array appended. */ + def appendedAll[B >: A : ClassTag](suffix: Array[_ <: B]): Array[B] = { + val dest = Array.copyAs[B](xs, xs.length+suffix.length) + Array.copy(suffix, 0, dest, xs.length, suffix.length) + dest + } + + @`inline` final def :++ [B >: A : ClassTag](suffix: IterableOnce[B]): Array[B] = appendedAll(suffix) + + @`inline` final def :++ [B >: A : ClassTag](suffix: Array[_ <: B]): Array[B] = appendedAll(suffix) + + @`inline` final def concat[B >: A : ClassTag](suffix: IterableOnce[B]): Array[B] = appendedAll(suffix) + + @`inline` final def concat[B >: A : ClassTag](suffix: Array[_ <: B]): Array[B] = appendedAll(suffix) + + @`inline` final def ++[B >: A : ClassTag](xs: IterableOnce[B]): Array[B] = appendedAll(xs) + + @`inline` final def ++[B >: A : ClassTag](xs: Array[_ <: B]): Array[B] = appendedAll(xs) + + /** Tests whether this array contains a given value as an element. + * + * @param elem the element to test. + * @return `true` if this array has an element that is equal (as + * determined by `==`) to `elem`, `false` otherwise. + */ + def contains(elem: A): Boolean = exists (_ == elem) + + /** Returns a copy of this array with patched values. + * Patching at negative indices is the same as patching starting at 0. + * Patching at indices at or larger than the length of the original array appends the patch to the end. + * If more values are replaced than actually exist, the excess is ignored. + * + * @param from The start index from which to patch + * @param other The patch values + * @param replaced The number of values in the original array that are replaced by the patch. + */ + def patch[B >: A : ClassTag](from: Int, other: IterableOnce[B], replaced: Int): Array[B] = { + val b = ArrayBuilder.make[B] + val k = other.knownSize + val r = if(replaced < 0) 0 else replaced + if(k >= 0) b.sizeHint(xs.length + k - r) + val chunk1 = if(from > 0) min(from, xs.length) else 0 + if(chunk1 > 0) b.addAll(xs, 0, chunk1) + b ++= other + val remaining = xs.length - chunk1 - r + if(remaining > 0) b.addAll(xs, xs.length - remaining, remaining) + b.result() + } + + /** Converts an array of pairs into an array of first elements and an array of second elements. + * + * @tparam A1 the type of the first half of the element pairs + * @tparam A2 the type of the second half of the element pairs + * @param asPair an implicit conversion which asserts that the element type + * of this Array is a pair. + * @param ct1 a class tag for `A1` type parameter that is required to create an instance + * of `Array[A1]` + * @param ct2 a class tag for `A2` type parameter that is required to create an instance + * of `Array[A2]` + * @return a pair of Arrays, containing, respectively, the first and second half + * of each element pair of this Array. + */ + def unzip[A1, A2](implicit asPair: A => (A1, A2), ct1: ClassTag[A1], ct2: ClassTag[A2]): (Array[A1], Array[A2]) = { + val a1 = new Array[A1](xs.length) + val a2 = new Array[A2](xs.length) + var i = 0 + while (i < xs.length) { + val e = asPair(xs(i)) + a1(i) = e._1 + a2(i) = e._2 + i += 1 + } + (a1, a2) + } + + /** Converts an array of triples into three arrays, one containing the elements from each position of the triple. + * + * @tparam A1 the type of the first of three elements in the triple + * @tparam A2 the type of the second of three elements in the triple + * @tparam A3 the type of the third of three elements in the triple + * @param asTriple an implicit conversion which asserts that the element type + * of this Array is a triple. + * @param ct1 a class tag for T1 type parameter that is required to create an instance + * of Array[T1] + * @param ct2 a class tag for T2 type parameter that is required to create an instance + * of Array[T2] + * @param ct3 a class tag for T3 type parameter that is required to create an instance + * of Array[T3] + * @return a triple of Arrays, containing, respectively, the first, second, and third + * elements from each element triple of this Array. + */ + def unzip3[A1, A2, A3](implicit asTriple: A => (A1, A2, A3), ct1: ClassTag[A1], ct2: ClassTag[A2], + ct3: ClassTag[A3]): (Array[A1], Array[A2], Array[A3]) = { + val a1 = new Array[A1](xs.length) + val a2 = new Array[A2](xs.length) + val a3 = new Array[A3](xs.length) + var i = 0 + while (i < xs.length) { + val e = asTriple(xs(i)) + a1(i) = e._1 + a2(i) = e._2 + a3(i) = e._3 + i += 1 + } + (a1, a2, a3) + } + + /** Transposes a two dimensional array. + * + * @tparam B Type of row elements. + * @param asArray A function that converts elements of this array to rows - arrays of type `B`. + * @return An array obtained by replacing elements of this arrays with rows the represent. + */ + def transpose[B](implicit asArray: A => Array[B]): Array[Array[B]] = { + val aClass = xs.getClass.getComponentType + val bb = new ArrayBuilder.ofRef[Array[B]]()(ClassTag[Array[B]](aClass)) + if (xs.length == 0) bb.result() + else { + def mkRowBuilder() = ArrayBuilder.make[B](using ClassTag[B](aClass.getComponentType)) + val bs = new ArrayOps(asArray(xs(0))).map((x: B) => mkRowBuilder()) + for (xs <- this) { + var i = 0 + for (x <- new ArrayOps(asArray(xs))) { + bs(i) += x + i += 1 + } + } + for (b <- new ArrayOps(bs)) bb += b.result() + bb.result() + } + } + + /** Apply `f` to each element for its side effects. + * Note: [U] parameter needed to help scalac's type inference. + */ + def foreach[U](f: A => U): Unit = { + val len = xs.length + var i = 0 + (xs: Any @unchecked) match { + case xs: Array[AnyRef] => while (i < len) { f(xs(i).asInstanceOf[A]); i = i+1 } + case xs: Array[Int] => while (i < len) { f(xs(i).asInstanceOf[A]); i = i+1 } + case xs: Array[Double] => while (i < len) { f(xs(i).asInstanceOf[A]); i = i+1 } + case xs: Array[Long] => while (i < len) { f(xs(i).asInstanceOf[A]); i = i+1 } + case xs: Array[Float] => while (i < len) { f(xs(i).asInstanceOf[A]); i = i+1 } + case xs: Array[Char] => while (i < len) { f(xs(i).asInstanceOf[A]); i = i+1 } + case xs: Array[Byte] => while (i < len) { f(xs(i).asInstanceOf[A]); i = i+1 } + case xs: Array[Short] => while (i < len) { f(xs(i).asInstanceOf[A]); i = i+1 } + case xs: Array[Boolean] => while (i < len) { f(xs(i).asInstanceOf[A]); i = i+1 } + } + } + + /** Selects all the elements of this array ignoring the duplicates. + * + * @return a new array consisting of all the elements of this array without duplicates. + */ + def distinct: Array[A] = distinctBy(identity) + + /** Selects all the elements of this array ignoring the duplicates as determined by `==` after applying + * the transforming function `f`. + * + * @param f The transforming function whose result is used to determine the uniqueness of each element + * @tparam B the type of the elements after being transformed by `f` + * @return a new array consisting of all the elements of this array without duplicates. + */ + def distinctBy[B](f: A => B): Array[A] = + ArrayBuilder.make[A].addAll(iterator.distinctBy(f)).result() + + /** A copy of this array with an element value appended until a given target length is reached. + * + * @param len the target length + * @param elem the padding value + * @tparam B the element type of the returned array. + * @return a new array consisting of + * all elements of this array followed by the minimal number of occurrences of `elem` so + * that the resulting collection has a length of at least `len`. + */ + def padTo[B >: A : ClassTag](len: Int, elem: B): Array[B] = { + var i = xs.length + val newlen = max(i, len) + val dest = Array.copyAs[B](xs, newlen) + while(i < newlen) { + dest(i) = elem + i += 1 + } + dest + } + + /** Produces the range of all indices of this sequence. + * + * @return a `Range` value from `0` to one less than the length of this array. + */ + def indices: Range = Range(0, xs.length) + + /** Partitions this array into a map of arrays according to some discriminator function. + * + * @param f the discriminator function. + * @tparam K the type of keys returned by the discriminator function. + * @return A map from keys to arrays such that the following invariant holds: + * {{{ + * (xs groupBy f)(k) = xs filter (x => f(x) == k) + * }}} + * That is, every key `k` is bound to an array of those elements `x` + * for which `f(x)` equals `k`. + */ + def groupBy[K](f: A => K): immutable.Map[K, Array[A]] = { + val m = mutable.Map.empty[K, ArrayBuilder[A]] + val len = xs.length + var i = 0 + while(i < len) { + val elem = xs(i) + val key = f(elem) + val bldr = m.getOrElseUpdate(key, ArrayBuilder.make[A]) + bldr += elem + i += 1 + } + m.view.mapValues(_.result()).toMap + } + + /** + * Partitions this array into a map of arrays according to a discriminator function `key`. + * Each element in a group is transformed into a value of type `B` using the `value` function. + * + * It is equivalent to `groupBy(key).mapValues(_.map(f))`, but more efficient. + * + * {{{ + * case class User(name: String, age: Int) + * + * def namesByAge(users: Array[User]): Map[Int, Array[String]] = + * users.groupMap(_.age)(_.name) + * }}} + * + * @param key the discriminator function + * @param f the element transformation function + * @tparam K the type of keys returned by the discriminator function + * @tparam B the type of values returned by the transformation function + */ + def groupMap[K, B : ClassTag](key: A => K)(f: A => B): immutable.Map[K, Array[B]] = { + val m = mutable.Map.empty[K, ArrayBuilder[B]] + val len = xs.length + var i = 0 + while(i < len) { + val elem = xs(i) + val k = key(elem) + val bldr = m.getOrElseUpdate(k, ArrayBuilder.make[B]) + bldr += f(elem) + i += 1 + } + m.view.mapValues(_.result()).toMap + } + + @`inline` final def toSeq: immutable.Seq[A] = toIndexedSeq + + def toIndexedSeq: immutable.IndexedSeq[A] = + immutable.ArraySeq.unsafeWrapArray(Array.copyOf(xs, xs.length)) + + /** Copy elements of this array to another array. + * Fills the given array `xs` starting at index 0. + * Copying will stop once either all the elements of this array have been copied, + * or the end of the array is reached. + * + * @param xs the array to fill. + * @tparam B the type of the elements of the array. + */ + def copyToArray[B >: A](xs: Array[B]): Int = copyToArray(xs, 0) + + /** Copy elements of this array to another array. + * Fills the given array `xs` starting at index `start`. + * Copying will stop once either all the elements of this array have been copied, + * or the end of the array is reached. + * + * @param xs the array to fill. + * @param start the starting index within the destination array. + * @tparam B the type of the elements of the array. + */ + def copyToArray[B >: A](xs: Array[B], start: Int): Int = copyToArray(xs, start, Int.MaxValue) + + /** Copy elements of this array to another array. + * Fills the given array `xs` starting at index `start` with at most `len` values. + * Copying will stop once either all the elements of this array have been copied, + * or the end of the array is reached, or `len` elements have been copied. + * + * @param xs the array to fill. + * @param start the starting index within the destination array. + * @param len the maximal number of elements to copy. + * @tparam B the type of the elements of the array. + */ + def copyToArray[B >: A](xs: Array[B], start: Int, len: Int): Int = { + val copied = IterableOnce.elemsToCopyToArray(this.xs.length, xs.length, start, len) + if (copied > 0) { + Array.copy(this.xs, 0, xs, start, copied) + } + copied + } + + /** Create a copy of this array with the specified element type. */ + def toArray[B >: A: ClassTag]: Array[B] = { + val destination = new Array[B](xs.length) + @annotation.unused val copied = copyToArray(destination, 0) + //assert(copied == xs.length) + destination + } + + /** Counts the number of elements in this array which satisfy a predicate */ + def count(p: A => Boolean): Int = { + var i, res = 0 + val len = xs.length + while(i < len) { + if(p(xs(i))) res += 1 + i += 1 + } + res + } + + // can't use a default arg because we already have another overload with a default arg + /** Tests whether this array starts with the given array. */ + @`inline` def startsWith[B >: A](that: Array[B]): Boolean = startsWith(that, 0) + + /** Tests whether this array contains the given array at a given index. + * + * @param that the array to test + * @param offset the index where the array is searched. + * @return `true` if the array `that` is contained in this array at + * index `offset`, otherwise `false`. + */ + def startsWith[B >: A](that: Array[B], offset: Int): Boolean = { + val safeOffset = offset.max(0) + val thatl = that.length + if(thatl > xs.length-safeOffset) thatl == 0 + else { + var i = 0 + while(i < thatl) { + if(xs(i+safeOffset) != that(i)) return false + i += 1 + } + true + } + } + + /** Tests whether this array ends with the given array. + * + * @param that the array to test + * @return `true` if this array has `that` as a suffix, `false` otherwise. + */ + def endsWith[B >: A](that: Array[B]): Boolean = { + val thatl = that.length + val off = xs.length - thatl + if(off < 0) false + else { + var i = 0 + while(i < thatl) { + if(xs(i+off) != that(i)) return false + i += 1 + } + true + } + } + + /** A copy of this array with one single replaced element. + * @param index the position of the replacement + * @param elem the replacing element + * @return a new array which is a copy of this array with the element at position `index` replaced by `elem`. + * @throws IndexOutOfBoundsException if `index` does not satisfy `0 <= index < length`. + */ + def updated[B >: A : ClassTag](index: Int, elem: B): Array[B] = { + if(index < 0 || index >= xs.length) throw new IndexOutOfBoundsException(s"$index is out of bounds (min 0, max ${xs.length-1})") + val dest = toArray[B] + dest(index) = elem + dest + } + + @`inline` def view: IndexedSeqView[A] = new ArrayOps.ArrayView[A](xs) + + + /* ************************************************************************************************************ + The remaining methods are provided for completeness but they delegate to mutable.ArraySeq implementations which + may not provide the best possible performance. We need them in `ArrayOps` because their return type + mentions `C` (which is `Array[A]` in `StringOps` and `mutable.ArraySeq[A]` in `mutable.ArraySeq`). + ************************************************************************************************************ */ + + + /** Computes the multiset difference between this array and another sequence. + * + * @param that the sequence of elements to remove + * @return a new array which contains all elements of this array + * except some of occurrences of elements that also appear in `that`. + * If an element value `x` appears + * ''n'' times in `that`, then the first ''n'' occurrences of `x` will not form + * part of the result, but any following occurrences will. + */ + def diff[B >: A](that: Seq[B]): Array[A] = mutable.ArraySeq.make(xs).diff(that).toArray[A] + + /** Computes the multiset intersection between this array and another sequence. + * + * @param that the sequence of elements to intersect with. + * @return a new array which contains all elements of this array + * which also appear in `that`. + * If an element value `x` appears + * ''n'' times in `that`, then the first ''n'' occurrences of `x` will be retained + * in the result, but any following occurrences will be omitted. + */ + def intersect[B >: A](that: Seq[B]): Array[A] = mutable.ArraySeq.make(xs).intersect(that).toArray[A] + + /** Groups elements in fixed size blocks by passing a "sliding window" + * over them (as opposed to partitioning them, as is done in grouped.) + * @see [[scala.collection.Iterator]], method `sliding` + * + * @param size the number of elements per group + * @param step the distance between the first elements of successive groups + * @return An iterator producing arrays of size `size`, except the + * last element (which may be the only element) will be truncated + * if there are fewer than `size` elements remaining to be grouped. + */ + def sliding(size: Int, step: Int = 1): Iterator[Array[A]] = mutable.ArraySeq.make(xs).sliding(size, step).map(_.toArray[A]) + + /** Iterates over combinations of elements. + * + * A '''combination''' of length `n` is a sequence of `n` elements selected in order of their first index in this sequence. + * + * For example, `"xyx"` has two combinations of length 2. The `x` is selected first: `"xx"`, `"xy"`. + * The sequence `"yx"` is not returned as a combination because it is subsumed by `"xy"`. + * + * If there is more than one way to generate the same combination, only one will be returned. + * + * For example, the result `"xy"` arbitrarily selected one of the `x` elements. + * + * As a further illustration, `"xyxx"` has three different ways to generate `"xy"` because there are three elements `x` + * to choose from. Moreover, there are three unordered pairs `"xx"` but only one is returned. + * + * It is not specified which of these equal combinations is returned. It is an implementation detail + * that should not be relied on. For example, the combination `"xx"` does not necessarily contain + * the first `x` in this sequence. This behavior is observable if the elements compare equal + * but are not identical. + * + * As a consequence, `"xyx".combinations(3).next()` is `"xxy"`: the combination does not reflect the order + * of the original sequence, but the order in which elements were selected, by "first index"; + * the order of each `x` element is also arbitrary. + * + * @return An Iterator which traverses the n-element combinations of this array + * @example {{{ + * Array('a', 'b', 'b', 'b', 'c').combinations(2).map(runtime.ScalaRunTime.stringOf).foreach(println) + * // Array(a, b) + * // Array(a, c) + * // Array(b, b) + * // Array(b, c) + * Array('b', 'a', 'b').combinations(2).map(runtime.ScalaRunTime.stringOf).foreach(println) + * // Array(b, b) + * // Array(b, a) + * }}} + */ + def combinations(n: Int): Iterator[Array[A]] = mutable.ArraySeq.make(xs).combinations(n).map(_.toArray[A]) + + /** Iterates over distinct permutations of elements. + * + * @return An Iterator which traverses the distinct permutations of this array. + * @example {{{ + * Array('a', 'b', 'b').permutations.map(runtime.ScalaRunTime.stringOf).foreach(println) + * // Array(a, b, b) + * // Array(b, a, b) + * // Array(b, b, a) + * }}} + */ + def permutations: Iterator[Array[A]] = mutable.ArraySeq.make(xs).permutations.map(_.toArray[A]) + + // we have another overload here, so we need to duplicate this method + /** Tests whether this array contains the given sequence at a given index. + * + * @param that the sequence to test + * @param offset the index where the sequence is searched. + * @return `true` if the sequence `that` is contained in this array at + * index `offset`, otherwise `false`. + */ + def startsWith[B >: A](that: IterableOnce[B], offset: Int = 0): Boolean = mutable.ArraySeq.make(xs).startsWith(that, offset) + + // we have another overload here, so we need to duplicate this method + /** Tests whether this array ends with the given sequence. + * + * @param that the sequence to test + * @return `true` if this array has `that` as a suffix, `false` otherwise. + */ + def endsWith[B >: A](that: Iterable[B]): Boolean = mutable.ArraySeq.make(xs).endsWith(that) +} diff --git a/scala2-library-bootstrapped/src/scala/collection/Factory.scala b/scala2-library-bootstrapped/src/scala/collection/Factory.scala new file mode 100644 index 000000000000..6006f292bb19 --- /dev/null +++ b/scala2-library-bootstrapped/src/scala/collection/Factory.scala @@ -0,0 +1,784 @@ +/* + * Scala (https://www.scala-lang.org) + * + * Copyright EPFL and Lightbend, Inc. + * + * Licensed under Apache License 2.0 + * (http://www.apache.org/licenses/LICENSE-2.0). + * + * See the NOTICE file distributed with this work for + * additional information regarding copyright ownership. + */ + +package scala +package collection + +import scala.collection.immutable.NumericRange +import scala.language.implicitConversions +import scala.collection.mutable.Builder +import scala.annotation.unchecked.uncheckedVariance +import scala.reflect.ClassTag + +/** + * A factory that builds a collection of type `C` with elements of type `A`. + * + * This is a general form of any factory ([[IterableFactory]], + * [[SortedIterableFactory]], [[MapFactory]] and [[SortedMapFactory]]) whose + * element type is fixed. + * + * @tparam A Type of elements (e.g. `Int`, `Boolean`, etc.) + * @tparam C Type of collection (e.g. `List[Int]`, `TreeMap[Int, String]`, etc.) + */ +trait Factory[-A, +C] extends Any { + + /** + * @return A collection of type `C` containing the same elements + * as the source collection `it`. + * @param it Source collection + */ + def fromSpecific(it: IterableOnce[A]): C + + /** Get a Builder for the collection. For non-strict collection types this will use an intermediate buffer. + * Building collections with `fromSpecific` is preferred because it can be lazy for lazy collections. */ + def newBuilder: Builder[A, C] +} + +object Factory { + + implicit val stringFactory: Factory[Char, String] = new StringFactory + @SerialVersionUID(3L) + private class StringFactory extends Factory[Char, String] with Serializable { + def fromSpecific(it: IterableOnce[Char]): String = { + val b = new mutable.StringBuilder(scala.math.max(0, it.knownSize)) + b ++= it + b.result() + } + def newBuilder: Builder[Char, String] = new mutable.StringBuilder() + } + + implicit def arrayFactory[A: ClassTag]: Factory[A, Array[A]] = new ArrayFactory[A] + @SerialVersionUID(3L) + private class ArrayFactory[A: ClassTag] extends Factory[A, Array[A]] with Serializable { + def fromSpecific(it: IterableOnce[A]): Array[A] = { + val b = newBuilder + b.sizeHint(scala.math.max(0, it.knownSize)) + b ++= it + b.result() + } + def newBuilder: Builder[A, Array[A]] = mutable.ArrayBuilder.make[A] + } + +} + +/** Base trait for companion objects of unconstrained collection types that may require + * multiple traversals of a source collection to build a target collection `CC`. + * + * @tparam CC Collection type constructor (e.g. `List`) + * @define factoryInfo + * This object provides a set of operations to create $Coll values. + * + * @define coll collection + * @define Coll `Iterable` + */ +trait IterableFactory[+CC[_]] extends Serializable { + + /** Creates a target $coll from an existing source collection + * + * @param source Source collection + * @tparam A the type of the collection’s elements + * @return a new $coll with the elements of `source` + */ + def from[A](source: IterableOnce[A]): CC[A] + + /** An empty collection + * @tparam A the type of the ${coll}'s elements + */ + def empty[A]: CC[A] + + /** Creates a $coll with the specified elements. + * @tparam A the type of the ${coll}'s elements + * @param elems the elements of the created $coll + * @return a new $coll with elements `elems` + */ + def apply[A](elems: A*): CC[A] = from(elems) + + /** Produces a $coll containing repeated applications of a function to a start value. + * + * @param start the start value of the $coll + * @param len the number of elements contained in the $coll + * @param f the function that's repeatedly applied + * @return a $coll with `len` values in the sequence `start, f(start), f(f(start)), ...` + */ + def iterate[A](start: A, len: Int)(f: A => A): CC[A] = from(new View.Iterate(start, len)(f)) + + /** Produces a $coll that uses a function `f` to produce elements of type `A` + * and update an internal state of type `S`. + * + * @param init State initial value + * @param f Computes the next element (or returns `None` to signal + * the end of the collection) + * @tparam A Type of the elements + * @tparam S Type of the internal state + * @return a $coll that produces elements using `f` until `f` returns `None` + */ + def unfold[A, S](init: S)(f: S => Option[(A, S)]): CC[A] = from(new View.Unfold(init)(f)) + + /** Produces a $coll containing a sequence of increasing of integers. + * + * @param start the first element of the $coll + * @param end the end value of the $coll (the first value NOT contained) + * @return a $coll with values `start, start + 1, ..., end - 1` + */ + def range[A : Integral](start: A, end: A): CC[A] = from(NumericRange(start, end, implicitly[Integral[A]].one)) + + /** Produces a $coll containing equally spaced values in some integer interval. + * @param start the start value of the $coll + * @param end the end value of the $coll (the first value NOT contained) + * @param step the difference between successive elements of the $coll (must be positive or negative) + * @return a $coll with values `start, start + step, ...` up to, but excluding `end` + */ + def range[A : Integral](start: A, end: A, step: A): CC[A] = from(NumericRange(start, end, step)) + + /** + * @return A builder for $Coll objects. + * @tparam A the type of the ${coll}’s elements + */ + def newBuilder[A]: Builder[A, CC[A]] + + /** Produces a $coll containing the results of some element computation a number of times. + * @param n the number of elements contained in the $coll. + * @param elem the element computation + * @return A $coll that contains the results of `n` evaluations of `elem`. + */ + def fill[A](n: Int)(elem: => A): CC[A] = from(new View.Fill(n)(elem)) + + /** Produces a two-dimensional $coll containing the results of some element computation a number of times. + * @param n1 the number of elements in the 1st dimension + * @param n2 the number of elements in the 2nd dimension + * @param elem the element computation + * @return A $coll that contains the results of `n1 x n2` evaluations of `elem`. + */ + def fill[A](n1: Int, n2: Int)(elem: => A): CC[CC[A] @uncheckedVariance] = fill(n1)(fill(n2)(elem)) + + /** Produces a three-dimensional $coll containing the results of some element computation a number of times. + * @param n1 the number of elements in the 1st dimension + * @param n2 the number of elements in the 2nd dimension + * @param n3 the number of elements in the 3rd dimension + * @param elem the element computation + * @return A $coll that contains the results of `n1 x n2 x n3` evaluations of `elem`. + */ + def fill[A](n1: Int, n2: Int, n3: Int)(elem: => A): CC[CC[CC[A]] @uncheckedVariance] = fill(n1)(fill(n2, n3)(elem)) + + /** Produces a four-dimensional $coll containing the results of some element computation a number of times. + * @param n1 the number of elements in the 1st dimension + * @param n2 the number of elements in the 2nd dimension + * @param n3 the number of elements in the 3rd dimension + * @param n4 the number of elements in the 4th dimension + * @param elem the element computation + * @return A $coll that contains the results of `n1 x n2 x n3 x n4` evaluations of `elem`. + */ + def fill[A](n1: Int, n2: Int, n3: Int, n4: Int)(elem: => A): CC[CC[CC[CC[A]]] @uncheckedVariance] = + fill(n1)(fill(n2, n3, n4)(elem)) + + /** Produces a five-dimensional $coll containing the results of some element computation a number of times. + * @param n1 the number of elements in the 1st dimension + * @param n2 the number of elements in the 2nd dimension + * @param n3 the number of elements in the 3rd dimension + * @param n4 the number of elements in the 4th dimension + * @param n5 the number of elements in the 5th dimension + * @param elem the element computation + * @return A $coll that contains the results of `n1 x n2 x n3 x n4 x n5` evaluations of `elem`. + */ + def fill[A](n1: Int, n2: Int, n3: Int, n4: Int, n5: Int)(elem: => A): CC[CC[CC[CC[CC[A]]]] @uncheckedVariance] = + fill(n1)(fill(n2, n3, n4, n5)(elem)) + + /** Produces a $coll containing values of a given function over a range of integer values starting from 0. + * @param n The number of elements in the $coll + * @param f The function computing element values + * @return A $coll consisting of elements `f(0), ..., f(n -1)` + */ + def tabulate[A](n: Int)(f: Int => A): CC[A] = from(new View.Tabulate(n)(f)) + + /** Produces a two-dimensional $coll containing values of a given function over ranges of integer values starting from 0. + * @param n1 the number of elements in the 1st dimension + * @param n2 the number of elements in the 2nd dimension + * @param f The function computing element values + * @return A $coll consisting of elements `f(i1, i2)` + * for `0 <= i1 < n1` and `0 <= i2 < n2`. + */ + def tabulate[A](n1: Int, n2: Int)(f: (Int, Int) => A): CC[CC[A] @uncheckedVariance] = + tabulate(n1)(i1 => tabulate(n2)(f(i1, _))) + + /** Produces a three-dimensional $coll containing values of a given function over ranges of integer values starting from 0. + * @param n1 the number of elements in the 1st dimension + * @param n2 the number of elements in the 2nd dimension + * @param n3 the number of elements in the 3rd dimension + * @param f The function computing element values + * @return A $coll consisting of elements `f(i1, i2, i3)` + * for `0 <= i1 < n1`, `0 <= i2 < n2`, and `0 <= i3 < n3`. + */ + def tabulate[A](n1: Int, n2: Int, n3: Int)(f: (Int, Int, Int) => A): CC[CC[CC[A]] @uncheckedVariance] = + tabulate(n1)(i1 => tabulate(n2, n3)(f(i1, _, _))) + + /** Produces a four-dimensional $coll containing values of a given function over ranges of integer values starting from 0. + * @param n1 the number of elements in the 1st dimension + * @param n2 the number of elements in the 2nd dimension + * @param n3 the number of elements in the 3rd dimension + * @param n4 the number of elements in the 4th dimension + * @param f The function computing element values + * @return A $coll consisting of elements `f(i1, i2, i3, i4)` + * for `0 <= i1 < n1`, `0 <= i2 < n2`, `0 <= i3 < n3`, and `0 <= i4 < n4`. + */ + def tabulate[A](n1: Int, n2: Int, n3: Int, n4: Int)(f: (Int, Int, Int, Int) => A): CC[CC[CC[CC[A]]] @uncheckedVariance] = + tabulate(n1)(i1 => tabulate(n2, n3, n4)(f(i1, _, _, _))) + + /** Produces a five-dimensional $coll containing values of a given function over ranges of integer values starting from 0. + * @param n1 the number of elements in the 1st dimension + * @param n2 the number of elements in the 2nd dimension + * @param n3 the number of elements in the 3rd dimension + * @param n4 the number of elements in the 4th dimension + * @param n5 the number of elements in the 5th dimension + * @param f The function computing element values + * @return A $coll consisting of elements `f(i1, i2, i3, i4, i5)` + * for `0 <= i1 < n1`, `0 <= i2 < n2`, `0 <= i3 < n3`, `0 <= i4 < n4`, and `0 <= i5 < n5`. + */ + def tabulate[A](n1: Int, n2: Int, n3: Int, n4: Int, n5: Int)(f: (Int, Int, Int, Int, Int) => A): CC[CC[CC[CC[CC[A]]]] @uncheckedVariance] = + tabulate(n1)(i1 => tabulate(n2, n3, n4, n5)(f(i1, _, _, _, _))) + + /** Concatenates all argument collections into a single $coll. + * + * @param xss the collections that are to be concatenated. + * @return the concatenation of all the collections. + */ + def concat[A](xss: Iterable[A]*): CC[A] = { + from(xss.foldLeft(View.empty[A])(_ ++ _)) + } + + implicit def iterableFactory[A]: Factory[A, CC[A]] = IterableFactory.toFactory(this) +} + +object IterableFactory { + + /** + * Fixes the element type of `factory` to `A` + * @param factory The factory to fix the element type + * @tparam A Type of elements + * @tparam CC Collection type constructor of the factory (e.g. `Seq`, `List`) + * @return A [[Factory]] that uses the given `factory` to build a collection of elements + * of type `A` + */ + implicit def toFactory[A, CC[_]](factory: IterableFactory[CC]): Factory[A, CC[A]] = new ToFactory[A, CC](factory) + + @SerialVersionUID(3L) + private[this] class ToFactory[A, CC[_]](factory: IterableFactory[CC]) extends Factory[A, CC[A]] with Serializable { + def fromSpecific(it: IterableOnce[A]): CC[A] = factory.from[A](it) + def newBuilder: Builder[A, CC[A]] = factory.newBuilder[A] + } + + implicit def toBuildFrom[A, CC[_]](factory: IterableFactory[CC]): BuildFrom[Any, A, CC[A]] = + new BuildFrom[Any, A, CC[A]] { + def fromSpecific(from: Any)(it: IterableOnce[A]) = factory.from(it) + def newBuilder(from: Any) = factory.newBuilder + } + + @SerialVersionUID(3L) + class Delegate[CC[_]](delegate: IterableFactory[CC]) extends IterableFactory[CC] { + override def apply[A](elems: A*): CC[A] = delegate.apply(elems: _*) + def empty[A]: CC[A] = delegate.empty + def from[E](it: IterableOnce[E]): CC[E] = delegate.from(it) + def newBuilder[A]: Builder[A, CC[A]] = delegate.newBuilder[A] + } +} + +/** + * @tparam CC Collection type constructor (e.g. `List`) + */ +trait SeqFactory[+CC[A] <: SeqOps[A, Seq, Seq[A]]] extends IterableFactory[CC] { + import SeqFactory.UnapplySeqWrapper + final def unapplySeq[A](x: CC[A] @uncheckedVariance): UnapplySeqWrapper[A] = new UnapplySeqWrapper(x) // TODO is uncheckedVariance sound here? +} + +object SeqFactory { + @SerialVersionUID(3L) + class Delegate[CC[A] <: SeqOps[A, Seq, Seq[A]]](delegate: SeqFactory[CC]) extends SeqFactory[CC] { + override def apply[A](elems: A*): CC[A] = delegate.apply(elems: _*) + def empty[A]: CC[A] = delegate.empty + def from[E](it: IterableOnce[E]): CC[E] = delegate.from(it) + def newBuilder[A]: Builder[A, CC[A]] = delegate.newBuilder[A] + } + + final class UnapplySeqWrapper[A](private val c: SeqOps[A, Seq, Seq[A]]) extends AnyVal { + def isEmpty: false = false + def get: UnapplySeqWrapper[A] = this + def lengthCompare(len: Int): Int = c.lengthCompare(len) + def apply(i: Int): A = c(i) + def drop(n: Int): scala.Seq[A] = c match { + case seq: scala.Seq[A] => seq.drop(n) + case _ => c.view.drop(n).toSeq + } + def toSeq: scala.Seq[A] = c.toSeq + } +} + +trait StrictOptimizedSeqFactory[+CC[A] <: SeqOps[A, Seq, Seq[A]]] extends SeqFactory[CC] { + + override def fill[A](n: Int)(elem: => A): CC[A] = { + val b = newBuilder[A] + b.sizeHint(n) + var i = 0 + while (i < n) { + b += elem + i += 1 + } + b.result() + } + + override def tabulate[A](n: Int)(f: Int => A): CC[A] = { + val b = newBuilder[A] + b.sizeHint(n) + var i = 0 + while (i < n) { + b += f(i) + i += 1 + } + b.result() + } + + override def concat[A](xss: Iterable[A]*): CC[A] = { + val b = newBuilder[A] + val knownSizes = xss.view.map(_.knownSize) + if (knownSizes forall (_ >= 0)) { + b.sizeHint(knownSizes.sum) + } + for (xs <- xss) b ++= xs + b.result() + } + +} + +/** + * @tparam A Type of elements (e.g. `Int`, `Boolean`, etc.) + * @tparam C Type of collection (e.g. `List[Int]`, `TreeMap[Int, String]`, etc.) + * @define factoryInfo + * This object provides a set of operations to create $Coll values. + * + * @define coll collection + * @define Coll `Iterable` + */ +trait SpecificIterableFactory[-A, +C] extends Factory[A, C] { + def empty: C + def apply(xs: A*): C = fromSpecific(xs) + def fill(n: Int)(elem: => A): C = fromSpecific(new View.Fill(n)(elem)) + def newBuilder: Builder[A, C] + + implicit def specificIterableFactory: Factory[A, C] = this +} + +/** + * @define factoryInfo + * This object provides a set of operations to create $Coll values. + * + * @define coll collection + * @define Coll `Iterable` + */ +trait MapFactory[+CC[_, _]] extends Serializable { + + /** + * An empty Map + */ + def empty[K, V]: CC[K, V] + + /** + * A collection of type Map generated from given iterable object. + */ + def from[K, V](it: IterableOnce[(K, V)]): CC[K, V] + + /** + * A collection of type Map that contains given key/value bindings. + */ + def apply[K, V](elems: (K, V)*): CC[K, V] = from(elems) + + /** + * The default builder for Map objects. + */ + def newBuilder[K, V]: Builder[(K, V), CC[K, V]] + + /** + * The default Factory instance for maps. + */ + implicit def mapFactory[K, V]: Factory[(K, V), CC[K, V]] = MapFactory.toFactory(this) +} + +object MapFactory { + + /** + * Fixes the key and value types of `factory` to `K` and `V`, respectively + * @param factory The factory to fix the key and value types + * @tparam K Type of keys + * @tparam V Type of values + * @tparam CC Collection type constructor of the factory (e.g. `Map`, `HashMap`, etc.) + * @return A [[Factory]] that uses the given `factory` to build a map with keys of type `K` + * and values of type `V` + */ + implicit def toFactory[K, V, CC[_, _]](factory: MapFactory[CC]): Factory[(K, V), CC[K, V]] = new ToFactory[K, V, CC](factory) + + @SerialVersionUID(3L) + private[this] class ToFactory[K, V, CC[_, _]](factory: MapFactory[CC]) extends Factory[(K, V), CC[K, V]] with Serializable { + def fromSpecific(it: IterableOnce[(K, V)]): CC[K, V] = factory.from[K, V](it) + def newBuilder: Builder[(K, V), CC[K, V]] = factory.newBuilder[K, V] + } + + implicit def toBuildFrom[K, V, CC[_, _]](factory: MapFactory[CC]): BuildFrom[Any, (K, V), CC[K, V]] = + new BuildFrom[Any, (K, V), CC[K, V]] { + def fromSpecific(from: Any)(it: IterableOnce[(K, V)]) = factory.from(it) + def newBuilder(from: Any) = factory.newBuilder[K, V] + } + + @SerialVersionUID(3L) + class Delegate[C[_, _]](delegate: MapFactory[C]) extends MapFactory[C] { + override def apply[K, V](elems: (K, V)*): C[K, V] = delegate.apply(elems: _*) + def from[K, V](it: IterableOnce[(K, V)]): C[K, V] = delegate.from(it) + def empty[K, V]: C[K, V] = delegate.empty + def newBuilder[K, V]: Builder[(K, V), C[K, V]] = delegate.newBuilder + } +} + +/** Base trait for companion objects of collections that require an implicit evidence. + * @tparam CC Collection type constructor (e.g. `ArraySeq`) + * @tparam Ev Unary type constructor for the implicit evidence required for an element type + * (typically `Ordering` or `ClassTag`) + * + * @define factoryInfo + * This object provides a set of operations to create $Coll values. + * + * @define coll collection + * @define Coll `Iterable` + */ +trait EvidenceIterableFactory[+CC[_], Ev[_]] extends Serializable { + + def from[E : Ev](it: IterableOnce[E]): CC[E] + + def empty[A : Ev]: CC[A] + + def apply[A : Ev](xs: A*): CC[A] = from(xs) + + /** Produces a $coll containing the results of some element computation a number of times. + * @param n the number of elements contained in the $coll. + * @param elem the element computation + * @return A $coll that contains the results of `n` evaluations of `elem`. + */ + def fill[A : Ev](n: Int)(elem: => A): CC[A] = from(new View.Fill(n)(elem)) + + /** Produces a $coll containing values of a given function over a range of integer values starting from 0. + * @param n The number of elements in the $coll + * @param f The function computing element values + * @return A $coll consisting of elements `f(0), ..., f(n -1)` + */ + def tabulate[A : Ev](n: Int)(f: Int => A): CC[A] = from(new View.Tabulate(n)(f)) + + /** Produces a $coll containing repeated applications of a function to a start value. + * + * @param start the start value of the $coll + * @param len the number of elements contained in the $coll + * @param f the function that's repeatedly applied + * @return a $coll with `len` values in the sequence `start, f(start), f(f(start)), ...` + */ + def iterate[A : Ev](start: A, len: Int)(f: A => A): CC[A] = from(new View.Iterate(start, len)(f)) + + /** Produces a $coll that uses a function `f` to produce elements of type `A` + * and update an internal state of type `S`. + * + * @param init State initial value + * @param f Computes the next element (or returns `None` to signal + * the end of the collection) + * @tparam A Type of the elements + * @tparam S Type of the internal state + * @return a $coll that produces elements using `f` until `f` returns `None` + */ + def unfold[A : Ev, S](init: S)(f: S => Option[(A, S)]): CC[A] = from(new View.Unfold(init)(f)) + + def newBuilder[A : Ev]: Builder[A, CC[A]] + + implicit def evidenceIterableFactory[A : Ev]: Factory[A, CC[A]] = EvidenceIterableFactory.toFactory(this) +} + +object EvidenceIterableFactory { + + /** + * Fixes the element type of `factory` to `A` + * @param factory The factory to fix the element type + * @tparam A Type of elements + * @tparam CC Collection type constructor of the factory (e.g. `TreeSet`) + * @tparam Ev Type constructor of the evidence (usually `Ordering` or `ClassTag`) + * @return A [[Factory]] that uses the given `factory` to build a collection of elements + * of type `A` + */ + implicit def toFactory[Ev[_], A: Ev, CC[_]](factory: EvidenceIterableFactory[CC, Ev]): Factory[A, CC[A]] = new ToFactory[Ev, A, CC](factory) + + @SerialVersionUID(3L) + private[this] class ToFactory[Ev[_], A: Ev, CC[_]](factory: EvidenceIterableFactory[CC, Ev]) extends Factory[A, CC[A]] with Serializable { + def fromSpecific(it: IterableOnce[A]): CC[A] = factory.from[A](it) + def newBuilder: Builder[A, CC[A]] = factory.newBuilder[A] + } + + implicit def toBuildFrom[Ev[_], A: Ev, CC[_]](factory: EvidenceIterableFactory[CC, Ev]): BuildFrom[Any, A, CC[A]] = new EvidenceIterableFactoryToBuildFrom(factory) + private class EvidenceIterableFactoryToBuildFrom[Ev[_], A: Ev, CC[_]](factory: EvidenceIterableFactory[CC, Ev]) extends BuildFrom[Any, A, CC[A]] { + def fromSpecific(from: Any)(it: IterableOnce[A]): CC[A] = factory.from[A](it) + def newBuilder(from: Any): Builder[A, CC[A]] = factory.newBuilder[A] + } + + @SerialVersionUID(3L) + class Delegate[CC[_], Ev[_]](delegate: EvidenceIterableFactory[CC, Ev]) extends EvidenceIterableFactory[CC, Ev] { + override def apply[A: Ev](xs: A*): CC[A] = delegate.apply(xs: _*) + def empty[A : Ev]: CC[A] = delegate.empty + def from[E : Ev](it: IterableOnce[E]): CC[E] = delegate.from(it) + def newBuilder[A : Ev]: Builder[A, CC[A]] = delegate.newBuilder[A] + } +} + +/** Base trait for companion objects of collections that require an implicit `Ordering`. + * @tparam CC Collection type constructor (e.g. `SortedSet`) + */ +trait SortedIterableFactory[+CC[_]] extends EvidenceIterableFactory[CC, Ordering] + +object SortedIterableFactory { + @SerialVersionUID(3L) + class Delegate[CC[_]](delegate: EvidenceIterableFactory[CC, Ordering]) + extends EvidenceIterableFactory.Delegate[CC, Ordering](delegate) with SortedIterableFactory[CC] +} + +/** Base trait for companion objects of collections that require an implicit `ClassTag`. + * @tparam CC Collection type constructor (e.g. `ArraySeq`) + */ +trait ClassTagIterableFactory[+CC[_]] extends EvidenceIterableFactory[CC, ClassTag] { + + @`inline` private[this] implicit def ccClassTag[X]: ClassTag[CC[X]] = + ClassTag.AnyRef.asInstanceOf[ClassTag[CC[X]]] // Good enough for boxed vs primitive arrays + + /** Produces a $coll containing a sequence of increasing of integers. + * + * @param start the first element of the $coll + * @param end the end value of the $coll (the first value NOT contained) + * @return a $coll with values `start, start + 1, ..., end - 1` + */ + def range[A : Integral : ClassTag](start: A, end: A): CC[A] = from(NumericRange(start, end, implicitly[Integral[A]].one)) + + /** Produces a $coll containing equally spaced values in some integer interval. + * @param start the start value of the $coll + * @param end the end value of the $coll (the first value NOT contained) + * @param step the difference between successive elements of the $coll (must be positive or negative) + * @return a $coll with values `start, start + step, ...` up to, but excluding `end` + */ + def range[A : Integral : ClassTag](start: A, end: A, step: A): CC[A] = from(NumericRange(start, end, step)) + + /** Produces a two-dimensional $coll containing the results of some element computation a number of times. + * @param n1 the number of elements in the 1st dimension + * @param n2 the number of elements in the 2nd dimension + * @param elem the element computation + * @return A $coll that contains the results of `n1 x n2` evaluations of `elem`. + */ + def fill[A : ClassTag](n1: Int, n2: Int)(elem: => A): CC[CC[A] @uncheckedVariance] = fill(n1)(fill(n2)(elem)) + + /** Produces a three-dimensional $coll containing the results of some element computation a number of times. + * @param n1 the number of elements in the 1st dimension + * @param n2 the number of elements in the 2nd dimension + * @param n3 the number of elements in the 3rd dimension + * @param elem the element computation + * @return A $coll that contains the results of `n1 x n2 x n3` evaluations of `elem`. + */ + def fill[A : ClassTag](n1: Int, n2: Int, n3: Int)(elem: => A): CC[CC[CC[A]] @uncheckedVariance] = fill(n1)(fill(n2, n3)(elem)) + + /** Produces a four-dimensional $coll containing the results of some element computation a number of times. + * @param n1 the number of elements in the 1st dimension + * @param n2 the number of elements in the 2nd dimension + * @param n3 the number of elements in the 3rd dimension + * @param n4 the number of elements in the 4th dimension + * @param elem the element computation + * @return A $coll that contains the results of `n1 x n2 x n3 x n4` evaluations of `elem`. + */ + def fill[A : ClassTag](n1: Int, n2: Int, n3: Int, n4: Int)(elem: => A): CC[CC[CC[CC[A]]] @uncheckedVariance] = + fill(n1)(fill(n2, n3, n4)(elem)) + + /** Produces a five-dimensional $coll containing the results of some element computation a number of times. + * @param n1 the number of elements in the 1st dimension + * @param n2 the number of elements in the 2nd dimension + * @param n3 the number of elements in the 3rd dimension + * @param n4 the number of elements in the 4th dimension + * @param n5 the number of elements in the 5th dimension + * @param elem the element computation + * @return A $coll that contains the results of `n1 x n2 x n3 x n4 x n5` evaluations of `elem`. + */ + def fill[A : ClassTag](n1: Int, n2: Int, n3: Int, n4: Int, n5: Int)(elem: => A): CC[CC[CC[CC[CC[A]]]] @uncheckedVariance] = + fill(n1)(fill(n2, n3, n4, n5)(elem)) + + /** Produces a two-dimensional $coll containing values of a given function over ranges of integer values starting from 0. + * @param n1 the number of elements in the 1st dimension + * @param n2 the number of elements in the 2nd dimension + * @param f The function computing element values + * @return A $coll consisting of elements `f(i1, i2)` + * for `0 <= i1 < n1` and `0 <= i2 < n2`. + */ + def tabulate[A : ClassTag](n1: Int, n2: Int)(f: (Int, Int) => A): CC[CC[A] @uncheckedVariance] = + tabulate(n1)(i1 => tabulate(n2)(f(i1, _))) + + /** Produces a three-dimensional $coll containing values of a given function over ranges of integer values starting from 0. + * @param n1 the number of elements in the 1st dimension + * @param n2 the number of elements in the 2nd dimension + * @param n3 the number of elements in the 3rd dimension + * @param f The function computing element values + * @return A $coll consisting of elements `f(i1, i2, i3)` + * for `0 <= i1 < n1`, `0 <= i2 < n2`, and `0 <= i3 < n3`. + */ + def tabulate[A : ClassTag](n1: Int, n2: Int, n3: Int)(f: (Int, Int, Int) => A): CC[CC[CC[A]] @uncheckedVariance] = + tabulate(n1)(i1 => tabulate(n2, n3)(f(i1, _, _))) + + /** Produces a four-dimensional $coll containing values of a given function over ranges of integer values starting from 0. + * @param n1 the number of elements in the 1st dimension + * @param n2 the number of elements in the 2nd dimension + * @param n3 the number of elements in the 3rd dimension + * @param n4 the number of elements in the 4th dimension + * @param f The function computing element values + * @return A $coll consisting of elements `f(i1, i2, i3, i4)` + * for `0 <= i1 < n1`, `0 <= i2 < n2`, `0 <= i3 < n3`, and `0 <= i4 < n4`. + */ + def tabulate[A : ClassTag](n1: Int, n2: Int, n3: Int, n4: Int)(f: (Int, Int, Int, Int) => A): CC[CC[CC[CC[A]]] @uncheckedVariance] = + tabulate(n1)(i1 => tabulate(n2, n3, n4)(f(i1, _, _, _))) + + /** Produces a five-dimensional $coll containing values of a given function over ranges of integer values starting from 0. + * @param n1 the number of elements in the 1st dimension + * @param n2 the number of elements in the 2nd dimension + * @param n3 the number of elements in the 3rd dimension + * @param n4 the number of elements in the 4th dimension + * @param n5 the number of elements in the 5th dimension + * @param f The function computing element values + * @return A $coll consisting of elements `f(i1, i2, i3, i4, i5)` + * for `0 <= i1 < n1`, `0 <= i2 < n2`, `0 <= i3 < n3`, `0 <= i4 < n4`, and `0 <= i5 < n5`. + */ + def tabulate[A : ClassTag](n1: Int, n2: Int, n3: Int, n4: Int, n5: Int)(f: (Int, Int, Int, Int, Int) => A): CC[CC[CC[CC[CC[A]]]] @uncheckedVariance] = + tabulate(n1)(i1 => tabulate(n2, n3, n4, n5)(f(i1, _, _, _, _))) +} + +object ClassTagIterableFactory { + @SerialVersionUID(3L) + class Delegate[CC[_]](delegate: EvidenceIterableFactory[CC, ClassTag]) + extends EvidenceIterableFactory.Delegate[CC, ClassTag](delegate) with ClassTagIterableFactory[CC] + + /** An IterableFactory that uses ClassTag.Any as the evidence for every element type. This may or may not be + * sound depending on the use of the `ClassTag` by the collection implementation. */ + @SerialVersionUID(3L) + class AnyIterableDelegate[CC[_]](delegate: ClassTagIterableFactory[CC]) extends IterableFactory[CC] { + def empty[A]: CC[A] = delegate.empty(using ClassTag.Any).asInstanceOf[CC[A]] + def from[A](it: IterableOnce[A]): CC[A] = delegate.from[Any](it)(using ClassTag.Any).asInstanceOf[CC[A]] + def newBuilder[A]: Builder[A, CC[A]] = delegate.newBuilder(using ClassTag.Any).asInstanceOf[Builder[A, CC[A]]] + override def apply[A](elems: A*): CC[A] = delegate.apply[Any](elems: _*)(using ClassTag.Any).asInstanceOf[CC[A]] + override def iterate[A](start: A, len: Int)(f: A => A): CC[A] = delegate.iterate[A](start, len)(f)(using ClassTag.Any.asInstanceOf[ClassTag[A]]) + override def unfold[A, S](init: S)(f: S => Option[(A, S)]): CC[A] = delegate.unfold[A, S](init)(f)(using ClassTag.Any.asInstanceOf[ClassTag[A]]) + override def range[A](start: A, end: A)(implicit i: Integral[A]): CC[A] = delegate.range[A](start, end)(using i, ClassTag.Any.asInstanceOf[ClassTag[A]]) + override def range[A](start: A, end: A, step: A)(implicit i: Integral[A]): CC[A] = delegate.range[A](start, end, step)(using i, ClassTag.Any.asInstanceOf[ClassTag[A]]) + override def fill[A](n: Int)(elem: => A): CC[A] = delegate.fill[Any](n)(elem)(using ClassTag.Any).asInstanceOf[CC[A]] + override def tabulate[A](n: Int)(f: Int => A): CC[A] = delegate.tabulate[Any](n)(f)(using ClassTag.Any).asInstanceOf[CC[A]] + } +} + +/** + * @tparam CC Collection type constructor (e.g. `ArraySeq`) + */ +trait ClassTagSeqFactory[+CC[A] <: SeqOps[A, Seq, Seq[A]]] extends ClassTagIterableFactory[CC] { + import SeqFactory.UnapplySeqWrapper + final def unapplySeq[A](x: CC[A] @uncheckedVariance): UnapplySeqWrapper[A] = new UnapplySeqWrapper(x) // TODO is uncheckedVariance sound here? +} + +object ClassTagSeqFactory { + @SerialVersionUID(3L) + class Delegate[CC[A] <: SeqOps[A, Seq, Seq[A]]](delegate: ClassTagSeqFactory[CC]) + extends ClassTagIterableFactory.Delegate[CC](delegate) with ClassTagSeqFactory[CC] + + /** A SeqFactory that uses ClassTag.Any as the evidence for every element type. This may or may not be + * sound depending on the use of the `ClassTag` by the collection implementation. */ + @SerialVersionUID(3L) + class AnySeqDelegate[CC[A] <: SeqOps[A, Seq, Seq[A]]](delegate: ClassTagSeqFactory[CC]) + extends ClassTagIterableFactory.AnyIterableDelegate[CC](delegate) with SeqFactory[CC] +} + +trait StrictOptimizedClassTagSeqFactory[+CC[A] <: SeqOps[A, Seq, Seq[A]]] extends ClassTagSeqFactory[CC] { + + override def fill[A : ClassTag](n: Int)(elem: => A): CC[A] = { + val b = newBuilder[A] + b.sizeHint(n) + var i = 0 + while (i < n) { + b += elem + i += 1 + } + b.result() + } + + override def tabulate[A : ClassTag](n: Int)(f: Int => A): CC[A] = { + val b = newBuilder[A] + b.sizeHint(n) + var i = 0 + while (i < n) { + b += f(i) + i += 1 + } + b.result() + } + +} + +/** + * @define factoryInfo + * This object provides a set of operations to create $Coll values. + * + * @define coll collection + * @define Coll `Iterable` + */ +trait SortedMapFactory[+CC[_, _]] extends Serializable { + + def empty[K : Ordering, V]: CC[K, V] + + def from[K : Ordering, V](it: IterableOnce[(K, V)]): CC[K, V] + + def apply[K : Ordering, V](elems: (K, V)*): CC[K, V] = from(elems) + + def newBuilder[K : Ordering, V]: Builder[(K, V), CC[K, V]] + + implicit def sortedMapFactory[K : Ordering, V]: Factory[(K, V), CC[K, V]] = SortedMapFactory.toFactory(this) + +} + +object SortedMapFactory { + + /** + * Implicit conversion that fixes the key and value types of `factory` to `K` and `V`, + * respectively. + * + * @param factory The factory to fix the key and value types + * @tparam K Type of keys + * @tparam V Type of values + * @tparam CC Collection type constructor of the factory (e.g. `TreeMap`) + * @return A [[Factory]] that uses the given `factory` to build a map with keys of + * type `K` and values of type `V` + */ + implicit def toFactory[K : Ordering, V, CC[_, _]](factory: SortedMapFactory[CC]): Factory[(K, V), CC[K, V]] = new ToFactory[K, V, CC](factory) + + @SerialVersionUID(3L) + private[this] class ToFactory[K : Ordering, V, CC[_, _]](factory: SortedMapFactory[CC]) extends Factory[(K, V), CC[K, V]] with Serializable { + def fromSpecific(it: IterableOnce[(K, V)]): CC[K, V] = factory.from[K, V](it) + def newBuilder: Builder[(K, V), CC[K, V]] = factory.newBuilder[K, V] + } + + implicit def toBuildFrom[K : Ordering, V, CC[_, _]](factory: SortedMapFactory[CC]): BuildFrom[Any, (K, V), CC[K, V]] = new SortedMapFactoryToBuildFrom(factory) + private class SortedMapFactoryToBuildFrom[K : Ordering, V, CC[_, _]](factory: SortedMapFactory[CC]) extends BuildFrom[Any, (K, V), CC[K, V]] { + def fromSpecific(from: Any)(it: IterableOnce[(K, V)]) = factory.from(it) + def newBuilder(from: Any) = factory.newBuilder[K, V] + } + + @SerialVersionUID(3L) + class Delegate[CC[_, _]](delegate: SortedMapFactory[CC]) extends SortedMapFactory[CC] { + override def apply[K: Ordering, V](elems: (K, V)*): CC[K, V] = delegate.apply(elems: _*) + def from[K : Ordering, V](it: IterableOnce[(K, V)]): CC[K, V] = delegate.from(it) + def empty[K : Ordering, V]: CC[K, V] = delegate.empty + def newBuilder[K : Ordering, V]: Builder[(K, V), CC[K, V]] = delegate.newBuilder + } +} diff --git a/scala2-library-bootstrapped/src/scala/collection/Iterable.scala b/scala2-library-bootstrapped/src/scala/collection/Iterable.scala new file mode 100644 index 000000000000..8f9142583b29 --- /dev/null +++ b/scala2-library-bootstrapped/src/scala/collection/Iterable.scala @@ -0,0 +1,1043 @@ +/* + * Scala (https://www.scala-lang.org) + * + * Copyright EPFL and Lightbend, Inc. + * + * Licensed under Apache License 2.0 + * (http://www.apache.org/licenses/LICENSE-2.0). + * + * See the NOTICE file distributed with this work for + * additional information regarding copyright ownership. + */ + +package scala +package collection + +import scala.annotation.nowarn +import scala.annotation.unchecked.uncheckedVariance +import scala.collection.mutable.Builder +import scala.collection.View.{LeftPartitionMapped, RightPartitionMapped} + +/** Base trait for generic collections. + * + * @tparam A the element type of the collection + * + * @define Coll `Iterable` + * @define coll iterable collection + */ +trait Iterable[+A] extends IterableOnce[A] + with IterableOps[A, Iterable, Iterable[A]] + with IterableFactoryDefaults[A, Iterable] { + + // The collection itself + @deprecated("toIterable is internal and will be made protected; its name is similar to `toList` or `toSeq`, but it doesn't copy non-immutable collections", "2.13.7") + final def toIterable: this.type = this + + final protected def coll: this.type = this + + def iterableFactory: IterableFactory[Iterable] = Iterable + + @deprecated("Iterable.seq always returns the iterable itself", "2.13.0") + def seq: this.type = this + + /** Defines the prefix of this object's `toString` representation. + * + * It is recommended to return the name of the concrete collection type, but + * not implementation subclasses. For example, for `ListMap` this method should + * return `"ListMap"`, not `"Map"` (the supertype) or `"Node"` (an implementation + * subclass). + * + * The default implementation returns "Iterable". It is overridden for the basic + * collection kinds "Seq", "IndexedSeq", "LinearSeq", "Buffer", "Set", "Map", + * "SortedSet", "SortedMap" and "View". + * + * @return a string representation which starts the result of `toString` + * applied to this $coll. By default the string prefix is the + * simple name of the collection class $coll. + */ + protected[this] def className: String = stringPrefix + + /** Forwarder to `className` for use in `scala.runtime.ScalaRunTime`. + * + * This allows the proper visibility for `className` to be + * published, but provides the exclusive access needed by + * `scala.runtime.ScalaRunTime.stringOf` (and a few tests in + * the test suite). + */ + private[scala] final def collectionClassName: String = className + + @deprecatedOverriding("Override className instead", "2.13.0") + protected[this] def stringPrefix: String = "Iterable" + + /** Converts this $coll to a string. + * + * @return a string representation of this collection. By default this + * string consists of the `className` of this $coll, followed + * by all elements separated by commas and enclosed in parentheses. + */ + override def toString = mkString(className + "(", ", ", ")") + + /** Analogous to `zip` except that the elements in each collection are not consumed until a strict operation is + * invoked on the returned `LazyZip2` decorator. + * + * Calls to `lazyZip` can be chained to support higher arities (up to 4) without incurring the expense of + * constructing and deconstructing intermediary tuples. + * + * {{{ + * val xs = List(1, 2, 3) + * val res = (xs lazyZip xs lazyZip xs lazyZip xs).map((a, b, c, d) => a + b + c + d) + * // res == List(4, 8, 12) + * }}} + * + * @param that the iterable providing the second element of each eventual pair + * @tparam B the type of the second element in each eventual pair + * @return a decorator `LazyZip2` that allows strict operations to be performed on the lazily evaluated pairs + * or chained calls to `lazyZip`. Implicit conversion to `Iterable[(A, B)]` is also supported. + */ + def lazyZip[B](that: Iterable[B]): LazyZip2[A, B, this.type] = new LazyZip2(this, this, that) +} + +/** Base trait for Iterable operations + * + * =VarianceNote= + * + * We require that for all child classes of Iterable the variance of + * the child class and the variance of the `C` parameter passed to `IterableOps` + * are the same. We cannot express this since we lack variance polymorphism. That's + * why we have to resort at some places to write `C[A @uncheckedVariance]`. + * + * @tparam CC type constructor of the collection (e.g. `List`, `Set`). Operations returning a collection + * with a different type of element `B` (e.g. `map`) return a `CC[B]`. + * @tparam C type of the collection (e.g. `List[Int]`, `String`, `BitSet`). Operations returning a collection + * with the same type of element (e.g. `drop`, `filter`) return a `C`. + * + * @define Coll Iterable + * @define coll iterable collection + * @define orderDependent + * + * Note: might return different results for different runs, unless the underlying collection type is ordered. + * @define orderDependentFold + * + * Note: might return different results for different runs, unless the + * underlying collection type is ordered or the operator is associative + * and commutative. + * @define mayNotTerminateInf + * + * Note: may not terminate for infinite-sized collections. + * @define willNotTerminateInf + * + * Note: will not terminate for infinite-sized collections. + * @define undefinedorder + * The order in which operations are performed on elements is unspecified + * and may be nondeterministic. + */ +trait IterableOps[+A, +CC[_], +C] extends Any with IterableOnce[A] with IterableOnceOps[A, CC, C] { + /** + * @return This collection as an `Iterable[A]`. No new collection will be built if `this` is already an `Iterable[A]`. + */ + // Should be `protected def asIterable`, or maybe removed altogether if it's not needed + @deprecated("toIterable is internal and will be made protected; its name is similar to `toList` or `toSeq`, but it doesn't copy non-immutable collections", "2.13.7") + def toIterable: Iterable[A] + + /** Converts this $coll to an unspecified Iterable. Will return + * the same collection if this instance is already Iterable. + * @return An Iterable containing all elements of this $coll. + */ + @deprecated("toTraversable is internal and will be made protected; its name is similar to `toList` or `toSeq`, but it doesn't copy non-immutable collections", "2.13.0") + final def toTraversable: Traversable[A] = toIterable + + override def isTraversableAgain: Boolean = true + + /** + * @return This collection as a `C`. + */ + protected def coll: C + + @deprecated("Use coll instead of repr in a collection implementation, use the collection value itself from the outside", "2.13.0") + final def repr: C = coll + + /** + * Defines how to turn a given `Iterable[A]` into a collection of type `C`. + * + * This process can be done in a strict way or a non-strict way (ie. without evaluating + * the elements of the resulting collections). In other words, this methods defines + * the evaluation model of the collection. + * + * @note When implementing a custom collection type and refining `C` to the new type, this + * method needs to be overridden (the compiler will issue an error otherwise). In the + * common case where `C =:= CC[A]`, this can be done by mixing in the + * [[scala.collection.IterableFactoryDefaults]] trait, which implements the method using + * [[iterableFactory]]. + * + * @note As witnessed by the `@uncheckedVariance` annotation, using this method + * might be unsound. However, as long as it is called with an + * `Iterable[A]` obtained from `this` collection (as it is the case in the + * implementations of operations where we use a `View[A]`), it is safe. + */ + protected def fromSpecific(coll: IterableOnce[A @uncheckedVariance]): C + + /** The companion object of this ${coll}, providing various factory methods. + * + * @note When implementing a custom collection type and refining `CC` to the new type, this + * method needs to be overridden to return a factory for the new type (the compiler will + * issue an error otherwise). + */ + def iterableFactory: IterableFactory[CC] + + @deprecated("Use iterableFactory instead", "2.13.0") + @deprecatedOverriding("Use iterableFactory instead", "2.13.0") + @`inline` def companion: IterableFactory[CC] = iterableFactory + + /** + * @return a strict builder for the same collection type. + * + * Note that in the case of lazy collections (e.g. [[scala.collection.View]] or [[scala.collection.immutable.LazyList]]), + * it is possible to implement this method but the resulting `Builder` will break laziness. + * As a consequence, operations should preferably be implemented with `fromSpecific` + * instead of this method. + * + * @note When implementing a custom collection type and refining `C` to the new type, this + * method needs to be overridden (the compiler will issue an error otherwise). In the + * common case where `C =:= CC[A]`, this can be done by mixing in the + * [[scala.collection.IterableFactoryDefaults]] trait, which implements the method using + * [[iterableFactory]]. + * + * @note As witnessed by the `@uncheckedVariance` annotation, using this method might + * be unsound. However, as long as the returned builder is only fed + * with `A` values taken from `this` instance, it is safe. + */ + protected def newSpecificBuilder: Builder[A @uncheckedVariance, C] + + /** The empty iterable of the same type as this iterable + * + * @return an empty iterable of type `C`. + */ + def empty: C = fromSpecific(Nil) + + /** Selects the first element of this $coll. + * $orderDependent + * @return the first element of this $coll. + * @throws NoSuchElementException if the $coll is empty. + */ + def head: A = iterator.next() + + /** Optionally selects the first element. + * $orderDependent + * @return the first element of this $coll if it is nonempty, + * `None` if it is empty. + */ + def headOption: Option[A] = { + val it = iterator + if (it.hasNext) Some(it.next()) else None + } + + /** Selects the last element. + * $orderDependent + * @return The last element of this $coll. + * @throws NoSuchElementException If the $coll is empty. + */ + def last: A = { + val it = iterator + var lst = it.next() + while (it.hasNext) lst = it.next() + lst + } + + /** Optionally selects the last element. + * $orderDependent + * @return the last element of this $coll$ if it is nonempty, + * `None` if it is empty. + */ + def lastOption: Option[A] = if (isEmpty) None else Some(last) + + /** A view over the elements of this collection. */ + def view: View[A] = View.fromIteratorProvider(() => iterator) + + /** Compares the size of this $coll to a test value. + * + * @param otherSize the test value that gets compared with the size. + * @return A value `x` where + * {{{ + * x < 0 if this.size < otherSize + * x == 0 if this.size == otherSize + * x > 0 if this.size > otherSize + * }}} + * + * The method as implemented here does not call `size` directly; its running time + * is `O(size min otherSize)` instead of `O(size)`. The method should be overridden + * if computing `size` is cheap and `knownSize` returns `-1`. + * + * @see [[sizeIs]] + */ + def sizeCompare(otherSize: Int): Int = { + if (otherSize < 0) 1 + else { + val known = knownSize + if (known >= 0) Integer.compare(known, otherSize) + else { + var i = 0 + val it = iterator + while (it.hasNext) { + if (i == otherSize) return 1 + it.next() + i += 1 + } + i - otherSize + } + } + } + + /** Returns a value class containing operations for comparing the size of this $coll to a test value. + * + * These operations are implemented in terms of [[sizeCompare(Int) `sizeCompare(Int)`]], and + * allow the following more readable usages: + * + * {{{ + * this.sizeIs < size // this.sizeCompare(size) < 0 + * this.sizeIs <= size // this.sizeCompare(size) <= 0 + * this.sizeIs == size // this.sizeCompare(size) == 0 + * this.sizeIs != size // this.sizeCompare(size) != 0 + * this.sizeIs >= size // this.sizeCompare(size) >= 0 + * this.sizeIs > size // this.sizeCompare(size) > 0 + * }}} + */ + @inline final def sizeIs: IterableOps.SizeCompareOps = new IterableOps.SizeCompareOps(this) + + /** Compares the size of this $coll to the size of another `Iterable`. + * + * @param that the `Iterable` whose size is compared with this $coll's size. + * @return A value `x` where + * {{{ + * x < 0 if this.size < that.size + * x == 0 if this.size == that.size + * x > 0 if this.size > that.size + * }}} + * + * The method as implemented here does not call `size` directly; its running time + * is `O(this.size min that.size)` instead of `O(this.size + that.size)`. + * The method should be overridden if computing `size` is cheap and `knownSize` returns `-1`. + */ + def sizeCompare(that: Iterable[_]): Int = { + val thatKnownSize = that.knownSize + + if (thatKnownSize >= 0) this sizeCompare thatKnownSize + else { + val thisKnownSize = this.knownSize + + if (thisKnownSize >= 0) { + val res = that sizeCompare thisKnownSize + // can't just invert the result, because `-Int.MinValue == Int.MinValue` + if (res == Int.MinValue) 1 else -res + } else { + val thisIt = this.iterator + val thatIt = that.iterator + while (thisIt.hasNext && thatIt.hasNext) { + thisIt.next() + thatIt.next() + } + java.lang.Boolean.compare(thisIt.hasNext, thatIt.hasNext) + } + } + } + + /** A view over a slice of the elements of this collection. */ + @deprecated("Use .view.slice(from, until) instead of .view(from, until)", "2.13.0") + def view(from: Int, until: Int): View[A] = view.slice(from, until) + + /** Transposes this $coll of iterable collections into + * a $coll of ${coll}s. + * + * The resulting collection's type will be guided by the + * static type of $coll. For example: + * + * {{{ + * val xs = List( + * Set(1, 2, 3), + * Set(4, 5, 6)).transpose + * // xs == List( + * // List(1, 4), + * // List(2, 5), + * // List(3, 6)) + * + * val ys = Vector( + * List(1, 2, 3), + * List(4, 5, 6)).transpose + * // ys == Vector( + * // Vector(1, 4), + * // Vector(2, 5), + * // Vector(3, 6)) + * }}} + * + * $willForceEvaluation + * + * @tparam B the type of the elements of each iterable collection. + * @param asIterable an implicit conversion which asserts that the + * element type of this $coll is an `Iterable`. + * @return a two-dimensional $coll of ${coll}s which has as ''n''th row + * the ''n''th column of this $coll. + * @throws IllegalArgumentException if all collections in this $coll + * are not of the same size. + */ + def transpose[B](implicit asIterable: A => /*<:= headSize) fail + bs(i) += x + i += 1 + } + if (i != headSize) + fail + } + iterableFactory.from(bs.map(_.result())) + } + + def filter(pred: A => Boolean): C = fromSpecific(new View.Filter(this, pred, isFlipped = false)) + + def filterNot(pred: A => Boolean): C = fromSpecific(new View.Filter(this, pred, isFlipped = true)) + + /** Creates a non-strict filter of this $coll. + * + * Note: the difference between `c filter p` and `c withFilter p` is that + * the former creates a new collection, whereas the latter only + * restricts the domain of subsequent `map`, `flatMap`, `foreach`, + * and `withFilter` operations. + * $orderDependent + * + * @param p the predicate used to test elements. + * @return an object of class `WithFilter`, which supports + * `map`, `flatMap`, `foreach`, and `withFilter` operations. + * All these operations apply to those elements of this $coll + * which satisfy the predicate `p`. + */ + def withFilter(p: A => Boolean): collection.WithFilter[A, CC] = new IterableOps.WithFilter(this, p) + + /** A pair of, first, all elements that satisfy predicate `p` and, second, + * all elements that do not. Interesting because it splits a collection in two. + * + * The default implementation provided here needs to traverse the collection twice. + * Strict collections have an overridden version of `partition` in `StrictOptimizedIterableOps`, + * which requires only a single traversal. + */ + def partition(p: A => Boolean): (C, C) = { + val first = new View.Filter(this, p, false) + val second = new View.Filter(this, p, true) + (fromSpecific(first), fromSpecific(second)) + } + + override def splitAt(n: Int): (C, C) = (take(n), drop(n)) + + def take(n: Int): C = fromSpecific(new View.Take(this, n)) + + /** Selects the last ''n'' elements. + * $orderDependent + * @param n the number of elements to take from this $coll. + * @return a $coll consisting only of the last `n` elements of this $coll, + * or else the whole $coll, if it has less than `n` elements. + * If `n` is negative, returns an empty $coll. + */ + def takeRight(n: Int): C = fromSpecific(new View.TakeRight(this, n)) + + /** Takes longest prefix of elements that satisfy a predicate. + * $orderDependent + * @param p The predicate used to test elements. + * @return the longest prefix of this $coll whose elements all satisfy + * the predicate `p`. + */ + def takeWhile(p: A => Boolean): C = fromSpecific(new View.TakeWhile(this, p)) + + def span(p: A => Boolean): (C, C) = (takeWhile(p), dropWhile(p)) + + def drop(n: Int): C = fromSpecific(new View.Drop(this, n)) + + /** Selects all elements except last ''n'' ones. + * $orderDependent + * @param n the number of elements to drop from this $coll. + * @return a $coll consisting of all elements of this $coll except the last `n` ones, or else the + * empty $coll, if this $coll has less than `n` elements. + * If `n` is negative, don't drop any elements. + */ + def dropRight(n: Int): C = fromSpecific(new View.DropRight(this, n)) + + def dropWhile(p: A => Boolean): C = fromSpecific(new View.DropWhile(this, p)) + + /** Partitions elements in fixed size ${coll}s. + * @see [[scala.collection.Iterator]], method `grouped` + * + * @param size the number of elements per group + * @return An iterator producing ${coll}s of size `size`, except the + * last will be less than size `size` if the elements don't divide evenly. + */ + def grouped(size: Int): Iterator[C] = + iterator.grouped(size).map(fromSpecific) + + /** Groups elements in fixed size blocks by passing a "sliding window" + * over them (as opposed to partitioning them, as is done in `grouped`.) + * + * An empty collection returns an empty iterator, and a non-empty + * collection containing fewer elements than the window size returns + * an iterator that will produce the original collection as its only + * element. + * @see [[scala.collection.Iterator]], method `sliding` + * + * @param size the number of elements per group + * @return An iterator producing ${coll}s of size `size`, except for a + * non-empty collection with less than `size` elements, which + * returns an iterator that produces the source collection itself + * as its only element. + * @example `List().sliding(2) = empty iterator` + * @example `List(1).sliding(2) = Iterator(List(1))` + * @example `List(1, 2).sliding(2) = Iterator(List(1, 2))` + * @example `List(1, 2, 3).sliding(2) = Iterator(List(1, 2), List(2, 3))` + */ + def sliding(size: Int): Iterator[C] = sliding(size, 1) + + /** Groups elements in fixed size blocks by passing a "sliding window" + * over them (as opposed to partitioning them, as is done in grouped.) + * + * The returned iterator will be empty when called on an empty collection. + * The last element the iterator produces may be smaller than the window + * size when the original collection isn't exhausted by the window before + * it and its last element isn't skipped by the step before it. + * + * @see [[scala.collection.Iterator]], method `sliding` + * + * @param size the number of elements per group + * @param step the distance between the first elements of successive + * groups + * @return An iterator producing ${coll}s of size `size`, except the last + * element (which may be the only element) will be smaller + * if there are fewer than `size` elements remaining to be grouped. + * @example `List(1, 2, 3, 4, 5).sliding(2, 2) = Iterator(List(1, 2), List(3, 4), List(5))` + * @example `List(1, 2, 3, 4, 5, 6).sliding(2, 3) = Iterator(List(1, 2), List(4, 5))` + */ + def sliding(size: Int, step: Int): Iterator[C] = + iterator.sliding(size, step).map(fromSpecific) + + /** The rest of the collection without its first element. */ + def tail: C = { + if (isEmpty) throw new UnsupportedOperationException + drop(1) + } + + /** The initial part of the collection without its last element. + * $willForceEvaluation + */ + def init: C = { + if (isEmpty) throw new UnsupportedOperationException + dropRight(1) + } + + def slice(from: Int, until: Int): C = + fromSpecific(new View.Drop(new View.Take(this, until), from)) + + /** Partitions this $coll into a map of ${coll}s according to some discriminator function. + * + * $willForceEvaluation + * + * @param f the discriminator function. + * @tparam K the type of keys returned by the discriminator function. + * @return A map from keys to ${coll}s such that the following invariant holds: + * {{{ + * (xs groupBy f)(k) = xs filter (x => f(x) == k) + * }}} + * That is, every key `k` is bound to a $coll of those elements `x` + * for which `f(x)` equals `k`. + * + */ + def groupBy[K](f: A => K): immutable.Map[K, C] = { + val m = mutable.Map.empty[K, Builder[A, C]] + val it = iterator + while (it.hasNext) { + val elem = it.next() + val key = f(elem) + val bldr = m.getOrElseUpdate(key, newSpecificBuilder) + bldr += elem + } + var result = immutable.HashMap.empty[K, C] + val mapIt = m.iterator + while (mapIt.hasNext) { + val (k, v) = mapIt.next() + result = result.updated(k, v.result()) + } + result + } + + /** + * Partitions this $coll into a map of ${coll}s according to a discriminator function `key`. + * Each element in a group is transformed into a value of type `B` using the `value` function. + * + * It is equivalent to `groupBy(key).mapValues(_.map(f))`, but more efficient. + * + * {{{ + * case class User(name: String, age: Int) + * + * def namesByAge(users: Seq[User]): Map[Int, Seq[String]] = + * users.groupMap(_.age)(_.name) + * }}} + * + * $willForceEvaluation + * + * @param key the discriminator function + * @param f the element transformation function + * @tparam K the type of keys returned by the discriminator function + * @tparam B the type of values returned by the transformation function + */ + def groupMap[K, B](key: A => K)(f: A => B): immutable.Map[K, CC[B]] = { + val m = mutable.Map.empty[K, Builder[B, CC[B]]] + for (elem <- this) { + val k = key(elem) + val bldr = m.getOrElseUpdate(k, iterableFactory.newBuilder[B]) + bldr += f(elem) + } + class Result extends runtime.AbstractFunction1[(K, Builder[B, CC[B]]), Unit] { + var built = immutable.Map.empty[K, CC[B]] + def apply(kv: (K, Builder[B, CC[B]])) = + built = built.updated(kv._1, kv._2.result()) + } + val result = new Result + m.foreach(result) + result.built + } + + /** + * Partitions this $coll into a map according to a discriminator function `key`. All the values that + * have the same discriminator are then transformed by the `f` function and then reduced into a + * single value with the `reduce` function. + * + * It is equivalent to `groupBy(key).mapValues(_.map(f).reduce(reduce))`, but more efficient. + * + * {{{ + * def occurrences[A](as: Seq[A]): Map[A, Int] = + * as.groupMapReduce(identity)(_ => 1)(_ + _) + * }}} + * + * $willForceEvaluation + */ + def groupMapReduce[K, B](key: A => K)(f: A => B)(reduce: (B, B) => B): immutable.Map[K, B] = { + val m = mutable.Map.empty[K, B] + for (elem <- this) { + val k = key(elem) + val v = + m.get(k) match { + case Some(b) => reduce(b, f(elem)) + case None => f(elem) + } + m.put(k, v) + } + m.to(immutable.Map) + } + + /** Computes a prefix scan of the elements of the collection. + * + * Note: The neutral element `z` may be applied more than once. + * + * @tparam B element type of the resulting collection + * @param z neutral element for the operator `op` + * @param op the associative operator for the scan + * + * @return a new $coll containing the prefix scan of the elements in this $coll + */ + def scan[B >: A](z: B)(op: (B, B) => B): CC[B] = scanLeft(z)(op) + + def scanLeft[B](z: B)(op: (B, A) => B): CC[B] = iterableFactory.from(new View.ScanLeft(this, z, op)) + + /** Produces a collection containing cumulative results of applying the operator going right to left. + * The head of the collection is the last cumulative result. + * $willNotTerminateInf + * $orderDependent + * $willForceEvaluation + * + * Example: + * {{{ + * List(1, 2, 3, 4).scanRight(0)(_ + _) == List(10, 9, 7, 4, 0) + * }}} + * + * @tparam B the type of the elements in the resulting collection + * @param z the initial value + * @param op the binary operator applied to the intermediate result and the element + * @return collection with intermediate results + */ + def scanRight[B](z: B)(op: (A, B) => B): CC[B] = { + class Scanner extends runtime.AbstractFunction1[A, Unit] { + var acc = z + var scanned = acc :: immutable.Nil + def apply(x: A) = { + acc = op(x, acc) + scanned ::= acc + } + } + val scanner = new Scanner + reversed.foreach(scanner) + iterableFactory.from(scanner.scanned) + } + + def map[B](f: A => B): CC[B] = iterableFactory.from(new View.Map(this, f)) + + def flatMap[B](f: A => IterableOnce[B]): CC[B] = iterableFactory.from(new View.FlatMap(this, f)) + + def flatten[B](implicit asIterable: A => IterableOnce[B]): CC[B] = flatMap(asIterable) + + def collect[B](pf: PartialFunction[A, B]): CC[B] = + iterableFactory.from(new View.Collect(this, pf)) + + /** Applies a function `f` to each element of the $coll and returns a pair of ${coll}s: the first one + * made of those values returned by `f` that were wrapped in [[scala.util.Left]], and the second + * one made of those wrapped in [[scala.util.Right]]. + * + * Example: + * {{{ + * val xs = $Coll(1, "one", 2, "two", 3, "three") partitionMap { + * case i: Int => Left(i) + * case s: String => Right(s) + * } + * // xs == ($Coll(1, 2, 3), + * // $Coll(one, two, three)) + * }}} + * + * @tparam A1 the element type of the first resulting collection + * @tparam A2 the element type of the second resulting collection + * @param f the 'split function' mapping the elements of this $coll to an [[scala.util.Either]] + * + * @return a pair of ${coll}s: the first one made of those values returned by `f` that were wrapped in [[scala.util.Left]], + * and the second one made of those wrapped in [[scala.util.Right]]. + */ + def partitionMap[A1, A2](f: A => Either[A1, A2]): (CC[A1], CC[A2]) = { + val left: View[A1] = new LeftPartitionMapped(this, f) + val right: View[A2] = new RightPartitionMapped(this, f) + (iterableFactory.from(left), iterableFactory.from(right)) + } + + /** Returns a new $coll containing the elements from the left hand operand followed by the elements from the + * right hand operand. The element type of the $coll is the most specific superclass encompassing + * the element types of the two operands. + * + * @param suffix the iterable to append. + * @tparam B the element type of the returned collection. + * @return a new $coll which contains all elements + * of this $coll followed by all elements of `suffix`. + */ + def concat[B >: A](suffix: IterableOnce[B]): CC[B] = iterableFactory.from(suffix match { + case xs: Iterable[B] => new View.Concat(this, xs) + case xs => iterator ++ suffix.iterator + }) + + /** Alias for `concat` */ + @`inline` final def ++ [B >: A](suffix: IterableOnce[B]): CC[B] = concat(suffix) + + /** Returns a $coll formed from this $coll and another iterable collection + * by combining corresponding elements in pairs. + * If one of the two collections is longer than the other, its remaining elements are ignored. + * + * @param that The iterable providing the second half of each result pair + * @tparam B the type of the second half of the returned pairs + * @return a new $coll containing pairs consisting of corresponding elements of this $coll and `that`. + * The length of the returned collection is the minimum of the lengths of this $coll and `that`. + */ + def zip[B](that: IterableOnce[B]): CC[(A @uncheckedVariance, B)] = iterableFactory.from(that match { // sound bcs of VarianceNote + case that: Iterable[B] => new View.Zip(this, that) + case _ => iterator.zip(that) + }) + + def zipWithIndex: CC[(A @uncheckedVariance, Int)] = iterableFactory.from(new View.ZipWithIndex(this)) + + /** Returns a $coll formed from this $coll and another iterable collection + * by combining corresponding elements in pairs. + * If one of the two collections is shorter than the other, + * placeholder elements are used to extend the shorter collection to the length of the longer. + * + * @param that the iterable providing the second half of each result pair + * @param thisElem the element to be used to fill up the result if this $coll is shorter than `that`. + * @param thatElem the element to be used to fill up the result if `that` is shorter than this $coll. + * @return a new collection of type `That` containing pairs consisting of + * corresponding elements of this $coll and `that`. The length + * of the returned collection is the maximum of the lengths of this $coll and `that`. + * If this $coll is shorter than `that`, `thisElem` values are used to pad the result. + * If `that` is shorter than this $coll, `thatElem` values are used to pad the result. + */ + def zipAll[A1 >: A, B](that: Iterable[B], thisElem: A1, thatElem: B): CC[(A1, B)] = iterableFactory.from(new View.ZipAll(this, that, thisElem, thatElem)) + + /** Converts this $coll of pairs into two collections of the first and second + * half of each pair. + * + * {{{ + * val xs = $Coll( + * (1, "one"), + * (2, "two"), + * (3, "three")).unzip + * // xs == ($Coll(1, 2, 3), + * // $Coll(one, two, three)) + * }}} + * + * @tparam A1 the type of the first half of the element pairs + * @tparam A2 the type of the second half of the element pairs + * @param asPair an implicit conversion which asserts that the element type + * of this $coll is a pair. + * @return a pair of ${coll}s, containing the first, respectively second + * half of each element pair of this $coll. + */ + def unzip[A1, A2](implicit asPair: A => (A1, A2)): (CC[A1], CC[A2]) = { + val first: View[A1] = new View.Map[A, A1](this, asPair(_)._1) + val second: View[A2] = new View.Map[A, A2](this, asPair(_)._2) + (iterableFactory.from(first), iterableFactory.from(second)) + } + + /** Converts this $coll of triples into three collections of the first, second, + * and third element of each triple. + * + * {{{ + * val xs = $Coll( + * (1, "one", '1'), + * (2, "two", '2'), + * (3, "three", '3')).unzip3 + * // xs == ($Coll(1, 2, 3), + * // $Coll(one, two, three), + * // $Coll(1, 2, 3)) + * }}} + * + * @tparam A1 the type of the first member of the element triples + * @tparam A2 the type of the second member of the element triples + * @tparam A3 the type of the third member of the element triples + * @param asTriple an implicit conversion which asserts that the element type + * of this $coll is a triple. + * @return a triple of ${coll}s, containing the first, second, respectively + * third member of each element triple of this $coll. + */ + def unzip3[A1, A2, A3](implicit asTriple: A => (A1, A2, A3)): (CC[A1], CC[A2], CC[A3]) = { + val first: View[A1] = new View.Map[A, A1](this, asTriple(_)._1) + val second: View[A2] = new View.Map[A, A2](this, asTriple(_)._2) + val third: View[A3] = new View.Map[A, A3](this, asTriple(_)._3) + (iterableFactory.from(first), iterableFactory.from(second), iterableFactory.from(third)) + } + + /** Iterates over the tails of this $coll. The first value will be this + * $coll and the final one will be an empty $coll, with the intervening + * values the results of successive applications of `tail`. + * + * @return an iterator over all the tails of this $coll + * @example `List(1,2,3).tails = Iterator(List(1,2,3), List(2,3), List(3), Nil)` + */ + def tails: Iterator[C] = iterateUntilEmpty(_.tail) + + /** Iterates over the inits of this $coll. The first value will be this + * $coll and the final one will be an empty $coll, with the intervening + * values the results of successive applications of `init`. + * + * $willForceEvaluation + * + * @return an iterator over all the inits of this $coll + * @example `List(1,2,3).inits = Iterator(List(1,2,3), List(1,2), List(1), Nil)` + */ + def inits: Iterator[C] = iterateUntilEmpty(_.init) + + override def tapEach[U](f: A => U): C = fromSpecific(new View.Map(this, { (a: A) => f(a); a })) + + // A helper for tails and inits. + private[this] def iterateUntilEmpty(f: Iterable[A] => Iterable[A]): Iterator[C] = { + // toIterable ties the knot between `this: IterableOnceOps[A, CC, C]` and `this.tail: C` + // `this.tail.tail` doesn't compile as `C` is unbounded + // `Iterable.from(this)` would eagerly copy non-immutable collections + val it = Iterator.iterate(toIterable: @nowarn("cat=deprecation"))(f).takeWhile(_.nonEmpty) + (it ++ Iterator.single(Iterable.empty)).map(fromSpecific) + } + + @deprecated("Use ++ instead of ++: for collections of type Iterable", "2.13.0") + def ++:[B >: A](that: IterableOnce[B]): CC[B] = iterableFactory.from(that match { + case xs: Iterable[B] => new View.Concat(xs, this) + case _ => that.iterator ++ iterator + }) +} + +object IterableOps { + + /** Operations for comparing the size of a collection to a test value. + * + * These operations are implemented in terms of + * [[scala.collection.IterableOps.sizeCompare(Int) `sizeCompare(Int)`]]. + */ + final class SizeCompareOps private[collection](val it: IterableOps[_, AnyConstr, _]) extends AnyVal { + /** Tests if the size of the collection is less than some value. */ + @inline def <(size: Int): Boolean = it.sizeCompare(size) < 0 + /** Tests if the size of the collection is less than or equal to some value. */ + @inline def <=(size: Int): Boolean = it.sizeCompare(size) <= 0 + /** Tests if the size of the collection is equal to some value. */ + @inline def ==(size: Int): Boolean = it.sizeCompare(size) == 0 + /** Tests if the size of the collection is not equal to some value. */ + @inline def !=(size: Int): Boolean = it.sizeCompare(size) != 0 + /** Tests if the size of the collection is greater than or equal to some value. */ + @inline def >=(size: Int): Boolean = it.sizeCompare(size) >= 0 + /** Tests if the size of the collection is greater than some value. */ + @inline def >(size: Int): Boolean = it.sizeCompare(size) > 0 + } + + /** A trait that contains just the `map`, `flatMap`, `foreach` and `withFilter` methods + * of trait `Iterable`. + * + * @tparam A Element type (e.g. `Int`) + * @tparam CC Collection type constructor (e.g. `List`) + * + * @define coll collection + */ + @SerialVersionUID(3L) + class WithFilter[+A, +CC[_]]( + self: IterableOps[A, CC, _], + p: A => Boolean + ) extends collection.WithFilter[A, CC] with Serializable { + + protected def filtered: Iterable[A] = + new View.Filter(self, p, isFlipped = false) + + def map[B](f: A => B): CC[B] = + self.iterableFactory.from(new View.Map(filtered, f)) + + def flatMap[B](f: A => IterableOnce[B]): CC[B] = + self.iterableFactory.from(new View.FlatMap(filtered, f)) + + def foreach[U](f: A => U): Unit = filtered.foreach(f) + + def withFilter(q: A => Boolean): WithFilter[A, CC] = + new WithFilter(self, (a: A) => p(a) && q(a)) + + } + +} + +@SerialVersionUID(3L) +object Iterable extends IterableFactory.Delegate[Iterable](immutable.Iterable) { + + def single[A](a: A): Iterable[A] = new AbstractIterable[A] { + override def iterator = Iterator.single(a) + override def knownSize = 1 + override def head = a + override def headOption: Some[A] = Some(a) + override def last = a + override def lastOption: Some[A] = Some(a) + override def view: View.Single[A] = new View.Single(a) + override def take(n: Int) = if (n > 0) this else Iterable.empty + override def takeRight(n: Int) = if (n > 0) this else Iterable.empty + override def drop(n: Int) = if (n > 0) Iterable.empty else this + override def dropRight(n: Int) = if (n > 0) Iterable.empty else this + override def tail: Iterable[Nothing] = Iterable.empty + override def init: Iterable[Nothing] = Iterable.empty + } +} + +/** Explicit instantiation of the `Iterable` trait to reduce class file size in subclasses. */ +abstract class AbstractIterable[+A] extends Iterable[A] + +/** This trait provides default implementations for the factory methods `fromSpecific` and + * `newSpecificBuilder` that need to be refined when implementing a collection type that refines + * the `CC` and `C` type parameters. + * + * The default implementations in this trait can be used in the common case when `CC[A]` is the + * same as `C`. + */ +trait IterableFactoryDefaults[+A, +CC[x] <: IterableOps[x, CC, CC[x]]] extends IterableOps[A, CC, CC[A @uncheckedVariance]] { + protected def fromSpecific(coll: IterableOnce[A @uncheckedVariance]): CC[A @uncheckedVariance] = iterableFactory.from(coll) + protected def newSpecificBuilder: Builder[A @uncheckedVariance, CC[A @uncheckedVariance]] = iterableFactory.newBuilder[A] + + // overridden for efficiency, since we know CC[A] =:= C + override def empty: CC[A @uncheckedVariance] = iterableFactory.empty +} + +/** This trait provides default implementations for the factory methods `fromSpecific` and + * `newSpecificBuilder` that need to be refined when implementing a collection type that refines + * the `CC` and `C` type parameters. It is used for collections that have an additional constraint, + * expressed by the `evidenceIterableFactory` method. + * + * The default implementations in this trait can be used in the common case when `CC[A]` is the + * same as `C`. + */ +trait EvidenceIterableFactoryDefaults[+A, +CC[x] <: IterableOps[x, CC, CC[x]], Ev[_]] extends IterableOps[A, CC, CC[A @uncheckedVariance]] { + protected def evidenceIterableFactory: EvidenceIterableFactory[CC, Ev] + implicit protected def iterableEvidence: Ev[A @uncheckedVariance] + override protected def fromSpecific(coll: IterableOnce[A @uncheckedVariance]): CC[A @uncheckedVariance] = evidenceIterableFactory.from(coll) + override protected def newSpecificBuilder: Builder[A @uncheckedVariance, CC[A @uncheckedVariance]] = evidenceIterableFactory.newBuilder[A] + override def empty: CC[A @uncheckedVariance] = evidenceIterableFactory.empty +} + +/** This trait provides default implementations for the factory methods `fromSpecific` and + * `newSpecificBuilder` that need to be refined when implementing a collection type that refines + * the `CC` and `C` type parameters. It is used for sorted sets. + * + * Note that in sorted sets, the `CC` type of the set is not the same as the `CC` type for the + * underlying iterable (which is fixed to `Set` in [[SortedSetOps]]). This trait has therefore + * two type parameters `CC` and `WithFilterCC`. The `withFilter` method inherited from + * `IterableOps` is overridden with a compatible default implementation. + * + * The default implementations in this trait can be used in the common case when `CC[A]` is the + * same as `C`. + */ +trait SortedSetFactoryDefaults[+A, + +CC[X] <: SortedSet[X] with SortedSetOps[X, CC, CC[X]], + +WithFilterCC[x] <: IterableOps[x, WithFilterCC, WithFilterCC[x]] with Set[x]] extends SortedSetOps[A @uncheckedVariance, CC, CC[A @uncheckedVariance]] { + self: IterableOps[A, WithFilterCC, _] => + + override protected def fromSpecific(coll: IterableOnce[A @uncheckedVariance]): CC[A @uncheckedVariance] = sortedIterableFactory.from(coll)(using ordering) + override protected def newSpecificBuilder: mutable.Builder[A @uncheckedVariance, CC[A @uncheckedVariance]] = sortedIterableFactory.newBuilder[A](using ordering) + override def empty: CC[A @uncheckedVariance] = sortedIterableFactory.empty(using ordering) + + override def withFilter(p: A => Boolean): SortedSetOps.WithFilter[A, WithFilterCC, CC] = + new SortedSetOps.WithFilter[A, WithFilterCC, CC](this, p) +} + + +/** This trait provides default implementations for the factory methods `fromSpecific` and + * `newSpecificBuilder` that need to be refined when implementing a collection type that refines + * the `CC` and `C` type parameters. It is used for maps. + * + * Note that in maps, the `CC` type of the map is not the same as the `CC` type for the + * underlying iterable (which is fixed to `Map` in [[MapOps]]). This trait has therefore + * two type parameters `CC` and `WithFilterCC`. The `withFilter` method inherited from + * `IterableOps` is overridden with a compatible default implementation. + * + * The default implementations in this trait can be used in the common case when `CC[A]` is the + * same as `C`. + */ +trait MapFactoryDefaults[K, +V, + +CC[x, y] <: IterableOps[(x, y), Iterable, Iterable[(x, y)]], + +WithFilterCC[x] <: IterableOps[x, WithFilterCC, WithFilterCC[x]] with Iterable[x]] extends MapOps[K, V, CC, CC[K, V @uncheckedVariance]] with IterableOps[(K, V), WithFilterCC, CC[K, V @uncheckedVariance]] { + override protected def fromSpecific(coll: IterableOnce[(K, V @uncheckedVariance)]): CC[K, V @uncheckedVariance] = mapFactory.from(coll) + override protected def newSpecificBuilder: mutable.Builder[(K, V @uncheckedVariance), CC[K, V @uncheckedVariance]] = mapFactory.newBuilder[K, V] + override def empty: CC[K, V @uncheckedVariance] = (this: AnyRef) match { + // Implemented here instead of in TreeSeqMap since overriding empty in TreeSeqMap is not forwards compatible (should be moved) + case self: immutable.TreeSeqMap[_, _] => immutable.TreeSeqMap.empty(self.orderedBy).asInstanceOf[CC[K, V]] + case _ => mapFactory.empty + } + + override def withFilter(p: ((K, V)) => Boolean): MapOps.WithFilter[K, V, WithFilterCC, CC] = + new MapOps.WithFilter[K, V, WithFilterCC, CC](this, p) +} + +/** This trait provides default implementations for the factory methods `fromSpecific` and + * `newSpecificBuilder` that need to be refined when implementing a collection type that refines + * the `CC` and `C` type parameters. It is used for sorted maps. + * + * Note that in sorted maps, the `CC` type of the map is not the same as the `CC` type for the + * underlying map (which is fixed to `Map` in [[SortedMapOps]]). This trait has therefore + * three type parameters `CC`, `WithFilterCC` and `UnsortedCC`. The `withFilter` method inherited + * from `IterableOps` is overridden with a compatible default implementation. + * + * The default implementations in this trait can be used in the common case when `CC[A]` is the + * same as `C`. + */ +trait SortedMapFactoryDefaults[K, +V, + +CC[x, y] <: Map[x, y] with SortedMapOps[x, y, CC, CC[x, y]] with UnsortedCC[x, y], + +WithFilterCC[x] <: IterableOps[x, WithFilterCC, WithFilterCC[x]] with Iterable[x], + +UnsortedCC[x, y] <: Map[x, y]] extends SortedMapOps[K, V, CC, CC[K, V @uncheckedVariance]] with MapOps[K, V, UnsortedCC, CC[K, V @uncheckedVariance]] { + self: IterableOps[(K, V), WithFilterCC, _] => + + override def empty: CC[K, V @uncheckedVariance] = sortedMapFactory.empty(using ordering) + override protected def fromSpecific(coll: IterableOnce[(K, V @uncheckedVariance)]): CC[K, V @uncheckedVariance] = sortedMapFactory.from(coll)(using ordering) + override protected def newSpecificBuilder: mutable.Builder[(K, V @uncheckedVariance), CC[K, V @uncheckedVariance]] = sortedMapFactory.newBuilder[K, V](using ordering) + + override def withFilter(p: ((K, V)) => Boolean): collection.SortedMapOps.WithFilter[K, V, WithFilterCC, UnsortedCC, CC] = + new collection.SortedMapOps.WithFilter[K, V, WithFilterCC, UnsortedCC, CC](this, p) +} diff --git a/scala2-library-bootstrapped/src/scala/collection/SortedMap.scala b/scala2-library-bootstrapped/src/scala/collection/SortedMap.scala new file mode 100644 index 000000000000..5beb811ed0b2 --- /dev/null +++ b/scala2-library-bootstrapped/src/scala/collection/SortedMap.scala @@ -0,0 +1,220 @@ +/* + * Scala (https://www.scala-lang.org) + * + * Copyright EPFL and Lightbend, Inc. + * + * Licensed under Apache License 2.0 + * (http://www.apache.org/licenses/LICENSE-2.0). + * + * See the NOTICE file distributed with this work for + * additional information regarding copyright ownership. + */ + +package scala +package collection + +import scala.annotation.{implicitNotFound, nowarn} + +/** A Map whose keys are sorted according to a [[scala.math.Ordering]]*/ +trait SortedMap[K, +V] + extends Map[K, V] + with SortedMapOps[K, V, SortedMap, SortedMap[K, V]] + with SortedMapFactoryDefaults[K, V, SortedMap, Iterable, Map]{ + + def unsorted: Map[K, V] = this + + def sortedMapFactory: SortedMapFactory[SortedMap] = SortedMap + + @nowarn("""cat=deprecation&origin=scala\.collection\.Iterable\.stringPrefix""") + override protected[this] def stringPrefix: String = "SortedMap" + + override def equals(that: Any): Boolean = that match { + case _ if this eq that.asInstanceOf[AnyRef] => true + case sm: SortedMap[K @unchecked, _] if sm.ordering == this.ordering => + (sm canEqual this) && + (this.size == sm.size) && { + val i1 = this.iterator + val i2 = sm.iterator + var allEqual = true + while (allEqual && i1.hasNext) { + val kv1 = i1.next() + val kv2 = i2.next() + allEqual = ordering.equiv(kv1._1, kv2._1) && kv1._2 == kv2._2 + } + allEqual + } + case _ => super.equals(that) + } +} + +trait SortedMapOps[K, +V, +CC[X, Y] <: Map[X, Y] with SortedMapOps[X, Y, CC, _], +C <: SortedMapOps[K, V, CC, C]] + extends MapOps[K, V, Map, C] + with SortedOps[K, C] { + + /** The companion object of this sorted map, providing various factory methods. + * + * @note When implementing a custom collection type and refining `CC` to the new type, this + * method needs to be overridden to return a factory for the new type (the compiler will + * issue an error otherwise). + */ + def sortedMapFactory: SortedMapFactory[CC] + + /** Similar to `mapFromIterable`, but returns a SortedMap collection type. + * Note that the return type is now `CC[K2, V2]`. + */ + @`inline` protected final def sortedMapFromIterable[K2, V2](it: Iterable[(K2, V2)])(implicit ordering: Ordering[K2]): CC[K2, V2] = sortedMapFactory.from(it) + + def unsorted: Map[K, V] + + /** + * Creates an iterator over all the key/value pairs + * contained in this map having a key greater than or + * equal to `start` according to the ordering of + * this map. x.iteratorFrom(y) is equivalent + * to but often more efficient than x.from(y).iterator. + * + * @param start The lower bound (inclusive) + * on the keys to be returned + */ + def iteratorFrom(start: K): Iterator[(K, V)] + + /** + * Creates an iterator over all the keys(or elements) contained in this + * collection greater than or equal to `start` + * according to the ordering of this collection. x.keysIteratorFrom(y) + * is equivalent to but often more efficient than + * x.from(y).keysIterator. + * + * @param start The lower bound (inclusive) + * on the keys to be returned + */ + def keysIteratorFrom(start: K): Iterator[K] + + /** + * Creates an iterator over all the values contained in this + * map that are associated with a key greater than or equal to `start` + * according to the ordering of this map. x.valuesIteratorFrom(y) is + * equivalent to but often more efficient than + * x.from(y).valuesIterator. + * + * @param start The lower bound (inclusive) + * on the keys to be returned + */ + def valuesIteratorFrom(start: K): Iterator[V] = iteratorFrom(start).map(_._2) + + def firstKey: K = head._1 + def lastKey: K = last._1 + + /** Find the element with smallest key larger than or equal to a given key. + * @param key The given key. + * @return `None` if there is no such node. + */ + def minAfter(key: K): Option[(K, V)] = rangeFrom(key).headOption + + /** Find the element with largest key less than a given key. + * @param key The given key. + * @return `None` if there is no such node. + */ + def maxBefore(key: K): Option[(K, V)] = rangeUntil(key).lastOption + + def rangeTo(to: K): C = { + val i = keySet.rangeFrom(to).iterator + if (i.isEmpty) return coll + val next = i.next() + if (ordering.compare(next, to) == 0) + if (i.isEmpty) coll + else rangeUntil(i.next()) + else + rangeUntil(next) + } + + override def keySet: SortedSet[K] = new KeySortedSet + + /** The implementation class of the set returned by `keySet` */ + protected class KeySortedSet extends SortedSet[K] with GenKeySet with GenKeySortedSet { + def diff(that: Set[K]): SortedSet[K] = fromSpecific(view.filterNot(that)) + def rangeImpl(from: Option[K], until: Option[K]): SortedSet[K] = { + val map = SortedMapOps.this.rangeImpl(from, until) + new map.KeySortedSet + } + } + + /** A generic trait that is reused by sorted keyset implementations */ + protected trait GenKeySortedSet extends GenKeySet { this: SortedSet[K] => + implicit def ordering: Ordering[K] = SortedMapOps.this.ordering + def iteratorFrom(start: K): Iterator[K] = SortedMapOps.this.keysIteratorFrom(start) + } + + // And finally, we add new overloads taking an ordering + /** Builds a new sorted map by applying a function to all elements of this $coll. + * + * @param f the function to apply to each element. + * @return a new $coll resulting from applying the given function + * `f` to each element of this $coll and collecting the results. + */ + def map[K2, V2](f: ((K, V)) => (K2, V2))(implicit @implicitNotFound(SortedMapOps.ordMsg) ordering: Ordering[K2]): CC[K2, V2] = + sortedMapFactory.from(new View.Map[(K, V), (K2, V2)](this, f)) + + /** Builds a new sorted map by applying a function to all elements of this $coll + * and using the elements of the resulting collections. + * + * @param f the function to apply to each element. + * @return a new $coll resulting from applying the given collection-valued function + * `f` to each element of this $coll and concatenating the results. + */ + def flatMap[K2, V2](f: ((K, V)) => IterableOnce[(K2, V2)])(implicit @implicitNotFound(SortedMapOps.ordMsg) ordering: Ordering[K2]): CC[K2, V2] = + sortedMapFactory.from(new View.FlatMap(this, f)) + + /** Builds a new sorted map by applying a partial function to all elements of this $coll + * on which the function is defined. + * + * @param pf the partial function which filters and maps the $coll. + * @return a new $coll resulting from applying the given partial function + * `pf` to each element on which it is defined and collecting the results. + * The order of the elements is preserved. + */ + def collect[K2, V2](pf: PartialFunction[(K, V), (K2, V2)])(implicit @implicitNotFound(SortedMapOps.ordMsg) ordering: Ordering[K2]): CC[K2, V2] = + sortedMapFactory.from(new View.Collect(this, pf)) + + override def concat[V2 >: V](suffix: IterableOnce[(K, V2)]): CC[K, V2] = sortedMapFactory.from(suffix match { + case it: Iterable[(K, V2)] => new View.Concat(this, it) + case _ => iterator.concat(suffix.iterator) + })(using ordering) + + /** Alias for `concat` */ + @`inline` override final def ++ [V2 >: V](xs: IterableOnce[(K, V2)]): CC[K, V2] = concat(xs) + + @deprecated("Consider requiring an immutable Map or fall back to Map.concat", "2.13.0") + override def + [V1 >: V](kv: (K, V1)): CC[K, V1] = sortedMapFactory.from(new View.Appended(this, kv))(using ordering) + + @deprecated("Use ++ with an explicit collection argument instead of + with varargs", "2.13.0") + override def + [V1 >: V](elem1: (K, V1), elem2: (K, V1), elems: (K, V1)*): CC[K, V1] = sortedMapFactory.from(new View.Concat(new View.Appended(new View.Appended(this, elem1), elem2), elems))(using ordering) +} + +object SortedMapOps { + private[collection] final val ordMsg = "No implicit Ordering[${K2}] found to build a SortedMap[${K2}, ${V2}]. You may want to upcast to a Map[${K}, ${V}] first by calling `unsorted`." + + /** Specializes `MapWithFilter` for sorted Map collections + * + * @define coll sorted map collection + */ + class WithFilter[K, +V, +IterableCC[_], +MapCC[X, Y] <: Map[X, Y], +CC[X, Y] <: Map[X, Y] with SortedMapOps[X, Y, CC, _]]( + self: SortedMapOps[K, V, CC, _] with MapOps[K, V, MapCC, _] with IterableOps[(K, V), IterableCC, _], + p: ((K, V)) => Boolean + ) extends MapOps.WithFilter[K, V, IterableCC, MapCC](self, p) { + + def map[K2 : Ordering, V2](f: ((K, V)) => (K2, V2)): CC[K2, V2] = + self.sortedMapFactory.from(new View.Map(filtered, f)) + + def flatMap[K2 : Ordering, V2](f: ((K, V)) => IterableOnce[(K2, V2)]): CC[K2, V2] = + self.sortedMapFactory.from(new View.FlatMap(filtered, f)) + + override def withFilter(q: ((K, V)) => Boolean): WithFilter[K, V, IterableCC, MapCC, CC] = + new WithFilter[K, V, IterableCC, MapCC, CC](self, (kv: (K, V)) => p(kv) && q(kv)) + + } + +} + +@SerialVersionUID(3L) +object SortedMap extends SortedMapFactory.Delegate[SortedMap](immutable.SortedMap) diff --git a/scala2-library-bootstrapped/src/scala/collection/StrictOptimizedSortedMapOps.scala b/scala2-library-bootstrapped/src/scala/collection/StrictOptimizedSortedMapOps.scala new file mode 100644 index 000000000000..ad5d67a64635 --- /dev/null +++ b/scala2-library-bootstrapped/src/scala/collection/StrictOptimizedSortedMapOps.scala @@ -0,0 +1,46 @@ +/* + * Scala (https://www.scala-lang.org) + * + * Copyright EPFL and Lightbend, Inc. + * + * Licensed under Apache License 2.0 + * (http://www.apache.org/licenses/LICENSE-2.0). + * + * See the NOTICE file distributed with this work for + * additional information regarding copyright ownership. + */ + +package scala.collection + +import scala.annotation.implicitNotFound + +/** + * Trait that overrides sorted map operations to take advantage of strict builders. + * + * @tparam K Type of keys + * @tparam V Type of values + * @tparam CC Collection type constructor + * @tparam C Collection type + */ +trait StrictOptimizedSortedMapOps[K, +V, +CC[X, Y] <: Map[X, Y] with SortedMapOps[X, Y, CC, _], +C <: SortedMapOps[K, V, CC, C]] + extends SortedMapOps[K, V, CC, C] + with StrictOptimizedMapOps[K, V, Map, C] { + + override def map[K2, V2](f: ((K, V)) => (K2, V2))(implicit @implicitNotFound(SortedMapOps.ordMsg) ordering: Ordering[K2]): CC[K2, V2] = + strictOptimizedMap(sortedMapFactory.newBuilder, f) + + override def flatMap[K2, V2](f: ((K, V)) => IterableOnce[(K2, V2)])(implicit @implicitNotFound(SortedMapOps.ordMsg) ordering: Ordering[K2]): CC[K2, V2] = + strictOptimizedFlatMap(sortedMapFactory.newBuilder, f) + + override def concat[V2 >: V](xs: IterableOnce[(K, V2)]): CC[K, V2] = + strictOptimizedConcat(xs, sortedMapFactory.newBuilder(using ordering)) + + override def collect[K2, V2](pf: PartialFunction[(K, V), (K2, V2)])(implicit @implicitNotFound(SortedMapOps.ordMsg) ordering: Ordering[K2]): CC[K2, V2] = + strictOptimizedCollect(sortedMapFactory.newBuilder, pf) + + @deprecated("Use ++ with an explicit collection argument instead of + with varargs", "2.13.0") + override def + [V1 >: V](elem1: (K, V1), elem2: (K, V1), elems: (K, V1)*): CC[K, V1] = { + val m = ((this + elem1).asInstanceOf[Map[K, V]] + elem2).asInstanceOf[CC[K, V1]] + if(elems.isEmpty) m else m.concat(elems).asInstanceOf[CC[K, V1]] + } +} diff --git a/scala2-library-bootstrapped/src/scala/collection/generic/DefaultSerializationProxy.scala b/scala2-library-bootstrapped/src/scala/collection/generic/DefaultSerializationProxy.scala new file mode 100644 index 000000000000..e794044a1af9 --- /dev/null +++ b/scala2-library-bootstrapped/src/scala/collection/generic/DefaultSerializationProxy.scala @@ -0,0 +1,87 @@ +/* + * Scala (https://www.scala-lang.org) + * + * Copyright EPFL and Lightbend, Inc. + * + * Licensed under Apache License 2.0 + * (http://www.apache.org/licenses/LICENSE-2.0). + * + * See the NOTICE file distributed with this work for + * additional information regarding copyright ownership. + */ + +package scala.collection.generic + +import java.io.{ObjectInputStream, ObjectOutputStream} + +import scala.collection.{Factory, Iterable} +import scala.collection.mutable.Builder + +/** The default serialization proxy for collection implementations. + * + * This class is `final` and requires an extra `Factory` object rather than leaving the details of creating a `Builder` + * to an abstract method that could be implemented by a subclass. This is necessary because the factory is needed + * for deserializing this class's private state, which happens before any subclass fields would be deserialized. Any + * additional state required to create the proper `Builder` needs to be captured by the `factory`. + */ +@SerialVersionUID(3L) +final class DefaultSerializationProxy[A](factory: Factory[A, Any], @transient private[this] val coll: Iterable[A]) extends Serializable { + + @transient protected var builder: Builder[A, Any] = _ + + private[this] def writeObject(out: ObjectOutputStream): Unit = { + out.defaultWriteObject() + val k = coll.knownSize + out.writeInt(k) + var count = 0 + coll.foreach { x => + out.writeObject(x) + count += 1 + } + if(k >= 0) { + if(count != k) throw new IllegalStateException(s"Illegal size $count of collection, expected $k") + } else out.writeObject(SerializeEnd) + } + + private[this] def readObject(in: ObjectInputStream): Unit = { + in.defaultReadObject() + builder = factory.newBuilder + val k = in.readInt() + if(k >= 0) { + builder.sizeHint(k) + var count = 0 + while(count < k) { + builder += in.readObject().asInstanceOf[A] + count += 1 + } + } else { + while (true) in.readObject match { + case SerializeEnd => return + case a => builder += a.asInstanceOf[A] + } + } + } + + protected[this] def readResolve(): Any = builder.result() +} + +@SerialVersionUID(3L) +private[collection] case object SerializeEnd + +/** Mix-in trait to enable DefaultSerializationProxy for the standard collection types. Depending on the type + * it is mixed into, it will dynamically choose `iterableFactory`, `mapFactory`, `sortedIterableFactory` or + * `sortedMapFactory` for deserialization into the respective `CC` type. Override `writeReplace` or implement + * it directly without using this trait if you need a non-standard factory or if you want to use a different + * serialization scheme. + */ +trait DefaultSerializable extends Serializable { this: scala.collection.Iterable[_] => + protected[this] def writeReplace(): AnyRef = { + val f: Factory[Any, Any] = this match { + case it: scala.collection.SortedMap[_, _] => it.sortedMapFactory.sortedMapFactory[Any, Any](using it.ordering.asInstanceOf[Ordering[Any]]).asInstanceOf[Factory[Any, Any]] + case it: scala.collection.Map[_, _] => it.mapFactory.mapFactory[Any, Any].asInstanceOf[Factory[Any, Any]] + case it: scala.collection.SortedSet[_] => it.sortedIterableFactory.evidenceIterableFactory[Any](using it.ordering.asInstanceOf[Ordering[Any]]) + case it => it.iterableFactory.iterableFactory + } + new DefaultSerializationProxy(f, this) + } +} diff --git a/scala2-library-bootstrapped/src/scala/collection/mutable/ArraySeq.scala b/scala2-library-bootstrapped/src/scala/collection/mutable/ArraySeq.scala new file mode 100644 index 000000000000..ebefa4c3c17a --- /dev/null +++ b/scala2-library-bootstrapped/src/scala/collection/mutable/ArraySeq.scala @@ -0,0 +1,354 @@ +/* + * Scala (https://www.scala-lang.org) + * + * Copyright EPFL and Lightbend, Inc. + * + * Licensed under Apache License 2.0 + * (http://www.apache.org/licenses/LICENSE-2.0). + * + * See the NOTICE file distributed with this work for + * additional information regarding copyright ownership. + */ + +package scala.collection +package mutable +import java.util.Arrays +import scala.collection.Stepper.EfficientSplit +import scala.collection.convert.impl._ +import scala.reflect.ClassTag +import scala.util.hashing.MurmurHash3 + +/** + * A collection representing `Array[T]`. Unlike `ArrayBuffer` it is always backed by the same + * underlying `Array`, therefore it is not growable or shrinkable. + * + * @tparam T type of the elements in this wrapped array. + * + * @define Coll `ArraySeq` + * @define coll wrapped array + * @define orderDependent + * @define orderDependentFold + * @define mayNotTerminateInf + * @define willNotTerminateInf + */ +@SerialVersionUID(3L) +sealed abstract class ArraySeq[T] + extends AbstractSeq[T] + with IndexedSeq[T] + with IndexedSeqOps[T, ArraySeq, ArraySeq[T]] + with StrictOptimizedSeqOps[T, ArraySeq, ArraySeq[T]] + with Serializable { + + override def iterableFactory: scala.collection.SeqFactory[ArraySeq] = ArraySeq.untagged + + override protected def fromSpecific(coll: scala.collection.IterableOnce[T]): ArraySeq[T] = { + val b = ArrayBuilder.make(using elemTag).asInstanceOf[ArrayBuilder[T]] + val s = coll.knownSize + if(s > 0) b.sizeHint(s) + b ++= coll + ArraySeq.make(b.result()) + } + override protected def newSpecificBuilder: Builder[T, ArraySeq[T]] = ArraySeq.newBuilder(using elemTag).asInstanceOf[Builder[T, ArraySeq[T]]] + override def empty: ArraySeq[T] = ArraySeq.empty(using elemTag.asInstanceOf[ClassTag[T]]) + + /** The tag of the element type. This does not have to be equal to the element type of this ArraySeq. A primitive + * ArraySeq can be backed by an array of boxed values and a reference ArraySeq can be backed by an array of a supertype + * or subtype of the element type. */ + def elemTag: ClassTag[_] + + /** Update element at given index */ + def update(@deprecatedName("idx", "2.13.0") index: Int, elem: T): Unit + + /** The underlying array. Its element type does not have to be equal to the element type of this ArraySeq. A primitive + * ArraySeq can be backed by an array of boxed values and a reference ArraySeq can be backed by an array of a supertype + * or subtype of the element type. */ + def array: Array[_] + + override def stepper[S <: Stepper[_]](implicit shape: StepperShape[T, S]): S with EfficientSplit + + override protected[this] def className = "ArraySeq" + + /** Clones this object, including the underlying Array. */ + override def clone(): ArraySeq[T] = ArraySeq.make(array.clone()).asInstanceOf[ArraySeq[T]] + + override def copyToArray[B >: T](xs: Array[B], start: Int, len: Int): Int = { + val copied = IterableOnce.elemsToCopyToArray(length, xs.length, start, len) + if(copied > 0) { + Array.copy(array, 0, xs, start, copied) + } + copied + } + + override def equals(other: Any): Boolean = other match { + case that: ArraySeq[_] if this.array.length != that.array.length => + false + case _ => + super.equals(other) + } + + override def sorted[B >: T](implicit ord: Ordering[B]): ArraySeq[T] = + ArraySeq.make(array.sorted(ord.asInstanceOf[Ordering[Any]])).asInstanceOf[ArraySeq[T]] + + override def sortInPlace[B >: T]()(implicit ord: Ordering[B]): this.type = { + if (length > 1) scala.util.Sorting.stableSort(array.asInstanceOf[Array[B]]) + this + } +} + +/** A companion object used to create instances of `ArraySeq`. + */ +@SerialVersionUID(3L) +object ArraySeq extends StrictOptimizedClassTagSeqFactory[ArraySeq] { self => + val untagged: SeqFactory[ArraySeq] = new ClassTagSeqFactory.AnySeqDelegate(self) + + // This is reused for all calls to empty. + private[this] val EmptyArraySeq = new ofRef[AnyRef](new Array[AnyRef](0)) + def empty[T : ClassTag]: ArraySeq[T] = EmptyArraySeq.asInstanceOf[ArraySeq[T]] + + def from[A : ClassTag](it: scala.collection.IterableOnce[A]): ArraySeq[A] = make(Array.from[A](it)) + + def newBuilder[A : ClassTag]: Builder[A, ArraySeq[A]] = ArrayBuilder.make[A].mapResult(make) + + /** + * Wrap an existing `Array` into a `ArraySeq` of the proper primitive specialization type + * without copying. + * + * Note that an array containing boxed primitives can be converted to a `ArraySeq` without + * copying. For example, `val a: Array[Any] = Array(1)` is an array of `Object` at runtime, + * containing `Integer`s. An `ArraySeq[Int]` can be obtained with a cast: + * `ArraySeq.make(a).asInstanceOf[ArraySeq[Int]]`. The values are still + * boxed, the resulting instance is an [[ArraySeq.ofRef]]. Writing + * `ArraySeq.make(a.asInstanceOf[Array[Int]])` does not work, it throws a `ClassCastException` + * at runtime. + */ + def make[T](x: Array[T]): ArraySeq[T] = ((x.asInstanceOf[Array[_]]: @unchecked) match { + case null => null + case x: Array[AnyRef] => new ofRef[AnyRef](x) + case x: Array[Int] => new ofInt(x) + case x: Array[Double] => new ofDouble(x) + case x: Array[Long] => new ofLong(x) + case x: Array[Float] => new ofFloat(x) + case x: Array[Char] => new ofChar(x) + case x: Array[Byte] => new ofByte(x) + case x: Array[Short] => new ofShort(x) + case x: Array[Boolean] => new ofBoolean(x) + case x: Array[Unit] => new ofUnit(x) + }).asInstanceOf[ArraySeq[T]] + + @SerialVersionUID(3L) + final class ofRef[T <: AnyRef](val array: Array[T]) extends ArraySeq[T] { + def elemTag: ClassTag[T] = ClassTag[T](array.getClass.getComponentType) + def length: Int = array.length + def apply(index: Int): T = array(index) + def update(index: Int, elem: T): Unit = { array(index) = elem } + override def hashCode = MurmurHash3.arraySeqHash(array) + override def equals(that: Any) = that match { + case that: ofRef[_] => + Array.equals( + this.array.asInstanceOf[Array[AnyRef]], + that.array.asInstanceOf[Array[AnyRef]]) + case _ => super.equals(that) + } + override def iterator: Iterator[T] = new ArrayOps.ArrayIterator[T](array) + override def stepper[S <: Stepper[_]](implicit shape: StepperShape[T, S]): S with EfficientSplit = ( + if(shape.shape == StepperShape.ReferenceShape) + new ObjectArrayStepper(array, 0, array.length) + else shape.parUnbox(new ObjectArrayStepper(array, 0, array.length).asInstanceOf[AnyStepper[T] with EfficientSplit]) + ).asInstanceOf[S with EfficientSplit] + } + + @SerialVersionUID(3L) + final class ofByte(val array: Array[Byte]) extends ArraySeq[Byte] { + // Type erases to `ManifestFactory.ByteManifest`, but can't annotate that because it's not accessible + def elemTag: ClassTag.Byte.type = ClassTag.Byte + def length: Int = array.length + def apply(index: Int): Byte = array(index) + def update(index: Int, elem: Byte): Unit = { array(index) = elem } + override def hashCode = MurmurHash3.arraySeqHash(array) + override def equals(that: Any) = that match { + case that: ofByte => Arrays.equals(array, that.array) + case _ => super.equals(that) + } + override def iterator: Iterator[Byte] = new ArrayOps.ArrayIterator[Byte](array) + override def stepper[S <: Stepper[_]](implicit shape: StepperShape[Byte, S]): S with EfficientSplit = ( + if(shape.shape == StepperShape.ReferenceShape) + AnyStepper.ofParIntStepper(new WidenedByteArrayStepper(array, 0, array.length)) + else new WidenedByteArrayStepper(array, 0, array.length) + ).asInstanceOf[S with EfficientSplit] + } + + @SerialVersionUID(3L) + final class ofShort(val array: Array[Short]) extends ArraySeq[Short] { + // Type erases to `ManifestFactory.ShortManifest`, but can't annotate that because it's not accessible + def elemTag: ClassTag.Short.type = ClassTag.Short + def length: Int = array.length + def apply(index: Int): Short = array(index) + def update(index: Int, elem: Short): Unit = { array(index) = elem } + override def hashCode = MurmurHash3.arraySeqHash(array) + override def equals(that: Any) = that match { + case that: ofShort => Arrays.equals(array, that.array) + case _ => super.equals(that) + } + override def iterator: Iterator[Short] = new ArrayOps.ArrayIterator[Short](array) + override def stepper[S <: Stepper[_]](implicit shape: StepperShape[Short, S]): S with EfficientSplit = ( + if(shape.shape == StepperShape.ReferenceShape) + AnyStepper.ofParIntStepper(new WidenedShortArrayStepper(array, 0, array.length)) + else new WidenedShortArrayStepper(array, 0, array.length) + ).asInstanceOf[S with EfficientSplit] + } + + @SerialVersionUID(3L) + final class ofChar(val array: Array[Char]) extends ArraySeq[Char] { + // Type erases to `ManifestFactory.CharManifest`, but can't annotate that because it's not accessible + def elemTag: ClassTag.Char.type = ClassTag.Char + def length: Int = array.length + def apply(index: Int): Char = array(index) + def update(index: Int, elem: Char): Unit = { array(index) = elem } + override def hashCode = MurmurHash3.arraySeqHash(array) + override def equals(that: Any) = that match { + case that: ofChar => Arrays.equals(array, that.array) + case _ => super.equals(that) + } + override def iterator: Iterator[Char] = new ArrayOps.ArrayIterator[Char](array) + override def stepper[S <: Stepper[_]](implicit shape: StepperShape[Char, S]): S with EfficientSplit = ( + if(shape.shape == StepperShape.ReferenceShape) + AnyStepper.ofParIntStepper(new WidenedCharArrayStepper(array, 0, array.length)) + else new WidenedCharArrayStepper(array, 0, array.length) + ).asInstanceOf[S with EfficientSplit] + + override def addString(sb: StringBuilder, start: String, sep: String, end: String): sb.type = { + val jsb = sb.underlying + if (start.length != 0) jsb.append(start) + val len = array.length + if (len != 0) { + if (sep.isEmpty) jsb.append(array) + else { + jsb.ensureCapacity(jsb.length + len + end.length + (len - 1) * sep.length) + jsb.append(array(0)) + var i = 1 + while (i < len) { + jsb.append(sep) + jsb.append(array(i)) + i += 1 + } + } + } + if (end.length != 0) jsb.append(end) + sb + } + } + + @SerialVersionUID(3L) + final class ofInt(val array: Array[Int]) extends ArraySeq[Int] { + // Type erases to `ManifestFactory.IntManifest`, but can't annotate that because it's not accessible + def elemTag: ClassTag.Int.type = ClassTag.Int + def length: Int = array.length + def apply(index: Int): Int = array(index) + def update(index: Int, elem: Int): Unit = { array(index) = elem } + override def hashCode = MurmurHash3.arraySeqHash(array) + override def equals(that: Any) = that match { + case that: ofInt => Arrays.equals(array, that.array) + case _ => super.equals(that) + } + override def iterator: Iterator[Int] = new ArrayOps.ArrayIterator[Int](array) + override def stepper[S <: Stepper[_]](implicit shape: StepperShape[Int, S]): S with EfficientSplit = ( + if(shape.shape == StepperShape.ReferenceShape) + AnyStepper.ofParIntStepper(new IntArrayStepper(array, 0, array.length)) + else new IntArrayStepper(array, 0, array.length) + ).asInstanceOf[S with EfficientSplit] + } + + @SerialVersionUID(3L) + final class ofLong(val array: Array[Long]) extends ArraySeq[Long] { + // Type erases to `ManifestFactory.LongManifest`, but can't annotate that because it's not accessible + def elemTag: ClassTag.Long.type = ClassTag.Long + def length: Int = array.length + def apply(index: Int): Long = array(index) + def update(index: Int, elem: Long): Unit = { array(index) = elem } + override def hashCode = MurmurHash3.arraySeqHash(array) + override def equals(that: Any) = that match { + case that: ofLong => Arrays.equals(array, that.array) + case _ => super.equals(that) + } + override def iterator: Iterator[Long] = new ArrayOps.ArrayIterator[Long](array) + override def stepper[S <: Stepper[_]](implicit shape: StepperShape[Long, S]): S with EfficientSplit = ( + if(shape.shape == StepperShape.ReferenceShape) + AnyStepper.ofParLongStepper(new LongArrayStepper(array, 0, array.length)) + else new LongArrayStepper(array, 0, array.length) + ).asInstanceOf[S with EfficientSplit] + } + + @SerialVersionUID(3L) + final class ofFloat(val array: Array[Float]) extends ArraySeq[Float] { + // Type erases to `ManifestFactory.FloatManifest`, but can't annotate that because it's not accessible + def elemTag: ClassTag.Float.type = ClassTag.Float + def length: Int = array.length + def apply(index: Int): Float = array(index) + def update(index: Int, elem: Float): Unit = { array(index) = elem } + override def hashCode = MurmurHash3.arraySeqHash(array) + override def equals(that: Any) = that match { + case that: ofFloat => Arrays.equals(array, that.array) + case _ => super.equals(that) + } + override def iterator: Iterator[Float] = new ArrayOps.ArrayIterator[Float](array) + override def stepper[S <: Stepper[_]](implicit shape: StepperShape[Float, S]): S with EfficientSplit = ( + if(shape.shape == StepperShape.ReferenceShape) + AnyStepper.ofParDoubleStepper(new WidenedFloatArrayStepper(array, 0, array.length)) + else new WidenedFloatArrayStepper(array, 0, array.length) + ).asInstanceOf[S with EfficientSplit] + } + + @SerialVersionUID(3L) + final class ofDouble(val array: Array[Double]) extends ArraySeq[Double] { + // Type erases to `ManifestFactory.DoubleManifest`, but can't annotate that because it's not accessible + def elemTag: ClassTag.Double.type = ClassTag.Double + def length: Int = array.length + def apply(index: Int): Double = array(index) + def update(index: Int, elem: Double): Unit = { array(index) = elem } + override def hashCode = MurmurHash3.arraySeqHash(array) + override def equals(that: Any) = that match { + case that: ofDouble => Arrays.equals(array, that.array) + case _ => super.equals(that) + } + override def iterator: Iterator[Double] = new ArrayOps.ArrayIterator[Double](array) + override def stepper[S <: Stepper[_]](implicit shape: StepperShape[Double, S]): S with EfficientSplit = ( + if(shape.shape == StepperShape.ReferenceShape) + AnyStepper.ofParDoubleStepper(new DoubleArrayStepper(array, 0, array.length)) + else new DoubleArrayStepper(array, 0, array.length) + ).asInstanceOf[S with EfficientSplit] + } + + @SerialVersionUID(3L) + final class ofBoolean(val array: Array[Boolean]) extends ArraySeq[Boolean] { + // Type erases to `ManifestFactory.BooleanManifest`, but can't annotate that because it's not accessible + def elemTag: ClassTag.Boolean.type = ClassTag.Boolean + def length: Int = array.length + def apply(index: Int): Boolean = array(index) + def update(index: Int, elem: Boolean): Unit = { array(index) = elem } + override def hashCode = MurmurHash3.arraySeqHash(array) + override def equals(that: Any) = that match { + case that: ofBoolean => Arrays.equals(array, that.array) + case _ => super.equals(that) + } + override def iterator: Iterator[Boolean] = new ArrayOps.ArrayIterator[Boolean](array) + override def stepper[S <: Stepper[_]](implicit shape: StepperShape[Boolean, S]): S with EfficientSplit = + new BoxedBooleanArrayStepper(array, 0, array.length).asInstanceOf[S with EfficientSplit] + } + + @SerialVersionUID(3L) + final class ofUnit(val array: Array[Unit]) extends ArraySeq[Unit] { + // Type erases to `ManifestFactory.UnitManifest`, but can't annotate that because it's not accessible + def elemTag: ClassTag.Unit.type = ClassTag.Unit + def length: Int = array.length + def apply(index: Int): Unit = array(index) + def update(index: Int, elem: Unit): Unit = { array(index) = elem } + override def hashCode = MurmurHash3.arraySeqHash(array) + override def equals(that: Any) = that match { + case that: ofUnit => array.length == that.array.length + case _ => super.equals(that) + } + override def iterator: Iterator[Unit] = new ArrayOps.ArrayIterator[Unit](array) + override def stepper[S <: Stepper[_]](implicit shape: StepperShape[Unit, S]): S with EfficientSplit = + new ObjectArrayStepper[AnyRef](array.asInstanceOf[Array[AnyRef]], 0, array.length).asInstanceOf[S with EfficientSplit] + } +} diff --git a/scala2-library-bootstrapped/src/scala/collection/mutable/CollisionProofHashMap.scala b/scala2-library-bootstrapped/src/scala/collection/mutable/CollisionProofHashMap.scala new file mode 100644 index 000000000000..36b53d1e433b --- /dev/null +++ b/scala2-library-bootstrapped/src/scala/collection/mutable/CollisionProofHashMap.scala @@ -0,0 +1,888 @@ +/* + * Scala (https://www.scala-lang.org) + * + * Copyright EPFL and Lightbend, Inc. + * + * Licensed under Apache License 2.0 + * (http://www.apache.org/licenses/LICENSE-2.0). + * + * See the NOTICE file distributed with this work for + * additional information regarding copyright ownership. + */ + +package scala.collection +package mutable + +import scala.{unchecked => uc} +import scala.annotation.{implicitNotFound, tailrec, unused} +import scala.annotation.unchecked.uncheckedVariance +import scala.collection.generic.DefaultSerializationProxy +import scala.runtime.Statics + +/** This class implements mutable maps using a hashtable with red-black trees in the buckets for good + * worst-case performance on hash collisions. An `Ordering` is required for the element type. Equality + * as determined by the `Ordering` has to be consistent with `equals` and `hashCode`. Universal equality + * of numeric types is not supported (similar to `AnyRefMap`). + * + * @see [[https://docs.scala-lang.org/overviews/collections-2.13/concrete-mutable-collection-classes.html#hash-tables "Scala's Collection Library overview"]] + * section on `Hash Tables` for more information. + * + * @define Coll `mutable.CollisionProofHashMap` + * @define coll mutable collision-proof hash map + * @define mayNotTerminateInf + * @define willNotTerminateInf + */ +final class CollisionProofHashMap[K, V](initialCapacity: Int, loadFactor: Double)(implicit ordering: Ordering[K]) + extends AbstractMap[K, V] + with MapOps[K, V, Map, CollisionProofHashMap[K, V]] //-- + with StrictOptimizedIterableOps[(K, V), Iterable, CollisionProofHashMap[K, V]] + with StrictOptimizedMapOps[K, V, Map, CollisionProofHashMap[K, V]] { //-- + + private[this] final def sortedMapFactory: SortedMapFactory[CollisionProofHashMap] = CollisionProofHashMap + + def this()(implicit ordering: Ordering[K]) = this(CollisionProofHashMap.defaultInitialCapacity, CollisionProofHashMap.defaultLoadFactor)(ordering) + + import CollisionProofHashMap.Node + private[this] type RBNode = CollisionProofHashMap.RBNode[K, V] + private[this] type LLNode = CollisionProofHashMap.LLNode[K, V] + + /** The actual hash table. */ + private[this] var table: Array[Node] = new Array[Node](tableSizeFor(initialCapacity)) + + /** The next size value at which to resize (capacity * load factor). */ + private[this] var threshold: Int = newThreshold(table.length) + + private[this] var contentSize = 0 + + override def size: Int = contentSize + + @`inline` private[this] final def computeHash(o: K): Int = { + val h = if(o.asInstanceOf[AnyRef] eq null) 0 else o.hashCode + h ^ (h >>> 16) + } + + @`inline` private[this] final def index(hash: Int) = hash & (table.length - 1) + + override protected def fromSpecific(coll: IterableOnce[(K, V)] @uncheckedVariance): CollisionProofHashMap[K, V] @uncheckedVariance = CollisionProofHashMap.from(coll) + override protected def newSpecificBuilder: Builder[(K, V), CollisionProofHashMap[K, V]] @uncheckedVariance = CollisionProofHashMap.newBuilder[K, V] + + override def empty: CollisionProofHashMap[K, V] = new CollisionProofHashMap[K, V] + + override def contains(key: K): Boolean = findNode(key) ne null + + def get(key: K): Option[V] = findNode(key) match { + case null => None + case nd => Some(nd match { + case nd: LLNode @uc => nd.value + case nd: RBNode @uc => nd.value + }) + } + + @throws[NoSuchElementException] + override def apply(key: K): V = findNode(key) match { + case null => default(key) + case nd => nd match { + case nd: LLNode @uc => nd.value + case nd: RBNode @uc => nd.value + } + } + + override def getOrElse[V1 >: V](key: K, default: => V1): V1 = { + val nd = findNode(key) + if (nd eq null) default else nd match { + case nd: LLNode @uc => nd.value + case n => n.asInstanceOf[RBNode].value + } + } + + @`inline` private[this] def findNode(elem: K): Node = { + val hash = computeHash(elem) + table(index(hash)) match { + case null => null + case n: LLNode @uc => n.getNode(elem, hash) + case n => n.asInstanceOf[RBNode].getNode(elem, hash) + } + } + + override def sizeHint(size: Int): Unit = { + val target = tableSizeFor(((size + 1).toDouble / loadFactor).toInt) + if(target > table.length) { + if(size == 0) reallocTable(target) + else growTable(target) + } + } + + override def update(key: K, value: V): Unit = put0(key, value, false) + + override def put(key: K, value: V): Option[V] = put0(key, value, true) match { + case null => None + case sm => sm + } + + def addOne(elem: (K, V)): this.type = { put0(elem._1, elem._2, false); this } + + @`inline` private[this] def put0(key: K, value: V, getOld: Boolean): Some[V] = { + if(contentSize + 1 >= threshold) growTable(table.length * 2) + val hash = computeHash(key) + val idx = index(hash) + put0(key, value, getOld, hash, idx) + } + + private[this] def put0(key: K, value: V, getOld: Boolean, hash: Int, idx: Int): Some[V] = { + val res = table(idx) match { + case n: RBNode @uc => + insert(n, idx, key, hash, value) + case _old => + val old: LLNode = _old.asInstanceOf[LLNode] + if(old eq null) { + table(idx) = new LLNode(key, hash, value, null) + } else { + var remaining = CollisionProofHashMap.treeifyThreshold + var prev: LLNode = null + var n = old + while((n ne null) && n.hash <= hash && remaining > 0) { + if(n.hash == hash && key == n.key) { + val old = n.value + n.value = value + return (if(getOld) Some(old) else null) + } + prev = n + n = n.next + remaining -= 1 + } + if(remaining == 0) { + treeify(old, idx) + return put0(key, value, getOld, hash, idx) + } + if(prev eq null) table(idx) = new LLNode(key, hash, value, old) + else prev.next = new LLNode(key, hash, value, prev.next) + } + true + } + if(res) contentSize += 1 + if(res) Some(null.asInstanceOf[V]) else null //TODO + } + + private[this] def treeify(old: LLNode, idx: Int): Unit = { + table(idx) = CollisionProofHashMap.leaf(old.key, old.hash, old.value, red = false, null) + var n: LLNode = old.next + while(n ne null) { + val root = table(idx).asInstanceOf[RBNode] + insertIntoExisting(root, idx, n.key, n.hash, n.value, root) + n = n.next + } + } + + override def addAll(xs: IterableOnce[(K, V)]): this.type = { + val k = xs.knownSize + if(k > 0) sizeHint(contentSize + k) + super.addAll(xs) + } + + // returns the old value or Statics.pfMarker if not found + private[this] def remove0(elem: K) : Any = { + val hash = computeHash(elem) + val idx = index(hash) + table(idx) match { + case null => Statics.pfMarker + case t: RBNode @uc => + val v = delete(t, idx, elem, hash) + if(v.asInstanceOf[AnyRef] ne Statics.pfMarker) contentSize -= 1 + v + case nd: LLNode @uc if nd.hash == hash && nd.key == elem => + // first element matches + table(idx) = nd.next + contentSize -= 1 + nd.value + case nd: LLNode @uc => + // find an element that matches + var prev = nd + var next = nd.next + while((next ne null) && next.hash <= hash) { + if(next.hash == hash && next.key == elem) { + prev.next = next.next + contentSize -= 1 + return next.value + } + prev = next + next = next.next + } + Statics.pfMarker + } + } + + private[this] abstract class MapIterator[R] extends AbstractIterator[R] { + protected[this] def extract(node: LLNode): R + protected[this] def extract(node: RBNode): R + + private[this] var i = 0 + private[this] var node: Node = null + private[this] val len = table.length + + def hasNext: Boolean = { + if(node ne null) true + else { + while(i < len) { + val n = table(i) + i += 1 + n match { + case null => + case n: RBNode @uc => + node = CollisionProofHashMap.minNodeNonNull(n) + return true + case n: LLNode @uc => + node = n + return true + } + } + false + } + } + + def next(): R = + if(!hasNext) Iterator.empty.next() + else node match { + case n: RBNode @uc => + val r = extract(n) + node = CollisionProofHashMap.successor(n ) + r + case n: LLNode @uc => + val r = extract(n) + node = n.next + r + } + } + + override def keysIterator: Iterator[K] = { + if (isEmpty) Iterator.empty + else new MapIterator[K] { + protected[this] def extract(node: LLNode) = node.key + protected[this] def extract(node: RBNode) = node.key + } + } + + override def iterator: Iterator[(K, V)] = { + if (isEmpty) Iterator.empty + else new MapIterator[(K, V)] { + protected[this] def extract(node: LLNode) = (node.key, node.value) + protected[this] def extract(node: RBNode) = (node.key, node.value) + } + } + + private[this] def growTable(newlen: Int) = { + var oldlen = table.length + table = java.util.Arrays.copyOf(table, newlen) + threshold = newThreshold(table.length) + while(oldlen < newlen) { + var i = 0 + while (i < oldlen) { + val old = table(i) + if(old ne null) splitBucket(old, i, i + oldlen, oldlen) + i += 1 + } + oldlen *= 2 + } + } + + @`inline` private[this] def reallocTable(newlen: Int) = { + table = new Array(newlen) + threshold = newThreshold(table.length) + } + + @`inline` private[this] def splitBucket(tree: Node, lowBucket: Int, highBucket: Int, mask: Int): Unit = tree match { + case t: LLNode @uc => splitBucket(t, lowBucket, highBucket, mask) + case t: RBNode @uc => splitBucket(t, lowBucket, highBucket, mask) + } + + private[this] def splitBucket(list: LLNode, lowBucket: Int, highBucket: Int, mask: Int): Unit = { + val preLow: LLNode = new LLNode(null.asInstanceOf[K], 0, null.asInstanceOf[V], null) + val preHigh: LLNode = new LLNode(null.asInstanceOf[K], 0, null.asInstanceOf[V], null) + //preLow.next = null + //preHigh.next = null + var lastLow: LLNode = preLow + var lastHigh: LLNode = preHigh + var n = list + while(n ne null) { + val next = n.next + if((n.hash & mask) == 0) { // keep low + lastLow.next = n + lastLow = n + } else { // move to high + lastHigh.next = n + lastHigh = n + } + n = next + } + lastLow.next = null + if(list ne preLow.next) table(lowBucket) = preLow.next + if(preHigh.next ne null) { + table(highBucket) = preHigh.next + lastHigh.next = null + } + } + + private[this] def splitBucket(tree: RBNode, lowBucket: Int, highBucket: Int, mask: Int): Unit = { + var lowCount, highCount = 0 + tree.foreachNode((n: RBNode) => if((n.hash & mask) != 0) highCount += 1 else lowCount += 1) + if(highCount != 0) { + if(lowCount == 0) { + table(lowBucket) = null + table(highBucket) = tree + } else { + table(lowBucket) = fromNodes(new CollisionProofHashMap.RBNodesIterator(tree).filter(n => (n.hash & mask) == 0), lowCount) + table(highBucket) = fromNodes(new CollisionProofHashMap.RBNodesIterator(tree).filter(n => (n.hash & mask) != 0), highCount) + } + } + } + + private[this] def tableSizeFor(capacity: Int) = + (Integer.highestOneBit((capacity-1).max(4))*2).min(1 << 30) + + private[this] def newThreshold(size: Int) = (size.toDouble * loadFactor).toInt + + override def clear(): Unit = { + java.util.Arrays.fill(table.asInstanceOf[Array[AnyRef]], null) + contentSize = 0 + } + + override def remove(key: K): Option[V] = { + val v = remove0(key) + if(v.asInstanceOf[AnyRef] eq Statics.pfMarker) None else Some(v.asInstanceOf[V]) + } + + def subtractOne(elem: K): this.type = { remove0(elem); this } + + override def knownSize: Int = size + + override def isEmpty: Boolean = size == 0 + + override def foreach[U](f: ((K, V)) => U): Unit = { + val len = table.length + var i = 0 + while(i < len) { + val n = table(i) + if(n ne null) n match { + case n: LLNode @uc => n.foreach(f) + case n: RBNode @uc => n.foreach(f) + } + i += 1 + } + } + + override def foreachEntry[U](f: (K, V) => U): Unit = { + val len = table.length + var i = 0 + while(i < len) { + val n = table(i) + if(n ne null) n match { + case n: LLNode @uc => n.foreachEntry(f) + case n: RBNode @uc => n.foreachEntry(f) + } + i += 1 + } + } + + protected[this] def writeReplace(): AnyRef = new DefaultSerializationProxy(new CollisionProofHashMap.DeserializationFactory[K, V](table.length, loadFactor, ordering), this) + + override protected[this] def className = "CollisionProofHashMap" + + override def getOrElseUpdate(key: K, defaultValue: => V): V = { + val hash = computeHash(key) + val idx = index(hash) + table(idx) match { + case null => () + case n: LLNode @uc => + val nd = n.getNode(key, hash) + if(nd != null) return nd.value + case n => + val nd = n.asInstanceOf[RBNode].getNode(key, hash) + if(nd != null) return nd.value + } + val table0 = table + val default = defaultValue + if(contentSize + 1 >= threshold) growTable(table.length * 2) + // Avoid recomputing index if the `defaultValue()` or new element hasn't triggered a table resize. + val newIdx = if (table0 eq table) idx else index(hash) + put0(key, default, false, hash, newIdx) + default + } + + ///////////////////// Overrides code from SortedMapOps + + /** Builds a new `CollisionProofHashMap` by applying a function to all elements of this $coll. + * + * @param f the function to apply to each element. + * @return a new $coll resulting from applying the given function + * `f` to each element of this $coll and collecting the results. + */ + def map[K2, V2](f: ((K, V)) => (K2, V2)) + (implicit @implicitNotFound(CollisionProofHashMap.ordMsg) ordering: Ordering[K2]): CollisionProofHashMap[K2, V2] = + sortedMapFactory.from(new View.Map[(K, V), (K2, V2)](this, f)) + + /** Builds a new `CollisionProofHashMap` by applying a function to all elements of this $coll + * and using the elements of the resulting collections. + * + * @param f the function to apply to each element. + * @return a new $coll resulting from applying the given collection-valued function + * `f` to each element of this $coll and concatenating the results. + */ + def flatMap[K2, V2](f: ((K, V)) => IterableOnce[(K2, V2)]) + (implicit @implicitNotFound(CollisionProofHashMap.ordMsg) ordering: Ordering[K2]): CollisionProofHashMap[K2, V2] = + sortedMapFactory.from(new View.FlatMap(this, f)) + + /** Builds a new sorted map by applying a partial function to all elements of this $coll + * on which the function is defined. + * + * @param pf the partial function which filters and maps the $coll. + * @return a new $coll resulting from applying the given partial function + * `pf` to each element on which it is defined and collecting the results. + * The order of the elements is preserved. + */ + def collect[K2, V2](pf: PartialFunction[(K, V), (K2, V2)]) + (implicit @implicitNotFound(CollisionProofHashMap.ordMsg) ordering: Ordering[K2]): CollisionProofHashMap[K2, V2] = + sortedMapFactory.from(new View.Collect(this, pf)) + + override def concat[V2 >: V](suffix: IterableOnce[(K, V2)]): CollisionProofHashMap[K, V2] = sortedMapFactory.from(suffix match { + case it: Iterable[(K, V2)] => new View.Concat(this, it) + case _ => iterator.concat(suffix.iterator) + }) + + /** Alias for `concat` */ + @`inline` override final def ++ [V2 >: V](xs: IterableOnce[(K, V2)]): CollisionProofHashMap[K, V2] = concat(xs) + + @deprecated("Consider requiring an immutable Map or fall back to Map.concat", "2.13.0") + override def + [V1 >: V](kv: (K, V1)): CollisionProofHashMap[K, V1] = + sortedMapFactory.from(new View.Appended(this, kv)) + + @deprecated("Use ++ with an explicit collection argument instead of + with varargs", "2.13.0") + override def + [V1 >: V](elem1: (K, V1), elem2: (K, V1), elems: (K, V1)*): CollisionProofHashMap[K, V1] = + sortedMapFactory.from(new View.Concat(new View.Appended(new View.Appended(this, elem1), elem2), elems)) + + ///////////////////// RedBlackTree code derived from mutable.RedBlackTree: + + @`inline` private[this] def isRed(node: RBNode) = (node ne null) && node.red + @`inline` private[this] def isBlack(node: RBNode) = (node eq null) || !node.red + + @unused @`inline` private[this] def compare(key: K, hash: Int, node: LLNode): Int = { + val i = hash - node.hash + if(i != 0) i else ordering.compare(key, node.key) + } + + @`inline` private[this] def compare(key: K, hash: Int, node: RBNode): Int = { + /*val i = hash - node.hash + if(i != 0) i else*/ ordering.compare(key, node.key) + } + + // ---- insertion ---- + + @tailrec private[this] final def insertIntoExisting(_root: RBNode, bucket: Int, key: K, hash: Int, value: V, x: RBNode): Boolean = { + val cmp = compare(key, hash, x) + if(cmp == 0) { + x.value = value + false + } else { + val next = if(cmp < 0) x.left else x.right + if(next eq null) { + val z = CollisionProofHashMap.leaf(key, hash, value, red = true, x) + if (cmp < 0) x.left = z else x.right = z + table(bucket) = fixAfterInsert(_root, z) + return true + } + else insertIntoExisting(_root, bucket, key, hash, value, next) + } + } + + private[this] final def insert(tree: RBNode, bucket: Int, key: K, hash: Int, value: V): Boolean = { + if(tree eq null) { + table(bucket) = CollisionProofHashMap.leaf(key, hash, value, red = false, null) + true + } else insertIntoExisting(tree, bucket, key, hash, value, tree) + } + + private[this] def fixAfterInsert(_root: RBNode, node: RBNode): RBNode = { + var root = _root + var z = node + while (isRed(z.parent)) { + if (z.parent eq z.parent.parent.left) { + val y = z.parent.parent.right + if (isRed(y)) { + z.parent.red = false + y.red = false + z.parent.parent.red = true + z = z.parent.parent + } else { + if (z eq z.parent.right) { + z = z.parent + root = rotateLeft(root, z) + } + z.parent.red = false + z.parent.parent.red = true + root = rotateRight(root, z.parent.parent) + } + } else { // symmetric cases + val y = z.parent.parent.left + if (isRed(y)) { + z.parent.red = false + y.red = false + z.parent.parent.red = true + z = z.parent.parent + } else { + if (z eq z.parent.left) { + z = z.parent + root = rotateRight(root, z) + } + z.parent.red = false + z.parent.parent.red = true + root = rotateLeft(root, z.parent.parent) + } + } + } + root.red = false + root + } + + // ---- deletion ---- + + // returns the old value or Statics.pfMarker if not found + private[this] def delete(_root: RBNode, bucket: Int, key: K, hash: Int): Any = { + var root = _root + val z = root.getNode(key, hash: Int) + if (z ne null) { + val oldValue = z.value + var y = z + var yIsRed = y.red + var x: RBNode = null + var xParent: RBNode = null + + if (z.left eq null) { + x = z.right + root = transplant(root, z, z.right) + xParent = z.parent + } + else if (z.right eq null) { + x = z.left + root = transplant(root, z, z.left) + xParent = z.parent + } + else { + y = CollisionProofHashMap.minNodeNonNull(z.right) + yIsRed = y.red + x = y.right + + if (y.parent eq z) xParent = y + else { + xParent = y.parent + root = transplant(root, y, y.right) + y.right = z.right + y.right.parent = y + } + root = transplant(root, z, y) + y.left = z.left + y.left.parent = y + y.red = z.red + } + + if (!yIsRed) root = fixAfterDelete(root, x, xParent) + if(root ne _root) table(bucket) = root + oldValue + } else Statics.pfMarker + } + + private[this] def fixAfterDelete(_root: RBNode, node: RBNode, parent: RBNode): RBNode = { + var root = _root + var x = node + var xParent = parent + while ((x ne root) && isBlack(x)) { + if (x eq xParent.left) { + var w = xParent.right + // assert(w ne null) + + if (w.red) { + w.red = false + xParent.red = true + root = rotateLeft(root, xParent) + w = xParent.right + } + if (isBlack(w.left) && isBlack(w.right)) { + w.red = true + x = xParent + } else { + if (isBlack(w.right)) { + w.left.red = false + w.red = true + root = rotateRight(root, w) + w = xParent.right + } + w.red = xParent.red + xParent.red = false + w.right.red = false + root = rotateLeft(root, xParent) + x = root + } + } else { // symmetric cases + var w = xParent.left + // assert(w ne null) + + if (w.red) { + w.red = false + xParent.red = true + root = rotateRight(root, xParent) + w = xParent.left + } + if (isBlack(w.right) && isBlack(w.left)) { + w.red = true + x = xParent + } else { + if (isBlack(w.left)) { + w.right.red = false + w.red = true + root = rotateLeft(root, w) + w = xParent.left + } + w.red = xParent.red + xParent.red = false + w.left.red = false + root = rotateRight(root, xParent) + x = root + } + } + xParent = x.parent + } + if (x ne null) x.red = false + root + } + + // ---- helpers ---- + + @`inline` private[this] def rotateLeft(_root: RBNode, x: RBNode): RBNode = { + var root = _root + val y = x.right + x.right = y.left + + val xp = x.parent + if (y.left ne null) y.left.parent = x + y.parent = xp + + if (xp eq null) root = y + else if (x eq xp.left) xp.left = y + else xp.right = y + + y.left = x + x.parent = y + root + } + + @`inline` private[this] def rotateRight(_root: RBNode, x: RBNode): RBNode = { + var root = _root + val y = x.left + x.left = y.right + + val xp = x.parent + if (y.right ne null) y.right.parent = x + y.parent = xp + + if (xp eq null) root = y + else if (x eq xp.right) xp.right = y + else xp.left = y + + y.right = x + x.parent = y + root + } + + /** + * Transplant the node `from` to the place of node `to`. This is done by setting `from` as a child of `to`'s previous + * parent and setting `from`'s parent to the `to`'s previous parent. The children of `from` are left unchanged. + */ + private[this] def transplant(_root: RBNode, to: RBNode, from: RBNode): RBNode = { + var root = _root + if (to.parent eq null) root = from + else if (to eq to.parent.left) to.parent.left = from + else to.parent.right = from + if (from ne null) from.parent = to.parent + root + } + + // building + + def fromNodes(xs: Iterator[Node], size: Int): RBNode = { + val maxUsedDepth = 32 - Integer.numberOfLeadingZeros(size) // maximum depth of non-leaf nodes + def f(level: Int, size: Int): RBNode = size match { + case 0 => null + case 1 => + val nn = xs.next() + val (key, hash, value) = nn match { + case nn: LLNode @uc => (nn.key, nn.hash, nn.value) + case nn: RBNode @uc => (nn.key, nn.hash, nn.value) + } + new RBNode(key, hash, value, level == maxUsedDepth && level != 1, null, null, null) + case n => + val leftSize = (size-1)/2 + val left = f(level+1, leftSize) + val nn = xs.next() + val right = f(level+1, size-1-leftSize) + val (key, hash, value) = nn match { + case nn: LLNode @uc => (nn.key, nn.hash, nn.value) + case nn: RBNode @uc => (nn.key, nn.hash, nn.value) + } + val n = new RBNode(key, hash, value, false, left, right, null) + if(left ne null) left.parent = n + right.parent = n + n + } + f(1, size) + } +} + +/** + * $factoryInfo + * @define Coll `mutable.CollisionProofHashMap` + * @define coll mutable collision-proof hash map + */ +@SerialVersionUID(3L) +object CollisionProofHashMap extends SortedMapFactory[CollisionProofHashMap] { + private[collection] final val ordMsg = "No implicit Ordering[${K2}] found to build a CollisionProofHashMap[${K2}, ${V2}]. You may want to upcast to a Map[${K}, ${V}] first by calling `unsorted`." + + def from[K : Ordering, V](it: scala.collection.IterableOnce[(K, V)]): CollisionProofHashMap[K, V] = { + val k = it.knownSize + val cap = if(k > 0) ((k + 1).toDouble / defaultLoadFactor).toInt else defaultInitialCapacity + new CollisionProofHashMap[K, V](cap, defaultLoadFactor) ++= it + } + + def empty[K : Ordering, V]: CollisionProofHashMap[K, V] = new CollisionProofHashMap[K, V] + + def newBuilder[K : Ordering, V]: Builder[(K, V), CollisionProofHashMap[K, V]] = newBuilder(defaultInitialCapacity, defaultLoadFactor) + + def newBuilder[K : Ordering, V](initialCapacity: Int, loadFactor: Double): Builder[(K, V), CollisionProofHashMap[K, V]] = + new GrowableBuilder[(K, V), CollisionProofHashMap[K, V]](new CollisionProofHashMap[K, V](initialCapacity, loadFactor)) { + override def sizeHint(size: Int) = elems.sizeHint(size) + } + + /** The default load factor for the hash table */ + final def defaultLoadFactor: Double = 0.75 + + /** The default initial capacity for the hash table */ + final def defaultInitialCapacity: Int = 16 + + @SerialVersionUID(3L) + private final class DeserializationFactory[K, V](val tableLength: Int, val loadFactor: Double, val ordering: Ordering[K]) extends Factory[(K, V), CollisionProofHashMap[K, V]] with Serializable { + def fromSpecific(it: IterableOnce[(K, V)]): CollisionProofHashMap[K, V] = new CollisionProofHashMap[K, V](tableLength, loadFactor)(ordering) ++= it + def newBuilder: Builder[(K, V), CollisionProofHashMap[K, V]] = CollisionProofHashMap.newBuilder(tableLength, loadFactor)(using ordering) + } + + @unused @`inline` private def compare[K, V](key: K, hash: Int, node: LLNode[K, V])(implicit ord: Ordering[K]): Int = { + val i = hash - node.hash + if(i != 0) i else ord.compare(key, node.key) + } + + @`inline` private def compare[K, V](key: K, hash: Int, node: RBNode[K, V])(implicit ord: Ordering[K]): Int = { + /*val i = hash - node.hash + if(i != 0) i else*/ ord.compare(key, node.key) + } + + private final val treeifyThreshold = 8 + + // Superclass for RBNode and LLNode to help the JIT with optimizing instance checks, but no shared common fields. + // Keeping calls monomorphic where possible and dispatching manually where needed is faster. + sealed abstract class Node + + /////////////////////////// Red-Black Tree Node + + final class RBNode[K, V](var key: K, var hash: Int, var value: V, var red: Boolean, var left: RBNode[K, V], var right: RBNode[K, V], var parent: RBNode[K, V]) extends Node { + override def toString: String = "RBNode(" + key + ", " + hash + ", " + value + ", " + red + ", " + left + ", " + right + ")" + + @tailrec def getNode(k: K, h: Int)(implicit ord: Ordering[K]): RBNode[K, V] = { + val cmp = compare(k, h, this) + if (cmp < 0) { + if(left ne null) left.getNode(k, h) else null + } else if (cmp > 0) { + if(right ne null) right.getNode(k, h) else null + } else this + } + + def foreach[U](f: ((K, V)) => U): Unit = { + if(left ne null) left.foreach(f) + f((key, value)) + if(right ne null) right.foreach(f) + } + + def foreachEntry[U](f: (K, V) => U): Unit = { + if(left ne null) left.foreachEntry(f) + f(key, value) + if(right ne null) right.foreachEntry(f) + } + + def foreachNode[U](f: RBNode[K, V] => U): Unit = { + if(left ne null) left.foreachNode(f) + f(this) + if(right ne null) right.foreachNode(f) + } + } + + @`inline` private def leaf[A, B](key: A, hash: Int, value: B, red: Boolean, parent: RBNode[A, B]): RBNode[A, B] = + new RBNode(key, hash, value, red, null, null, parent) + + @tailrec private def minNodeNonNull[A, B](node: RBNode[A, B]): RBNode[A, B] = + if (node.left eq null) node else minNodeNonNull(node.left) + + /** + * Returns the node that follows `node` in an in-order tree traversal. If `node` has the maximum key (and is, + * therefore, the last node), this method returns `null`. + */ + private def successor[A, B](node: RBNode[A, B]): RBNode[A, B] = { + if (node.right ne null) minNodeNonNull(node.right) + else { + var x = node + var y = x.parent + while ((y ne null) && (x eq y.right)) { + x = y + y = y.parent + } + y + } + } + + private final class RBNodesIterator[A, B](tree: RBNode[A, B])(implicit @unused ord: Ordering[A]) extends AbstractIterator[RBNode[A, B]] { + private[this] var nextNode: RBNode[A, B] = if(tree eq null) null else minNodeNonNull(tree) + + def hasNext: Boolean = nextNode ne null + + @throws[NoSuchElementException] + def next(): RBNode[A, B] = nextNode match { + case null => Iterator.empty.next() + case node => + nextNode = successor(node) + node + } + } + + /////////////////////////// Linked List Node + + private final class LLNode[K, V](var key: K, var hash: Int, var value: V, var next: LLNode[K, V]) extends Node { + override def toString = s"LLNode($key, $value, $hash) -> $next" + + private[this] def eq(a: Any, b: Any): Boolean = + if(a.asInstanceOf[AnyRef] eq null) b.asInstanceOf[AnyRef] eq null else a.asInstanceOf[AnyRef].equals(b) + + @tailrec def getNode(k: K, h: Int)(implicit ord: Ordering[K]): LLNode[K, V] = { + if(h == hash && eq(k, key) /*ord.compare(k, key) == 0*/) this + else if((next eq null) || (hash > h)) null + else next.getNode(k, h) + } + + @tailrec def foreach[U](f: ((K, V)) => U): Unit = { + f((key, value)) + if(next ne null) next.foreach(f) + } + + @tailrec def foreachEntry[U](f: (K, V) => U): Unit = { + f(key, value) + if(next ne null) next.foreachEntry(f) + } + + @tailrec def foreachNode[U](f: LLNode[K, V] => U): Unit = { + f(this) + if(next ne null) next.foreachNode(f) + } + } +} diff --git a/scala2-library-cc/src/scala/Array.scala b/scala2-library-cc/src/scala/Array.scala new file mode 100644 index 000000000000..d2098a76f32f --- /dev/null +++ b/scala2-library-cc/src/scala/Array.scala @@ -0,0 +1,690 @@ +/* + * Scala (https://www.scala-lang.org) + * + * Copyright EPFL and Lightbend, Inc. + * + * Licensed under Apache License 2.0 + * (http://www.apache.org/licenses/LICENSE-2.0). + * + * See the NOTICE file distributed with this work for + * additional information regarding copyright ownership. + */ + +package scala + +//import scala.collection.generic._ +import scala.collection.{Factory, immutable, mutable} +import mutable.ArrayBuilder +import immutable.ArraySeq +import scala.language.implicitConversions +import scala.reflect.{ClassTag, classTag} +import scala.runtime.BoxedUnit +import scala.runtime.ScalaRunTime +import scala.runtime.ScalaRunTime.{array_apply, array_update} + +/** Utility methods for operating on arrays. + * For example: + * {{{ + * val a = Array(1, 2) + * val b = Array.ofDim[Int](2) + * val c = Array.concat(a, b) + * }}} + * where the array objects `a`, `b` and `c` have respectively the values + * `Array(1, 2)`, `Array(0, 0)` and `Array(1, 2, 0, 0)`. + */ +object Array { + val emptyBooleanArray = new Array[Boolean](0) + val emptyByteArray = new Array[Byte](0) + val emptyCharArray = new Array[Char](0) + val emptyDoubleArray = new Array[Double](0) + val emptyFloatArray = new Array[Float](0) + val emptyIntArray = new Array[Int](0) + val emptyLongArray = new Array[Long](0) + val emptyShortArray = new Array[Short](0) + val emptyObjectArray = new Array[Object](0) + + /** Provides an implicit conversion from the Array object to a collection Factory */ + implicit def toFactory[A : ClassTag](dummy: Array.type): Factory[A, Array[A]] = new ArrayFactory(dummy) + @SerialVersionUID(3L) + private class ArrayFactory[A : ClassTag](dummy: Array.type) extends Factory[A, Array[A]] with Serializable { + def fromSpecific(it: IterableOnce[A]): Array[A] = Array.from[A](it) + def newBuilder: mutable.Builder[A, Array[A]] = Array.newBuilder[A] + } + + /** + * Returns a new [[scala.collection.mutable.ArrayBuilder]]. + */ + def newBuilder[T](implicit t: ClassTag[T]): ArrayBuilder[T] = ArrayBuilder.make[T](using t) + + /** Build an array from the iterable collection. + * + * {{{ + * scala> val a = Array.from(Seq(1, 5)) + * val a: Array[Int] = Array(1, 5) + * + * scala> val b = Array.from(Range(1, 5)) + * val b: Array[Int] = Array(1, 2, 3, 4) + * }}} + * + * @param it the iterable collection + * @return an array consisting of elements of the iterable collection + */ + def from[A : ClassTag](it: IterableOnce[A]): Array[A] = it match { + case it: Iterable[A] => it.toArray[A] + case _ => it.iterator.toArray[A] + } + + private def slowcopy(src : AnyRef, + srcPos : Int, + dest : AnyRef, + destPos : Int, + length : Int): Unit = { + var i = srcPos + var j = destPos + val srcUntil = srcPos + length + while (i < srcUntil) { + array_update(dest, j, array_apply(src, i)) + i += 1 + j += 1 + } + } + + /** Copy one array to another. + * Equivalent to Java's + * `System.arraycopy(src, srcPos, dest, destPos, length)`, + * except that this also works for polymorphic and boxed arrays. + * + * Note that the passed-in `dest` array will be modified by this call. + * + * @param src the source array. + * @param srcPos starting position in the source array. + * @param dest destination array. + * @param destPos starting position in the destination array. + * @param length the number of array elements to be copied. + * + * @see `java.lang.System#arraycopy` + */ + def copy(src: AnyRef, srcPos: Int, dest: AnyRef, destPos: Int, length: Int): Unit = { + val srcClass = src.getClass + if (srcClass.isArray && dest.getClass.isAssignableFrom(srcClass)) + java.lang.System.arraycopy(src, srcPos, dest, destPos, length) + else + slowcopy(src, srcPos, dest, destPos, length) + } + + /** Copy one array to another, truncating or padding with default values (if + * necessary) so the copy has the specified length. + * + * Equivalent to Java's + * `java.util.Arrays.copyOf(original, newLength)`, + * except that this works for primitive and object arrays in a single method. + * + * @see `java.util.Arrays#copyOf` + */ + def copyOf[A](original: Array[A], newLength: Int): Array[A] = ((original: @unchecked) match { + case x: Array[BoxedUnit] => newUnitArray(newLength).asInstanceOf[Array[A]] + case x: Array[AnyRef] => java.util.Arrays.copyOf(x, newLength) + case x: Array[Int] => java.util.Arrays.copyOf(x, newLength) + case x: Array[Double] => java.util.Arrays.copyOf(x, newLength) + case x: Array[Long] => java.util.Arrays.copyOf(x, newLength) + case x: Array[Float] => java.util.Arrays.copyOf(x, newLength) + case x: Array[Char] => java.util.Arrays.copyOf(x, newLength) + case x: Array[Byte] => java.util.Arrays.copyOf(x, newLength) + case x: Array[Short] => java.util.Arrays.copyOf(x, newLength) + case x: Array[Boolean] => java.util.Arrays.copyOf(x, newLength) + }).asInstanceOf[Array[A]] + + /** Copy one array to another, truncating or padding with default values (if + * necessary) so the copy has the specified length. The new array can have + * a different type than the original one as long as the values are + * assignment-compatible. When copying between primitive and object arrays, + * boxing and unboxing are supported. + * + * Equivalent to Java's + * `java.util.Arrays.copyOf(original, newLength, newType)`, + * except that this works for all combinations of primitive and object arrays + * in a single method. + * + * @see `java.util.Arrays#copyOf` + */ + def copyAs[A](original: Array[_], newLength: Int)(implicit ct: ClassTag[A]): Array[A] = { + val runtimeClass = ct.runtimeClass + if (runtimeClass == Void.TYPE) newUnitArray(newLength).asInstanceOf[Array[A]] + else { + val destClass = runtimeClass.asInstanceOf[Class[A]] + if (destClass.isAssignableFrom(original.getClass.getComponentType)) { + if (destClass.isPrimitive) copyOf[A](original.asInstanceOf[Array[A]], newLength) + else { + val destArrayClass = java.lang.reflect.Array.newInstance(destClass, 0).getClass.asInstanceOf[Class[Array[AnyRef]]] + java.util.Arrays.copyOf(original.asInstanceOf[Array[AnyRef]], newLength, destArrayClass).asInstanceOf[Array[A]] + } + } else { + val dest = new Array[A](newLength) + Array.copy(original, 0, dest, 0, original.length) + dest + } + } + } + + private def newUnitArray(len: Int): Array[Unit] = { + val result = new Array[Unit](len) + java.util.Arrays.fill(result.asInstanceOf[Array[AnyRef]], ()) + result + } + + /** Returns an array of length 0 */ + def empty[T: ClassTag]: Array[T] = new Array[T](0) + + /** Creates an array with given elements. + * + * @param xs the elements to put in the array + * @return an array containing all elements from xs. + */ + // Subject to a compiler optimization in Cleanup. + // Array(e0, ..., en) is translated to { val a = new Array(3); a(i) = ei; a } + def apply[T: ClassTag](xs: T*): Array[T] = { + val len = xs.length + xs match { + case wa: immutable.ArraySeq[_] if wa.unsafeArray.getClass.getComponentType == classTag[T].runtimeClass => + // We get here in test/files/run/sd760a.scala, `Array[T](t)` for + // a specialized type parameter `T`. While we still pay for two + // copies of the array it is better than before when we also boxed + // each element when populating the result. + ScalaRunTime.array_clone(wa.unsafeArray).asInstanceOf[Array[T]] + case _ => + val array = new Array[T](len) + val iterator = xs.iterator + var i = 0 + while (iterator.hasNext) { + array(i) = iterator.next(); i += 1 + } + array + } + } + + /** Creates an array of `Boolean` objects */ + // Subject to a compiler optimization in Cleanup, see above. + def apply(x: Boolean, xs: Boolean*): Array[Boolean] = { + val array = new Array[Boolean](xs.length + 1) + array(0) = x + val iterator = xs.iterator + var i = 1 + while (iterator.hasNext) { + array(i) = iterator.next(); i += 1 + } + array + } + + /** Creates an array of `Byte` objects */ + // Subject to a compiler optimization in Cleanup, see above. + def apply(x: Byte, xs: Byte*): Array[Byte] = { + val array = new Array[Byte](xs.length + 1) + array(0) = x + val iterator = xs.iterator + var i = 1 + while (iterator.hasNext) { + array(i) = iterator.next(); i += 1 + } + array + } + + /** Creates an array of `Short` objects */ + // Subject to a compiler optimization in Cleanup, see above. + def apply(x: Short, xs: Short*): Array[Short] = { + val array = new Array[Short](xs.length + 1) + array(0) = x + val iterator = xs.iterator + var i = 1 + while (iterator.hasNext) { + array(i) = iterator.next(); i += 1 + } + array + } + + /** Creates an array of `Char` objects */ + // Subject to a compiler optimization in Cleanup, see above. + def apply(x: Char, xs: Char*): Array[Char] = { + val array = new Array[Char](xs.length + 1) + array(0) = x + val iterator = xs.iterator + var i = 1 + while (iterator.hasNext) { + array(i) = iterator.next(); i += 1 + } + array + } + + /** Creates an array of `Int` objects */ + // Subject to a compiler optimization in Cleanup, see above. + def apply(x: Int, xs: Int*): Array[Int] = { + val array = new Array[Int](xs.length + 1) + array(0) = x + val iterator = xs.iterator + var i = 1 + while (iterator.hasNext) { + array(i) = iterator.next(); i += 1 + } + array + } + + /** Creates an array of `Long` objects */ + // Subject to a compiler optimization in Cleanup, see above. + def apply(x: Long, xs: Long*): Array[Long] = { + val array = new Array[Long](xs.length + 1) + array(0) = x + val iterator = xs.iterator + var i = 1 + while (iterator.hasNext) { + array(i) = iterator.next(); i += 1 + } + array + } + + /** Creates an array of `Float` objects */ + // Subject to a compiler optimization in Cleanup, see above. + def apply(x: Float, xs: Float*): Array[Float] = { + val array = new Array[Float](xs.length + 1) + array(0) = x + val iterator = xs.iterator + var i = 1 + while (iterator.hasNext) { + array(i) = iterator.next(); i += 1 + } + array + } + + /** Creates an array of `Double` objects */ + // Subject to a compiler optimization in Cleanup, see above. + def apply(x: Double, xs: Double*): Array[Double] = { + val array = new Array[Double](xs.length + 1) + array(0) = x + val iterator = xs.iterator + var i = 1 + while (iterator.hasNext) { + array(i) = iterator.next(); i += 1 + } + array + } + + /** Creates an array of `Unit` objects */ + def apply(x: Unit, xs: Unit*): Array[Unit] = { + val array = new Array[Unit](xs.length + 1) + array(0) = x + val iterator = xs.iterator + var i = 1 + while (iterator.hasNext) { + array(i) = iterator.next(); i += 1 + } + array + } + + /** Creates array with given dimensions */ + def ofDim[T: ClassTag](n1: Int): Array[T] = + new Array[T](n1) + /** Creates a 2-dimensional array */ + def ofDim[T: ClassTag](n1: Int, n2: Int): Array[Array[T]] = { + val arr: Array[Array[T]] = (new Array[Array[T]](n1): Array[Array[T]]) + for (i <- 0 until n1) arr(i) = new Array[T](n2) + arr + // tabulate(n1)(_ => ofDim[T](n2)) + } + /** Creates a 3-dimensional array */ + def ofDim[T: ClassTag](n1: Int, n2: Int, n3: Int): Array[Array[Array[T]]] = + tabulate(n1)(_ => ofDim[T](n2, n3)) + /** Creates a 4-dimensional array */ + def ofDim[T: ClassTag](n1: Int, n2: Int, n3: Int, n4: Int): Array[Array[Array[Array[T]]]] = + tabulate(n1)(_ => ofDim[T](n2, n3, n4)) + /** Creates a 5-dimensional array */ + def ofDim[T: ClassTag](n1: Int, n2: Int, n3: Int, n4: Int, n5: Int): Array[Array[Array[Array[Array[T]]]]] = + tabulate(n1)(_ => ofDim[T](n2, n3, n4, n5)) + + /** Concatenates all arrays into a single array. + * + * @param xss the given arrays + * @return the array created from concatenating `xss` + */ + def concat[T: ClassTag](xss: Array[T]*): Array[T] = { + val b = newBuilder[T] + b.sizeHint(xss.map(_.length).sum) + for (xs <- xss) b ++= xs + b.result() + } + + /** Returns an array that contains the results of some element computation a number + * of times. + * + * Note that this means that `elem` is computed a total of n times: + * {{{ + * scala> Array.fill(3){ math.random } + * res3: Array[Double] = Array(0.365461167592537, 1.550395944913685E-4, 0.7907242137333306) + * }}} + * + * @param n the number of elements desired + * @param elem the element computation + * @return an Array of size n, where each element contains the result of computing + * `elem`. + */ + def fill[T: ClassTag](n: Int)(elem: => T): Array[T] = { + if (n <= 0) { + empty[T] + } else { + val array = new Array[T](n) + var i = 0 + while (i < n) { + array(i) = elem + i += 1 + } + array + } + } + + /** Returns a two-dimensional array that contains the results of some element + * computation a number of times. + * + * @param n1 the number of elements in the 1st dimension + * @param n2 the number of elements in the 2nd dimension + * @param elem the element computation + */ + def fill[T: ClassTag](n1: Int, n2: Int)(elem: => T): Array[Array[T]] = + tabulate(n1)(_ => fill(n2)(elem)) + + /** Returns a three-dimensional array that contains the results of some element + * computation a number of times. + * + * @param n1 the number of elements in the 1st dimension + * @param n2 the number of elements in the 2nd dimension + * @param n3 the number of elements in the 3rd dimension + * @param elem the element computation + */ + def fill[T: ClassTag](n1: Int, n2: Int, n3: Int)(elem: => T): Array[Array[Array[T]]] = + tabulate(n1)(_ => fill(n2, n3)(elem)) + + /** Returns a four-dimensional array that contains the results of some element + * computation a number of times. + * + * @param n1 the number of elements in the 1st dimension + * @param n2 the number of elements in the 2nd dimension + * @param n3 the number of elements in the 3rd dimension + * @param n4 the number of elements in the 4th dimension + * @param elem the element computation + */ + def fill[T: ClassTag](n1: Int, n2: Int, n3: Int, n4: Int)(elem: => T): Array[Array[Array[Array[T]]]] = + tabulate(n1)(_ => fill(n2, n3, n4)(elem)) + + /** Returns a five-dimensional array that contains the results of some element + * computation a number of times. + * + * @param n1 the number of elements in the 1st dimension + * @param n2 the number of elements in the 2nd dimension + * @param n3 the number of elements in the 3rd dimension + * @param n4 the number of elements in the 4th dimension + * @param n5 the number of elements in the 5th dimension + * @param elem the element computation + */ + def fill[T: ClassTag](n1: Int, n2: Int, n3: Int, n4: Int, n5: Int)(elem: => T): Array[Array[Array[Array[Array[T]]]]] = + tabulate(n1)(_ => fill(n2, n3, n4, n5)(elem)) + + /** Returns an array containing values of a given function over a range of integer + * values starting from 0. + * + * @param n The number of elements in the array + * @param f The function computing element values + * @return An `Array` consisting of elements `f(0),f(1), ..., f(n - 1)` + */ + def tabulate[T: ClassTag](n: Int)(f: Int => T): Array[T] = { + if (n <= 0) { + empty[T] + } else { + val array = new Array[T](n) + var i = 0 + while (i < n) { + array(i) = f(i) + i += 1 + } + array + } + } + + /** Returns a two-dimensional array containing values of a given function + * over ranges of integer values starting from `0`. + * + * @param n1 the number of elements in the 1st dimension + * @param n2 the number of elements in the 2nd dimension + * @param f The function computing element values + */ + def tabulate[T: ClassTag](n1: Int, n2: Int)(f: (Int, Int) => T): Array[Array[T]] = + tabulate(n1)(i1 => tabulate(n2)(f(i1, _))) + + /** Returns a three-dimensional array containing values of a given function + * over ranges of integer values starting from `0`. + * + * @param n1 the number of elements in the 1st dimension + * @param n2 the number of elements in the 2nd dimension + * @param n3 the number of elements in the 3rd dimension + * @param f The function computing element values + */ + def tabulate[T: ClassTag](n1: Int, n2: Int, n3: Int)(f: (Int, Int, Int) => T): Array[Array[Array[T]]] = + tabulate(n1)(i1 => tabulate(n2, n3)(f(i1, _, _))) + + /** Returns a four-dimensional array containing values of a given function + * over ranges of integer values starting from `0`. + * + * @param n1 the number of elements in the 1st dimension + * @param n2 the number of elements in the 2nd dimension + * @param n3 the number of elements in the 3rd dimension + * @param n4 the number of elements in the 4th dimension + * @param f The function computing element values + */ + def tabulate[T: ClassTag](n1: Int, n2: Int, n3: Int, n4: Int)(f: (Int, Int, Int, Int) => T): Array[Array[Array[Array[T]]]] = + tabulate(n1)(i1 => tabulate(n2, n3, n4)(f(i1, _, _, _))) + + /** Returns a five-dimensional array containing values of a given function + * over ranges of integer values starting from `0`. + * + * @param n1 the number of elements in the 1st dimension + * @param n2 the number of elements in the 2nd dimension + * @param n3 the number of elements in the 3rd dimension + * @param n4 the number of elements in the 4th dimension + * @param n5 the number of elements in the 5th dimension + * @param f The function computing element values + */ + def tabulate[T: ClassTag](n1: Int, n2: Int, n3: Int, n4: Int, n5: Int)(f: (Int, Int, Int, Int, Int) => T): Array[Array[Array[Array[Array[T]]]]] = + tabulate(n1)(i1 => tabulate(n2, n3, n4, n5)(f(i1, _, _, _, _))) + + /** Returns an array containing a sequence of increasing integers in a range. + * + * @param start the start value of the array + * @param end the end value of the array, exclusive (in other words, this is the first value '''not''' returned) + * @return the array with values in range `start, start + 1, ..., end - 1` + * up to, but excluding, `end`. + */ + def range(start: Int, end: Int): Array[Int] = range(start, end, 1) + + /** Returns an array containing equally spaced values in some integer interval. + * + * @param start the start value of the array + * @param end the end value of the array, exclusive (in other words, this is the first value '''not''' returned) + * @param step the increment value of the array (may not be zero) + * @return the array with values in `start, start + step, ...` up to, but excluding `end` + */ + def range(start: Int, end: Int, step: Int): Array[Int] = { + if (step == 0) throw new IllegalArgumentException("zero step") + val array = new Array[Int](immutable.Range.count(start, end, step, isInclusive = false)) + + var n = 0 + var i = start + while (if (step < 0) end < i else i < end) { + array(n) = i + i += step + n += 1 + } + array + } + + /** Returns an array containing repeated applications of a function to a start value. + * + * @param start the start value of the array + * @param len the number of elements returned by the array + * @param f the function that is repeatedly applied + * @return the array returning `len` values in the sequence `start, f(start), f(f(start)), ...` + */ + def iterate[T: ClassTag](start: T, len: Int)(f: T => T): Array[T] = { + if (len > 0) { + val array = new Array[T](len) + var acc = start + var i = 1 + array(0) = acc + + while (i < len) { + acc = f(acc) + array(i) = acc + i += 1 + } + array + } else { + empty[T] + } + } + + /** Compare two arrays per element. + * + * A more efficient version of `xs.sameElements(ys)`. + * + * Note that arrays are invariant in Scala, but it may + * be sound to cast an array of arbitrary reference type + * to `Array[AnyRef]`. Arrays on the JVM are covariant + * in their element type. + * + * `Array.equals(xs.asInstanceOf[Array[AnyRef]], ys.asInstanceOf[Array[AnyRef]])` + * + * @param xs an array of AnyRef + * @param ys an array of AnyRef + * @return true if corresponding elements are equal + */ + def equals(xs: Array[AnyRef], ys: Array[AnyRef]): Boolean = + (xs eq ys) || + (xs.length == ys.length) && { + var i = 0 + while (i < xs.length && xs(i) == ys(i)) i += 1 + i >= xs.length + } + + /** Called in a pattern match like `{ case Array(x,y,z) => println('3 elements')}`. + * + * @param x the selector value + * @return sequence wrapped in a [[scala.Some]], if `x` is an Array, otherwise `None` + */ + def unapplySeq[T](x: Array[T]): UnapplySeqWrapper[T] = new UnapplySeqWrapper(x) + + final class UnapplySeqWrapper[T](private val a: Array[T]) extends AnyVal { + def isEmpty: false = false + def get: UnapplySeqWrapper[T] = this + def lengthCompare(len: Int): Int = a.lengthCompare(len) + def apply(i: Int): T = a(i) + def drop(n: Int): scala.Seq[T] = ArraySeq.unsafeWrapArray(a.drop(n)) // clones the array, also if n == 0 + def toSeq: scala.Seq[T] = a.toSeq // clones the array + } +} + +/** Arrays are mutable, indexed collections of values. `Array[T]` is Scala's representation + * for Java's `T[]`. + * + * {{{ + * val numbers = Array(1, 2, 3, 4) + * val first = numbers(0) // read the first element + * numbers(3) = 100 // replace the 4th array element with 100 + * val biggerNumbers = numbers.map(_ * 2) // multiply all numbers by two + * }}} + * + * Arrays make use of two common pieces of Scala syntactic sugar, shown on lines 2 and 3 of the above + * example code. + * Line 2 is translated into a call to `apply(Int)`, while line 3 is translated into a call to + * `update(Int, T)`. + * + * Two implicit conversions exist in [[scala.Predef]] that are frequently applied to arrays: a conversion + * to [[scala.collection.ArrayOps]] (shown on line 4 of the example above) and a conversion + * to [[scala.collection.mutable.ArraySeq]] (a subtype of [[scala.collection.Seq]]). + * Both types make available many of the standard operations found in the Scala collections API. + * The conversion to `ArrayOps` is temporary, as all operations defined on `ArrayOps` return an `Array`, + * while the conversion to `ArraySeq` is permanent as all operations return a `ArraySeq`. + * + * The conversion to `ArrayOps` takes priority over the conversion to `ArraySeq`. For instance, + * consider the following code: + * + * {{{ + * val arr = Array(1, 2, 3) + * val arrReversed = arr.reverse + * val seqReversed : collection.Seq[Int] = arr.reverse + * }}} + * + * Value `arrReversed` will be of type `Array[Int]`, with an implicit conversion to `ArrayOps` occurring + * to perform the `reverse` operation. The value of `seqReversed`, on the other hand, will be computed + * by converting to `ArraySeq` first and invoking the variant of `reverse` that returns another + * `ArraySeq`. + * + * @see [[https://www.scala-lang.org/files/archive/spec/2.13/ Scala Language Specification]], for in-depth information on the transformations the Scala compiler makes on Arrays (Sections 6.6 and 6.15 respectively.) + * @see [[https://docs.scala-lang.org/sips/scala-2-8-arrays.html "Scala 2.8 Arrays"]] the Scala Improvement Document detailing arrays since Scala 2.8. + * @see [[https://docs.scala-lang.org/overviews/collections-2.13/arrays.html "The Scala 2.8 Collections' API"]] section on `Array` by Martin Odersky for more information. + * @hideImplicitConversion scala.Predef.booleanArrayOps + * @hideImplicitConversion scala.Predef.byteArrayOps + * @hideImplicitConversion scala.Predef.charArrayOps + * @hideImplicitConversion scala.Predef.doubleArrayOps + * @hideImplicitConversion scala.Predef.floatArrayOps + * @hideImplicitConversion scala.Predef.intArrayOps + * @hideImplicitConversion scala.Predef.longArrayOps + * @hideImplicitConversion scala.Predef.refArrayOps + * @hideImplicitConversion scala.Predef.shortArrayOps + * @hideImplicitConversion scala.Predef.unitArrayOps + * @hideImplicitConversion scala.LowPriorityImplicits.wrapRefArray + * @hideImplicitConversion scala.LowPriorityImplicits.wrapIntArray + * @hideImplicitConversion scala.LowPriorityImplicits.wrapDoubleArray + * @hideImplicitConversion scala.LowPriorityImplicits.wrapLongArray + * @hideImplicitConversion scala.LowPriorityImplicits.wrapFloatArray + * @hideImplicitConversion scala.LowPriorityImplicits.wrapCharArray + * @hideImplicitConversion scala.LowPriorityImplicits.wrapByteArray + * @hideImplicitConversion scala.LowPriorityImplicits.wrapShortArray + * @hideImplicitConversion scala.LowPriorityImplicits.wrapBooleanArray + * @hideImplicitConversion scala.LowPriorityImplicits.wrapUnitArray + * @hideImplicitConversion scala.LowPriorityImplicits.genericWrapArray + * @define coll array + * @define Coll `Array` + * @define orderDependent + * @define orderDependentFold + * @define mayNotTerminateInf + * @define willNotTerminateInf + * @define collectExample + * @define undefinedorder + */ +final class Array[T](_length: Int) extends java.io.Serializable with java.lang.Cloneable { + + /** The length of the array */ + def length: Int = throw new Error() + + /** The element at given index. + * + * Indices start at `0`; `xs.apply(0)` is the first element of array `xs`. + * Note the indexing syntax `xs(i)` is a shorthand for `xs.apply(i)`. + * + * @param i the index + * @return the element at the given index + * @throws ArrayIndexOutOfBoundsException if `i < 0` or `length <= i` + */ + def apply(i: Int): T = throw new Error() + + /** Update the element at given index. + * + * Indices start at `0`; `xs.update(i, x)` replaces the i^th^ element in the array. + * Note the syntax `xs(i) = x` is a shorthand for `xs.update(i, x)`. + * + * @param i the index + * @param x the value to be written at index `i` + * @throws ArrayIndexOutOfBoundsException if `i < 0` or `length <= i` + */ + def update(i: Int, x: T): Unit = { throw new Error() } + + /** Clone the Array. + * + * @return A clone of the Array. + */ + override def clone(): Array[T] = throw new Error() +} diff --git a/scala2-library-cc/src/scala/collection/ArrayOps.scala b/scala2-library-cc/src/scala/collection/ArrayOps.scala index e8548c12751f..72ec66a0bc86 100644 --- a/scala2-library-cc/src/scala/collection/ArrayOps.scala +++ b/scala2-library-cc/src/scala/collection/ArrayOps.scala @@ -590,7 +590,7 @@ final class ArrayOps[A](private val xs: Array[A]) extends AnyVal { val len = xs.length def boxed = if(len < ArrayOps.MaxStableSortLength) { val a = xs.clone() - Sorting.stableSort(a)(ord.asInstanceOf[Ordering[A]]) + Sorting.stableSort(a)(using ord.asInstanceOf[Ordering[A]]) a } else { val a = Array.copyAs[AnyRef](xs, len)(ClassTag.AnyRef) @@ -1300,7 +1300,7 @@ final class ArrayOps[A](private val xs: Array[A]) extends AnyVal { val bb = new ArrayBuilder.ofRef[Array[B]]()(ClassTag[Array[B]](aClass)) if (xs.length == 0) bb.result() else { - def mkRowBuilder() = ArrayBuilder.make[B](ClassTag[B](aClass.getComponentType)) + def mkRowBuilder() = ArrayBuilder.make[B](using ClassTag[B](aClass.getComponentType)) val bs = new ArrayOps(asArray(xs(0))).map((x: B) => mkRowBuilder()) for (xs <- this) { var i = 0 diff --git a/scala2-library-cc/src/scala/collection/Factory.scala b/scala2-library-cc/src/scala/collection/Factory.scala index 99f584b972fc..96f39bafc905 100644 --- a/scala2-library-cc/src/scala/collection/Factory.scala +++ b/scala2-library-cc/src/scala/collection/Factory.scala @@ -675,16 +675,16 @@ object ClassTagIterableFactory { * sound depending on the use of the `ClassTag` by the collection implementation. */ @SerialVersionUID(3L) class AnyIterableDelegate[CC[_]](delegate: ClassTagIterableFactory[CC]) extends IterableFactory[CC] { - def empty[A]: CC[A] = delegate.empty(ClassTag.Any).asInstanceOf[CC[A]] - def from[A](it: IterableOnce[A]^): CC[A] = delegate.from[Any](it)(ClassTag.Any).asInstanceOf[CC[A]] - def newBuilder[A]: Builder[A, CC[A]] = delegate.newBuilder(ClassTag.Any).asInstanceOf[Builder[A, CC[A]]] - override def apply[A](elems: A*): CC[A] = delegate.apply[Any](elems: _*)(ClassTag.Any).asInstanceOf[CC[A]] - override def iterate[A](start: A, len: Int)(f: A => A): CC[A] = delegate.iterate[A](start, len)(f)(ClassTag.Any.asInstanceOf[ClassTag[A]]) - override def unfold[A, S](init: S)(f: S => Option[(A, S)]): CC[A] = delegate.unfold[A, S](init)(f)(ClassTag.Any.asInstanceOf[ClassTag[A]]) - override def range[A](start: A, end: A)(implicit i: Integral[A]): CC[A] = delegate.range[A](start, end)(i, ClassTag.Any.asInstanceOf[ClassTag[A]]) - override def range[A](start: A, end: A, step: A)(implicit i: Integral[A]): CC[A] = delegate.range[A](start, end, step)(i, ClassTag.Any.asInstanceOf[ClassTag[A]]) - override def fill[A](n: Int)(elem: => A): CC[A] = delegate.fill[Any](n)(elem)(ClassTag.Any).asInstanceOf[CC[A]] - override def tabulate[A](n: Int)(f: Int => A): CC[A] = delegate.tabulate[Any](n)(f)(ClassTag.Any).asInstanceOf[CC[A]] + def empty[A]: CC[A] = delegate.empty(using ClassTag.Any).asInstanceOf[CC[A]] + def from[A](it: IterableOnce[A]^): CC[A] = delegate.from[Any](it)(using ClassTag.Any).asInstanceOf[CC[A]] + def newBuilder[A]: Builder[A, CC[A]] = delegate.newBuilder(using ClassTag.Any).asInstanceOf[Builder[A, CC[A]]] + override def apply[A](elems: A*): CC[A] = delegate.apply[Any](elems: _*)(using ClassTag.Any).asInstanceOf[CC[A]] + override def iterate[A](start: A, len: Int)(f: A => A): CC[A] = delegate.iterate[A](start, len)(f)(using ClassTag.Any.asInstanceOf[ClassTag[A]]) + override def unfold[A, S](init: S)(f: S => Option[(A, S)]): CC[A] = delegate.unfold[A, S](init)(f)(using ClassTag.Any.asInstanceOf[ClassTag[A]]) + override def range[A](start: A, end: A)(implicit i: Integral[A]): CC[A] = delegate.range[A](start, end)(using i, ClassTag.Any.asInstanceOf[ClassTag[A]]) + override def range[A](start: A, end: A, step: A)(implicit i: Integral[A]): CC[A] = delegate.range[A](start, end, step)(using i, ClassTag.Any.asInstanceOf[ClassTag[A]]) + override def fill[A](n: Int)(elem: => A): CC[A] = delegate.fill[Any](n)(elem)(using ClassTag.Any).asInstanceOf[CC[A]] + override def tabulate[A](n: Int)(f: Int => A): CC[A] = delegate.tabulate[Any](n)(f)(using ClassTag.Any).asInstanceOf[CC[A]] } } diff --git a/scala2-library-cc/src/scala/collection/Iterable.scala b/scala2-library-cc/src/scala/collection/Iterable.scala index 5afc14f4ceef..6556f31d378d 100644 --- a/scala2-library-cc/src/scala/collection/Iterable.scala +++ b/scala2-library-cc/src/scala/collection/Iterable.scala @@ -985,9 +985,9 @@ trait SortedSetFactoryDefaults[+A, +WithFilterCC[x] <: IterableOps[x, WithFilterCC, WithFilterCC[x]] with Set[x]] extends SortedSetOps[A @uncheckedVariance, CC, CC[A @uncheckedVariance]] { self: IterableOps[A, WithFilterCC, _] => - override protected def fromSpecific(coll: IterableOnce[A @uncheckedVariance]^): CC[A @uncheckedVariance]^{coll} = sortedIterableFactory.from(coll)(ordering) - override protected def newSpecificBuilder: mutable.Builder[A @uncheckedVariance, CC[A @uncheckedVariance]] = sortedIterableFactory.newBuilder[A](ordering) - override def empty: CC[A @uncheckedVariance] = sortedIterableFactory.empty(ordering) + override protected def fromSpecific(coll: IterableOnce[A @uncheckedVariance]^): CC[A @uncheckedVariance]^{coll} = sortedIterableFactory.from(coll)(using ordering) + override protected def newSpecificBuilder: mutable.Builder[A @uncheckedVariance, CC[A @uncheckedVariance]] = sortedIterableFactory.newBuilder[A](using ordering) + override def empty: CC[A @uncheckedVariance] = sortedIterableFactory.empty(using ordering) override def withFilter(p: A => Boolean): SortedSetOps.WithFilter[A, WithFilterCC, CC]^{p} = new SortedSetOps.WithFilter[A, WithFilterCC, CC](this, p) @@ -1040,9 +1040,9 @@ trait SortedMapFactoryDefaults[K, +V, +UnsortedCC[x, y] <: Map[x, y]] extends SortedMapOps[K, V, CC, CC[K, V @uncheckedVariance]] with MapOps[K, V, UnsortedCC, CC[K, V @uncheckedVariance]] { self: IterableOps[(K, V), WithFilterCC, _] => - override def empty: CC[K, V @uncheckedVariance] = sortedMapFactory.empty(ordering) - override protected def fromSpecific(coll: IterableOnce[(K, V @uncheckedVariance)]^): CC[K, V @uncheckedVariance]^{coll} = sortedMapFactory.from(coll)(ordering) - override protected def newSpecificBuilder: mutable.Builder[(K, V @uncheckedVariance), CC[K, V @uncheckedVariance]] = sortedMapFactory.newBuilder[K, V](ordering) + override def empty: CC[K, V @uncheckedVariance] = sortedMapFactory.empty(using ordering) + override protected def fromSpecific(coll: IterableOnce[(K, V @uncheckedVariance)]^): CC[K, V @uncheckedVariance]^{coll} = sortedMapFactory.from(coll)(using ordering) + override protected def newSpecificBuilder: mutable.Builder[(K, V @uncheckedVariance), CC[K, V @uncheckedVariance]] = sortedMapFactory.newBuilder[K, V](using ordering) override def withFilter(p: ((K, V)) => Boolean): collection.SortedMapOps.WithFilter[K, V, WithFilterCC, UnsortedCC, CC]^{p} = new collection.SortedMapOps.WithFilter[K, V, WithFilterCC, UnsortedCC, CC](this, p) diff --git a/scala2-library-cc/src/scala/collection/SortedMap.scala b/scala2-library-cc/src/scala/collection/SortedMap.scala index 7b9381ebb078..876a83b2709c 100644 --- a/scala2-library-cc/src/scala/collection/SortedMap.scala +++ b/scala2-library-cc/src/scala/collection/SortedMap.scala @@ -181,16 +181,16 @@ trait SortedMapOps[K, +V, +CC[X, Y] <: Map[X, Y] with SortedMapOps[X, Y, CC, _], override def concat[V2 >: V](suffix: IterableOnce[(K, V2)]^): CC[K, V2] = sortedMapFactory.from(suffix match { case it: Iterable[(K, V2)] => new View.Concat(this, it) case _ => iterator.concat(suffix.iterator) - })(ordering) + })(using ordering) /** Alias for `concat` */ @`inline` override final def ++ [V2 >: V](xs: IterableOnce[(K, V2)]^): CC[K, V2] = concat(xs) @deprecated("Consider requiring an immutable Map or fall back to Map.concat", "2.13.0") - override def + [V1 >: V](kv: (K, V1)): CC[K, V1] = sortedMapFactory.from(new View.Appended(this, kv))(ordering) + override def + [V1 >: V](kv: (K, V1)): CC[K, V1] = sortedMapFactory.from(new View.Appended(this, kv))(using ordering) @deprecated("Use ++ with an explicit collection argument instead of + with varargs", "2.13.0") - override def + [V1 >: V](elem1: (K, V1), elem2: (K, V1), elems: (K, V1)*): CC[K, V1] = sortedMapFactory.from(new View.Concat(new View.Appended(new View.Appended(this, elem1), elem2), elems))(ordering) + override def + [V1 >: V](elem1: (K, V1), elem2: (K, V1), elems: (K, V1)*): CC[K, V1] = sortedMapFactory.from(new View.Concat(new View.Appended(new View.Appended(this, elem1), elem2), elems))(using ordering) } object SortedMapOps { diff --git a/scala2-library-cc/src/scala/collection/StrictOptimizedSortedMapOps.scala b/scala2-library-cc/src/scala/collection/StrictOptimizedSortedMapOps.scala index 9a9e6e367922..411a86c7cc5c 100644 --- a/scala2-library-cc/src/scala/collection/StrictOptimizedSortedMapOps.scala +++ b/scala2-library-cc/src/scala/collection/StrictOptimizedSortedMapOps.scala @@ -34,7 +34,7 @@ trait StrictOptimizedSortedMapOps[K, +V, +CC[X, Y] <: Map[X, Y] with SortedMapOp strictOptimizedFlatMap(sortedMapFactory.newBuilder, f) override def concat[V2 >: V](xs: IterableOnce[(K, V2)]^): CC[K, V2] = - strictOptimizedConcat(xs, sortedMapFactory.newBuilder(ordering)) + strictOptimizedConcat(xs, sortedMapFactory.newBuilder(using ordering)) override def collect[K2, V2](pf: PartialFunction[(K, V), (K2, V2)])(implicit @implicitNotFound(SortedMapOps.ordMsg) ordering: Ordering[K2]): CC[K2, V2] = strictOptimizedCollect(sortedMapFactory.newBuilder, pf) diff --git a/scala2-library-cc/src/scala/collection/generic/DefaultSerializationProxy.scala b/scala2-library-cc/src/scala/collection/generic/DefaultSerializationProxy.scala index e36bb77ebdb8..1f0e6164731c 100644 --- a/scala2-library-cc/src/scala/collection/generic/DefaultSerializationProxy.scala +++ b/scala2-library-cc/src/scala/collection/generic/DefaultSerializationProxy.scala @@ -78,9 +78,9 @@ private[collection] case object SerializeEnd trait DefaultSerializable extends Serializable { this: scala.collection.Iterable[_] => protected[this] def writeReplace(): AnyRef = { val f: Factory[Any, Any] = this match { - case it: scala.collection.SortedMap[_, _] => it.sortedMapFactory.sortedMapFactory[Any, Any](it.ordering.asInstanceOf[Ordering[Any]]).asInstanceOf[Factory[Any, Any]] + case it: scala.collection.SortedMap[_, _] => it.sortedMapFactory.sortedMapFactory[Any, Any](using it.ordering.asInstanceOf[Ordering[Any]]).asInstanceOf[Factory[Any, Any]] case it: scala.collection.Map[_, _] => it.mapFactory.mapFactory[Any, Any].asInstanceOf[Factory[Any, Any]] - case it: scala.collection.SortedSet[_] => it.sortedIterableFactory.evidenceIterableFactory[Any](it.ordering.asInstanceOf[Ordering[Any]]) + case it: scala.collection.SortedSet[_] => it.sortedIterableFactory.evidenceIterableFactory[Any](using it.ordering.asInstanceOf[Ordering[Any]]) case it => it.iterableFactory.iterableFactory } new DefaultSerializationProxy(f, this) diff --git a/scala2-library-cc/src/scala/collection/mutable/ArraySeq.scala b/scala2-library-cc/src/scala/collection/mutable/ArraySeq.scala index 70762e5b340d..d1c5b5c9ce72 100644 --- a/scala2-library-cc/src/scala/collection/mutable/ArraySeq.scala +++ b/scala2-library-cc/src/scala/collection/mutable/ArraySeq.scala @@ -46,15 +46,15 @@ sealed abstract class ArraySeq[T] override def iterableFactory: scala.collection.SeqFactory[ArraySeq] = ArraySeq.untagged override protected def fromSpecific(coll: scala.collection.IterableOnce[T]^): ArraySeq[T] = { - val b = ArrayBuilder.make(elemTag).asInstanceOf[ArrayBuilder[T]] + val b = ArrayBuilder.make(using elemTag).asInstanceOf[ArrayBuilder[T]] val s = coll.knownSize if(s > 0) b.sizeHint(s) b ++= coll ArraySeq.make(b.result()) } override protected def newSpecificBuilder: Builder[T, ArraySeq[T]] = - ArraySeq.newBuilder[T](elemTag.asInstanceOf[ClassTag[T]]).asInstanceOf[Builder[T, ArraySeq[T]]] - override def empty: ArraySeq[T] = ArraySeq.empty(elemTag.asInstanceOf[ClassTag[T]]) + ArraySeq.newBuilder[T](using elemTag.asInstanceOf[ClassTag[T]]).asInstanceOf[Builder[T, ArraySeq[T]]] + override def empty: ArraySeq[T] = ArraySeq.empty(using elemTag.asInstanceOf[ClassTag[T]]) /** The tag of the element type. This does not have to be equal to the element type of this ArraySeq. A primitive * ArraySeq can be backed by an array of boxed values and a reference ArraySeq can be backed by an array of a supertype diff --git a/scala2-library-cc/src/scala/collection/mutable/CollisionProofHashMap.scala b/scala2-library-cc/src/scala/collection/mutable/CollisionProofHashMap.scala index ff3bab1dd818..05c3124a3323 100644 --- a/scala2-library-cc/src/scala/collection/mutable/CollisionProofHashMap.scala +++ b/scala2-library-cc/src/scala/collection/mutable/CollisionProofHashMap.scala @@ -768,7 +768,7 @@ object CollisionProofHashMap extends SortedMapFactory[CollisionProofHashMap] { @SerialVersionUID(3L) private final class DeserializationFactory[K, V](val tableLength: Int, val loadFactor: Double, val ordering: Ordering[K]) extends Factory[(K, V), CollisionProofHashMap[K, V]] with Serializable { def fromSpecific(it: IterableOnce[(K, V)]^): CollisionProofHashMap[K, V] = new CollisionProofHashMap[K, V](tableLength, loadFactor)(ordering) ++= it - def newBuilder: Builder[(K, V), CollisionProofHashMap[K, V]] = CollisionProofHashMap.newBuilder(tableLength, loadFactor)(ordering) + def newBuilder: Builder[(K, V), CollisionProofHashMap[K, V]] = CollisionProofHashMap.newBuilder(tableLength, loadFactor)(using ordering) } @unused @`inline` private def compare[K, V](key: K, hash: Int, node: LLNode[K, V])(implicit ord: Ordering[K]): Int = { diff --git a/tests/neg/given-loop-prevention.check b/tests/neg/given-loop-prevention.check new file mode 100644 index 000000000000..460adf03be49 --- /dev/null +++ b/tests/neg/given-loop-prevention.check @@ -0,0 +1,14 @@ +-- Error: tests/neg/given-loop-prevention.scala:10:36 ------------------------------------------------------------------ +10 | given List[Foo] = List(summon[Foo]) // error + | ^ + | Result of implicit search for Foo will change. + | Current result Baz.given_Foo will be no longer eligible + | because it is not defined before the search position. + | Result with new rules: No Matching Implicit. + | To opt into the new rules, compile with `-source future` or use + | the `scala.language.future` language import. + | + | To fix the problem without the language import, you could try one of the following: + | - use a `given ... with` clause as the enclosing given, + | - rearrange definitions so that Baz.given_Foo comes earlier, + | - use an explicit argument. diff --git a/tests/neg/given-loop-prevention.scala b/tests/neg/given-loop-prevention.scala new file mode 100644 index 000000000000..9d404b8c6d8e --- /dev/null +++ b/tests/neg/given-loop-prevention.scala @@ -0,0 +1,12 @@ + +class Foo + +object Bar { + given Foo with {} + given List[Foo] = List(summon[Foo]) // ok +} + +object Baz { + given List[Foo] = List(summon[Foo]) // error + given Foo with {} +} diff --git a/tests/neg/i6716.check b/tests/neg/i6716.check index 4684842e73fe..0144f539f53c 100644 --- a/tests/neg/i6716.check +++ b/tests/neg/i6716.check @@ -1,5 +1,5 @@ --- Warning: tests/neg/i6716.scala:12:39 -------------------------------------------------------------------------------- -12 | given Monad[Bar] = summon[Monad[Foo]] // warn +-- Error: tests/neg/i6716.scala:11:39 ---------------------------------------------------------------------------------- +11 | given Monad[Bar] = summon[Monad[Foo]] // error | ^ | Result of implicit search for Monad[Foo] will change. | Current result Bar.given_Monad_Bar will be no longer eligible @@ -12,5 +12,3 @@ | - use a `given ... with` clause as the enclosing given, | - rearrange definitions so that Bar.given_Monad_Bar comes earlier, | - use an explicit argument. - | This will be an error in Scala 3.5 and later. -No warnings can be incurred under -Werror (or -Xfatal-warnings) diff --git a/tests/neg/i6716.scala b/tests/neg/i6716.scala index 311209fd9006..8b37d4e223ac 100644 --- a/tests/neg/i6716.scala +++ b/tests/neg/i6716.scala @@ -1,4 +1,3 @@ -//> using options -Xfatal-warnings trait Monad[T]: def id: String @@ -9,11 +8,10 @@ object Foo { opaque type Bar = Foo object Bar { - given Monad[Bar] = summon[Monad[Foo]] // warn + given Monad[Bar] = summon[Monad[Foo]] // error } object Test extends App { println(summon[Monad[Foo]].id) println(summon[Monad[Bar]].id) } -// nopos-error: No warnings can be incurred under -Werror (or -Xfatal-warnings) \ No newline at end of file diff --git a/tests/neg/i7294-a.check b/tests/neg/i7294-a.check deleted file mode 100644 index c33735258ad0..000000000000 --- a/tests/neg/i7294-a.check +++ /dev/null @@ -1,27 +0,0 @@ --- [E007] Type Mismatch Error: tests/neg/i7294-a.scala:10:20 ----------------------------------------------------------- -10 | case x: T => x.g(10) // error - | ^^^^^^^ - | Found: Any - | Required: T - | - | where: T is a type in given instance f with bounds <: foo.Foo - | - | longer explanation available when compiling with `-explain` --- Warning: tests/neg/i7294-a.scala:10:12 ------------------------------------------------------------------------------ -10 | case x: T => x.g(10) // error - | ^ - | Result of implicit search for scala.reflect.TypeTest[Nothing, T] will change. - | Current result foo.Test.f will be no longer eligible - | because it is not defined before the search position. - | Result with new rules: No Matching Implicit. - | To opt into the new rules, compile with `-source future` or use - | the `scala.language.future` language import. - | - | To fix the problem without the language import, you could try one of the following: - | - use a `given ... with` clause as the enclosing given, - | - rearrange definitions so that foo.Test.f comes earlier, - | - use an explicit argument. - | This will be an error in Scala 3.5 and later. - | - | where: T is a type in given instance f with bounds <: foo.Foo -No warnings can be incurred under -Werror (or -Xfatal-warnings) diff --git a/tests/neg/i7294-a.scala b/tests/neg/i7294-a.scala deleted file mode 100644 index a5193097e941..000000000000 --- a/tests/neg/i7294-a.scala +++ /dev/null @@ -1,14 +0,0 @@ -//> using options -Xfatal-warnings - -package foo - -trait Foo { def g(x: Int): Any } - -object Test: - - inline given f[T <: Foo]: T = ??? match { - case x: T => x.g(10) // error - } - - @main def Test = f -// nopos-error: No warnings can be incurred under -Werror (or -Xfatal-warnings) diff --git a/tests/neg/i7294-b.scala b/tests/neg/i7294-b.scala deleted file mode 100644 index 17cd7f07c3f7..000000000000 --- a/tests/neg/i7294-b.scala +++ /dev/null @@ -1,12 +0,0 @@ -//> using options -Xfatal-warnings - -package foo - -trait Foo { def g(x: Any): Any } - -inline given f[T <: Foo]: T = ??? match { - case x: T => x.g(10) // error -} - -@main def Test = f -// nopos-error: No warnings can be incurred under -Werror (or -Xfatal-warnings) diff --git a/tests/neg/i7294.check b/tests/neg/i7294.check new file mode 100644 index 000000000000..d6e559997f78 --- /dev/null +++ b/tests/neg/i7294.check @@ -0,0 +1,25 @@ +-- Error: tests/neg/i7294.scala:7:10 ----------------------------------------------------------------------------------- +7 | case x: T => x.g(10) // error // error + | ^ + | Result of implicit search for scala.reflect.TypeTest[Nothing, T] will change. + | Current result foo.f will be no longer eligible + | because it is not defined before the search position. + | Result with new rules: No Matching Implicit. + | To opt into the new rules, compile with `-source future` or use + | the `scala.language.future` language import. + | + | To fix the problem without the language import, you could try one of the following: + | - use a `given ... with` clause as the enclosing given, + | - rearrange definitions so that foo.f comes earlier, + | - use an explicit argument. + | + | where: T is a type in given instance f with bounds <: foo.Foo +-- [E007] Type Mismatch Error: tests/neg/i7294.scala:7:18 -------------------------------------------------------------- +7 | case x: T => x.g(10) // error // error + | ^^^^^^^ + | Found: Any + | Required: T + | + | where: T is a type in given instance f with bounds <: foo.Foo + | + | longer explanation available when compiling with `-explain` diff --git a/tests/neg/i7294.scala b/tests/neg/i7294.scala new file mode 100644 index 000000000000..fbb00f9b7e89 --- /dev/null +++ b/tests/neg/i7294.scala @@ -0,0 +1,10 @@ + +package foo + +trait Foo { def g(x: Any): Any } + +inline given f[T <: Foo]: T = ??? match { + case x: T => x.g(10) // error // error +} + +@main def Test = f diff --git a/tests/neg/looping-givens.check b/tests/neg/looping-givens.check new file mode 100644 index 000000000000..1e7ee08d79df --- /dev/null +++ b/tests/neg/looping-givens.check @@ -0,0 +1,48 @@ +-- Error: tests/neg/looping-givens.scala:9:22 -------------------------------------------------------------------------- +9 | given aa: A = summon // error + | ^ + | Result of implicit search for T will change. + | Current result ab will be no longer eligible + | because it is not defined before the search position. + | Result with new rules: a. + | To opt into the new rules, compile with `-source future` or use + | the `scala.language.future` language import. + | + | To fix the problem without the language import, you could try one of the following: + | - use a `given ... with` clause as the enclosing given, + | - rearrange definitions so that ab comes earlier, + | - use an explicit argument. + | + | where: T is a type variable with constraint <: A +-- Error: tests/neg/looping-givens.scala:10:22 ------------------------------------------------------------------------- +10 | given bb: B = summon // error + | ^ + | Result of implicit search for T will change. + | Current result ab will be no longer eligible + | because it is not defined before the search position. + | Result with new rules: b. + | To opt into the new rules, compile with `-source future` or use + | the `scala.language.future` language import. + | + | To fix the problem without the language import, you could try one of the following: + | - use a `given ... with` clause as the enclosing given, + | - rearrange definitions so that ab comes earlier, + | - use an explicit argument. + | + | where: T is a type variable with constraint <: B +-- Error: tests/neg/looping-givens.scala:11:28 ------------------------------------------------------------------------- +11 | given ab: (A & B) = summon // error + | ^ + | Result of implicit search for T will change. + | Current result ab will be no longer eligible + | because it is not defined before the search position. + | Result with new rules: Search Failure: joint(ab, ab). + | To opt into the new rules, compile with `-source future` or use + | the `scala.language.future` language import. + | + | To fix the problem without the language import, you could try one of the following: + | - use a `given ... with` clause as the enclosing given, + | - rearrange definitions so that ab comes earlier, + | - use an explicit argument. + | + | where: T is a type variable with constraint <: A & B diff --git a/tests/neg/looping-givens.scala b/tests/neg/looping-givens.scala new file mode 100644 index 000000000000..57dc95f99aab --- /dev/null +++ b/tests/neg/looping-givens.scala @@ -0,0 +1,11 @@ +//> options -source 3.4 + +class A +class B + +given joint(using a: A, b: B): (A & B) = ??? + +def foo(using a: A, b: B) = + given aa: A = summon // error + given bb: B = summon // error + given ab: (A & B) = summon // error diff --git a/tests/pos-deep-subtype/CollectionStrawMan6.scala b/tests/pos-deep-subtype/CollectionStrawMan6.scala index 9f189afbcf3a..99f634a66622 100644 --- a/tests/pos-deep-subtype/CollectionStrawMan6.scala +++ b/tests/pos-deep-subtype/CollectionStrawMan6.scala @@ -754,11 +754,11 @@ object CollectionStrawMan6 extends LowPriority { def elemTag: ClassTag[A] = ClassTag(xs.getClass.getComponentType) - protected def fromIterableWithSameElemType(coll: Iterable[A]): Array[A] = coll.toArray[A](elemTag) + protected def fromIterableWithSameElemType(coll: Iterable[A]): Array[A] = coll.toArray[A](using elemTag) def fromIterable[B: ClassTag](coll: Iterable[B]): Array[B] = coll.toArray[B] - protected[this] def newBuilder = new ArrayBuffer[A].mapResult(_.toArray(elemTag)) + protected[this] def newBuilder = new ArrayBuffer[A].mapResult(_.toArray(using elemTag)) override def knownSize = xs.length diff --git a/tests/pos/extmethods.scala b/tests/pos/extmethods.scala index 368b4f439916..40683c56c694 100644 --- a/tests/pos/extmethods.scala +++ b/tests/pos/extmethods.scala @@ -17,7 +17,7 @@ object CollectionStrawMan { def elemTag: ClassTag[A] = ClassTag(xs.getClass.getComponentType) - protected[this] def newBuilder = new ArrayBuffer[A].mapResult(_.toArray(elemTag)) + protected[this] def newBuilder = new ArrayBuffer[A].mapResult(_.toArray(using elemTag)) } } diff --git a/tests/pos/given-loop-prevention.scala b/tests/pos/given-loop-prevention.scala deleted file mode 100644 index 0bae0bb24fed..000000000000 --- a/tests/pos/given-loop-prevention.scala +++ /dev/null @@ -1,14 +0,0 @@ -//> using options -Xfatal-warnings - -class Foo - -object Bar { - given Foo with {} - given List[Foo] = List(summon[Foo]) // ok -} - -object Baz { - @annotation.nowarn - given List[Foo] = List(summon[Foo]) // gives a warning, which is suppressed - given Foo with {} -} diff --git a/tests/pos/i17245.scala b/tests/pos/i17245.scala index 3b5b3a74108d..8609a8293670 100644 --- a/tests/pos/i17245.scala +++ b/tests/pos/i17245.scala @@ -14,7 +14,7 @@ type OnChannel = Channel => Any val case1: OnChannel = Mockito.mock[OnChannel] val case2: OnChannel = Mockito.mock val case3 = Mockito.mock[OnChannel] - val case4: OnChannel = Mockito.mock[OnChannel](summon[ClassTag[OnChannel]]) + val case4: OnChannel = Mockito.mock[OnChannel](using summon[ClassTag[OnChannel]]) // not a regressive case, but an added improvement with the fix for the above val case5: Channel => Any = Mockito.mock[Channel => Any] diff --git a/tests/pos/i9967.scala b/tests/pos/i9967.scala index 4e915a27bfbf..d8cbf99b9d6e 100644 --- a/tests/pos/i9967.scala +++ b/tests/pos/i9967.scala @@ -1,6 +1,6 @@ import collection.mutable class MaxSizeMap[K, V](maxSize: Int)(using o: Ordering[K]): - val sortedMap: mutable.TreeMap[K, V] = mutable.TreeMap.empty[K, V](o) + val sortedMap: mutable.TreeMap[K, V] = mutable.TreeMap.empty[K, V](using o) export sortedMap._ diff --git a/tests/pos/t5643.scala b/tests/pos/t5643.scala index 1ce34ba36226..9866f8d399c2 100644 --- a/tests/pos/t5643.scala +++ b/tests/pos/t5643.scala @@ -13,7 +13,7 @@ object TupledEvidenceTest { def f[T : GetResult] = "" - f[(String,String)](getTuple[(String, String)]) + f[(String,String)](using getTuple[(String, String)]) f[(String,String)] } diff --git a/tests/run/colltest6/CollectionStrawMan6_1.scala b/tests/run/colltest6/CollectionStrawMan6_1.scala index bed5c476b96d..0bf0cbddffc9 100644 --- a/tests/run/colltest6/CollectionStrawMan6_1.scala +++ b/tests/run/colltest6/CollectionStrawMan6_1.scala @@ -755,11 +755,11 @@ object CollectionStrawMan6 extends LowPriority { def elemTag: ClassTag[A] = ClassTag(xs.getClass.getComponentType) - protected def fromIterableWithSameElemType(coll: Iterable[A]): Array[A] = coll.toArray[A](elemTag) + protected def fromIterableWithSameElemType(coll: Iterable[A]): Array[A] = coll.toArray[A](using elemTag) def fromIterable[B: ClassTag](coll: Iterable[B]): Array[B] = coll.toArray[B] - protected[this] def newBuilder = new ArrayBuffer[A].mapResult(_.toArray(elemTag)) + protected[this] def newBuilder = new ArrayBuffer[A].mapResult(_.toArray(using elemTag)) override def knownSize = xs.length diff --git a/tests/run/i502.scala b/tests/run/i502.scala index 71176d9660cd..20ed1f43b840 100644 --- a/tests/run/i502.scala +++ b/tests/run/i502.scala @@ -6,13 +6,13 @@ object Test extends App { Array[Int](1, 2) try { - Array[Int](1, 2)(null) + Array[Int](1, 2)(using null) ??? } catch { case _: NullPointerException => println("Ok") } - Array[Int](1, 2)({println("foo"); summon[ClassTag[Int]]}) + Array[Int](1, 2)(using {println("foo"); summon[ClassTag[Int]]}) - Array[Int](1, 2)(ClassTag.apply({ println("bar"); classOf[Int]})) + Array[Int](1, 2)(using ClassTag.apply({ println("bar"); classOf[Int]})) } diff --git a/tests/run/t2029.scala b/tests/run/t2029.scala index d4ab0f02b67f..d5bc478fa0b3 100644 --- a/tests/run/t2029.scala +++ b/tests/run/t2029.scala @@ -5,7 +5,7 @@ object Test{ val mainSet = TreeSet(1 to 5 :_*) var compareCalled = false; - val smallerSet = TreeSet(2 to 4 :_*)(Ordering[Int].reverse) + val smallerSet = TreeSet(2 to 4 :_*)(using Ordering[Int].reverse) println(mainSet.mkString(",")) println(smallerSet.mkString(",")) diff --git a/tests/run/t3326.scala b/tests/run/t3326.scala index 3d7d83068f92..1f8c04394682 100644 --- a/tests/run/t3326.scala +++ b/tests/run/t3326.scala @@ -28,8 +28,8 @@ object Test { def testCollectionSorted(): Unit = { import collection.* val order = implicitly[Ordering[Int]].reverse - var m1: SortedMap[Int, String] = SortedMap.empty[Int, String](order) - var m2: SortedMap[Int, String] = SortedMap.empty[Int, String](order) + var m1: SortedMap[Int, String] = SortedMap.empty[Int, String](using order) + var m2: SortedMap[Int, String] = SortedMap.empty[Int, String](using order) m1 ++= List(1 -> "World") m1 ++= List(2 -> "Hello") @@ -49,8 +49,8 @@ object Test { def testImmutableSorted(): Unit = { import collection.immutable.* val order = implicitly[Ordering[Int]].reverse - var m1: SortedMap[Int, String] = SortedMap.empty[Int, String](order) - var m2: SortedMap[Int, String] = SortedMap.empty[Int, String](order) + var m1: SortedMap[Int, String] = SortedMap.empty[Int, String](using order) + var m2: SortedMap[Int, String] = SortedMap.empty[Int, String](using order) m1 += (1 -> "World") m1 += (2 -> "Hello") diff --git a/tests/semanticdb/expect/InventedNames.expect.scala b/tests/semanticdb/expect/InventedNames.expect.scala index 7c5b008209c2..b92e9aa940a7 100644 --- a/tests/semanticdb/expect/InventedNames.expect.scala +++ b/tests/semanticdb/expect/InventedNames.expect.scala @@ -32,7 +32,7 @@ given [T/*<-givens::InventedNames$package.given_Z_T#[T]*/]: Z/*->givens::Z#*/[T/ val a/*<-givens::InventedNames$package.a.*/ = intValue/*->givens::InventedNames$package.intValue.*/ val b/*<-givens::InventedNames$package.b.*/ = given_String/*->givens::InventedNames$package.given_String.*/ -val c/*<-givens::InventedNames$package.c.*/ = given_Double/*->givens::InventedNames$package.given_Double().*/ +//val c = given_Double val d/*<-givens::InventedNames$package.d.*/ = given_List_T/*->givens::InventedNames$package.given_List_T().*/[Int/*->scala::Int#*/] val e/*<-givens::InventedNames$package.e.*/ = given_Char/*->givens::InventedNames$package.given_Char.*/ val f/*<-givens::InventedNames$package.f.*/ = given_Float/*->givens::InventedNames$package.given_Float.*/ diff --git a/tests/semanticdb/expect/InventedNames.scala b/tests/semanticdb/expect/InventedNames.scala index 42c14c90e370..61baae46c832 100644 --- a/tests/semanticdb/expect/InventedNames.scala +++ b/tests/semanticdb/expect/InventedNames.scala @@ -32,7 +32,7 @@ given [T]: Z[T] with val a = intValue val b = given_String -val c = given_Double +//val c = given_Double val d = given_List_T[Int] val e = given_Char val f = given_Float diff --git a/tests/semanticdb/metac.expect b/tests/semanticdb/metac.expect index 84c3e7c6a110..98657f122255 100644 --- a/tests/semanticdb/metac.expect +++ b/tests/semanticdb/metac.expect @@ -2093,16 +2093,15 @@ Schema => SemanticDB v4 Uri => InventedNames.scala Text => empty Language => Scala -Symbols => 45 entries -Occurrences => 66 entries -Synthetics => 3 entries +Symbols => 44 entries +Occurrences => 64 entries +Synthetics => 2 entries Symbols: -givens/InventedNames$package. => final package object givens extends Object { self: givens.type => +24 decls } +givens/InventedNames$package. => final package object givens extends Object { self: givens.type => +23 decls } givens/InventedNames$package.`* *`. => final implicit lazy val given method * * Long givens/InventedNames$package.a. => val method a Int givens/InventedNames$package.b. => val method b String -givens/InventedNames$package.c. => val method c Double givens/InventedNames$package.d. => val method d List[Int] givens/InventedNames$package.e. => val method e Char givens/InventedNames$package.f. => val method f Float @@ -2193,8 +2192,6 @@ Occurrences: [32:8..32:16): intValue -> givens/InventedNames$package.intValue. [33:4..33:5): b <- givens/InventedNames$package.b. [33:8..33:20): given_String -> givens/InventedNames$package.given_String. -[34:4..34:5): c <- givens/InventedNames$package.c. -[34:8..34:20): given_Double -> givens/InventedNames$package.given_Double(). [35:4..35:5): d <- givens/InventedNames$package.d. [35:8..35:20): given_List_T -> givens/InventedNames$package.given_List_T(). [35:21..35:24): Int -> scala/Int# @@ -2214,7 +2211,6 @@ Occurrences: Synthetics: [24:0..24:0): => *(x$1) -[34:8..34:20):given_Double => *(intValue) [40:8..40:15):given_Y => *(given_X) expect/Issue1749.scala diff --git a/tests/warn/context-bounds-migration.scala b/tests/warn/context-bounds-migration.scala deleted file mode 100644 index cdd3eca62b5c..000000000000 --- a/tests/warn/context-bounds-migration.scala +++ /dev/null @@ -1,9 +0,0 @@ - -class C[T] -def foo[X: C] = () - -given [T]: C[T] = C[T]() - -def Test = - foo(C[Int]()) // warning - foo(using C[Int]()) // ok diff --git a/tests/warn/i15474.scala b/tests/warn/i15474.scala index d7c41130a1bb..0d8fc111ac6a 100644 --- a/tests/warn/i15474.scala +++ b/tests/warn/i15474.scala @@ -1,4 +1,4 @@ - +//> using options -source 3.4 import scala.language.implicitConversions diff --git a/tests/warn/looping-givens.check b/tests/warn/looping-givens.check new file mode 100644 index 000000000000..eec348c19d11 --- /dev/null +++ b/tests/warn/looping-givens.check @@ -0,0 +1,45 @@ +-- Warning: tests/warn/looping-givens.scala:9:22 ----------------------------------------------------------------------- +9 | given aa: A = summon // warn + | ^ + | Result of implicit search for A & B will change. + | Current result ab will be no longer eligible + | because it is not defined before the search position. + | Result with new rules: a. + | To opt into the new rules, compile with `-source future` or use + | the `scala.language.future` language import. + | + | To fix the problem without the language import, you could try one of the following: + | - use a `given ... with` clause as the enclosing given, + | - rearrange definitions so that ab comes earlier, + | - use an explicit argument. + | This will be an error in Scala 3.5 and later. +-- Warning: tests/warn/looping-givens.scala:10:22 ---------------------------------------------------------------------- +10 | given bb: B = summon // warn + | ^ + | Result of implicit search for A & B will change. + | Current result ab will be no longer eligible + | because it is not defined before the search position. + | Result with new rules: b. + | To opt into the new rules, compile with `-source future` or use + | the `scala.language.future` language import. + | + | To fix the problem without the language import, you could try one of the following: + | - use a `given ... with` clause as the enclosing given, + | - rearrange definitions so that ab comes earlier, + | - use an explicit argument. + | This will be an error in Scala 3.5 and later. +-- Warning: tests/warn/looping-givens.scala:11:28 ---------------------------------------------------------------------- +11 | given ab: (A & B) = summon // warn + | ^ + | Result of implicit search for A & B will change. + | Current result ab will be no longer eligible + | because it is not defined before the search position. + | Result with new rules: joint. + | To opt into the new rules, compile with `-source future` or use + | the `scala.language.future` language import. + | + | To fix the problem without the language import, you could try one of the following: + | - use a `given ... with` clause as the enclosing given, + | - rearrange definitions so that ab comes earlier, + | - use an explicit argument. + | This will be an error in Scala 3.5 and later. diff --git a/tests/warn/looping-givens.scala b/tests/warn/looping-givens.scala index 6b6a32002331..2f737206f64e 100644 --- a/tests/warn/looping-givens.scala +++ b/tests/warn/looping-givens.scala @@ -1,3 +1,5 @@ +//> using options -source 3.4 + class A class B From 6c75005b627512f6aeee96120b19862f94bd501b Mon Sep 17 00:00:00 2001 From: Hamza Remmal Date: Tue, 11 Jun 2024 14:52:24 +0100 Subject: [PATCH 268/371] Disable ClasspathTests.unglobClasspathVerifyTest (#20551) cc @bishabosha @Gedochao [test_scala2_library_tasty] [test_windows_full] [test_java8] --- compiler/test/dotty/tools/scripting/ClasspathTests.scala | 1 + 1 file changed, 1 insertion(+) diff --git a/compiler/test/dotty/tools/scripting/ClasspathTests.scala b/compiler/test/dotty/tools/scripting/ClasspathTests.scala index 5107af5eee43..24c6c297a777 100755 --- a/compiler/test/dotty/tools/scripting/ClasspathTests.scala +++ b/compiler/test/dotty/tools/scripting/ClasspathTests.scala @@ -77,6 +77,7 @@ class ClasspathTests: /* * verify classpath is unglobbed by MainGenericRunner. */ + @Ignore @Test def unglobClasspathVerifyTest = { val testScriptName = "unglobClasspath.sc" val testScript = scripts("/scripting").find { _.name.matches(testScriptName) } match From aac98c9df002ae061fc95f45d6568035c7be7e4e Mon Sep 17 00:00:00 2001 From: Hamza Remmal Date: Fri, 14 Jun 2024 09:30:43 +0100 Subject: [PATCH 269/371] Adapt the release workflow to SIP-46 (#20565) --- .github/workflows/ci.yaml | 185 ++++++++++++++++++++++++++++++++++++-- 1 file changed, 176 insertions(+), 9 deletions(-) diff --git a/.github/workflows/ci.yaml b/.github/workflows/ci.yaml index b606e6ae1732..92df4a190ec7 100644 --- a/.github/workflows/ci.yaml +++ b/.github/workflows/ci.yaml @@ -748,13 +748,35 @@ jobs: - name: Add SBT proxy repositories run: cp -vf .github/workflows/repositories /root/.sbt/ ; true - - - name: Prepare Release - run: | + # Extract the release tag + - name: Extract the release tag + run : echo "RELEASE_TAG=${GITHUB_REF#*refs/tags/}" >> $GITHUB_ENV + # BUILD THE SDKs + - name: Build and pack the SDK (universal) + run : | ./project/scripts/sbt dist/packArchive sha256sum dist/target/scala3-* > dist/target/sha256sum.txt - echo "RELEASE_TAG=${GITHUB_REF#*refs/tags/}" >> $GITHUB_ENV - + - name: Build and pack the SDK (linux x86-64) + run : | + ./project/scripts/sbt dist-linux-x86_64/packArchive + sha256sum dist/linux-x86_64/target/scala3-* > dist/linux-x86_64/target/sha256sum.txt + - name: Build and pack the SDK (linux aarch64) + run : | + ./project/scripts/sbt dist-linux-aarch64/packArchive + sha256sum dist/linux-aarch64/target/scala3-* > dist/linux-aarch64/target/sha256sum.txt + - name: Build and pack the SDK (mac x86-64) + run : | + ./project/scripts/sbt dist-mac-x86_64/packArchive + sha256sum dist/mac-x86_64/target/scala3-* > dist/mac-x86_64/target/sha256sum.txt + - name: Build and pack the SDK (mac aarch64) + run : | + ./project/scripts/sbt dist-mac-aarch64/packArchive + sha256sum dist/mac-aarch64/target/scala3-* > dist/mac-aarch64/target/sha256sum.txt + - name: Build and pack the SDK (win x86-64) + run : | + ./project/scripts/sbt dist-win-x86_64/packArchive + sha256sum dist/win-x86_64/target/scala3-* > dist/win-x86_64/target/sha256sum.txt + # Create the GitHub release - name: Create GitHub Release id: create_gh_release uses: actions/create-release@latest @@ -767,7 +789,7 @@ jobs: draft: true prerelease: ${{ contains(env.RELEASE_TAG, '-') }} - - name: Upload zip archive to GitHub Release + - name: Upload zip archive to GitHub Release (universal) uses: actions/upload-release-asset@v1 env: GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }} @@ -776,8 +798,7 @@ jobs: asset_path: ./dist/target/scala3-${{ env.RELEASE_TAG }}.zip asset_name: scala3-${{ env.RELEASE_TAG }}.zip asset_content_type: application/zip - - - name: Upload tar.gz archive to GitHub Release + - name: Upload tar.gz archive to GitHub Release (universal) uses: actions/upload-release-asset@v1 env: GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }} @@ -787,7 +808,103 @@ jobs: asset_name: scala3-${{ env.RELEASE_TAG }}.tar.gz asset_content_type: application/gzip - - name: Upload SHA256 sum of the release artefacts to GitHub Release + - name: Upload zip archive to GitHub Release (linux x86-64) + uses: actions/upload-release-asset@v1 + env: + GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }} + with: + upload_url: ${{ steps.create_gh_release.outputs.upload_url }} + asset_path: ./dist/linux-x86_64/target/scala3-${{ env.RELEASE_TAG }}-x86_64-pc-linux.zip + asset_name: scala3-${{ env.RELEASE_TAG }}-x86_64-pc-linux.zip + asset_content_type: application/zip + - name: Upload tar.gz archive to GitHub Release (linux x86-64) + uses: actions/upload-release-asset@v1 + env: + GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }} + with: + upload_url: ${{ steps.create_gh_release.outputs.upload_url }} + asset_path: ./dist/linux-x86_64/target/scala3-${{ env.RELEASE_TAG }}-x86_64-pc-linux.tar.gz + asset_name: scala3-${{ env.RELEASE_TAG }}-x86_64-pc-linux.tar.gz + asset_content_type: application/gzip + + - name: Upload zip archive to GitHub Release (linux aarch64) + uses: actions/upload-release-asset@v1 + env: + GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }} + with: + upload_url: ${{ steps.create_gh_release.outputs.upload_url }} + asset_path: ./dist/linux-aarch64/target/scala3-${{ env.RELEASE_TAG }}-aarch64-pc-linux.zip + asset_name: scala3-${{ env.RELEASE_TAG }}-aarch64-pc-linux.zip + asset_content_type: application/zip + - name: Upload tar.gz archive to GitHub Release (linux aarch64) + uses: actions/upload-release-asset@v1 + env: + GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }} + with: + upload_url: ${{ steps.create_gh_release.outputs.upload_url }} + asset_path: ./dist/linux-aarch64/target/scala3-${{ env.RELEASE_TAG }}-aarch64-pc-linux.tar.gz + asset_name: scala3-${{ env.RELEASE_TAG }}-aarch64-pc-linux.tar.gz + asset_content_type: application/gzip + + - name: Upload zip archive to GitHub Release (mac x86-64) + uses: actions/upload-release-asset@v1 + env: + GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }} + with: + upload_url: ${{ steps.create_gh_release.outputs.upload_url }} + asset_path: ./dist/mac-x86_64/target/scala3-${{ env.RELEASE_TAG }}-x86_64-apple-darwin.zip + asset_name: scala3-${{ env.RELEASE_TAG }}-x86_64-apple-darwin.zip + asset_content_type: application/zip + - name: Upload tar.gz archive to GitHub Release (mac x86-64) + uses: actions/upload-release-asset@v1 + env: + GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }} + with: + upload_url: ${{ steps.create_gh_release.outputs.upload_url }} + asset_path: ./dist/mac-x86_64/target/scala3-${{ env.RELEASE_TAG }}-x86_64-apple-darwin.tar.gz + asset_name: scala3-${{ env.RELEASE_TAG }}-x86_64-apple-darwin.tar.gz + asset_content_type: application/gzip + + - name: Upload zip archive to GitHub Release (mac aarch64) + uses: actions/upload-release-asset@v1 + env: + GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }} + with: + upload_url: ${{ steps.create_gh_release.outputs.upload_url }} + asset_path: ./dist/mac-aarch64/target/scala3-${{ env.RELEASE_TAG }}-aarch64-apple-darwin.zip + asset_name: scala3-${{ env.RELEASE_TAG }}-aarch64-apple-darwin.zip + asset_content_type: application/zip + - name: Upload tar.gz archive to GitHub Release (mac aarch64) + uses: actions/upload-release-asset@v1 + env: + GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }} + with: + upload_url: ${{ steps.create_gh_release.outputs.upload_url }} + asset_path: ./dist/mac-aarch64/target/scala3-${{ env.RELEASE_TAG }}-aarch64-apple-darwin.tar.gz + asset_name: scala3-${{ env.RELEASE_TAG }}-aarch64-apple-darwin.tar.gz + asset_content_type: application/gzip + + - name: Upload zip archive to GitHub Release (win x86-64) + uses: actions/upload-release-asset@v1 + env: + GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }} + with: + upload_url: ${{ steps.create_gh_release.outputs.upload_url }} + asset_path: ./dist/win-x86_64/target/scala3-${{ env.RELEASE_TAG }}-x86_64-pc-win32.zip + asset_name: scala3-${{ env.RELEASE_TAG }}-x86_64-pc-win32.zip + asset_content_type: application/zip + - name: Upload tar.gz archive to GitHub Release (win x86-64) + uses: actions/upload-release-asset@v1 + env: + GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }} + with: + upload_url: ${{ steps.create_gh_release.outputs.upload_url }} + asset_path: ./dist/win-x86_64/target/scala3-${{ env.RELEASE_TAG }}-x86_64-pc-win32.tar.gz + asset_name: scala3-${{ env.RELEASE_TAG }}-x86_64-pc-win32.tar.gz + asset_content_type: application/gzip + + + - name: Upload SHA256 sum of the release artefacts to GitHub Release (universal) uses: actions/upload-release-asset@v1 env: GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }} @@ -797,6 +914,56 @@ jobs: asset_name: sha256sum.txt asset_content_type: text/plain + - name: Upload SHA256 sum of the release artefacts to GitHub Release (linux x86-64) + uses: actions/upload-release-asset@v1 + env: + GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }} + with: + upload_url: ${{ steps.create_gh_release.outputs.upload_url }} + asset_path: ./dist/linux-x86_64/target/sha256sum.txt + asset_name: sha256sum-x86_64-pc-linux.txt + asset_content_type: text/plain + + - name: Upload SHA256 sum of the release artefacts to GitHub Release (linux aarch64) + uses: actions/upload-release-asset@v1 + env: + GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }} + with: + upload_url: ${{ steps.create_gh_release.outputs.upload_url }} + asset_path: ./dist/linux-aarch64/target/sha256sum-aarch64-pc-linux.txt + asset_name: sha256sum.txt + asset_content_type: text/plain + + - name: Upload SHA256 sum of the release artefacts to GitHub Release (mac x86-64) + uses: actions/upload-release-asset@v1 + env: + GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }} + with: + upload_url: ${{ steps.create_gh_release.outputs.upload_url }} + asset_path: ./dist/mac-x86_64/target/sha256sum.txt + asset_name: sha256sum-x86_64-apple-darwin.txt + asset_content_type: text/plain + + - name: Upload SHA256 sum of the release artefacts to GitHub Release (mac aarch64) + uses: actions/upload-release-asset@v1 + env: + GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }} + with: + upload_url: ${{ steps.create_gh_release.outputs.upload_url }} + asset_path: ./dist/mac-aarch64/target/sha256sum.txt + asset_name: sha256sum-aarch64-apple-darwin.txt + asset_content_type: text/plain + + - name: Upload SHA256 sum of the release artefacts to GitHub Release (win x86-64) + uses: actions/upload-release-asset@v1 + env: + GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }} + with: + upload_url: ${{ steps.create_gh_release.outputs.upload_url }} + asset_path: ./dist/win-x86_64/target/sha256sum.txt + asset_name: sha256sum-x86_64-pc-win32.txt + asset_content_type: text/plain + - name: Publish Release run: ./project/scripts/sbtPublish ";project scala3-bootstrapped ;publishSigned ;sonatypeBundleRelease" From edbb7c4fcde5e53c43dcb508d64b82f8902c5449 Mon Sep 17 00:00:00 2001 From: Wojciech Mazur Date: Mon, 17 Jun 2024 13:13:45 +0200 Subject: [PATCH 270/371] Adapts the workflow to the changes in #20351 --- .github/workflows/publish-sdkman.yml | 69 +++++++++++++++++++++ .github/workflows/releases.yml | 57 ++++++++--------- .github/workflows/scripts/publish-sdkman.sh | 50 --------------- 3 files changed, 98 insertions(+), 78 deletions(-) create mode 100644 .github/workflows/publish-sdkman.yml delete mode 100755 .github/workflows/scripts/publish-sdkman.sh diff --git a/.github/workflows/publish-sdkman.yml b/.github/workflows/publish-sdkman.yml new file mode 100644 index 000000000000..2126a3237d83 --- /dev/null +++ b/.github/workflows/publish-sdkman.yml @@ -0,0 +1,69 @@ +################################################################################################### +### THIS IS A REUSABLE WORKFLOW TO PUBLISH SCALA TO SDKMAN! ### +### HOW TO USE: ### +### - THE RELEASE WORKFLOW SHOULD CALL THIS WORKFLOW ### +### - IT WILL PUBLISH TO SDKMAN! THE BINARIES TO EACH SUPPORTED PLATFORM AND A UNIVERSAL JAR ### +### - IT CHANGES THE DEFAULT VERSION IN SDKMAN! ### +### ### +### NOTE: ### +### - WE SHOULD KEEP IN SYNC THE NAME OF THE ARCHIVES WITH THE ACTUAL BUILD ### +### - WE SHOULD KEEP IN SYNC THE URL OF THE RELEASE ### +################################################################################################### + + +name: Publish Scala to SDKMAN! +run-name: Publish Scala ${{ inputs.version }} to SDKMAN! + +on: + workflow_call: + inputs: + version: + required: true + type: string + secrets: + CONSUMER-KEY: + required: true + CONSUMER-TOKEN: + required: true + +env: + RELEASE-URL: 'https://github.com/scala/scala3/releases/download/${{ inputs.version }}' + +jobs: + publish: + runs-on: ubuntu-latest + strategy: + matrix: + include: + - platform: LINUX_64 + archive : 'scala3-${{ inputs.version }}-x86_64-pc-linux.tar.gz' + - platform: LINUX_ARM64 + archive : 'scala3-${{ inputs.version }}-aarch64-pc-linux.tar.gz' + - platform: MAC_OSX + archive : 'scala3-${{ inputs.version }}-x86_64-apple-darwin.tar.gz' + - platform: MAC_ARM64 + archive : 'scala3-${{ inputs.version }}-aarch64-apple-darwin.tar.gz' + - platform: WINDOWS_64 + archive : 'scala3-${{ inputs.version }}-x86_64-pc-win32.tar.gz' + - platform: UNIVERSAL + archive : 'scala3-${{ inputs.version }}.zip' + steps: + - uses: hamzaremmal/sdkman-release-action@7e437233a6bd79bc4cb0fa9071b685e94bdfdba6 + with: + CONSUMER-KEY : ${{ secrets.CONSUMER-KEY }} + CONSUMER-TOKEN : ${{ secrets.CONSUMER-TOKEN }} + CANDIDATE : scala + VERSION : ${{ inputs.version }} + URL : '${{ env.RELEASE-URL }}/${{ matrix.archive }}' + PLATFORM : ${{ matrix.platform }} + + default: + runs-on: ubuntu-latest + needs: publish + steps: + - uses: hamzaremmal/sdkman-default-action@866bc79fc5bd397eeb48f9cedda2f15221c8515d + with: + CONSUMER-KEY : ${{ secrets.CONSUMER-KEY }} + CONSUMER-TOKEN : ${{ secrets.CONSUMER-TOKEN }} + CANDIDATE : scala + VERSION : ${{ inputs.version }} diff --git a/.github/workflows/releases.yml b/.github/workflows/releases.yml index dde8b0372d52..4b75dd1b737d 100644 --- a/.github/workflows/releases.yml +++ b/.github/workflows/releases.yml @@ -1,32 +1,33 @@ -name: Releases +################################################################################################### +### OFFICIAL RELEASE WORKFLOW ### +### HOW TO USE: ### +### - THIS WORKFLOW WILL NEED TO BE TRIGGERED MANUALLY ### +### ### +### NOTE: ### +### - THIS WORKFLOW SHOULD ONLY BE RUN ON STABLE RELEASES ### +### - IT ASSUMES THAT THE PRE-RELEASE WORKFLOW WAS PREVIOUSLY EXECUTED ### +### ### +################################################################################################### + +name: Official release of Scala +run-name: Official release of Scala ${{ inputs.version }} + on: workflow_dispatch: - -permissions: - contents: read + inputs: + version: + description: 'The version to officially release' + required: true + type: string jobs: - publish_release: - runs-on: [self-hosted, Linux] - container: - image: lampepfl/dotty:2021-03-22 - options: --cpu-shares 4096 - - env: - SDKMAN_KEY: ${{ secrets.SDKMAN_KEY }} - SDKMAN_TOKEN: ${{ secrets.SDKMAN_TOKEN }} - - steps: - - name: Reset existing repo - run: | - git config --global --add safe.directory /__w/dotty/dotty - git -c "http.https://github.com/.extraheader=" fetch --recurse-submodules=no "https://github.com/lampepfl/dotty" && git reset --hard FETCH_HEAD || true - - - name: Cleanup - run: .github/workflows/cleanup.sh - - - name: Git Checkout - uses: actions/checkout@v4 - - - name: Publish to SDKMAN - run: .github/workflows/scripts/publish-sdkman.sh + # TODO: ADD JOB TO SWITCH THE GITHUB RELEASE FROM DRAFT TO LATEST + publish-sdkman: + uses: ./.github/workflows/publish-sdkman.yml + with: + version: ${{ inputs.version }} + secrets: + CONSUMER-KEY: ${{ secrets.SDKMAN_KEY }} + CONSUMER-TOKEN: ${{ secrets.SDKMAN_TOKEN }} + + # TODO: ADD RELEASE WORKFLOW TO CHOCOLATEY AND OTHER PACKAGE MANAGERS HERE \ No newline at end of file diff --git a/.github/workflows/scripts/publish-sdkman.sh b/.github/workflows/scripts/publish-sdkman.sh deleted file mode 100755 index f959c426e9d8..000000000000 --- a/.github/workflows/scripts/publish-sdkman.sh +++ /dev/null @@ -1,50 +0,0 @@ -#!/usr/bin/env bash - -# This is script for publishing scala on SDKMAN. -# Script resolves the latest stable version of scala and then send REST request to SDKMAN Vendor API. -# It's releasing and announcing the release of scala on SDKMAN. -# -# Requirement: -# - the latest stable version of scala should be available in github artifacts - -set -u - -# latest stable dotty version -DOTTY_VERSION=$(curl -s https://api.github.com/repos/scala/scala3/releases/latest | grep '"tag_name":' | sed -E 's/.*"([^"]+)".*/\1/') -DOTTY_URL="https://github.com/scala/scala3/releases/download/$DOTTY_VERSION/scala3-$DOTTY_VERSION.zip" - -# checking if dotty version is available -if ! curl --output /dev/null --silent --head --fail "$DOTTY_URL"; then - echo "URL doesn't exist: $DOTTY_URL" - exit 1 -fi - -# Release a new Candidate Version -curl --silent --show-error --fail \ - -X POST \ - -H "Consumer-Key: $SDKMAN_KEY" \ - -H "Consumer-Token: $SDKMAN_TOKEN" \ - -H "Content-Type: application/json" \ - -H "Accept: application/json" \ - -d '{"candidate": "scala", "version": "'"$DOTTY_VERSION"'", "url": "'"$DOTTY_URL"'"}' \ - https://vendors.sdkman.io/release - -if [[ $? -ne 0 ]]; then - echo "Fail sending POST request to releasing scala on SDKMAN." - exit 1 -fi - -# Set DOTTY_VERSION as Default for Candidate -curl --silent --show-error --fail \ - -X PUT \ - -H "Consumer-Key: $SDKMAN_KEY" \ - -H "Consumer-Token: $SDKMAN_TOKEN" \ - -H "Content-Type: application/json" \ - -H "Accept: application/json" \ - -d '{"candidate": "scala", "version": "'"$DOTTY_VERSION"'"}' \ - https://vendors.sdkman.io/default - -if [[ $? -ne 0 ]]; then - echo "Fail sending PUT request to announcing the release of scala on SDKMAN." - exit 1 -fi From e005369f41f05bc2224650958207480667329b4e Mon Sep 17 00:00:00 2001 From: odersky Date: Tue, 28 May 2024 20:38:39 +0200 Subject: [PATCH 271/371] Avoid stacked thisCall contexts AddImplicitArgs can recursively add several implicit parameter lists. We need to make sure we don't perform a thisCallContext search in another thisCall context in this case. Fixes #20483 The original code would back out further and further in the context chain for every implicit parameter section on the secondary constructor. Eventually (in this case after 3 times) bad things happen. --- .../src/dotty/tools/dotc/core/Contexts.scala | 2 +- .../dotty/tools/dotc/typer/Implicits.scala | 2 +- .../src/dotty/tools/dotc/typer/Typer.scala | 27 ++++++++++++------- tests/pos/i20483.scala | 13 +++++++++ 4 files changed, 32 insertions(+), 12 deletions(-) create mode 100644 tests/pos/i20483.scala diff --git a/compiler/src/dotty/tools/dotc/core/Contexts.scala b/compiler/src/dotty/tools/dotc/core/Contexts.scala index a5b0e2dba254..79a0b279aefe 100644 --- a/compiler/src/dotty/tools/dotc/core/Contexts.scala +++ b/compiler/src/dotty/tools/dotc/core/Contexts.scala @@ -477,7 +477,7 @@ object Contexts { /** Is the flexible types option set? */ def flexibleTypes: Boolean = base.settings.YexplicitNulls.value && !base.settings.YnoFlexibleTypes.value - + /** Is the best-effort option set? */ def isBestEffort: Boolean = base.settings.YbestEffort.value diff --git a/compiler/src/dotty/tools/dotc/typer/Implicits.scala b/compiler/src/dotty/tools/dotc/typer/Implicits.scala index 54821444aed6..74bd59d4992f 100644 --- a/compiler/src/dotty/tools/dotc/typer/Implicits.scala +++ b/compiler/src/dotty/tools/dotc/typer/Implicits.scala @@ -1067,7 +1067,7 @@ trait Implicits: trace(s"search implicit ${pt.show}, arg = ${argument.show}: ${argument.tpe.show}", implicits, show = true) { record("inferImplicit") assert(ctx.phase.allowsImplicitSearch, - if (argument.isEmpty) i"missing implicit parameter of type $pt after typer at phase ${ctx.phase.phaseName}" + if (argument.isEmpty) i"missing implicit parameter of type $pt after typer at phase ${ctx.phase}" else i"type error: ${argument.tpe} does not conform to $pt${err.whyNoMatchStr(argument.tpe, pt)}") val usableForInference = pt.exists && !pt.unusableForInference diff --git a/compiler/src/dotty/tools/dotc/typer/Typer.scala b/compiler/src/dotty/tools/dotc/typer/Typer.scala index ae50d626cb1f..ae62ebbc4a3f 100644 --- a/compiler/src/dotty/tools/dotc/typer/Typer.scala +++ b/compiler/src/dotty/tools/dotc/typer/Typer.scala @@ -4058,7 +4058,9 @@ class Typer(@constructorOnly nestingLevel: Int = 0) extends Namer def dummyArg(tp: Type) = untpd.Ident(nme.???).withTypeUnchecked(tp) - def addImplicitArgs(using Context) = { + val origCtx = ctx + + def addImplicitArgs(using Context) = def hasDefaultParams = methPart(tree).symbol.hasDefaultParams def implicitArgs(formals: List[Type], argIndex: Int, pt: Type): List[Tree] = formals match case Nil => Nil @@ -4181,15 +4183,20 @@ class Typer(@constructorOnly nestingLevel: Int = 0) extends Namer case _ => retyped else issueErrors(tree, args) } - else tree match { - case tree: Block => - readaptSimplified(tpd.Block(tree.stats, tpd.Apply(tree.expr, args))) - case tree: NamedArg => - readaptSimplified(tpd.NamedArg(tree.name, tpd.Apply(tree.arg, args))) - case _ => - readaptSimplified(tpd.Apply(tree, args)) - } - } + else + inContext(origCtx): + // Reset context in case it was set to a supercall context before. + // otherwise the invariant for taking another this or super call context is not met. + // Test case is i20483.scala + tree match + case tree: Block => + readaptSimplified(tpd.Block(tree.stats, tpd.Apply(tree.expr, args))) + case tree: NamedArg => + readaptSimplified(tpd.NamedArg(tree.name, tpd.Apply(tree.arg, args))) + case _ => + readaptSimplified(tpd.Apply(tree, args)) + end addImplicitArgs + pt.revealIgnored match { case pt: FunProto if pt.applyKind == ApplyKind.Using => // We can end up here if extension methods are called with explicit given arguments. diff --git a/tests/pos/i20483.scala b/tests/pos/i20483.scala new file mode 100644 index 000000000000..a01a77327181 --- /dev/null +++ b/tests/pos/i20483.scala @@ -0,0 +1,13 @@ + +class Foo + (x: Option[String]) + (using Boolean) + (using Int) + (using Double): + + def this + (x: String) + (using Boolean) + (using Int) + (using Double) = + this(Some(x)) \ No newline at end of file From 665bd20cd3ea0b5c69faa86e272cb1cf6bbf15a3 Mon Sep 17 00:00:00 2001 From: Jamie Thompson Date: Tue, 11 Jun 2024 18:42:42 +0900 Subject: [PATCH 272/371] Bundle scala cli in scala command (#20351) fixes #20098 Proposed changes to zip/targz archive: - in the `/bin` directory store an extra launcher for Scala CLI (either JAR, or native per platform). - `/bin/scala[.bat]` is modified to invoke Scala CLI stored in `/bin` - new `/maven2` directory, which stores all the Jars and POM files necessary (in maven repo style) for scala-cli to invoke scala compiler offline (using the `-r` launcher option). - CHOICE: either replace jar files in `/lib` by aliases to the corresponding jar in `/maven2`, OR delete `/lib` and update references from scripts. (Looks like symlinks are not portable, so probably we should encode the classpath in a file, or adjust slightly how we build the toolchain) - add platform specific suffixes to artefacts: - e.g. `scala-3.5.0-x86_64-pc-linux.tar.gz` (for the artefact that bundles the x64 linux launcher) --------- Co-authored-by: Hamza REMMAL --- .github/workflows/ci.yaml | 14 +- .github/workflows/launchers.yml | 96 +++++ bin/common | 9 +- bin/common-platform | 63 +++ bin/scala | 35 +- bin/scalac | 2 +- bin/scaladoc | 2 +- build.sbt | 5 + .../src/dotty/tools/MainGenericRunner.scala | 2 +- .../scripting/argfileClasspath.sc | 9 - ...hReport.sc => classpathReport_scalacli.sc} | 2 +- .../scripting/cpArgumentsFile.txt | 1 - compiler/test-resources/scripting/envtest.sc | 2 + .../scripting/envtest_scalacli.sc | 3 + compiler/test-resources/scripting/hashBang.sc | 2 +- .../test-resources/scripting/hashBang.scala | 4 +- .../test-resources/scripting/scriptName.scala | 2 +- .../test-resources/scripting/scriptPath.sc | 2 +- .../scripting/scriptPath_scalacli.sc | 13 + compiler/test-resources/scripting/showArgs.sc | 2 +- .../scripting/showArgs_scalacli.sc | 7 + .../test-resources/scripting/sqlDateError.sc | 2 +- .../scripting/sqlDateError_scalacli.sc | 6 + .../test-resources/scripting/touchFile.sc | 2 +- .../scripting/unglobClasspath.sc | 6 - .../scripting/unglobClasspath_scalacli.sc | 9 + .../test/dotty/tools/io/ClasspathTest.scala | 4 +- .../tools/scripting/BashExitCodeTests.scala | 10 +- .../tools/scripting/BashScriptsTests.scala | 20 +- .../tools/scripting/ClasspathTests.scala | 18 +- .../tools/scripting/ExpressionTest.scala | 4 + .../dotty/tools/scripting/ScriptTestEnv.scala | 67 ++- .../tools/scripting/ScriptingTests.scala | 6 +- compiler/test/dotty/tools/utils.scala | 13 +- dist/bin-native-overrides/cli-common-platform | 16 + .../cli-common-platform.bat | 18 + dist/bin/cli-common-platform | 3 + dist/bin/cli-common-platform.bat | 5 + dist/bin/common | 132 +----- dist/bin/common-shared | 139 +++++++ dist/bin/scala | 10 +- dist/bin/scala.bat | 20 +- project/Build.scala | 80 +++- project/RepublishPlugin.scala | 388 +++++++++++++----- project/scripts/bootstrappedOnlyCmdTests | 17 +- project/scripts/buildScalaBinary | 12 + project/scripts/cmdTestsCommon.inc.sh | 17 + project/scripts/echoArgs.sc | 6 + project/scripts/native-integration/bashTests | 84 ++++ .../reportScalaVersion.scala | 4 + .../scripts/native-integration/winTests.bat | 19 + project/scripts/winCmdTests | 2 +- project/scripts/winCmdTests.bat | 2 +- .../src/main/scala/a/zz.scala | 6 + 54 files changed, 1088 insertions(+), 336 deletions(-) create mode 100644 .github/workflows/launchers.yml create mode 100755 bin/common-platform delete mode 100755 compiler/test-resources/scripting/argfileClasspath.sc rename compiler/test-resources/scripting/{classpathReport.sc => classpathReport_scalacli.sc} (91%) delete mode 100755 compiler/test-resources/scripting/cpArgumentsFile.txt create mode 100755 compiler/test-resources/scripting/envtest_scalacli.sc create mode 100755 compiler/test-resources/scripting/scriptPath_scalacli.sc create mode 100755 compiler/test-resources/scripting/showArgs_scalacli.sc create mode 100755 compiler/test-resources/scripting/sqlDateError_scalacli.sc delete mode 100755 compiler/test-resources/scripting/unglobClasspath.sc create mode 100755 compiler/test-resources/scripting/unglobClasspath_scalacli.sc create mode 100644 dist/bin-native-overrides/cli-common-platform create mode 100644 dist/bin-native-overrides/cli-common-platform.bat create mode 100644 dist/bin/cli-common-platform create mode 100644 dist/bin/cli-common-platform.bat create mode 100644 dist/bin/common-shared create mode 100755 project/scripts/buildScalaBinary create mode 100644 project/scripts/echoArgs.sc create mode 100755 project/scripts/native-integration/bashTests create mode 100644 project/scripts/native-integration/reportScalaVersion.scala create mode 100755 project/scripts/native-integration/winTests.bat create mode 100644 tests/cmdTest-sbt-tests/sourcepath-with-inline/src/main/scala/a/zz.scala diff --git a/.github/workflows/ci.yaml b/.github/workflows/ci.yaml index 92df4a190ec7..cad7caec490d 100644 --- a/.github/workflows/ci.yaml +++ b/.github/workflows/ci.yaml @@ -141,7 +141,8 @@ jobs: - name: Cmd Tests run: | - ./project/scripts/sbt ";dist/pack; scala3-bootstrapped/compile; scala3-bootstrapped/test ;sbt-test/scripted scala2-compat/*; scala3-compiler-bootstrapped/scala3CompilerCoursierTest:test" + ./project/scripts/buildScalaBinary + ./project/scripts/sbt ";scala3-bootstrapped/compile ;scala3-bootstrapped/test ;sbt-test/scripted scala2-compat/* ;scala3-compiler-bootstrapped/scala3CompilerCoursierTest:test" ./project/scripts/cmdTests ./project/scripts/bootstrappedOnlyCmdTests @@ -221,7 +222,7 @@ jobs: shell: cmd - name: build binary - run: sbt "dist/pack" & bash -version + run: sbt "dist-win-x86_64/pack" & bash -version shell: cmd - name: cygwin tests @@ -254,8 +255,12 @@ jobs: - name: Git Checkout uses: actions/checkout@v4 + - name: build binary + run: sbt "dist-win-x86_64/pack" + shell: cmd + - name: Test - run: sbt ";dist/pack ;scala3-bootstrapped/compile ;scala3-bootstrapped/test" + run: sbt ";scala3-bootstrapped/compile ;scala3-bootstrapped/test" shell: cmd - name: Scala.js Test @@ -581,7 +586,8 @@ jobs: - name: Test run: | - ./project/scripts/sbt ";dist/pack ;scala3-bootstrapped/compile ;scala3-bootstrapped/test ;sbt-test/scripted scala2-compat/*" + ./project/scripts/buildScalaBinary + ./project/scripts/sbt ";scala3-bootstrapped/compile ;scala3-bootstrapped/test ;sbt-test/scripted scala2-compat/*" ./project/scripts/cmdTests ./project/scripts/bootstrappedOnlyCmdTests diff --git a/.github/workflows/launchers.yml b/.github/workflows/launchers.yml new file mode 100644 index 000000000000..818e3b72b06b --- /dev/null +++ b/.github/workflows/launchers.yml @@ -0,0 +1,96 @@ +name: Test CLI Launchers on all the platforms +on: + pull_request: + workflow_dispatch: + +jobs: + linux-x86_64: + name: Deploy and Test on Linux x64 architecture + runs-on: ubuntu-latest + steps: + - uses: actions/checkout@v4 + - name: Set up JDK 17 + uses: actions/setup-java@v4 + with: + java-version: '17' + distribution: 'temurin' + cache: 'sbt' + - name: Build and test launcher command + run: ./project/scripts/native-integration/bashTests + env: + LAUNCHER_EXPECTED_PROJECT: "dist-linux-x86_64" + + linux-aarch64: + name: Deploy and Test on Linux ARM64 architecture + runs-on: macos-latest + if: ${{ false }} + steps: + - uses: actions/checkout@v4 + - name: Set up JDK 17 + uses: actions/setup-java@v4 + with: + java-version: '17' + distribution: 'temurin' + cache: 'sbt' + # https://github.com/actions/runner-images/issues/9369 + - name: Install sbt + run: brew install sbt + - name: Build and test launcher command + run: ./project/scripts/native-integration/bashTests + env: + LAUNCHER_EXPECTED_PROJECT: "dist-linux-aarch64" + + mac-x86_64: + name: Deploy and Test on Mac x64 architecture + runs-on: macos-13 + steps: + - uses: actions/checkout@v4 + - name: Set up JDK 17 + uses: actions/setup-java@v4 + with: + java-version: '17' + distribution: 'temurin' + cache: 'sbt' + # https://github.com/actions/runner-images/issues/9369 + - name: Install sbt + run: brew install sbt + - name: Build and test launcher command + run: ./project/scripts/native-integration/bashTests + env: + LAUNCHER_EXPECTED_PROJECT: "dist-mac-x86_64" + + mac-aarch64: + name: Deploy and Test on Mac ARM64 architecture + runs-on: macos-latest + steps: + - uses: actions/checkout@v4 + - name: Set up JDK 17 + uses: actions/setup-java@v4 + with: + java-version: '17' + distribution: 'temurin' + cache: 'sbt' + # https://github.com/actions/runner-images/issues/9369 + - name: Install sbt + run: brew install sbt + - name: Build and test launcher command + run: ./project/scripts/native-integration/bashTests + env: + LAUNCHER_EXPECTED_PROJECT: "dist-mac-aarch64" + + win-x86_64: + name: Deploy and Test on Windows x64 architecture + runs-on: windows-latest + steps: + - uses: actions/checkout@v4 + - name: Set up JDK 17 + uses: actions/setup-java@v4 + with: + java-version: '17' + distribution: 'temurin' + cache: 'sbt' + - name: Build the launcher command + run: sbt "dist-win-x86_64/pack" + - name: Run the launcher command tests + run: './project/scripts/native-integration/winTests.bat' + shell: cmd diff --git a/bin/common b/bin/common index 7d3aa7148265..37b2ebd1ff93 100755 --- a/bin/common +++ b/bin/common @@ -9,15 +9,18 @@ target="$1" shift # Mutates $@ by deleting the first element ($1) +# set the $DIST_PROJECT and $DIST_DIR variables +source "$ROOT/bin/common-platform" + # Marker file used to obtain the date of latest call to sbt-back -version="$ROOT/dist/target/pack/VERSION" +version="$ROOT/$DIST_DIR/target/pack/VERSION" # Create the target if absent or if file changed in ROOT/compiler new_files="$(find "$ROOT/compiler" \( -iname "*.scala" -o -iname "*.java" \) -newer "$version" 2> /dev/null)" if [ ! -f "$version" ] || [ ! -z "$new_files" ]; then echo "Building Dotty..." - (cd $ROOT && sbt "dist/pack") + (cd $ROOT && sbt "$DIST_PROJECT/pack") fi -"$target" "$@" +"$ROOT/$DIST_DIR/target/pack/bin/$target" "$@" diff --git a/bin/common-platform b/bin/common-platform new file mode 100755 index 000000000000..648e0195e7e6 --- /dev/null +++ b/bin/common-platform @@ -0,0 +1,63 @@ +#!/usr/bin/env bash + +unset cygwin mingw msys darwin + +# COLUMNS is used together with command line option '-pageWidth'. +if command -v tput >/dev/null 2>&1; then + export COLUMNS="$(tput -Tdumb cols)" +fi + +case "`uname`" in + CYGWIN*) cygwin=true + ;; + MINGW*) mingw=true + ;; + MSYS*) msys=true + ;; + Darwin*) darwin=true + ;; +esac + +unset DIST_PROJECT DIST_DIR + +if [[ ${cygwin-} || ${mingw-} || ${msys-} ]]; then + DIST_PROJECT="dist-win-x86_64" + DIST_DIR="dist/win-x86_64" +else + # OS and arch logic taken from https://github.com/VirtusLab/scala-cli/blob/main/scala-cli.sh + unset arch ARCH_NORM + arch=$(uname -m) + if [[ "$arch" == "aarch64" ]] || [[ "$arch" == "x86_64" ]]; then + ARCH_NORM="$arch" + elif [[ "$arch" == "amd64" ]]; then + ARCH_NORM="x86_64" + elif [[ "$arch" == "arm64" ]]; then + ARCH_NORM="aarch64" + else + ARCH_NORM="unknown" + fi + + if [ "$(expr substr $(uname -s) 1 5 2>/dev/null)" == "Linux" ]; then + if [[ "$ARCH_NORM" == "unknown" ]]; then + echo >&2 "unknown Linux CPU architecture, defaulting to JVM launcher" + DIST_PROJECT="dist" + DIST_DIR="dist" + else + DIST_PROJECT="dist-linux-$ARCH_NORM" + DIST_DIR="dist/linux-$ARCH_NORM" + fi + elif [ "$(uname)" == "Darwin" ]; then + if [[ "$ARCH_NORM" == "unknown" ]]; then + echo >&2 "unknown Darwin CPU architecture, defaulting to JVM launcher" + DIST_PROJECT="dist" + DIST_DIR="dist" + else + DIST_PROJECT="dist-mac-$ARCH_NORM" + DIST_DIR="dist/mac-$ARCH_NORM" + fi + else + echo >&2 "unknown OS, defaulting to JVM launcher" + DIST_PROJECT="dist" + DIST_DIR="dist" + fi +fi diff --git a/bin/scala b/bin/scala index 85c1ac91d08f..e87c4391806b 100755 --- a/bin/scala +++ b/bin/scala @@ -2,4 +2,37 @@ ROOT="$(cd "$(dirname "${BASH_SOURCE[0]}")" >& /dev/null && pwd)/.." -"$ROOT/bin/common" "$ROOT/dist/target/pack/bin/scala" "--power" "$@" "--offline" "--server=false" +scala_args() { + + declare -a CLI_ARGS + declare -a SCRIPT_ARGS + declare DISABLE_BLOOP=1 + + while (( "$#" )); do + case "$1" in + "--") + shift + SCRIPT_ARGS+=("--") + SCRIPT_ARGS+=("$@") + break + ;; + "clean" | "version" | "--version" | "-version" | "help" | "--help" | "-help") + CLI_ARGS+=("$1") + DISABLE_BLOOP=0 # clean command should not add --offline --server=false + shift + ;; + *) + CLI_ARGS+=("$1") + shift + ;; + esac + done + + if [ $DISABLE_BLOOP -eq 1 ]; then + CLI_ARGS+=("--offline" "--server=false") + fi + + echo "--power ${CLI_ARGS[@]} ${SCRIPT_ARGS[@]}" +} + +"$ROOT/bin/common" "scala" $(scala_args "$@") diff --git a/bin/scalac b/bin/scalac index faeb48d92d87..d141b9a6c6bb 100755 --- a/bin/scalac +++ b/bin/scalac @@ -2,4 +2,4 @@ ROOT="$(cd "$(dirname "${BASH_SOURCE[0]}")" >& /dev/null && pwd)/.." -"$ROOT/bin/common" "$ROOT/dist/target/pack/bin/scalac" "$@" +"$ROOT/bin/common" "scalac" "$@" diff --git a/bin/scaladoc b/bin/scaladoc index 11a754c6579f..02decabb9ae3 100755 --- a/bin/scaladoc +++ b/bin/scaladoc @@ -2,4 +2,4 @@ ROOT="$(cd "$(dirname "${BASH_SOURCE[0]}")" >& /dev/null && pwd)/.." -"$ROOT/bin/common" "$ROOT/dist/target/pack/bin/scaladoc" "$@" +"$ROOT/bin/common" "scaladoc" "$@" diff --git a/build.sbt b/build.sbt index 1bc74e5e23fb..f357044c91ca 100644 --- a/build.sbt +++ b/build.sbt @@ -28,6 +28,11 @@ val `scaladoc-js-main` = Build.`scaladoc-js-main` val `scaladoc-js-contributors` = Build.`scaladoc-js-contributors` val `scala3-bench-run` = Build.`scala3-bench-run` val dist = Build.dist +val `dist-mac-x86_64` = Build.`dist-mac-x86_64` +val `dist-mac-aarch64` = Build.`dist-mac-aarch64` +val `dist-win-x86_64` = Build.`dist-win-x86_64` +val `dist-linux-x86_64` = Build.`dist-linux-x86_64` +val `dist-linux-aarch64` = Build.`dist-linux-aarch64` val `community-build` = Build.`community-build` val `sbt-community-build` = Build.`sbt-community-build` val `scala3-presentation-compiler` = Build.`scala3-presentation-compiler` diff --git a/compiler/src/dotty/tools/MainGenericRunner.scala b/compiler/src/dotty/tools/MainGenericRunner.scala index 5b238693a135..bf477f019cba 100644 --- a/compiler/src/dotty/tools/MainGenericRunner.scala +++ b/compiler/src/dotty/tools/MainGenericRunner.scala @@ -270,7 +270,7 @@ object MainGenericRunner { val ranByCoursierBootstrap = sys.props.isDefinedAt("coursier.mainJar") - || sys.props.get("bootstrap.mainClass").filter(_ == "dotty.tools.MainGenericRunner").isDefined + || sys.props.get("bootstrap.mainClass").contains("dotty.tools.MainGenericRunner") val silenced = sys.props.get("scala.use_legacy_launcher") == Some("true") diff --git a/compiler/test-resources/scripting/argfileClasspath.sc b/compiler/test-resources/scripting/argfileClasspath.sc deleted file mode 100755 index c31371ba8934..000000000000 --- a/compiler/test-resources/scripting/argfileClasspath.sc +++ /dev/null @@ -1,9 +0,0 @@ -#!dist/target/pack/bin/scala @compiler/test-resources/scripting/cpArgumentsFile.txt - -import java.nio.file.Paths - -def main(args: Array[String]): Unit = - val cwd = Paths.get(".").toAbsolutePath.toString.replace('\\', '/').replaceAll("/$", "") - printf("cwd: %s\n", cwd) - printf("classpath: %s\n", sys.props("java.class.path")) - diff --git a/compiler/test-resources/scripting/classpathReport.sc b/compiler/test-resources/scripting/classpathReport_scalacli.sc similarity index 91% rename from compiler/test-resources/scripting/classpathReport.sc rename to compiler/test-resources/scripting/classpathReport_scalacli.sc index cc68c4b1d52e..0b2552b3ac84 100755 --- a/compiler/test-resources/scripting/classpathReport.sc +++ b/compiler/test-resources/scripting/classpathReport_scalacli.sc @@ -1,5 +1,5 @@ #!/usr/bin/env bin/scala - +// This file is a Scala CLI script. import java.nio.file.Paths // def main(args: Array[String]): Unit = // MIGRATION: Scala CLI expects `*.sc` files to be straight-line code diff --git a/compiler/test-resources/scripting/cpArgumentsFile.txt b/compiler/test-resources/scripting/cpArgumentsFile.txt deleted file mode 100755 index 73037eb7d9bc..000000000000 --- a/compiler/test-resources/scripting/cpArgumentsFile.txt +++ /dev/null @@ -1 +0,0 @@ --classpath dist/target/pack/lib/* diff --git a/compiler/test-resources/scripting/envtest.sc b/compiler/test-resources/scripting/envtest.sc index b2fde1b32339..724580449229 100755 --- a/compiler/test-resources/scripting/envtest.sc +++ b/compiler/test-resources/scripting/envtest.sc @@ -1,2 +1,4 @@ +// this file is intended to be ran as an argument to the dotty.tools.scripting.ScriptingDriver class + def main(args: Array[String]): Unit = println("Hello " + util.Properties.propOrNull("key")) diff --git a/compiler/test-resources/scripting/envtest_scalacli.sc b/compiler/test-resources/scripting/envtest_scalacli.sc new file mode 100755 index 000000000000..993ea1691640 --- /dev/null +++ b/compiler/test-resources/scripting/envtest_scalacli.sc @@ -0,0 +1,3 @@ +// This file is a Scala CLI script. + +println("Hello " + util.Properties.propOrNull("key")) diff --git a/compiler/test-resources/scripting/hashBang.sc b/compiler/test-resources/scripting/hashBang.sc index d767bd1a1592..98884bc050c0 100755 --- a/compiler/test-resources/scripting/hashBang.sc +++ b/compiler/test-resources/scripting/hashBang.sc @@ -1,4 +1,4 @@ -#!/usr/bin/env scala +#!/usr/bin/env fake-program-to-test-hashbang-removal # comment STUFF=nada !# diff --git a/compiler/test-resources/scripting/hashBang.scala b/compiler/test-resources/scripting/hashBang.scala index 1aab26269f86..b7bf6b541854 100755 --- a/compiler/test-resources/scripting/hashBang.scala +++ b/compiler/test-resources/scripting/hashBang.scala @@ -1,8 +1,8 @@ -#!/usr/bin/env scala +#!/usr/bin/env fake-program-to-test-hashbang-removal # comment STUFF=nada !# - +// everything above this point should be ignored by the compiler def main(args: Array[String]): Unit = System.err.printf("mainClassFromStack: %s\n",mainFromStack) assert(mainFromStack.contains("hashBang"),s"fromStack[$mainFromStack]") diff --git a/compiler/test-resources/scripting/scriptName.scala b/compiler/test-resources/scripting/scriptName.scala index 21aec32fe0bb..7e479197d567 100755 --- a/compiler/test-resources/scripting/scriptName.scala +++ b/compiler/test-resources/scripting/scriptName.scala @@ -1,4 +1,4 @@ -#!/usr/bin/env scala +// this file is intended to be ran as an argument to the dotty.tools.scripting.ScriptingDriver class def main(args: Array[String]): Unit = val name = Option(sys.props("script.name")) match { diff --git a/compiler/test-resources/scripting/scriptPath.sc b/compiler/test-resources/scripting/scriptPath.sc index 46cd5e8a7385..e29e659d09d4 100755 --- a/compiler/test-resources/scripting/scriptPath.sc +++ b/compiler/test-resources/scripting/scriptPath.sc @@ -1,4 +1,4 @@ -#!dist/target/pack/bin/scala +// this file is intended to be ran as an argument to the dotty.tools.scripting.ScriptingDriver class def main(args: Array[String]): Unit = args.zipWithIndex.foreach { case (arg,i) => printf("arg %d: [%s]\n",i,arg) } diff --git a/compiler/test-resources/scripting/scriptPath_scalacli.sc b/compiler/test-resources/scripting/scriptPath_scalacli.sc new file mode 100755 index 000000000000..c13888d0e4b1 --- /dev/null +++ b/compiler/test-resources/scripting/scriptPath_scalacli.sc @@ -0,0 +1,13 @@ +#!/usr/bin/env bin/scala + +// THIS FILE IS RAN WITH SCALA CLI, which wraps scripts exposing scriptPath and args variables + +args.zipWithIndex.foreach { case (arg,i) => printf("arg %d: [%s]\n",i,arg) } + +if !scriptPath.endsWith("scriptPath_scalacli.sc") then + printf( s"incorrect script.path defined as [$scriptPath]") +else + printf("scriptPath: %s\n", scriptPath) // report the value + +extension(s: String) + def norm: String = s.replace('\\', '/') diff --git a/compiler/test-resources/scripting/showArgs.sc b/compiler/test-resources/scripting/showArgs.sc index 8ef08f8962b0..69d552b9cf5f 100755 --- a/compiler/test-resources/scripting/showArgs.sc +++ b/compiler/test-resources/scripting/showArgs.sc @@ -1,4 +1,4 @@ -#!/usr/bin/env bin/scala +// this file is intended to be ran as an argument to the dotty.tools.scripting.ScriptingDriver class // precise output format expected by BashScriptsTests.scala def main(args: Array[String]): Unit = diff --git a/compiler/test-resources/scripting/showArgs_scalacli.sc b/compiler/test-resources/scripting/showArgs_scalacli.sc new file mode 100755 index 000000000000..4591ac159345 --- /dev/null +++ b/compiler/test-resources/scripting/showArgs_scalacli.sc @@ -0,0 +1,7 @@ +#!/usr/bin/env bin/scala + +// This file is a Scala CLI script. + +// precise output format expected by BashScriptsTests.scala +for (a,i) <- args.zipWithIndex do + printf(s"arg %2d:[%s]\n",i,a) diff --git a/compiler/test-resources/scripting/sqlDateError.sc b/compiler/test-resources/scripting/sqlDateError.sc index 35160fd6fcd5..e7c3a623c6c1 100755 --- a/compiler/test-resources/scripting/sqlDateError.sc +++ b/compiler/test-resources/scripting/sqlDateError.sc @@ -1,4 +1,4 @@ -#!/usr/bin/env bin/scala +// this file is intended to be ran as an argument to the dotty.tools.scripting.ScriptingDriver class def main(args: Array[String]): Unit = { println(new java.sql.Date(100L)) diff --git a/compiler/test-resources/scripting/sqlDateError_scalacli.sc b/compiler/test-resources/scripting/sqlDateError_scalacli.sc new file mode 100755 index 000000000000..10b58821a6e4 --- /dev/null +++ b/compiler/test-resources/scripting/sqlDateError_scalacli.sc @@ -0,0 +1,6 @@ +#!/usr/bin/env bin/scala + +// This file is a Scala CLI script. + +println(new java.sql.Date(100L)) +System.err.println("SCALA_OPTS="+Option(System.getenv("SCALA_OPTS")).getOrElse("")) diff --git a/compiler/test-resources/scripting/touchFile.sc b/compiler/test-resources/scripting/touchFile.sc index 974f8a64d192..b46b3c99d786 100755 --- a/compiler/test-resources/scripting/touchFile.sc +++ b/compiler/test-resources/scripting/touchFile.sc @@ -1,4 +1,4 @@ -#!/usr/bin/env scala +// this file is intended to be ran as an argument to the dotty.tools.scripting.ScriptingDriver class import java.io.File diff --git a/compiler/test-resources/scripting/unglobClasspath.sc b/compiler/test-resources/scripting/unglobClasspath.sc deleted file mode 100755 index deab2b8982ac..000000000000 --- a/compiler/test-resources/scripting/unglobClasspath.sc +++ /dev/null @@ -1,6 +0,0 @@ -// won't compile unless classpath is set correctly -import dotty.tools.tasty.TastyFormat - -// def main(args: Array[String]) = // MIGRATION: Scala CLI expects `*.sc` files to be straight-line code - val cp = sys.props("java.class.path") - printf("unglobbed classpath: %s\n", cp) diff --git a/compiler/test-resources/scripting/unglobClasspath_scalacli.sc b/compiler/test-resources/scripting/unglobClasspath_scalacli.sc new file mode 100755 index 000000000000..ccc4cf667085 --- /dev/null +++ b/compiler/test-resources/scripting/unglobClasspath_scalacli.sc @@ -0,0 +1,9 @@ +// This file is a Scala CLI script. + +import dotty.tools.tasty.TastyFormat +// ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ +// not visible on default classpath, "compiler/test/dotty/tools/scripting/ClasspathTests.scala" +// adds it to classpath via a compiler argument `-classpath 'org/scala-lang/tasty-core_3/$VERSION/*'` + +val cp = sys.props("java.class.path") +printf("unglobbed classpath: %s\n", cp) diff --git a/compiler/test/dotty/tools/io/ClasspathTest.scala b/compiler/test/dotty/tools/io/ClasspathTest.scala index a0fef65afdec..333f2b8062b0 100755 --- a/compiler/test/dotty/tools/io/ClasspathTest.scala +++ b/compiler/test/dotty/tools/io/ClasspathTest.scala @@ -15,6 +15,8 @@ class ClasspathTest { def pathsep = sys.props("path.separator") + def isWindows: Boolean = scala.util.Properties.isWin + // // Cope with wildcard classpath entries, exercised with -classpath // @@ -23,7 +25,7 @@ class ClasspathTest { @Test def testWildcards(): Unit = val outDir = Files.createTempDirectory("classpath-test") try - val compilerLib = "dist/target/pack/lib" + val compilerLib = s"${if isWindows then "dist-win-x86_64" else "dist"}/target/pack/lib" val libdir = Paths.get(compilerLib).toFile if libdir.exists then val libjarFiles = libdir.listFiles.toList.take(5) diff --git a/compiler/test/dotty/tools/scripting/BashExitCodeTests.scala b/compiler/test/dotty/tools/scripting/BashExitCodeTests.scala index 90a8d80330b4..857f5ef378e7 100644 --- a/compiler/test/dotty/tools/scripting/BashExitCodeTests.scala +++ b/compiler/test/dotty/tools/scripting/BashExitCodeTests.scala @@ -16,7 +16,11 @@ import ScriptTestEnv.* class BashExitCodeTests: private var myTmpDir: String | Null = null private lazy val tmpDir = { myTmpDir = Files.createTempDirectory("exit-code-tests").toFile.absPath; myTmpDir } - @After def cleanup(): Unit = if myTmpDir != null then io.Directory(myTmpDir).deleteRecursively() + @After def cleanup(): Unit = { + if myTmpDir != null then io.Directory(myTmpDir).deleteRecursively() + + cleanupScalaCLIDirs() + } /** Verify the exit code of running `cmd args*`. */ def verifyExit(cmd: String, args: String*)(expectedExitCode: Int): Unit = @@ -28,8 +32,8 @@ class BashExitCodeTests: s"expected $expectedExitCode but got $exitCode${pp("out", stdout)}${pp("err", stderr)}" }, expectedExitCode, exitCode) - // Helpers for running scala, scalac, and scalac without the the output directory ("raw") - def scala(args: String*) = verifyExit(scalaPath, ("--power" +: "--offline" +: "--server=false" +: args)*) + // Helpers for running scala, scalac, and scalac without the output directory ("raw") + def scala(args: String*) = verifyExit(scalaPath, ("--power" +: args :+ "--offline" :+ "--server=false")*) def scalacRaw(args: String*) = verifyExit(scalacPath, args*) def scalac(args: String*) = scalacRaw(("-d" +: tmpDir +: args)*) diff --git a/compiler/test/dotty/tools/scripting/BashScriptsTests.scala b/compiler/test/dotty/tools/scripting/BashScriptsTests.scala index 25bc54e2dcbe..6af863f0fccd 100644 --- a/compiler/test/dotty/tools/scripting/BashScriptsTests.scala +++ b/compiler/test/dotty/tools/scripting/BashScriptsTests.scala @@ -25,11 +25,13 @@ object BashScriptsTests: def testFiles = scripts("/scripting") @AfterClass def cleanup: Unit = { + cleanupScalaCLIDirs() + val af = argsfile.toFile - if (af.exists) { + if af.exists then af.delete() - } } + printf("osname[%s]\n", osname) printf("uname[%s]\n", ostypeFull) printf("using JAVA_HOME=%s\n", envJavaHome) @@ -50,7 +52,7 @@ object BashScriptsTests: val testScriptArgs = Seq( "a", "b", "c", "-repl", "-run", "-script", "-debug" ) - val Seq(showArgsScript, showArgsScalaCli) = Seq("showArgs.sc", "showArgsNu.sc").map { name => + val Seq(showArgsScript, showArgsScalaCli) = Seq("showArgs.sc", "showArgs_scalacli.sc").map { name => testFiles.find(_.getName == name).get.absPath } @@ -66,7 +68,7 @@ object BashScriptsTests: } file - val Seq(envtestNuSc, envtestScala) = Seq("envtestNu.sc", "envtest.scala").map { testFile(_) } + val Seq(envtestNuSc, envtestScala) = Seq("envtest_scalacli.sc", "envtest.scala").map { testFile(_) } // create command line with given options, execute specified script, return stdout def callScript(tag: String, script: String, keyPre: String): String = @@ -173,13 +175,13 @@ class BashScriptsTests: assert(stdout == expectedOutput) /* - * verify that scriptPathNu.sc sees a valid script.path property, - * and that it's value is the path to "scriptPathNu.sc". + * verify that scriptPath_scalacli.sc sees a valid script.path property, + * and that it's value is the path to "scriptPath_scalacli.sc". */ @Category(Array(classOf[BootstrappedOnlyTests])) @Test def verifyScriptPathProperty = assumeFalse("Scripts do not yet support Scala 2 library TASTy", Properties.usingScalaLibraryTasty) - val scriptFile = testFiles.find(_.getName == "scriptPathNu.sc").get + val scriptFile = testFiles.find(_.getName == "scriptPath_scalacli.sc").get val expected = s"${scriptFile.getName}" printf("===> verify valid system property script.path is reported by script [%s]\n", scriptFile.getName) printf("calling scriptFile: %s\n", scriptFile) @@ -196,7 +198,7 @@ class BashScriptsTests: */ @Test def verifyScalaOpts = assumeFalse("Scripts do not yet support Scala 2 library TASTy", Properties.usingScalaLibraryTasty) - val scriptFile = testFiles.find(_.getName == "classpathReport.sc").get + val scriptFile = testFiles.find(_.getName == "classpathReport_scalacli.sc").get printf("===> verify SCALA_OPTS='@argsfile' is properly handled by `dist/bin/scala`\n") val envPairs = List(("SCALA_OPTS", s"@$argsfile")) val (validTest, exitCode, stdout, stderr) = bashCommand(scriptFile.absPath, envPairs) @@ -219,7 +221,7 @@ class BashScriptsTests: */ @Test def sqlDateTest = assumeFalse("Scripts do not yet support Scala 2 library TASTy", Properties.usingScalaLibraryTasty) - val scriptBase = "sqlDateErrorNu" + val scriptBase = "sqlDateError_scalacli" val scriptFile = testFiles.find(_.getName == s"$scriptBase.sc").get val testJar = testFile(s"$scriptBase.jar") // jar should not be created when scriptFile runs val tj = Paths.get(testJar).toFile diff --git a/compiler/test/dotty/tools/scripting/ClasspathTests.scala b/compiler/test/dotty/tools/scripting/ClasspathTests.scala index 24c6c297a777..a946e509aeb3 100755 --- a/compiler/test/dotty/tools/scripting/ClasspathTests.scala +++ b/compiler/test/dotty/tools/scripting/ClasspathTests.scala @@ -11,8 +11,12 @@ import org.junit.{Test, Ignore, AfterClass} import vulpix.TestConfiguration import ScriptTestEnv.* -/** Test java command line generated by bin/scala and bin/scalac */ +object ClasspathTests: + @AfterClass def cleanup: Unit = { + cleanupScalaCLIDirs() + } +/** Test java command line generated by bin/scala and bin/scalac */ class ClasspathTests: /* * Test disabled (temporarily). @@ -24,7 +28,7 @@ class ClasspathTests: @Ignore @Test def hashbangClasspathVerifyTest = { // only interested in classpath test scripts - val testScriptName = "classpathReport.sc" + val testScriptName = "classpathReport_scalacli.sc" val testScript = scripts("/scripting").find { _.getName.matches(testScriptName) } match case None => sys.error(s"test script not found: ${testScriptName}") case Some(file) => file @@ -39,7 +43,7 @@ class ClasspathTests: cmd.foreach { printf("[%s]\n", _) } - // classpathReport.sc is expected to produce two lines: + // classpathReport_scalacli.sc is expected to produce two lines: // cwd: // classpath: @@ -51,10 +55,10 @@ class ClasspathTests: // convert scriptCp to a list of files val hashbangJars: List[File] = scriptCp.split(psep).map { _.toFile }.toList val hashbangClasspathJars = hashbangJars.map { _.name }.sorted.distinct // get jar basenames, remove duplicates - val packlibDir: String = ??? /* ??? was s"$scriptCwd/$packLibDir" */ // classpathReport.sc specifies a wildcard classpath in this directory + val packlibDir: String = ??? /* ??? was s"$scriptCwd/$packLibDir" */ // classpathReport_scalacli.sc specifies a wildcard classpath in this directory val packlibJars: List[File] = listJars(packlibDir) // classpath entries expected to have been reported by the script - printf("%d jar files in dist/target/pack/lib\n", packlibJars.size) + printf(s"%d jar files in $packDir/lib\n", packlibJars.size) printf("%d test script jars in classpath\n", hashbangClasspathJars.size) val (diff: Set[File], msg: String) = if (packlibJars.size > hashbangClasspathJars.size) { @@ -63,7 +67,7 @@ class ClasspathTests: (hashbangJars.toSet -- packlibJars.toSet , "only in hashbang classpath") } // verify that the script hasbang classpath setting was effective at supplementing the classpath - // (a minimal subset of jars below dist/target/pack/lib are always be in the classpath) + // (a minimal subset of jars below dist*/target/pack/lib are always be in the classpath) val missingClasspathEntries = if hashbangClasspathJars.size != packlibJars.size then printf("packlib dir [%s]\n", packlibDir) printf("hashbangClasspathJars: %s\n", hashbangJars.map { _.relpath.norm }.mkString("\n ", "\n ", "")) @@ -79,7 +83,7 @@ class ClasspathTests: */ @Ignore @Test def unglobClasspathVerifyTest = { - val testScriptName = "unglobClasspath.sc" + val testScriptName = "unglobClasspath_scalacli.sc" val testScript = scripts("/scripting").find { _.name.matches(testScriptName) } match case None => sys.error(s"test script not found: ${testScriptName}") case Some(file) => file diff --git a/compiler/test/dotty/tools/scripting/ExpressionTest.scala b/compiler/test/dotty/tools/scripting/ExpressionTest.scala index 02963f50ee52..bc42860253b0 100755 --- a/compiler/test/dotty/tools/scripting/ExpressionTest.scala +++ b/compiler/test/dotty/tools/scripting/ExpressionTest.scala @@ -55,6 +55,10 @@ class ExpressionTest: object ExpressionTest: + @AfterClass def cleanup(): Unit = { + cleanupScalaCLIDirs() + } + def main(args: Array[String]): Unit = val tests = new ExpressionTest println("\n=== verifyCommandLineExpression ===") diff --git a/compiler/test/dotty/tools/scripting/ScriptTestEnv.scala b/compiler/test/dotty/tools/scripting/ScriptTestEnv.scala index a52014f14704..dd1cc04bb58a 100644 --- a/compiler/test/dotty/tools/scripting/ScriptTestEnv.scala +++ b/compiler/test/dotty/tools/scripting/ScriptTestEnv.scala @@ -5,6 +5,7 @@ package scripting import scala.language.unsafeNulls import java.io.File +import java.util.Locale import java.nio.file.{Path, Paths, Files} import dotty.tools.dotc.config.Properties.* @@ -15,7 +16,7 @@ import scala.jdk.CollectionConverters.* /** * Common Code for supporting scripting tests. * To override the path to the bash executable, set TEST_BASH= - * To specify where `dist/target/pack/bin` resides, set TEST_CWD= + * To specify where `dist[*]/target/pack/bin` resides, set TEST_CWD= * Test scripts run in a bash env, so paths are converted to forward slash via .norm. */ object ScriptTestEnv { @@ -28,6 +29,44 @@ object ScriptTestEnv { def whichJava: String = whichExe("java") def whichBash: String = whichExe("bash") + def cleanupScalaCLIDirs(): Unit = { + val scriptingDir = io.Directory(scriptsDir("/scripting").getPath) + val dottyDir = io.Directory(workingDirectory) + + val residueDirs = Seq( + (scriptingDir / ".bsp"), + (scriptingDir / ".scala-build"), + (dottyDir / ".scala-build") + ) + + for f <- residueDirs do + f.deleteRecursively() + + val bspDir = dottyDir / ".bsp" + (bspDir / "scala.json").delete() + if bspDir.isEmpty then bspDir.delete() + } + + lazy val nativePackDir: Option[String] = { + def nativeDir(os: String, arch: String) = Some(s"dist/$os-$arch/target/pack") + def nativeOs(os: String) = archNorm match + case arch @ ("aarch64" | "x86_64") => nativeDir(os, arch) + case _ => None + + if winshell then nativeDir("win", "x86_64") // assume x86_64 for now + else if linux then nativeOs("linux") + else if mac then nativeOs("mac") + else None + } + + def jvmPackDir() = + println("warning: unknown OS architecture combination, defaulting to JVM launcher.") + "dist/target/pack" + + def packDir: String = nativePackDir.getOrElse(jvmPackDir()) + + def packBinDir: String = s"$packDir/bin" + lazy val workingDirectory: String = { val dirstr = if testCwd.nonEmpty then if verbose then printf("TEST_CWD set to [%s]\n", testCwd) @@ -36,7 +75,7 @@ object ScriptTestEnv { userDir // userDir, if TEST_CWD not set // issue warning if things don't look right - val test = Paths.get(s"$dirstr/dist/target/pack/bin").normalize + val test = Paths.get(s"$dirstr/$packBinDir").normalize if !test.isDirectory then printf("warning: not found below working directory: %s\n", test.norm) @@ -46,7 +85,7 @@ object ScriptTestEnv { def envPath: String = envOrElse("PATH", "") // remove duplicate entries in path - def supplementedPath: String = s"dist/target/pack/bin$psep$envJavaHome/bin$psep$envScalaHome/bin$psep$envPath".norm + def supplementedPath: String = s"$packBinDir$psep$envJavaHome/bin$psep$envScalaHome/bin$psep$envPath".norm def adjustedPathEntries: List[String] = supplementedPath.norm.split(psep).toList.distinct def adjustedPath: String = adjustedPathEntries.mkString(psep) def envPathEntries: List[String] = envPath.split(psep).toList.distinct @@ -55,11 +94,18 @@ object ScriptTestEnv { def unameExe = which("uname") def ostypeFull = if unameExe.nonEmpty then exec(unameExe).mkString else "" - def ostype = ostypeFull.toLowerCase.takeWhile{ cc => cc >= 'a' && cc <='z' || cc >= 'A' && cc <= 'Z' } + def ostype = ostypeFull.toLowerCase(Locale.ROOT).takeWhile{ cc => cc >= 'a' && cc <='z' || cc >= 'A' && cc <= 'Z' } + def archFull = if unameExe.nonEmpty then exec(unameExe, "-m").mkString else "" + def archNorm = archFull match + case "arm64" => "aarch64" + case "amd64" => "x86_64" + case id => id def cygwin = ostype == "cygwin" def mingw = ostype == "mingw" def msys = ostype == "msys" + def linux = ostype == "linux" + def mac = ostype == "darwin" def winshell: Boolean = cygwin || mingw || msys def which(str: String) = @@ -124,10 +170,9 @@ object ScriptTestEnv { } yield line - def packBinDir = "dist/target/pack/bin" - // def packLibDir = "dist/target/pack/lib" // replaced by packMavenDir - def packMavenDir = "dist/target/pack/maven2" - def packVersionFile = "dist/target/pack/VERSION" + // def packLibDir = s"$packDir/lib" // replaced by packMavenDir + def packMavenDir = s"$packDir/maven2" + def packVersionFile = s"$packDir/VERSION" def packBinScalaExists: Boolean = Files.exists(Paths.get(s"$packBinDir/scala")) def packScalaVersion: String = { @@ -248,8 +293,8 @@ object ScriptTestEnv { lazy val cwd: Path = Paths.get(".").toAbsolutePath.normalize lazy val (scalacPath: String, scalaPath: String) = { - val scalac = s"$workingDirectory/dist/target/pack/bin/scalac".toPath.normalize - val scala = s"$workingDirectory/dist/target/pack/bin/scala".toPath.normalize + val scalac = s"$workingDirectory/$packBinDir/scalac".toPath.normalize + val scala = s"$workingDirectory/$packBinDir/scala".toPath.normalize (scalac.norm, scala.norm) } @@ -257,7 +302,7 @@ object ScriptTestEnv { // use optional TEST_BASH if defined, otherwise, bash must be in PATH // envScalaHome is: - // dist/target/pack, if present + // dist[*]/target/pack, if present // else, SCALA_HOME if defined // else, not defined lazy val envScalaHome = diff --git a/compiler/test/dotty/tools/scripting/ScriptingTests.scala b/compiler/test/dotty/tools/scripting/ScriptingTests.scala index 713695b62f4a..4dc193f0efe4 100644 --- a/compiler/test/dotty/tools/scripting/ScriptingTests.scala +++ b/compiler/test/dotty/tools/scripting/ScriptingTests.scala @@ -17,7 +17,11 @@ import org.junit.Assume.assumeFalse /** Runs all tests contained in `compiler/test-resources/scripting/` */ class ScriptingTests: // classpath tests managed by scripting.ClasspathTests.scala - def testFiles = scripts("/scripting").filter { ! _.getName.toLowerCase.contains("classpath") } + def testFiles = scripts("/scripting").filter { sc => + val name = sc.getName.toLowerCase + !name.contains("classpath") + && !name.contains("_scalacli") + } /* * Call .scala scripts without -save option, verify no jar created diff --git a/compiler/test/dotty/tools/utils.scala b/compiler/test/dotty/tools/utils.scala index a8c480088e08..d17edbaa855e 100644 --- a/compiler/test/dotty/tools/utils.scala +++ b/compiler/test/dotty/tools/utils.scala @@ -20,14 +20,19 @@ import dotc.config.CommandLineParser object Dummy def scripts(path: String): Array[File] = { - val dir = new File(Dummy.getClass.getResource(path).getPath) - assert(dir.exists && dir.isDirectory, "Couldn't load scripts dir") + val dir = scriptsDir(path) dir.listFiles.filter { f => val path = if f.isDirectory then f.getPath + "/" else f.getPath Properties.testsFilter.isEmpty || Properties.testsFilter.exists(path.contains) } } +def scriptsDir(path: String): File = { + val dir = new File(Dummy.getClass.getResource(path).getPath) + assert(dir.exists && dir.isDirectory, "Couldn't load scripts dir") + dir +} + extension (f: File) def absPath = f.getAbsolutePath.replace('\\', '/') @@ -101,10 +106,10 @@ def toolArgsParse(lines: List[String], filename: Option[String]): List[(String,S case toolArg(name, args) => List((name, args)) case _ => Nil } ++ - lines.flatMap { + lines.flatMap { case directiveOptionsArg(args) => List(("scalac", args)) case directiveJavacOptions(args) => List(("javac", args)) - case _ => Nil + case _ => Nil } import org.junit.Test diff --git a/dist/bin-native-overrides/cli-common-platform b/dist/bin-native-overrides/cli-common-platform new file mode 100644 index 000000000000..1a11c770f91a --- /dev/null +++ b/dist/bin-native-overrides/cli-common-platform @@ -0,0 +1,16 @@ +#!/usr/bin/env bash + +if [[ ${cygwin-} || ${mingw-} || ${msys-} ]]; then + SCALA_CLI_VERSION="" + # iterate through lines in VERSION_SRC + while IFS= read -r line; do + # if line starts with "version:=" then extract the version + if [[ "$line" == cli_version:=* ]]; then + SCALA_CLI_VERSION="${line#cli_version:=}" + break + fi + done < "$PROG_HOME/EXTRA_PROPERTIES" + SCALA_CLI_CMD_BASH=("\"$PROG_HOME/bin/scala-cli\"" "--cli-version \"$SCALA_CLI_VERSION\"") +else + SCALA_CLI_CMD_BASH=("\"$PROG_HOME/bin/scala-cli\"") +fi diff --git a/dist/bin-native-overrides/cli-common-platform.bat b/dist/bin-native-overrides/cli-common-platform.bat new file mode 100644 index 000000000000..e0cfa40692b5 --- /dev/null +++ b/dist/bin-native-overrides/cli-common-platform.bat @@ -0,0 +1,18 @@ +@echo off + +setlocal enabledelayedexpansion + +set "_SCALA_CLI_VERSION=" +@rem read for cli_version:=_SCALA_CLI_VERSION in EXTRA_PROPERTIES file +FOR /F "usebackq delims=" %%G IN ("%_PROG_HOME%\EXTRA_PROPERTIES") DO ( + SET "line=%%G" + IF "!line:~0,13!"=="cli_version:=" ( + SET "_SCALA_CLI_VERSION=!line:~13!" + GOTO :foundCliVersion + ) +) + +:foundCliVersion +endlocal & set "SCALA_CLI_VERSION=%_SCALA_CLI_VERSION%" + +set SCALA_CLI_CMD_WIN="%_PROG_HOME%\bin\scala-cli.exe" "--cli-version" "%SCALA_CLI_VERSION%" \ No newline at end of file diff --git a/dist/bin/cli-common-platform b/dist/bin/cli-common-platform new file mode 100644 index 000000000000..a5906e882bb4 --- /dev/null +++ b/dist/bin/cli-common-platform @@ -0,0 +1,3 @@ +#!/usr/bin/env bash + +SCALA_CLI_CMD_BASH=("\"$JAVACMD\"" "-jar \"$PROG_HOME/bin/scala-cli.jar\"") diff --git a/dist/bin/cli-common-platform.bat b/dist/bin/cli-common-platform.bat new file mode 100644 index 000000000000..99103266c1d9 --- /dev/null +++ b/dist/bin/cli-common-platform.bat @@ -0,0 +1,5 @@ +@echo off + +@rem we need to escape % in the java command path, for some reason this doesnt work in common.bat +set "_JAVACMD=!_JAVACMD:%%=%%%%!" +set SCALA_CLI_CMD_WIN="%_JAVACMD%" "-jar" "%_PROG_HOME%\bin\scala-cli.jar" \ No newline at end of file diff --git a/dist/bin/common b/dist/bin/common index e3e4253938fb..4a0152fbc4cb 100755 --- a/dist/bin/common +++ b/dist/bin/common @@ -1,132 +1,6 @@ #!/usr/bin/env bash -#/*-------------------------------------------------------------------------- -# * Credits: This script is based on the script generated by sbt-pack. -# *--------------------------------------------------------------------------*/ - -# save terminal settings -saved_stty=$(stty -g 2>/dev/null) -# clear on error so we don't later try to restore them -if [[ ! $? ]]; then - saved_stty="" -fi - -# restore stty settings (echo in particular) -function restoreSttySettings() { - stty $saved_stty - saved_stty="" -} - -scala_exit_status=127 -function onExit() { - [[ "$saved_stty" != "" ]] && restoreSttySettings - exit $scala_exit_status -} - -# to reenable echo if we are interrupted before completing. -trap onExit INT TERM EXIT - -unset cygwin mingw msys darwin conemu - -# COLUMNS is used together with command line option '-pageWidth'. -if command -v tput >/dev/null 2>&1; then - export COLUMNS="$(tput -Tdumb cols)" -fi - -case "`uname`" in - CYGWIN*) cygwin=true - ;; - MINGW*) mingw=true - ;; - MSYS*) msys=true - ;; - Darwin*) darwin=true - if [ -z "$JAVA_VERSION" ] ; then - JAVA_VERSION="CurrentJDK" - else - echo "Using Java version: $JAVA_VERSION" 1>&2 - fi - if [ -z "$JAVA_HOME" ] ; then - JAVA_HOME=/System/Library/Frameworks/JavaVM.framework/Versions/${JAVA_VERSION}/Home - fi - JAVACMD="`which java`" - ;; -esac - -unset CYGPATHCMD -if [[ ${cygwin-} || ${mingw-} || ${msys-} ]]; then - # ConEmu terminal is incompatible with jna-5.*.jar - [[ (${CONEMUANSI-} || ${ConEmuANSI-}) ]] && conemu=true - # cygpath is used by various windows shells: cygwin, git-sdk, gitbash, msys, etc. - CYGPATHCMD=`which cygpath 2>/dev/null` - case "$TERM" in - rxvt* | xterm* | cygwin*) - stty -icanon min 1 -echo - JAVA_OPTS="$JAVA_OPTS -Djline.terminal=unix" - ;; - esac -fi - -# Resolve JAVA_HOME from javac command path -if [ -z "$JAVA_HOME" ]; then - javaExecutable="`which javac`" - if [ -n "$javaExecutable" -a -f "$javaExecutable" -a ! "`expr \"$javaExecutable\" : '\([^ ]*\)'`" = "no" ]; then - # readlink(1) is not available as standard on Solaris 10. - readLink=`which readlink` - if [ ! `expr "$readLink" : '\([^ ]*\)'` = "no" ]; then - javaExecutable="`readlink -f \"$javaExecutable\"`" - javaHome="`dirname \"$javaExecutable\"`" - javaHome=`expr "$javaHome" : '\(.*\)/bin'` - JAVA_HOME="$javaHome" - export JAVA_HOME - fi - fi -fi - -if [ -z "${JAVACMD-}" ] ; then - if [ -n "${JAVA_HOME-}" ] ; then - if [ -x "$JAVA_HOME/jre/sh/java" ] ; then - # IBM's JDK on AIX uses strange locations for the executables - JAVACMD="$JAVA_HOME/jre/sh/java" - else - JAVACMD="$JAVA_HOME/bin/java" - fi - else - JAVACMD="`which java`" - fi -fi - -if [ ! -x "$JAVACMD" ] ; then - echo "Error: JAVA_HOME is not defined correctly." - echo " We cannot execute $JAVACMD" - exit 1 -fi - -if [ -z "$JAVA_HOME" ] ; then - echo "Warning: JAVA_HOME environment variable is not set." -fi - -CLASSPATH_SUFFIX="" -# Path separator used in EXTRA_CLASSPATH -PSEP=":" - -# translate paths to Windows-mixed format before running java -if [ -n "${CYGPATHCMD-}" ]; then - [ -n "${PROG_HOME-}" ] && - PROG_HOME=`"$CYGPATHCMD" -am "$PROG_HOME"` - [ -n "$JAVA_HOME" ] && - JAVA_HOME=`"$CYGPATHCMD" -am "$JAVA_HOME"` - CLASSPATH_SUFFIX=";" - PSEP=";" -elif [[ ${mingw-} || ${msys-} ]]; then - # For Mingw / Msys, convert paths from UNIX format before anything is touched - [ -n "$PROG_HOME" ] && - PROG_HOME="`(cd "$PROG_HOME"; pwd -W | sed 's|/|\\\\|g')`" - [ -n "$JAVA_HOME" ] && - JAVA_HOME="`(cd "$JAVA_HOME"; pwd -W | sed 's|/|\\\\|g')`" - CLASSPATH_SUFFIX=";" - PSEP=";" -fi +source "$PROG_HOME/bin/common-shared" #/*-------------------------------------------------- # * The code below is for Dotty @@ -205,16 +79,12 @@ ReplMain=dotty.tools.repl.Main ScriptingMain=dotty.tools.scripting.Main declare -a java_args -declare -a scala_args declare -a residual_args declare -a script_args addJava () { java_args+=("'$1'") } -addScala () { - scala_args+=("'$1'") -} addResidual () { residual_args+=("'$1'") } diff --git a/dist/bin/common-shared b/dist/bin/common-shared new file mode 100644 index 000000000000..8c85993a5283 --- /dev/null +++ b/dist/bin/common-shared @@ -0,0 +1,139 @@ +#!/usr/bin/env bash + +# Common options for both scala-cli and java based launchers + +#/*-------------------------------------------------------------------------- +# * Credits: This script is based on the script generated by sbt-pack. +# *--------------------------------------------------------------------------*/ + +# save terminal settings +saved_stty=$(stty -g 2>/dev/null) +# clear on error so we don't later try to restore them +if [[ ! $? ]]; then + saved_stty="" +fi + +# restore stty settings (echo in particular) +function restoreSttySettings() { + stty $saved_stty + saved_stty="" +} + +scala_exit_status=127 +function onExit() { + [[ "$saved_stty" != "" ]] && restoreSttySettings + exit $scala_exit_status +} + +# to reenable echo if we are interrupted before completing. +trap onExit INT TERM EXIT + +unset cygwin mingw msys darwin conemu + +# COLUMNS is used together with command line option '-pageWidth'. +if command -v tput >/dev/null 2>&1; then + export COLUMNS="$(tput -Tdumb cols)" +fi + +case "`uname`" in + CYGWIN*) cygwin=true + ;; + MINGW*) mingw=true + ;; + MSYS*) msys=true + ;; + Darwin*) darwin=true + if [ -z "$JAVA_VERSION" ] ; then + JAVA_VERSION="CurrentJDK" + else + echo "Using Java version: $JAVA_VERSION" 1>&2 + fi + if [ -z "$JAVA_HOME" ] ; then + JAVA_HOME=/System/Library/Frameworks/JavaVM.framework/Versions/${JAVA_VERSION}/Home + fi + JAVACMD="`which java`" + ;; +esac + +unset CYGPATHCMD +if [[ ${cygwin-} || ${mingw-} || ${msys-} ]]; then + # ConEmu terminal is incompatible with jna-5.*.jar + [[ (${CONEMUANSI-} || ${ConEmuANSI-}) ]] && conemu=true + # cygpath is used by various windows shells: cygwin, git-sdk, gitbash, msys, etc. + CYGPATHCMD=`which cygpath 2>/dev/null` + case "$TERM" in + rxvt* | xterm* | cygwin*) + stty -icanon min 1 -echo + JAVA_OPTS="$JAVA_OPTS -Djline.terminal=unix" + ;; + esac +fi + +# Resolve JAVA_HOME from javac command path +if [ -z "$JAVA_HOME" ]; then + javaExecutable="`which javac`" + if [ -n "$javaExecutable" -a -f "$javaExecutable" -a ! "`expr \"$javaExecutable\" : '\([^ ]*\)'`" = "no" ]; then + # readlink(1) is not available as standard on Solaris 10. + readLink=`which readlink` + if [ ! `expr "$readLink" : '\([^ ]*\)'` = "no" ]; then + javaExecutable="`readlink -f \"$javaExecutable\"`" + javaHome="`dirname \"$javaExecutable\"`" + javaHome=`expr "$javaHome" : '\(.*\)/bin'` + JAVA_HOME="$javaHome" + export JAVA_HOME + fi + fi +fi + +if [ -z "${JAVACMD-}" ] ; then + if [ -n "${JAVA_HOME-}" ] ; then + if [ -x "$JAVA_HOME/jre/sh/java" ] ; then + # IBM's JDK on AIX uses strange locations for the executables + JAVACMD="$JAVA_HOME/jre/sh/java" + else + JAVACMD="$JAVA_HOME/bin/java" + fi + else + JAVACMD="`which java`" + fi +fi + +if [ ! -x "$JAVACMD" ] ; then + echo "Error: JAVA_HOME is not defined correctly." + echo " We cannot execute $JAVACMD" + exit 1 +fi + +if [ -z "$JAVA_HOME" ] ; then + echo "Warning: JAVA_HOME environment variable is not set." +fi + +CLASSPATH_SUFFIX="" +# Path separator used in EXTRA_CLASSPATH +PSEP=":" +PROG_HOME_URI="file://$PROG_HOME" + +# translate paths to Windows-mixed format before running java +if [ -n "${CYGPATHCMD-}" ]; then + [ -n "${PROG_HOME-}" ] && + PROG_HOME=`"$CYGPATHCMD" -am "$PROG_HOME"` + PROG_HOME_URI="file:///$PROG_HOME" # Add extra root dir prefix + [ -n "$JAVA_HOME" ] && + JAVA_HOME=`"$CYGPATHCMD" -am "$JAVA_HOME"` + CLASSPATH_SUFFIX=";" + PSEP=";" +elif [[ ${mingw-} || ${msys-} ]]; then + # For Mingw / Msys, convert paths from UNIX format before anything is touched + [ -n "$PROG_HOME" ] && + PROG_HOME="`(cd "$PROG_HOME"; pwd -W | sed 's|/|\\\\|g')`" + PROG_HOME_URI="file:///$PROG_HOME" # Add extra root dir prefix + [ -n "$JAVA_HOME" ] && + JAVA_HOME="`(cd "$JAVA_HOME"; pwd -W | sed 's|/|\\\\|g')`" + CLASSPATH_SUFFIX=";" + PSEP=";" +fi + +declare -a scala_args +addScala () { + scala_args+=("'$1'") +} diff --git a/dist/bin/scala b/dist/bin/scala index 3040c5a9a0f3..71747a8e9e20 100755 --- a/dist/bin/scala +++ b/dist/bin/scala @@ -26,7 +26,8 @@ if [ -z "${PROG_HOME-}" ] ; then cd "$saveddir" fi -source "$PROG_HOME/bin/cli-common" +source "$PROG_HOME/bin/common-shared" +source "$PROG_HOME/bin/cli-common-platform" SCALA_VERSION="" # iterate through lines in VERSION_SRC @@ -44,7 +45,7 @@ if [ -z "$SCALA_VERSION" ]; then exit 1 fi -MVN_REPOSITORY="file://$PROG_HOME/maven2" +MVN_REPOSITORY="$PROG_HOME_URI/maven2" # escape all script arguments while [[ $# -gt 0 ]]; do @@ -54,8 +55,9 @@ done # exec here would prevent onExit from being called, leaving terminal in unusable state [ -z "${ConEmuPID-}" -o -n "${cygwin-}" ] && export MSYSTEM= PWD= # workaround for #12405 -eval "\"$JAVACMD\"" \ - "-jar \"$SCALA_CLI_JAR\"" \ + +# SCALA_CLI_CMD_BASH is an array, set by cli-common-platform +eval "${SCALA_CLI_CMD_BASH[@]}" \ "--prog-name scala" \ "--cli-default-scala-version \"$SCALA_VERSION\"" \ "-r \"$MVN_REPOSITORY\"" \ diff --git a/dist/bin/scala.bat b/dist/bin/scala.bat index 78336272055b..d473facbbb1c 100644 --- a/dist/bin/scala.bat +++ b/dist/bin/scala.bat @@ -19,10 +19,11 @@ if not %_EXITCODE%==0 goto end call :setScalaOpts -@rem we need to escape % in the java command path, for some reason this doesnt work in common.bat -set "_JAVACMD=!_JAVACMD:%%=%%%%!" +call "%_PROG_HOME%\bin\cli-common-platform.bat" + +@rem SCALA_CLI_CMD_WIN is an array, set in cli-common-platform.bat +call %SCALA_CLI_CMD_WIN% "--prog-name" "scala" "--cli-default-scala-version" "%_SCALA_VERSION%" "-r" "%MVN_REPOSITORY%" %* -call "%_JAVACMD%" "-jar" "%SCALA_CLI_JAR%" "--prog-name" "scala" "--cli-default-scala-version" "%_SCALA_VERSION%" "-r" "%MVN_REPOSITORY%" %* if not %ERRORLEVEL%==0 ( set _EXITCODE=1& goto end ) goto end @@ -42,19 +43,8 @@ if not "%char%"==":" ( goto :findColon ) -@REM set _PROG_HOME to the substring from the first colon to the end -set "_PROG_HOME_SUB=!_PROG_HOME:~%index%!" -@REM strip initial character -set "_PROG_HOME_SUB=!_PROG_HOME_SUB:~1!" - -@REM set drive to substring from 0 to the first colon -set "_PROG_HOME_DRIVE=!_PROG_HOME:~0,%index%!" - - - set "_SCALA_VERSION=" -set "MVN_REPOSITORY=file://%_PROG_HOME_DRIVE%\%_PROG_HOME_SUB:\=/%/maven2" -set "SCALA_CLI_JAR=%_PROG_HOME%\etc\scala-cli.jar" +set "MVN_REPOSITORY=file:///%_PROG_HOME:\=/%/maven2" @rem read for version:=_SCALA_VERSION in VERSION_FILE FOR /F "usebackq delims=" %%G IN ("%_PROG_HOME%\VERSION") DO ( diff --git a/project/Build.scala b/project/Build.scala index 0876353a6a2f..99871c4c87e8 100644 --- a/project/Build.scala +++ b/project/Build.scala @@ -28,7 +28,6 @@ import sbttastymima.TastyMiMaPlugin import sbttastymima.TastyMiMaPlugin.autoImport._ import scala.util.Properties.isJavaAtLeast -import scala.collection.mutable import org.portablescala.sbtplatformdeps.PlatformDepsPlugin.autoImport._ import org.scalajs.linker.interface.{ModuleInitializer, StandardConfig} @@ -119,7 +118,11 @@ object Build { val mimaPreviousLTSDottyVersion = "3.3.0" /** Version of Scala CLI to download */ - val scalaCliLauncherVersion = "1.3.1" + val scalaCliLauncherVersion = "1.3.2" + /** Version of Scala CLI to download (on Windows - last known validated version) */ + val scalaCliLauncherVersionWindows = "1.3.2" + /** Version of Coursier to download for initializing the local maven repo of Scala command */ + val coursierJarVersion = "2.1.10" object CompatMode { final val BinaryCompatible = 0 @@ -2121,22 +2124,72 @@ object Build { packMain := Map(), publishArtifact := false, packGenerateMakefile := false, - packArchiveName := "scala3-" + dottyVersion, republishRepo := target.value / "republish", - republishLaunchers := { - val cliV = scalaCliLauncherVersion - Seq( - ("scala-cli.jar", cliV, url(s"https://github.com/VirtusLab/scala-cli/releases/download/v$cliV/scala-cli.jar")) - ) - }, + packResourceDir += (republishRepo.value / "bin" -> "bin"), + packResourceDir += (republishRepo.value / "maven2" -> "maven2"), Compile / pack := (Compile / pack).dependsOn(republish).value, ) lazy val dist = project.asDist(Bootstrapped) .settings( - packResourceDir += (baseDirectory.value / "bin" -> "bin"), - packResourceDir += (republishRepo.value / "maven2" -> "maven2"), - packResourceDir += (republishRepo.value / "etc" -> "etc"), + packArchiveName := "scala3-" + dottyVersion, + republishBinDir := baseDirectory.value / "bin", + republishCoursier += + ("coursier.jar" -> s"https://github.com/coursier/coursier/releases/download/v$coursierJarVersion/coursier.jar"), + republishLaunchers += + ("scala-cli.jar" -> s"https://github.com/VirtusLab/scala-cli/releases/download/v$scalaCliLauncherVersion/scala-cli.jar"), + ) + + lazy val `dist-mac-x86_64` = project.in(file("dist/mac-x86_64")).asDist(Bootstrapped) + .settings( + republishBinDir := (dist / republishBinDir).value, + packArchiveName := (dist / packArchiveName).value + "-x86_64-apple-darwin", + republishBinOverrides += (dist / baseDirectory).value / "bin-native-overrides", + republishFetchCoursier := (dist / republishFetchCoursier).value, + republishLaunchers += + ("scala-cli" -> s"gz+https://github.com/VirtusLab/scala-cli/releases/download/v$scalaCliLauncherVersion/scala-cli-x86_64-apple-darwin.gz") + ) + + lazy val `dist-mac-aarch64` = project.in(file("dist/mac-aarch64")).asDist(Bootstrapped) + .settings( + republishBinDir := (dist / republishBinDir).value, + packArchiveName := (dist / packArchiveName).value + "-aarch64-apple-darwin", + republishBinOverrides += (dist / baseDirectory).value / "bin-native-overrides", + republishFetchCoursier := (dist / republishFetchCoursier).value, + republishLaunchers += + ("scala-cli" -> s"gz+https://github.com/VirtusLab/scala-cli/releases/download/v$scalaCliLauncherVersion/scala-cli-aarch64-apple-darwin.gz") + ) + + lazy val `dist-win-x86_64` = project.in(file("dist/win-x86_64")).asDist(Bootstrapped) + .settings( + republishBinDir := (dist / republishBinDir).value, + packArchiveName := (dist / packArchiveName).value + "-x86_64-pc-win32", + republishBinOverrides += (dist / baseDirectory).value / "bin-native-overrides", + republishFetchCoursier := (dist / republishFetchCoursier).value, + republishExtraProps += ("cli_version" -> scalaCliLauncherVersion), + mappings += (republishRepo.value / "etc" / "EXTRA_PROPERTIES" -> "EXTRA_PROPERTIES"), + republishLaunchers += + ("scala-cli.exe" -> s"zip+https://github.com/VirtusLab/scala-cli/releases/download/v$scalaCliLauncherVersionWindows/scala-cli-x86_64-pc-win32.zip!/scala-cli.exe") + ) + + lazy val `dist-linux-x86_64` = project.in(file("dist/linux-x86_64")).asDist(Bootstrapped) + .settings( + republishBinDir := (dist / republishBinDir).value, + packArchiveName := (dist / packArchiveName).value + "-x86_64-pc-linux", + republishBinOverrides += (dist / baseDirectory).value / "bin-native-overrides", + republishFetchCoursier := (dist / republishFetchCoursier).value, + republishLaunchers += + ("scala-cli" -> s"gz+https://github.com/VirtusLab/scala-cli/releases/download/v$scalaCliLauncherVersion/scala-cli-x86_64-pc-linux.gz") + ) + + lazy val `dist-linux-aarch64` = project.in(file("dist/linux-aarch64")).asDist(Bootstrapped) + .settings( + republishBinDir := (dist / republishBinDir).value, + packArchiveName := (dist / packArchiveName).value + "-aarch64-pc-linux", + republishBinOverrides += (dist / baseDirectory).value / "bin-native-overrides", + republishFetchCoursier := (dist / republishFetchCoursier).value, + republishLaunchers += + ("scala-cli" -> s"gz+https://github.com/VirtusLab/scala-cli/releases/download/v$scalaCliLauncherVersion/scala-cli-aarch64-pc-linux.gz") ) private def customMimaReportBinaryIssues(issueFilterLocation: String) = mimaReportBinaryIssues := { @@ -2265,8 +2318,7 @@ object Build { settings(scala3PresentationCompilerBuildInfo) def asDist(implicit mode: Mode): Project = project. - enablePlugins(PackPlugin). - enablePlugins(RepublishPlugin). + enablePlugins(PackPlugin, RepublishPlugin). withCommonSettings. settings(commonDistSettings). dependsOn( diff --git a/project/RepublishPlugin.scala b/project/RepublishPlugin.scala index bd1190dfec88..537c82d62cce 100644 --- a/project/RepublishPlugin.scala +++ b/project/RepublishPlugin.scala @@ -12,9 +12,45 @@ import sbt.util.CacheImplicits._ import scala.collection.mutable import java.nio.file.Files +import java.nio.file.attribute.PosixFilePermission +import java.nio.file.{Files, Path} + +import scala.jdk.CollectionConverters._ + /** This local plugin provides ways of publishing a project classpath and library dependencies to * .a local repository */ object RepublishPlugin extends AutoPlugin { + + /** copied from github.com/coursier/coursier */ + private object FileUtil { + + def tryMakeExecutable(path: Path): Boolean = + try { + val perms = Files.getPosixFilePermissions(path).asScala.toSet + + var newPerms = perms + if (perms(PosixFilePermission.OWNER_READ)) + newPerms += PosixFilePermission.OWNER_EXECUTE + if (perms(PosixFilePermission.GROUP_READ)) + newPerms += PosixFilePermission.GROUP_EXECUTE + if (perms(PosixFilePermission.OTHERS_READ)) + newPerms += PosixFilePermission.OTHERS_EXECUTE + + if (newPerms != perms) + Files.setPosixFilePermissions( + path, + newPerms.asJava + ) + + true + } + catch { + case _: UnsupportedOperationException => + false + } + + } + override def trigger = allRequirements override def requires = super.requires && PublishBinPlugin && PackPlugin @@ -24,9 +60,17 @@ object RepublishPlugin extends AutoPlugin { val republishAllResolved = taskKey[Seq[ResolvedArtifacts]]("Resolve the dependencies for the distribution") val republishClasspath = taskKey[Set[File]]("cache the dependencies for the distribution") val republishFetchLaunchers = taskKey[Set[File]]("cache the launcher deps for the distribution") + val republishFetchCoursier = taskKey[File]("cache the coursier.jar for resolving the local maven repo.") + val republishPrepareBin = taskKey[File]("prepare the bin directory, including launchers and scripts.") + val republishWriteExtraProps = taskKey[Option[File]]("write extra properties for the launchers.") + val republishBinDir = settingKey[File]("where to find static files for the bin dir.") + val republishCoursierDir = settingKey[File]("where to download the coursier launcher jar.") + val republishBinOverrides = settingKey[Seq[File]]("files to override those in bin-dir.") val republish = taskKey[File]("cache the dependencies and download launchers for the distribution") val republishRepo = settingKey[File]("the location to store the republished artifacts.") - val republishLaunchers = settingKey[Seq[(String, String, URL)]]("launchers to download. Sequence of (name, version, URL).") + val republishLaunchers = settingKey[Seq[(String, String)]]("launchers to download. Sequence of (name, URL).") + val republishCoursier = settingKey[Seq[(String, String)]]("coursier launcher to download. Sequence of (name, URL).") + val republishExtraProps = settingKey[Seq[(String, String)]]("extra properties for launchers.") } import autoImport._ @@ -34,11 +78,207 @@ object RepublishPlugin extends AutoPlugin { case class SimpleModuleId(org: String, name: String, revision: String) { override def toString = s"$org:$name:$revision" } - case class ResolvedArtifacts(id: SimpleModuleId, jar: File, pom: File) + case class ResolvedArtifacts(id: SimpleModuleId, jar: Option[File], pom: Option[File]) - val isRelease = sys.env.get("RELEASEBUILD") == Some("yes") + private def republishResolvedArtifacts(resolved: Seq[ResolvedArtifacts], mavenRepo: File, logOpt: Option[Logger]): Set[File] = { + IO.createDirectory(mavenRepo) + resolved.map { ra => + for (log <- logOpt) + log.info(s"[republish] publishing ${ra.id} to $mavenRepo...") + val jarOpt = ra.jar + val pomOpt = ra.pom + + assert(jarOpt.nonEmpty || pomOpt.nonEmpty, s"Neither jar nor pom found for ${ra.id}") + + val pathElems = ra.id.org.split('.').toVector :+ ra.id.name :+ ra.id.revision + val artifactDir = pathElems.foldLeft(mavenRepo)(_ / _) + IO.createDirectory(artifactDir) + for (pom <- pomOpt) IO.copyFile(pom, artifactDir / pom.getName) + for (jar <- jarOpt) IO.copyFile(jar, artifactDir / jar.getName) + artifactDir + }.toSet + } + + private def coursierCmd(jar: File, cache: File, args: Seq[String]): Unit = { + val jar0 = jar.getAbsolutePath.toString + val javaHome = sys.props.get("java.home").getOrElse { + throw new MessageOnlyException("java.home property not set") + } + val javaCmd = { + val cmd = if (scala.util.Properties.isWin) "java.exe" else "java" + (file(javaHome) / "bin" / cmd).getAbsolutePath + } + val env = Map("COURSIER_CACHE" -> cache.getAbsolutePath.toString) + val cmdLine = Seq(javaCmd, "-jar", jar0) ++ args + // invoke cmdLine with env + val p = new ProcessBuilder(cmdLine: _*).inheritIO() + p.environment().putAll(env.asJava) + val proc = p.start() + proc.waitFor() + if (proc.exitValue() != 0) + throw new MessageOnlyException(s"Error running coursier.jar with args ${args.mkString(" ")}") + } + + private def coursierFetch(coursierJar: File, log: Logger, cacheDir: File, localRepo: File, libs: Seq[String]): Unit = { + val localRepoArg = { + val path = localRepo.getAbsolutePath + if (scala.util.Properties.isWin) { + val path0 = path.replace('\\', '/') + s"file:///$path0" // extra root slash for Windows paths + } + else + s"file://$path" + } + + IO.createDirectory(cacheDir) + for (lib <- libs) { + log.info(s"[republish] Fetching $lib with coursier.jar...") + coursierCmd(coursierJar, cacheDir, + Seq( + "fetch", + "--repository", localRepoArg, + lib + ) + ) + } + } + + /**Resolve the transitive library dependencies of `libs` to `csrCacheDir`. + */ + private def resolveLibraryDeps( + coursierJar: File, + log: Logger, + csrCacheDir: File, + localRepo: File, + resolvedLocal: Seq[ResolvedArtifacts]): Seq[ResolvedArtifacts] = { + + // publish the local artifacts to the local repo, so coursier can resolve them + republishResolvedArtifacts(resolvedLocal, localRepo, logOpt = None) + + coursierFetch(coursierJar, log, csrCacheDir, localRepo, resolvedLocal.map(_.id.toString)) + + val maven2Root = java.nio.file.Files.walk(csrCacheDir.toPath) + .filter(_.getFileName.toString == "maven2") + .findFirst() + .orElseThrow(() => new MessageOnlyException(s"Could not find maven2 directory in $csrCacheDir")) + + def pathToArtifact(p: Path): ResolvedArtifacts = { + // relative path from maven2Root + val lastAsString = p.getFileName.toString + val relP = maven2Root.relativize(p) + val parts = relP.iterator().asScala.map(_.toString).toVector + val (orgParts :+ name :+ rev :+ _) = parts + val id = SimpleModuleId(orgParts.mkString("."), name, rev) + if (lastAsString.endsWith(".jar")) { + ResolvedArtifacts(id, Some(p.toFile), None) + } else { + ResolvedArtifacts(id, None, Some(p.toFile)) + } + } + + java.nio.file.Files.walk(maven2Root) + .filter(p => { + val lastAsString = p.getFileName.toString + lastAsString.endsWith(".pom") || lastAsString.endsWith(".jar") + }) + .map[ResolvedArtifacts](pathToArtifact(_)) + .iterator() + .asScala + .toSeq + } + + private def fetchFilesTask( + libexecT: Def.Initialize[Task[File]], + srcs: SettingKey[Seq[(String, String)]], + strict: Boolean) = Def.task[Set[File]] { + val s = streams.value + val log = s.log + val repoDir = republishRepo.value + val launcherVersions = srcs.value + val libexec = libexecT.value + + val dlCache = s.cacheDirectory / "republish-launchers" + + val store = s.cacheStoreFactory / "versions" + + def work(name: String, dest: File, launcher: String): File = { + val (launcherURL, workFile, prefix, subPart) = { + if (launcher.startsWith("gz+")) { + IO.createDirectory(dlCache) + val launcherURL = url(launcher.stripPrefix("gz+")) + (launcherURL, dlCache / s"$name.gz", "gz", "") + } else if (launcher.startsWith("zip+")) { + IO.createDirectory(dlCache) + val (urlPart, subPath) = launcher.split("!/") match { + case Array(urlPart, subPath) => (urlPart, subPath) + case _ => + throw new MessageOnlyException(s"[republish] Invalid zip+ URL, expected ! to mark subpath: $launcher") + } + val launcherURL = url(urlPart.stripPrefix("zip+")) + (launcherURL, dlCache / s"$name.zip", "zip", subPath) + } else { + IO.createDirectory(libexec) + (url(launcher), dest, "", "") + } + } + IO.delete(workFile) + Using.urlInputStream(launcherURL) { in => + log.info(s"[republish] Downloading $launcherURL to $workFile...") + IO.transfer(in, workFile) + log.info(s"[republish] Downloaded $launcherURL to $workFile...") + } + if (prefix == "gz") { + IO.delete(dest) + Using.fileInputStream(workFile) { in => + Using.gzipInputStream(in) { gzIn => + IO.transfer(gzIn, dest) + } + } + log.info(s"[republish] uncompressed gz file $workFile to $dest...") + IO.delete(workFile) + } else if (prefix == "zip") { + IO.delete(dest) + val files = IO.unzip(workFile, dlCache, new ExactFilter(subPart)) + val extracted = files.headOption.getOrElse(throw new MessageOnlyException(s"[republish] No files extracted from $workFile matching $subPart")) + log.info(s"[republish] unzipped $workFile to $extracted...") + IO.move(extracted, dest) + log.info(s"[republish] moved $extracted to $dest...") + IO.delete(workFile) + } + FileUtil.tryMakeExecutable(dest.toPath) + dest + } + + val allLaunchers = { + if (strict && launcherVersions.isEmpty) + throw new MessageOnlyException(s"[republish] No launchers to fetch, check the build configuration for ${srcs.key.label}.") + + for ((name, launcher) <- launcherVersions) yield { + val dest = libexec / name + + val id = name.replaceAll("[^a-zA-Z0-9]", "_") + + val fetchAction = Tracked.inputChanged[String, File](store.make(id)) { (inChanged, launcher) => + if (inChanged || !Files.exists(dest.toPath)) { + work(name, dest, launcher) + } else { + log.info(s"[republish] Using cached $name launcher ($launcher).") + dest + } + } + + fetchAction(launcher) + } + } + allLaunchers.toSet + } override val projectSettings: Seq[Def.Setting[_]] = Def.settings( + republishCoursierDir := republishRepo.value / "coursier", + republishLaunchers := Seq.empty, + republishCoursier := Seq.empty, + republishBinOverrides := Seq.empty, + republishExtraProps := Seq.empty, republishLocalResolved / republishProjectRefs := { val proj = thisProjectRef.value val deps = buildDependencies.value @@ -55,7 +295,6 @@ object RepublishPlugin extends AutoPlugin { ids.zip(published).map({ case (id, as) => val simpleId = { - val disabled = CrossVersion.disabled val name0 = id.crossVersion match { case cv: CrossVersion.Binary => // projectID does not add binary suffix @@ -76,122 +315,85 @@ object RepublishPlugin extends AutoPlugin { }) assert(jarOrNull != null, s"Could not find jar for ${id}") assert(pomOrNull != null, s"Could not find pom for ${id}") - ResolvedArtifacts(simpleId, jarOrNull, pomOrNull) + ResolvedArtifacts(simpleId, Some(jarOrNull), Some(pomOrNull)) }) } }.value, republishAllResolved := { - val localResolved = republishLocalResolved.value + val resolvedLocal = republishLocalResolved.value + val coursierJar = republishFetchCoursier.value val report = (thisProjectRef / updateFull).value + val s = streams.value + val lm = (republishAllResolved / dependencyResolution).value + val cacheDir = republishRepo.value - val found = mutable.Map.empty[SimpleModuleId, ResolvedArtifacts] - val evicted = mutable.Set.empty[SimpleModuleId] - - localResolved.foreach({ resolved => - val simpleId = resolved.id - if(isRelease) - evicted += simpleId.copy(revision = simpleId.revision + "-bin-nonbootstrapped") - else - evicted += simpleId.copy(revision = simpleId.revision + "-nonbootstrapped") - found(simpleId) = resolved - }) + val log = s.log + val csrCacheDir = s.cacheDirectory / "csr-cache" + val localRepo = s.cacheDirectory / "localRepo" / "maven2" - report.allModuleReports.foreach { mr => - val simpleId = { - val id = mr.module - SimpleModuleId(id.organization, id.name, id.revision) - } + // resolve the transitive dependencies of the local artifacts + val resolvedLibs = resolveLibraryDeps(coursierJar, log, csrCacheDir, localRepo, resolvedLocal) - if (!found.contains(simpleId) && !evicted(simpleId)) { - var jarOrNull: File = null - var pomOrNull: File = null - mr.artifacts.foreach({ case (a, f) => - if (a.`type` == "jar" || a.`type` == "bundle") { - jarOrNull = f - } else if (a.`type` == "pom") { - pomOrNull = f - } - }) - assert(jarOrNull != null, s"Could not find jar for ${simpleId}") - if (pomOrNull == null) { - val jarPath = jarOrNull.toPath - // we found the jar, so assume we can resolve a sibling pom file - val pomPath = jarPath.resolveSibling(jarPath.getFileName.toString.stripSuffix(".jar") + ".pom") - assert(Files.exists(pomPath), s"Could not find pom for ${simpleId}") - pomOrNull = pomPath.toFile - } - found(simpleId) = ResolvedArtifacts(simpleId, jarOrNull, pomOrNull) - } + // the combination of local artifacts and resolved transitive dependencies + val merged = + (resolvedLocal ++ resolvedLibs).groupBy(_.id).values.map(_.reduce { (ra1, ra2) => + val jar = ra1.jar.orElse(ra2.jar) + val pom = ra1.pom.orElse(ra2.pom) + ResolvedArtifacts(ra1.id, jar, pom) + }) - } - found.values.toSeq + merged.toSeq }, republishClasspath := { val s = streams.value val resolved = republishAllResolved.value val cacheDir = republishRepo.value - - val log = s.log - val mavenRepo = cacheDir / "maven2" - IO.createDirectory(mavenRepo) - resolved.map { ra => - log.info(s"[republish] publishing ${ra.id} to $mavenRepo...") - val jar = ra.jar - val pom = ra.pom - - val pathElems = ra.id.org.split('.').toVector :+ ra.id.name :+ ra.id.revision - val artifactDir = pathElems.foldLeft(mavenRepo)(_ / _) - IO.createDirectory(artifactDir) - IO.copyFile(jar, artifactDir / jar.getName) - IO.copyFile(pom, artifactDir / pom.getName) - artifactDir - }.toSet + republishResolvedArtifacts(resolved, cacheDir / "maven2", logOpt = Some(s.log)) }, republishFetchLaunchers := { - val s = streams.value - val log = s.log + fetchFilesTask(republishPrepareBin, republishLaunchers, strict = true).value + }, + republishFetchCoursier := { + fetchFilesTask(republishCoursierDir.toTask, republishCoursier, strict = true).value.head + }, + republishPrepareBin := { + val baseDir = baseDirectory.value + val srcBin = republishBinDir.value + val overrides = republishBinOverrides.value val repoDir = republishRepo.value - val launcherVersions = republishLaunchers.value - - val etc = repoDir / "etc" - val store = s.cacheStoreFactory / "versions" - - def work(dest: File, launcher: URL) = { - IO.delete(dest) - Using.urlInputStream(launcher) { in => - IO.createDirectory(etc) - log.info(s"[republish] Downloading $launcher to $dest...") - IO.transfer(in, dest) - log.info(s"[republish] Downloaded $launcher to $dest...") - } - dest + val targetBin = repoDir / "bin" + IO.copyDirectory(srcBin, targetBin) + overrides.foreach { dir => + IO.copyDirectory(dir, targetBin, overwrite = true) } - - val allLaunchers = { - for ((name, version, launcher) <- launcherVersions) yield { - val dest = etc / name - - val id = name.replaceAll("[^a-zA-Z0-9]", "_") - - val fetchAction = Tracked.inputChanged[String, File](store.make(id)) { (inChanged, version) => - if (inChanged || !Files.exists(dest.toPath)) { - work(dest, launcher) - } else { - log.info(s"[republish] Using cached $launcher at $dest...") - dest - } + targetBin + }, + republishWriteExtraProps := { + val s = streams.value + val log = s.log + val extraProps = republishExtraProps.value + if (extraProps.isEmpty) { + log.info("[republish] No extra properties to write.") + None + } + else { + val repoDir = republishRepo.value + val propsFile = repoDir / "etc" / "EXTRA_PROPERTIES" + log.info(s"[republish] Writing extra properties to $propsFile...") + Using.fileWriter()(propsFile) { writer => + extraProps.foreach { case (k, v) => + writer.write(s"$k:=$v\n") } - - fetchAction(version) } + Some(propsFile) } - allLaunchers.toSet }, republish := { val cacheDir = republishRepo.value val artifacts = republishClasspath.value val launchers = republishFetchLaunchers.value + val extraProps = republishWriteExtraProps.value cacheDir } ) diff --git a/project/scripts/bootstrappedOnlyCmdTests b/project/scripts/bootstrappedOnlyCmdTests index f3d730f8f494..11c35a7028cc 100755 --- a/project/scripts/bootstrappedOnlyCmdTests +++ b/project/scripts/bootstrappedOnlyCmdTests @@ -15,13 +15,13 @@ echo "testing scala.quoted.Expr.run from sbt scala" grep -qe "val a: scala.Int = 3" "$tmp" # setup for `scalac`/`scala` script tests -"$SBT" dist/pack +"$SBT" "$DIST_PROJECT/pack" -echo "capturing scala version from dist/target/pack/VERSION" -IFS=':=' read -ra versionProps < "$ROOT/dist/target/pack/VERSION" # temporarily set IFS to ':=' to split versionProps +echo "capturing scala version from $DIST_DIR/target/pack/VERSION" +IFS=':=' read -ra versionProps < "$ROOT/$DIST_DIR/target/pack/VERSION" # temporarily set IFS to ':=' to split versionProps [ ${#versionProps[@]} -eq 3 ] && \ [ ${versionProps[0]} = "version" ] && \ - [ -n ${versionProps[2]} ] || die "Expected non-empty 'version' property in $ROOT/dist/target/pack/VERSION" + [ -n ${versionProps[2]} ] || die "Expected non-empty 'version' property in $ROOT/$DIST_DIR/target/pack/VERSION" scala_version=${versionProps[2]} # check that `scalac` compiles and `scala` runs it @@ -77,7 +77,7 @@ echo "testing sbt scalac with suspension" clear_out "$OUT" "$SBT" "scala3-compiler-bootstrapped/scalac -d $OUT tests/pos-macros/macros-in-same-project-1/Bar.scala tests/pos-macros/macros-in-same-project-1/Foo.scala" > "$tmp" -# echo ":quit" | ./dist/target/pack/bin/scala # not supported by CI +# echo ":quit" | ./$DIST_DIR/target/pack/bin/scala # not supported by CI echo "testing ./bin/scaladoc" clear_out "$OUT1" @@ -101,6 +101,13 @@ grep -qe "See 'scala --help' to read about a specific subcommand." "$t ./bin/scala -d hello.jar tests/run/hello.scala ls hello.jar +clear_cli_dotfiles tests/run + +# check that `scala` runs scripts with args +echo "testing ./bin/scala with arguments" +./bin/scala run project/scripts/echoArgs.sc -- abc true 123 > "$tmp" +test "$EXPECTED_OUTPUT_ARGS" = "$(cat "$tmp")" +clear_cli_dotfiles project/scripts echo "testing i12973" clear_out "$OUT" diff --git a/project/scripts/buildScalaBinary b/project/scripts/buildScalaBinary new file mode 100755 index 000000000000..7fc5275e5d8d --- /dev/null +++ b/project/scripts/buildScalaBinary @@ -0,0 +1,12 @@ +#!/usr/bin/env bash + +set -e + +ROOT="$(cd "$(dirname "${BASH_SOURCE[0]}")" >& /dev/null && pwd)/../.." +SBT="$ROOT/project/scripts/sbt" # if run on CI + +# set the $DIST_PROJECT and $DIST_DIR variables +source "$ROOT/bin/common-platform" + +# build the scala/scalac/scaladoc binary, where scala is native for the current platform. +"$SBT" "$DIST_PROJECT/pack" diff --git a/project/scripts/cmdTestsCommon.inc.sh b/project/scripts/cmdTestsCommon.inc.sh index a37ab757c057..bccb4aa56ac1 100644 --- a/project/scripts/cmdTestsCommon.inc.sh +++ b/project/scripts/cmdTestsCommon.inc.sh @@ -9,11 +9,15 @@ SOURCE="tests/pos/HelloWorld.scala" MAIN="HelloWorld" TASTY="HelloWorld.tasty" EXPECTED_OUTPUT="hello world" +EXPECTED_OUTPUT_ARGS="[0:abc],[1:true],[2:123]" OUT=$(mktemp -d) OUT1=$(mktemp -d) tmp=$(mktemp) +# set the $DIST_PROJECT and $DIST_DIR variables +source "$ROOT/bin/common-platform" + die () { echo >&2 "$@" exit 1 @@ -24,3 +28,16 @@ clear_out() local out="$1" rm -rf "$out"/* } + +clear_cli_dotfiles() +{ + local out="$1" + rm -rf "$out"/.bsp + rm -rf "$out"/.scala-build + + rm -f "$ROOT"/.bsp/scala.json + if [ -z "$(ls -A "$ROOT"/.bsp)" ]; then + rm -rf "$ROOT"/.bsp + fi + rm -rf "$ROOT"/.scala-build +} diff --git a/project/scripts/echoArgs.sc b/project/scripts/echoArgs.sc new file mode 100644 index 000000000000..cb9acbb6ad2e --- /dev/null +++ b/project/scripts/echoArgs.sc @@ -0,0 +1,6 @@ +// This is a Scala CLI script + +val formatted = + (for (arg, i) <- args.zipWithIndex yield + s"[$i:$arg]").mkString(",") +println(formatted) diff --git a/project/scripts/native-integration/bashTests b/project/scripts/native-integration/bashTests new file mode 100755 index 000000000000..5fb77355238c --- /dev/null +++ b/project/scripts/native-integration/bashTests @@ -0,0 +1,84 @@ +#!/usr/bin/env bash + +set -eux + +#/*---------------*\ +# * SETUP VARS *# +# *---------------*/ + +ROOT="$(cd "$(dirname "${BASH_SOURCE[0]}")" >& /dev/null && pwd)/../../.." + +SBT="$ROOT/project/scripts/sbt" # if run on CI +# SBT="sbt" # if run locally + +# set the $DIST_PROJECT and $DIST_DIR variables +source "$ROOT/bin/common-platform" + +die () { + echo >&2 "$@" + exit 1 +} + +PROG_HOME="$DIST_DIR/target/pack" + +SOURCE="$ROOT/tests/pos/HelloWorld.scala" +SOURCE_VERSION="$ROOT/project/scripts/native-integration/reportScalaVersion.scala" + +clear_cli_dotfiles() +{ + local out="$1" + rm -rf "$out"/.bsp + rm -rf "$out"/.scala-build + + rm -f "$ROOT"/.bsp/scala.json + if [ -z "$(ls -A "$ROOT"/.bsp)" ]; then + rm -rf "$ROOT"/.bsp + fi + rm -rf "$ROOT"/.scala-build +} + +#/*---------------*\ +# * INITIALIZE *# +# *---------------*/ + +# build the distribution +"$SBT" "$DIST_PROJECT/pack" + +SCALA_VERSION="" +# iterate through lines in VERSION_SRC +while IFS= read -r line; do + # if line starts with "version:=" then extract the version + if [[ "$line" == version:=* ]]; then + SCALA_VERSION="${line#version:=}" + break + fi +done < "$PROG_HOME/VERSION" + +if [ -z "$SCALA_VERSION" ]; then + die "Could not find scala version in $PROG_HOME/VERSION" +fi + +#/*-------------------*\ +# * TESTING BEGINS *# +# *-------------------*/ + +echo "assert native launcher matches expected version" +if [ -z "$LAUNCHER_EXPECTED_PROJECT" ]; then + die "LAUNCHER_EXPECTED_PROJECT is not set in the environment" +fi +test "$LAUNCHER_EXPECTED_PROJECT" = "$DIST_PROJECT" + +echo "testing version output (default)" +std_output=$("$PROG_HOME/bin/scala" version --scala-version) +test "$SCALA_VERSION" = "$std_output" + +echo "testing run command" +std_output=$("$PROG_HOME/bin/scala" run "$SOURCE" --power --offline --server=false) +test "hello world" = "$std_output" +clear_cli_dotfiles "$ROOT/tests/pos" + +echo "testing run command (-with-compiler)" +std_output=$("$PROG_HOME/bin/scala" run "$SOURCE_VERSION" -with-compiler --power --offline --server=false) +test "$SCALA_VERSION" = "$std_output" +clear_cli_dotfiles "$ROOT/project/scripts/native-integration" + diff --git a/project/scripts/native-integration/reportScalaVersion.scala b/project/scripts/native-integration/reportScalaVersion.scala new file mode 100644 index 000000000000..dc6e93708a48 --- /dev/null +++ b/project/scripts/native-integration/reportScalaVersion.scala @@ -0,0 +1,4 @@ +// To be ran by Scala CLI (requires -with-compiler command line option) + +@main def reportScalaVersion: Unit = + println(dotty.tools.dotc.config.Properties.versionNumberString) diff --git a/project/scripts/native-integration/winTests.bat b/project/scripts/native-integration/winTests.bat new file mode 100755 index 000000000000..a85b2c8c2531 --- /dev/null +++ b/project/scripts/native-integration/winTests.bat @@ -0,0 +1,19 @@ +@echo off +setlocal + +@rem paths are relative to the root project directory +set "_PREFIX=dist\win-x86_64\target\pack" +set "_SOURCE=tests\pos\HelloWorld.scala" +set "_OUT_DIR=out" + +@rem if-tests mimic the non-existing bash instruction 'set -e'. +call "%_PREFIX%\bin\scalac.bat" "@project\scripts\options" "%_SOURCE%" +if not %ERRORLEVEL%==0 endlocal& exit /b 1 + +call "%_PREFIX%\bin\scalac.bat" -d "%_OUT_DIR%" "%_SOURCE%" +if not %ERRORLEVEL%==0 endlocal& exit /b 1 + +call "%_PREFIX%\bin\scala.bat" --power -classpath "%_OUT_DIR%" -M HelloWorld --offline --server=false +if not %ERRORLEVEL%==0 endlocal& exit /b 1 + +endlocal diff --git a/project/scripts/winCmdTests b/project/scripts/winCmdTests index 2dffff5b196a..fe6a43c7f68f 100644 --- a/project/scripts/winCmdTests +++ b/project/scripts/winCmdTests @@ -1,7 +1,7 @@ #!/usr/bin/env bash set -e -PREFIX="dist/target/pack" +PREFIX="dist/win-x86_64/target/pack" SOURCE="tests/pos/HelloWorld.scala" $PREFIX/bin/scalac @project/scripts/options "$SOURCE" $PREFIX/bin/scalac -d out "$SOURCE" diff --git a/project/scripts/winCmdTests.bat b/project/scripts/winCmdTests.bat index d9b594d560ab..903f74d7ab98 100644 --- a/project/scripts/winCmdTests.bat +++ b/project/scripts/winCmdTests.bat @@ -2,7 +2,7 @@ setlocal @rem paths are relative to the root project directory -set "_PREFIX=dist\target\pack" +set "_PREFIX=dist\win-x86_64\target\pack" set "_SOURCE=tests\pos\HelloWorld.scala" set "_OUT_DIR=out" set "_SITE_DIR=_site" diff --git a/tests/cmdTest-sbt-tests/sourcepath-with-inline/src/main/scala/a/zz.scala b/tests/cmdTest-sbt-tests/sourcepath-with-inline/src/main/scala/a/zz.scala new file mode 100644 index 000000000000..17a7488ccb1a --- /dev/null +++ b/tests/cmdTest-sbt-tests/sourcepath-with-inline/src/main/scala/a/zz.scala @@ -0,0 +1,6 @@ +package a + +object Foo: // note that `Foo` is defined in `zz.scala` + class Local + inline def foo(using Local): Nothing = + ??? From 3d18e9841da5fde3417b83014ae0c28e6b0478f2 Mon Sep 17 00:00:00 2001 From: Wojciech Mazur Date: Fri, 21 Jun 2024 08:14:09 +0200 Subject: [PATCH 273/371] Revert "Disable windows tests for RC1" This reverts commit 95e53df0b360849efc49f724125094869eaf98b3. --- .github/workflows/ci.yaml | 19 +++++++++++++++++-- 1 file changed, 17 insertions(+), 2 deletions(-) diff --git a/.github/workflows/ci.yaml b/.github/workflows/ci.yaml index cad7caec490d..974866930c68 100644 --- a/.github/workflows/ci.yaml +++ b/.github/workflows/ci.yaml @@ -205,7 +205,16 @@ jobs: test_windows_fast: runs-on: [self-hosted, Windows] - if: false + if: "( + github.event_name == 'push' + && github.ref != 'refs/heads/main' + ) + || github.event_name == 'merge_group' + || ( + github.event_name == 'pull_request' + && !contains(github.event.pull_request.body, '[skip ci]') + && !contains(github.event.pull_request.body, '[skip test_windows_fast]') + )" steps: - name: Reset existing repo @@ -243,7 +252,13 @@ jobs: test_windows_full: runs-on: [self-hosted, Windows] - if: false + if: "github.event_name == 'schedule' && github.repository == 'scala/scala3' + || github.event_name == 'push' + || ( + github.event_name == 'pull_request' + && !contains(github.event.pull_request.body, '[skip ci]') + && contains(github.event.pull_request.body, '[test_windows_full]') + )" steps: - name: Reset existing repo From df91f071631c4521994a71f9f72a42714c6d3273 Mon Sep 17 00:00:00 2001 From: Wojciech Mazur Date: Fri, 21 Jun 2024 12:07:30 +0200 Subject: [PATCH 274/371] Fix incorrect paths to sha256 check sum files in release workflow --- .github/workflows/ci.yaml | 4 ++-- 1 file changed, 2 insertions(+), 2 deletions(-) diff --git a/.github/workflows/ci.yaml b/.github/workflows/ci.yaml index 974866930c68..2747830fb7d6 100644 --- a/.github/workflows/ci.yaml +++ b/.github/workflows/ci.yaml @@ -951,8 +951,8 @@ jobs: GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }} with: upload_url: ${{ steps.create_gh_release.outputs.upload_url }} - asset_path: ./dist/linux-aarch64/target/sha256sum-aarch64-pc-linux.txt - asset_name: sha256sum.txt + asset_path: ./dist/linux-aarch64/target/sha256sum.txt + asset_name: sha256sum-aarch64-pc-linux.txt asset_content_type: text/plain - name: Upload SHA256 sum of the release artefacts to GitHub Release (mac x86-64) From 1520e88314bccf9bb42efd47ad3616c6e758548d Mon Sep 17 00:00:00 2001 From: Wojciech Mazur Date: Fri, 21 Jun 2024 09:58:22 +0200 Subject: [PATCH 275/371] Add changelog for 3.5.0-RC2 --- changelogs/3.5.0-RC2.md | 25 +++++++++++++++++++++++++ 1 file changed, 25 insertions(+) create mode 100644 changelogs/3.5.0-RC2.md diff --git a/changelogs/3.5.0-RC2.md b/changelogs/3.5.0-RC2.md new file mode 100644 index 000000000000..f3bb8b52c73c --- /dev/null +++ b/changelogs/3.5.0-RC2.md @@ -0,0 +1,25 @@ +# Backported fixes + +- Bundle scala-cli in scala command [#20351](https://github.com/scala/scala3/pull/20351) +- Avoid stacked thisCall contexts [#20488](https://github.com/scala/scala3/pull/20488) +- Adapt the workflow to release on SDKMAN! [#20535](https://github.com/scala/scala3/pull/20535) +- Adapt the release workflow to SIP-46 [#20565](https://github.com/scala/scala3/pull/20565) +- Disable ClasspathTests.unglobClasspathVerifyTest [#20551](https://github.com/scala/scala3/pull/20551) +- Set default source version to 3.5 [#20441](https://github.com/scala/scala3/pull/20441) +- Bring back ambiguity filter when we report an implicit not found error [#20368](https://github.com/scala/scala3/pull/20368) +- Treat 3.5-migration the same as 3.5 for a warning about implicit priority change [#20436](https://github.com/scala/scala3/pull/20436) +- Avoid forcing whole package when using -experimental [#20409](https://github.com/scala/scala3/pull/20409) + +# Contributors + +Thank you to all the contributors who made this release possible 🎉 + +According to `git shortlog -sn --no-merges 3.5.0-RC1..3.5.0-RC2` these are: + +``` + 4 Hamza Remmal + 4 Wojciech Mazur + 3 Martin Odersky + 1 Jamie Thompson + 1 Guillaume Martres +``` From 828c03e236bfca6c3bd260eea5fabe6c9dddad5f Mon Sep 17 00:00:00 2001 From: Wojciech Mazur Date: Fri, 21 Jun 2024 09:59:05 +0200 Subject: [PATCH 276/371] Release 3.5.0-RC2 --- project/Build.scala | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/project/Build.scala b/project/Build.scala index 99871c4c87e8..6ff07701c06b 100644 --- a/project/Build.scala +++ b/project/Build.scala @@ -88,7 +88,7 @@ object Build { val referenceVersion = "3.4.2" - val baseVersion = "3.5.0-RC1" + val baseVersion = "3.5.0-RC2" // LTS or Next val versionLine = "Next" From ecf5a2e365d367804fb9b840ea042cb0128020cf Mon Sep 17 00:00:00 2001 From: Hamza Remmal Date: Wed, 19 Jun 2024 15:21:30 +0100 Subject: [PATCH 277/371] Release .zip instead of .tar.gz for windows in sdkman --- .github/workflows/publish-sdkman.yml | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/.github/workflows/publish-sdkman.yml b/.github/workflows/publish-sdkman.yml index 2126a3237d83..02e00bcbf03d 100644 --- a/.github/workflows/publish-sdkman.yml +++ b/.github/workflows/publish-sdkman.yml @@ -44,7 +44,7 @@ jobs: - platform: MAC_ARM64 archive : 'scala3-${{ inputs.version }}-aarch64-apple-darwin.tar.gz' - platform: WINDOWS_64 - archive : 'scala3-${{ inputs.version }}-x86_64-pc-win32.tar.gz' + archive : 'scala3-${{ inputs.version }}-x86_64-pc-win32.zip' - platform: UNIVERSAL archive : 'scala3-${{ inputs.version }}.zip' steps: From 3ecd98200055c545f0adb68b92f7afd3bdc9f810 Mon Sep 17 00:00:00 2001 From: Hamza Remmal Date: Wed, 19 Jun 2024 16:49:31 +0100 Subject: [PATCH 278/371] Do not release to the UNIVERSAL platform in sdkman --- .github/workflows/publish-sdkman.yml | 2 -- 1 file changed, 2 deletions(-) diff --git a/.github/workflows/publish-sdkman.yml b/.github/workflows/publish-sdkman.yml index 02e00bcbf03d..d4238b9371e4 100644 --- a/.github/workflows/publish-sdkman.yml +++ b/.github/workflows/publish-sdkman.yml @@ -45,8 +45,6 @@ jobs: archive : 'scala3-${{ inputs.version }}-aarch64-apple-darwin.tar.gz' - platform: WINDOWS_64 archive : 'scala3-${{ inputs.version }}-x86_64-pc-win32.zip' - - platform: UNIVERSAL - archive : 'scala3-${{ inputs.version }}.zip' steps: - uses: hamzaremmal/sdkman-release-action@7e437233a6bd79bc4cb0fa9071b685e94bdfdba6 with: From 0a7b7fe63efd4837b9aad66a0df77b6555cc15b2 Mon Sep 17 00:00:00 2001 From: Hamza Remmal Date: Fri, 21 Jun 2024 11:12:34 +0100 Subject: [PATCH 279/371] Upload zip files to sdkman instead of .tar.gz --- .github/workflows/publish-sdkman.yml | 8 ++++---- 1 file changed, 4 insertions(+), 4 deletions(-) diff --git a/.github/workflows/publish-sdkman.yml b/.github/workflows/publish-sdkman.yml index d4238b9371e4..de12f81426b5 100644 --- a/.github/workflows/publish-sdkman.yml +++ b/.github/workflows/publish-sdkman.yml @@ -36,13 +36,13 @@ jobs: matrix: include: - platform: LINUX_64 - archive : 'scala3-${{ inputs.version }}-x86_64-pc-linux.tar.gz' + archive : 'scala3-${{ inputs.version }}-x86_64-pc-linux.zip' - platform: LINUX_ARM64 - archive : 'scala3-${{ inputs.version }}-aarch64-pc-linux.tar.gz' + archive : 'scala3-${{ inputs.version }}-aarch64-pc-linux.zip' - platform: MAC_OSX - archive : 'scala3-${{ inputs.version }}-x86_64-apple-darwin.tar.gz' + archive : 'scala3-${{ inputs.version }}-x86_64-apple-darwin.zip' - platform: MAC_ARM64 - archive : 'scala3-${{ inputs.version }}-aarch64-apple-darwin.tar.gz' + archive : 'scala3-${{ inputs.version }}-aarch64-apple-darwin.zip' - platform: WINDOWS_64 archive : 'scala3-${{ inputs.version }}-x86_64-pc-win32.zip' steps: From a9af5ccbfd4441c29e0c2b2b5e818bf2b53d875e Mon Sep 17 00:00:00 2001 From: Jamie Thompson Date: Wed, 19 Jun 2024 12:21:16 +0200 Subject: [PATCH 280/371] replace pack command, do not produce lib directory, write classpath to file --- dist/bin/common | 63 ++--------- dist/bin/common.bat | 16 +-- dist/bin/scalac | 0 dist/bin/scalac.bat | 43 ++++---- dist/bin/scaladoc | 57 +--------- dist/bin/scaladoc.bat | 66 +++--------- project/Build.scala | 7 +- project/RepublishPlugin.scala | 193 ++++++++++++++++++++++++++++------ 8 files changed, 217 insertions(+), 228 deletions(-) mode change 100755 => 100644 dist/bin/common mode change 100644 => 100755 dist/bin/scalac diff --git a/dist/bin/common b/dist/bin/common old mode 100755 new mode 100644 index 4a0152fbc4cb..1ff0ca66274c --- a/dist/bin/common +++ b/dist/bin/common @@ -6,62 +6,21 @@ source "$PROG_HOME/bin/common-shared" # * The code below is for Dotty # *-------------------------------------------------*/ -find_lib () { - for lib in "$PROG_HOME"/lib/$1 ; do - if [[ -f "$lib" ]]; then - if [ -n "$CYGPATHCMD" ]; then - "$CYGPATHCMD" -am "$lib" - elif [[ $mingw || $msys ]]; then - echo "$lib" | sed 's|/|\\\\|g' - else - echo "$lib" - fi - return +load_classpath () { + command="$1" + psep_pattern="$2" + __CLASS_PATH="" + while IFS= read -r line; do + if ! [[ ( -n ${conemu-} || -n ${msys-}) && "$line" == "*jna-5*" ]]; then + # jna-5 only appropriate for some combinations + __CLASS_PATH+="$PROG_HOME/maven2/$line$psep_pattern" fi - done + done < "$PROG_HOME/etc/$command.classpath" + echo "$__CLASS_PATH" } -DOTTY_COMP=$(find_lib "*scala3-compiler*") -DOTTY_INTF=$(find_lib "*scala3-interfaces*") -DOTTY_LIB=$(find_lib "*scala3-library*") -DOTTY_STAGING=$(find_lib "*scala3-staging*") -DOTTY_TASTY_INSPECTOR=$(find_lib "*scala3-tasty-inspector*") -TASTY_CORE=$(find_lib "*tasty-core*") -SCALA_ASM=$(find_lib "*scala-asm*") -SCALA_LIB=$(find_lib "*scala-library*") -SBT_INTF=$(find_lib "*compiler-interface*") -JLINE_READER=$(find_lib "*jline-reader-3*") -JLINE_TERMINAL=$(find_lib "*jline-terminal-3*") -JLINE_TERMINAL_JNA=$(find_lib "*jline-terminal-jna-3*") - -# jna-5 only appropriate for some combinations -[[ ${conemu-} && ${msys-} ]] || JNA=$(find_lib "*jna-5*") - compilerJavaClasspathArgs () { - # echo "dotty-compiler: $DOTTY_COMP" - # echo "dotty-interface: $DOTTY_INTF" - # echo "dotty-library: $DOTTY_LIB" - # echo "tasty-core: $TASTY_CORE" - # echo "scala-asm: $SCALA_ASM" - # echo "scala-lib: $SCALA_LIB" - # echo "sbt-intface: $SBT_INTF" - - toolchain="" - toolchain+="$SCALA_LIB$PSEP" - toolchain+="$DOTTY_LIB$PSEP" - toolchain+="$SCALA_ASM$PSEP" - toolchain+="$SBT_INTF$PSEP" - toolchain+="$DOTTY_INTF$PSEP" - toolchain+="$DOTTY_COMP$PSEP" - toolchain+="$TASTY_CORE$PSEP" - toolchain+="$DOTTY_STAGING$PSEP" - toolchain+="$DOTTY_TASTY_INSPECTOR$PSEP" - - # jine - toolchain+="$JLINE_READER$PSEP" - toolchain+="$JLINE_TERMINAL$PSEP" - toolchain+="$JLINE_TERMINAL_JNA$PSEP" - [ -n "${JNA-}" ] && toolchain+="$JNA$PSEP" + toolchain="$(load_classpath "scala" "$PSEP")" if [ -n "${jvm_cp_args-}" ]; then jvm_cp_args="$toolchain$jvm_cp_args" diff --git a/dist/bin/common.bat b/dist/bin/common.bat index 7aef606d5509..f9c35e432b36 100644 --- a/dist/bin/common.bat +++ b/dist/bin/common.bat @@ -38,20 +38,6 @@ if not defined _PROG_HOME ( set _EXITCODE=1 goto :eof ) -set "_LIB_DIR=%_PROG_HOME%\lib" +set "_ETC_DIR=%_PROG_HOME%\etc" set _PSEP=; - -for /f "delims=" %%f in ('dir /a-d /b "%_LIB_DIR%\*scala3-compiler*"') do set "_SCALA3_COMP=%_LIB_DIR%\%%f" -for /f "delims=" %%f in ('dir /a-d /b "%_LIB_DIR%\*scala3-interfaces*"') do set "_SCALA3_INTF=%_LIB_DIR%\%%f" -for /f "delims=" %%f in ('dir /a-d /b "%_LIB_DIR%\*scala3-library*"') do set "_SCALA3_LIB=%_LIB_DIR%\%%f" -for /f "delims=" %%f in ('dir /a-d /b "%_LIB_DIR%\*scala3-staging*"') do set "_SCALA3_STAGING=%_LIB_DIR%\%%f" -for /f "delims=" %%f in ('dir /a-d /b "%_LIB_DIR%\*scala3-tasty-inspector*"') do set "_SCALA3_TASTY_INSPECTOR=%_LIB_DIR%\%%f" -for /f "delims=" %%f in ('dir /a-d /b "%_LIB_DIR%\*tasty-core*"') do set "_TASTY_CORE=%_LIB_DIR%\%%f" -for /f "delims=" %%f in ('dir /a-d /b "%_LIB_DIR%\*scala-asm*"') do set "_SCALA_ASM=%_LIB_DIR%\%%f" -for /f "delims=" %%f in ('dir /a-d /b "%_LIB_DIR%\*scala-library*"') do set "_SCALA_LIB=%_LIB_DIR%\%%f" -for /f "delims=" %%f in ('dir /a-d /b "%_LIB_DIR%\*compiler-interface*"') do set "_SBT_INTF=%_LIB_DIR%\%%f" -for /f "delims=" %%f in ('dir /a-d /b "%_LIB_DIR%\*jline-reader-3*"') do set "_JLINE_READER=%_LIB_DIR%\%%f" -for /f "delims=" %%f in ('dir /a-d /b "%_LIB_DIR%\*jline-terminal-3*"') do set "_JLINE_TERMINAL=%_LIB_DIR%\%%f" -for /f "delims=" %%f in ('dir /a-d /b "%_LIB_DIR%\*jline-terminal-jna-3*"') do set "_JLINE_TERMINAL_JNA=%_LIB_DIR%\%%f" -for /f "delims=" %%f in ('dir /a-d /b "%_LIB_DIR%\*jna-5*"') do set "_JNA=%_LIB_DIR%\%%f" diff --git a/dist/bin/scalac b/dist/bin/scalac old mode 100644 new mode 100755 diff --git a/dist/bin/scalac.bat b/dist/bin/scalac.bat index c8cd0babe60b..fe6d7e3fad4d 100644 --- a/dist/bin/scalac.bat +++ b/dist/bin/scalac.bat @@ -88,29 +88,10 @@ goto :eof @rem output parameter: _JVM_CP_ARGS :compilerJavaClasspathArgs -@rem echo scala3-compiler: %_SCALA3_COMP% -@rem echo scala3-interface: %_SCALA3_INTF% -@rem echo scala3-library: %_SCALA3_LIB% -@rem echo tasty-core: %_TASTY_CORE% -@rem echo scala-asm: %_SCALA_ASM% -@rem echo scala-lib: %_SCALA_LIB% -@rem echo sbt-intface: %_SBT_INTF% - -set "__TOOLCHAIN=%_SCALA_LIB%%_PSEP%" -set "__TOOLCHAIN=%__TOOLCHAIN%%_SCALA3_LIB%%_PSEP%" -set "__TOOLCHAIN=%__TOOLCHAIN%%_SCALA_ASM%%_PSEP%" -set "__TOOLCHAIN=%__TOOLCHAIN%%_SBT_INTF%%_PSEP%" -set "__TOOLCHAIN=%__TOOLCHAIN%%_SCALA3_INTF%%_PSEP%" -set "__TOOLCHAIN=%__TOOLCHAIN%%_SCALA3_COMP%%_PSEP%" -set "__TOOLCHAIN=%__TOOLCHAIN%%_TASTY_CORE%%_PSEP%" -set "__TOOLCHAIN=%__TOOLCHAIN%%_SCALA3_STAGING%%_PSEP%" -set "__TOOLCHAIN=%__TOOLCHAIN%%_SCALA3_TASTY_INSPECTOR%%_PSEP%" - -@rem # jline -set "__TOOLCHAIN=%__TOOLCHAIN%%_JLINE_READER%%_PSEP%" -set "__TOOLCHAIN=%__TOOLCHAIN%%_JLINE_TERMINAL%%_PSEP%" -set "__TOOLCHAIN=%__TOOLCHAIN%%_JLINE_TERMINAL_JNA%%_PSEP%" -set "__TOOLCHAIN=%__TOOLCHAIN%%_JNA%%_PSEP%" + +call :loadClasspathFromFile + +set "__TOOLCHAIN=%_CLASS_PATH%" if defined _SCALA_CPATH ( set "_JVM_CP_ARGS=%__TOOLCHAIN%%_SCALA_CPATH%" @@ -119,6 +100,22 @@ if defined _SCALA_CPATH ( ) goto :eof +@REM concatentate every line in "%_ETC_DIR%\scala.classpath" with _PSEP +:loadClasspathFromFile +set _CLASS_PATH= +if exist "%_ETC_DIR%\scala.classpath" ( + for /f "usebackq delims=" %%i in ("%_ETC_DIR%\scala.classpath") do ( + set "_LIB=%_PROG_HOME%\maven2\%%i" + set "_LIB=!_LIB:/=\!" + if not defined _CLASS_PATH ( + set "_CLASS_PATH=!_LIB!" + ) else ( + set "_CLASS_PATH=!_CLASS_PATH!%_PSEP%!_LIB!" + ) + ) +) +goto :eof + @rem ######################################################################### @rem ## Cleanups diff --git a/dist/bin/scaladoc b/dist/bin/scaladoc index 8b9ec41a7f8c..15bc0813f93a 100755 --- a/dist/bin/scaladoc +++ b/dist/bin/scaladoc @@ -53,62 +53,7 @@ addScrip() { } classpathArgs () { - CLASS_PATH="" - CLASS_PATH+="$(find_lib "*scaladoc*")$PSEP" - CLASS_PATH+="$(find_lib "*scala3-compiler*")$PSEP" - CLASS_PATH+="$(find_lib "*scala3-interfaces*")$PSEP" - CLASS_PATH+="$(find_lib "*scala3-library*")$PSEP" - CLASS_PATH+="$(find_lib "*tasty-core*")$PSEP" - CLASS_PATH+="$(find_lib "*scala3-tasty-inspector*")$PSEP" - CLASS_PATH+="$(find_lib "*flexmark-0*")$PSEP" - CLASS_PATH+="$(find_lib "*flexmark*")$PSEP" - CLASS_PATH+="$(find_lib "*flexmark-ext-anchorlink*")$PSEP" - CLASS_PATH+="$(find_lib "*flexmark-ext-autolink*")$PSEP" - CLASS_PATH+="$(find_lib "*flexmark-ext-emoji*")$PSEP" - CLASS_PATH+="$(find_lib "*flexmark-ext-gfm-strikethrough*")$PSEP" - CLASS_PATH+="$(find_lib "*flexmark-ext-gfm-tasklist*")$PSEP" - CLASS_PATH+="$(find_lib "*flexmark-ext-wikilink*")$PSEP" - CLASS_PATH+="$(find_lib "*flexmark-ext-yaml-front-matter*")$PSEP" - CLASS_PATH+="$(find_lib "*flexmark-ext-tables*")$PSEP" - CLASS_PATH+="$(find_lib "*flexmark-ext-ins*")$PSEP" - CLASS_PATH+="$(find_lib "*flexmark-ext-superscript*")$PSEP" - CLASS_PATH+="$(find_lib "*flexmark-util*")$PSEP" - CLASS_PATH+="$(find_lib "*flexmark-util-ast*")$PSEP" - CLASS_PATH+="$(find_lib "*flexmark-util-data*")$PSEP" - CLASS_PATH+="$(find_lib "*flexmark-util-dependency*")$PSEP" - CLASS_PATH+="$(find_lib "*flexmark-util-misc*")$PSEP" - CLASS_PATH+="$(find_lib "*flexmark-util-format*")$PSEP" - CLASS_PATH+="$(find_lib "*flexmark-util-sequence*")$PSEP" - CLASS_PATH+="$(find_lib "*flexmark-util-builder*")$PSEP" - CLASS_PATH+="$(find_lib "*flexmark-util-collection*")$PSEP" - CLASS_PATH+="$(find_lib "*flexmark-util-visitor*")$PSEP" - CLASS_PATH+="$(find_lib "*flexmark-util-options*")$PSEP" - CLASS_PATH+="$(find_lib "*flexmark-util-html*")$PSEP" - CLASS_PATH+="$(find_lib "*flexmark-formatter*")$PSEP" - CLASS_PATH+="$(find_lib "*flexmark-ast*")$PSEP" - CLASS_PATH+="$(find_lib "*liqp*")$PSEP" - CLASS_PATH+="$(find_lib "*jsoup*")$PSEP" - CLASS_PATH+="$(find_lib "*jackson-dataformat-yaml*")$PSEP" - CLASS_PATH+="$(find_lib "*jackson-datatype-jsr310*")$PSEP" - CLASS_PATH+="$(find_lib "*strftime4j*")$PSEP" - CLASS_PATH+="$(find_lib "*scala-asm*")$PSEP" - CLASS_PATH+="$(find_lib "*compiler-interface*")$PSEP" - CLASS_PATH+="$(find_lib "*jline-reader*")$PSEP" - CLASS_PATH+="$(find_lib "*jline-terminal-3*")$PSEP" - CLASS_PATH+="$(find_lib "*jline-terminal-jna*")$PSEP" - CLASS_PATH+="$(find_lib "*flexmark-formatter*")$PSEP" - CLASS_PATH+="$(find_lib "*autolink-0.6*")$PSEP" - CLASS_PATH+="$(find_lib "*flexmark-jira-converter*")$PSEP" - CLASS_PATH+="$(find_lib "*antlr4*")$PSEP" - CLASS_PATH+="$(find_lib "*jackson-annotations*")$PSEP" - CLASS_PATH+="$(find_lib "*jackson-core*")$PSEP" - CLASS_PATH+="$(find_lib "*jackson-databind*")$PSEP" - CLASS_PATH+="$(find_lib "*snakeyaml*")$PSEP" - CLASS_PATH+="$(find_lib "*scala-library*")$PSEP" - CLASS_PATH+="$(find_lib "*protobuf-java*")$PSEP" - CLASS_PATH+="$(find_lib "*util-interface*")$PSEP" - CLASS_PATH+="$(find_lib "*jna-5*")$PSEP" - CLASS_PATH+="$(find_lib "*antlr4-runtime*")$PSEP" + CLASS_PATH="$(load_classpath "scaladoc" "$PSEP")" jvm_cp_args="-classpath \"$CLASS_PATH\"" } diff --git a/dist/bin/scaladoc.bat b/dist/bin/scaladoc.bat index c30a4689244c..16433a83f501 100644 --- a/dist/bin/scaladoc.bat +++ b/dist/bin/scaladoc.bat @@ -105,60 +105,24 @@ goto :eof @rem output parameter: _CLASS_PATH :classpathArgs -set "_LIB_DIR=%_PROG_HOME%\lib" -set _CLASS_PATH= +set "_ETC_DIR=%_PROG_HOME%\etc" @rem keep list in sync with bash script `bin\scaladoc` ! -call :updateClasspath "scaladoc" -call :updateClasspath "scala3-compiler" -call :updateClasspath "scala3-interfaces" -call :updateClasspath "scala3-library" -call :updateClasspath "tasty-core" -call :updateClasspath "scala3-tasty-inspector" -call :updateClasspath "flexmark-0" -call :updateClasspath "flexmark-html-parser" -call :updateClasspath "flexmark-ext-anchorlink" -call :updateClasspath "flexmark-ext-autolink" -call :updateClasspath "flexmark-ext-emoji" -call :updateClasspath "flexmark-ext-gfm-strikethrough" -call :updateClasspath "flexmark-ext-gfm-tables" -call :updateClasspath "flexmark-ext-gfm-tasklist" -call :updateClasspath "flexmark-ext-wikilink" -call :updateClasspath "flexmark-ext-yaml-front-matter" -call :updateClasspath "liqp" -call :updateClasspath "jsoup" -call :updateClasspath "jackson-dataformat-yaml" -call :updateClasspath "jackson-datatype-jsr310" -call :updateClasspath "strftime4j" -call :updateClasspath "scala-asm" -call :updateClasspath "compiler-interface" -call :updateClasspath "jline-reader" -call :updateClasspath "jline-terminal-3" -call :updateClasspath "jline-terminal-jna" -call :updateClasspath "flexmark-util" -call :updateClasspath "flexmark-formatter" -call :updateClasspath "autolink-0.6" -call :updateClasspath "flexmark-jira-converter" -call :updateClasspath "antlr4" -call :updateClasspath "jackson-annotations" -call :updateClasspath "jackson-core" -call :updateClasspath "jackson-databind" -call :updateClasspath "snakeyaml" -call :updateClasspath "scala-library" -call :updateClasspath "protobuf-java" -call :updateClasspath "util-interface" -call :updateClasspath "jna-5" -call :updateClasspath "flexmark-ext-tables" -call :updateClasspath "flexmark-ext-ins" -call :updateClasspath "flexmark-ext-superscript" -call :updateClasspath "antlr4-runtime" +call :loadClasspathFromFile goto :eof -@rem input parameter: %1=pattern for library file -@rem output parameter: _CLASS_PATH -:updateClasspath -set "__PATTERN=%~1" -for /f "delims=" %%f in ('dir /a-d /b "%_LIB_DIR%\*%__PATTERN%*" 2^>NUL') do ( - set "_CLASS_PATH=!_CLASS_PATH!%_LIB_DIR%\%%f%_PSEP%" +@REM concatentate every line in "%_ETC_DIR%\scaladoc.classpath" with _PSEP +:loadClasspathFromFile +set _CLASS_PATH= +if exist "%_ETC_DIR%\scaladoc.classpath" ( + for /f "usebackq delims=" %%i in ("%_ETC_DIR%\scaladoc.classpath") do ( + set "_LIB=%_PROG_HOME%\maven2\%%i" + set "_LIB=!_LIB:/=\!" + if not defined _CLASS_PATH ( + set "_CLASS_PATH=!_LIB!" + ) else ( + set "_CLASS_PATH=!_CLASS_PATH!%_PSEP%!_LIB!" + ) + ) ) goto :eof diff --git a/project/Build.scala b/project/Build.scala index 6ff07701c06b..3ce365fac9f1 100644 --- a/project/Build.scala +++ b/project/Build.scala @@ -2127,7 +2127,12 @@ object Build { republishRepo := target.value / "republish", packResourceDir += (republishRepo.value / "bin" -> "bin"), packResourceDir += (republishRepo.value / "maven2" -> "maven2"), - Compile / pack := (Compile / pack).dependsOn(republish).value, + packResourceDir += (republishRepo.value / "etc" -> "etc"), + republishCommandLibs += + ("scala" -> List("scala3-interfaces", "scala3-compiler", "scala3-library", "tasty-core", "scala3-staging", "scala3-tasty-inspector")), + republishCommandLibs += + ("scaladoc" -> List("scala3-interfaces", "scala3-compiler", "scala3-library", "tasty-core", "scala3-tasty-inspector", "scaladoc")), + Compile / pack := republishPack.value, ) lazy val dist = project.asDist(Bootstrapped) diff --git a/project/RepublishPlugin.scala b/project/RepublishPlugin.scala index 537c82d62cce..6ce83c2f0abf 100644 --- a/project/RepublishPlugin.scala +++ b/project/RepublishPlugin.scala @@ -2,6 +2,7 @@ package dotty.tools.sbtplugin import sbt._ import xerial.sbt.pack.PackPlugin +import xerial.sbt.pack.PackPlugin.autoImport.{packResourceDir, packDir} import sbt.Keys._ import sbt.AutoPlugin import sbt.PublishBinPlugin @@ -66,7 +67,9 @@ object RepublishPlugin extends AutoPlugin { val republishBinDir = settingKey[File]("where to find static files for the bin dir.") val republishCoursierDir = settingKey[File]("where to download the coursier launcher jar.") val republishBinOverrides = settingKey[Seq[File]]("files to override those in bin-dir.") + val republishCommandLibs = settingKey[Seq[(String, List[String])]]("libraries needed for each command.") val republish = taskKey[File]("cache the dependencies and download launchers for the distribution") + val republishPack = taskKey[File]("do the pack command") val republishRepo = settingKey[File]("the location to store the republished artifacts.") val republishLaunchers = settingKey[Seq[(String, String)]]("launchers to download. Sequence of (name, URL).") val republishCoursier = settingKey[Seq[(String, String)]]("coursier launcher to download. Sequence of (name, URL).") @@ -99,7 +102,7 @@ object RepublishPlugin extends AutoPlugin { }.toSet } - private def coursierCmd(jar: File, cache: File, args: Seq[String]): Unit = { + private def coursierCmd(jar: File, cache: File): Seq[String] => List[String] = { val jar0 = jar.getAbsolutePath.toString val javaHome = sys.props.get("java.home").getOrElse { throw new MessageOnlyException("java.home property not set") @@ -108,38 +111,88 @@ object RepublishPlugin extends AutoPlugin { val cmd = if (scala.util.Properties.isWin) "java.exe" else "java" (file(javaHome) / "bin" / cmd).getAbsolutePath } - val env = Map("COURSIER_CACHE" -> cache.getAbsolutePath.toString) - val cmdLine = Seq(javaCmd, "-jar", jar0) ++ args - // invoke cmdLine with env - val p = new ProcessBuilder(cmdLine: _*).inheritIO() - p.environment().putAll(env.asJava) - val proc = p.start() - proc.waitFor() - if (proc.exitValue() != 0) - throw new MessageOnlyException(s"Error running coursier.jar with args ${args.mkString(" ")}") + val env = Map("COURSIER_CACHE" -> cache.getAbsolutePath.toString).asJava + val cmdLine0 = Seq(javaCmd, "-jar", jar0) + args => + val cmdLine = cmdLine0 ++ args + // invoke cmdLine with env, but also capture the output + val p = new ProcessBuilder(cmdLine: _*) + .directory(cache) + .inheritIO() + .redirectOutput(ProcessBuilder.Redirect.PIPE) + p.environment().putAll(env) + + val proc = p.start() + val in = proc.getInputStream + val output = { + try { + val src = scala.io.Source.fromInputStream(in) + try src.getLines().toList + finally src.close() + } finally { + in.close() + } + } + + proc.waitFor() + + if (proc.exitValue() != 0) + throw new MessageOnlyException(s"Error running coursier.jar with args ${args.mkString(" ")}") + + output + } + + private def resolveMaven2(repo: File): Path = { + java.nio.file.Files.walk(repo.toPath) + .filter(_.getFileName.toString == "maven2") + .findFirst() + .orElseThrow(() => new MessageOnlyException(s"Could not find maven2 directory in $repo")) + .toAbsolutePath() } - private def coursierFetch(coursierJar: File, log: Logger, cacheDir: File, localRepo: File, libs: Seq[String]): Unit = { + private def coursierFetch( + coursierJar: File, log: Logger, cacheDir: File, localRepo: File, libs: Seq[String]): Map[String, List[String]] = { + val localRepoPath = localRepo.getAbsolutePath val localRepoArg = { - val path = localRepo.getAbsolutePath - if (scala.util.Properties.isWin) { - val path0 = path.replace('\\', '/') - s"file:///$path0" // extra root slash for Windows paths + val uriPart = { + if (scala.util.Properties.isWin) { + s"/${localRepoPath.replace('\\', '/')}" // extra root slash for Windows paths + } + else { + localRepoPath // no change needed for Unix paths + } } - else - s"file://$path" + s"file://$uriPart" } - IO.createDirectory(cacheDir) - for (lib <- libs) { + val cacheDirPath = cacheDir.getAbsolutePath + lazy val maven2RootLocal = resolveMaven2(localRepo) + lazy val maven2RootCache = resolveMaven2(cacheDir) // lazy because cache dir isn't populated until after fetch + val cmd = coursierCmd(coursierJar, cacheDir) + val resolved = for (lib <- libs) yield { log.info(s"[republish] Fetching $lib with coursier.jar...") - coursierCmd(coursierJar, cacheDir, + val out = cmd( Seq( "fetch", + "--no-default", + "--repository", "central", "--repository", localRepoArg, lib ) ) + lib -> out.collect { + case s if s.startsWith(localRepoPath) => + maven2RootLocal.relativize(java.nio.file.Paths.get(s)).toString().replace('\\', '/') // format as uri + case s if s.startsWith(cacheDirPath) => + maven2RootCache.relativize(java.nio.file.Paths.get(s)).toString().replace('\\', '/') // format as uri + } + } + resolved.toMap + } + + private def fuzzyFind[V](map: Map[String, V], key: String): V = { + map.collectFirst({ case (k, v) if k.contains(key) => v }).getOrElse { + throw new MessageOnlyException(s"Could not find key $key in map $map") } } @@ -148,28 +201,34 @@ object RepublishPlugin extends AutoPlugin { private def resolveLibraryDeps( coursierJar: File, log: Logger, + republishDir: File, csrCacheDir: File, localRepo: File, - resolvedLocal: Seq[ResolvedArtifacts]): Seq[ResolvedArtifacts] = { + resolvedLocal: Seq[ResolvedArtifacts], + commandLibs: Seq[(String, List[String])]): Seq[ResolvedArtifacts] = { // publish the local artifacts to the local repo, so coursier can resolve them republishResolvedArtifacts(resolvedLocal, localRepo, logOpt = None) - coursierFetch(coursierJar, log, csrCacheDir, localRepo, resolvedLocal.map(_.id.toString)) + val classpaths = coursierFetch(coursierJar, log, csrCacheDir, localRepo, resolvedLocal.map(_.id.toString)) - val maven2Root = java.nio.file.Files.walk(csrCacheDir.toPath) - .filter(_.getFileName.toString == "maven2") - .findFirst() - .orElseThrow(() => new MessageOnlyException(s"Could not find maven2 directory in $csrCacheDir")) + if (commandLibs.nonEmpty) { + IO.createDirectory(republishDir / "etc") + for ((command, libs) <- commandLibs) { + val entries = libs.map(fuzzyFind(classpaths, _)).reduce(_ ++ _).distinct + IO.write(republishDir / "etc" / s"$command.classpath", entries.mkString("\n")) + } + } + + val maven2Root = resolveMaven2(csrCacheDir) def pathToArtifact(p: Path): ResolvedArtifacts = { // relative path from maven2Root - val lastAsString = p.getFileName.toString val relP = maven2Root.relativize(p) val parts = relP.iterator().asScala.map(_.toString).toVector - val (orgParts :+ name :+ rev :+ _) = parts + val (orgParts :+ name :+ rev :+ artifact) = parts val id = SimpleModuleId(orgParts.mkString("."), name, rev) - if (lastAsString.endsWith(".jar")) { + if (artifact.endsWith(".jar")) { ResolvedArtifacts(id, Some(p.toFile), None) } else { ResolvedArtifacts(id, None, Some(p.toFile)) @@ -279,6 +338,7 @@ object RepublishPlugin extends AutoPlugin { republishCoursier := Seq.empty, republishBinOverrides := Seq.empty, republishExtraProps := Seq.empty, + republishCommandLibs := Seq.empty, republishLocalResolved / republishProjectRefs := { val proj = thisProjectRef.value val deps = buildDependencies.value @@ -326,13 +386,15 @@ object RepublishPlugin extends AutoPlugin { val s = streams.value val lm = (republishAllResolved / dependencyResolution).value val cacheDir = republishRepo.value + val commandLibs = republishCommandLibs.value val log = s.log val csrCacheDir = s.cacheDirectory / "csr-cache" val localRepo = s.cacheDirectory / "localRepo" / "maven2" // resolve the transitive dependencies of the local artifacts - val resolvedLibs = resolveLibraryDeps(coursierJar, log, csrCacheDir, localRepo, resolvedLocal) + val resolvedLibs = resolveLibraryDeps( + coursierJar, log, cacheDir, csrCacheDir, localRepo, resolvedLocal, commandLibs) // the combination of local artifacts and resolved transitive dependencies val merged = @@ -395,6 +457,77 @@ object RepublishPlugin extends AutoPlugin { val launchers = republishFetchLaunchers.value val extraProps = republishWriteExtraProps.value cacheDir + }, + republishPack := { + val cacheDir = republish.value + val s = streams.value + val log = s.log + val distDir = target.value / packDir.value + val progVersion = version.value + + IO.createDirectory(distDir) + for ((path, dir) <- packResourceDir.value) { + val target = distDir / dir + IO.copyDirectory(path, target) + } + + locally { + // everything in this block is copied from sbt-pack plugin + import scala.util.Try + import java.time.format.DateTimeFormatterBuilder + import java.time.format.SignStyle + import java.time.temporal.ChronoField.* + import java.time.ZoneId + import java.time.Instant + import java.time.ZonedDateTime + import java.time.ZonedDateTime + import java.util.Locale + import java.util.Date + val base: File = new File(".") // Using the working directory as base for readability + + def write(path: String, content: String) { + val p = distDir / path + IO.write(p, content) + } + + val humanReadableTimestampFormatter = new DateTimeFormatterBuilder() + .parseCaseInsensitive() + .appendValue(YEAR, 4, 10, SignStyle.EXCEEDS_PAD) + .appendLiteral('-') + .appendValue(MONTH_OF_YEAR, 2) + .appendLiteral('-') + .appendValue(DAY_OF_MONTH, 2) + .appendLiteral(' ') + .appendValue(HOUR_OF_DAY, 2) + .appendLiteral(':') + .appendValue(MINUTE_OF_HOUR, 2) + .appendLiteral(':') + .appendValue(SECOND_OF_MINUTE, 2) + .appendOffset("+HHMM", "Z") + .toFormatter(Locale.US) + + // Retrieve build time + val systemZone = ZoneId.systemDefault().normalized() + val timestamp = ZonedDateTime.ofInstant(Instant.ofEpochMilli(new Date().getTime), systemZone) + val buildTime = humanReadableTimestampFormatter.format(timestamp) + + // Check the current Git revision + val gitRevision: String = Try { + if ((base / ".git").exists()) { + log.info("[republish] Checking the git revision of the current project") + sys.process.Process("git rev-parse HEAD").!! + } else { + "unknown" + } + }.getOrElse("unknown").trim + + + // Output the version number and Git revision + write("VERSION", s"version:=${progVersion}\nrevision:=${gitRevision}\nbuildTime:=${buildTime}\n") + } + + + distDir } ) } From f7e72afa143aba492a03ff10bd1666fed7da7e60 Mon Sep 17 00:00:00 2001 From: Jamie Thompson Date: Fri, 21 Jun 2024 14:13:57 +0200 Subject: [PATCH 281/371] add back in copy of mapped sequence --- dist/bin-native-overrides/cli-common-platform.bat | 6 +++++- project/Build.scala | 2 +- project/RepublishPlugin.scala | 14 ++++++++++++-- 3 files changed, 18 insertions(+), 4 deletions(-) diff --git a/dist/bin-native-overrides/cli-common-platform.bat b/dist/bin-native-overrides/cli-common-platform.bat index e0cfa40692b5..d1c4f1c4716b 100644 --- a/dist/bin-native-overrides/cli-common-platform.bat +++ b/dist/bin-native-overrides/cli-common-platform.bat @@ -12,7 +12,11 @@ FOR /F "usebackq delims=" %%G IN ("%_PROG_HOME%\EXTRA_PROPERTIES") DO ( ) ) +@REM we didn't find it, so we should fail +echo "ERROR: cli_version not found in EXTRA_PROPERTIES file" +exit /b 1 + :foundCliVersion endlocal & set "SCALA_CLI_VERSION=%_SCALA_CLI_VERSION%" -set SCALA_CLI_CMD_WIN="%_PROG_HOME%\bin\scala-cli.exe" "--cli-version" "%SCALA_CLI_VERSION%" \ No newline at end of file +set SCALA_CLI_CMD_WIN="%_PROG_HOME%\bin\scala-cli.exe" "--cli-version" "%SCALA_CLI_VERSION%" diff --git a/project/Build.scala b/project/Build.scala index 3ce365fac9f1..b72715970fb1 100644 --- a/project/Build.scala +++ b/project/Build.scala @@ -2172,7 +2172,7 @@ object Build { republishBinOverrides += (dist / baseDirectory).value / "bin-native-overrides", republishFetchCoursier := (dist / republishFetchCoursier).value, republishExtraProps += ("cli_version" -> scalaCliLauncherVersion), - mappings += (republishRepo.value / "etc" / "EXTRA_PROPERTIES" -> "EXTRA_PROPERTIES"), + mappings += (republishRepo.value / "EXTRA_PROPERTIES" -> "EXTRA_PROPERTIES"), republishLaunchers += ("scala-cli.exe" -> s"zip+https://github.com/VirtusLab/scala-cli/releases/download/v$scalaCliLauncherVersionWindows/scala-cli-x86_64-pc-win32.zip!/scala-cli.exe") ) diff --git a/project/RepublishPlugin.scala b/project/RepublishPlugin.scala index 6ce83c2f0abf..a0a8ce7dae74 100644 --- a/project/RepublishPlugin.scala +++ b/project/RepublishPlugin.scala @@ -114,7 +114,7 @@ object RepublishPlugin extends AutoPlugin { val env = Map("COURSIER_CACHE" -> cache.getAbsolutePath.toString).asJava val cmdLine0 = Seq(javaCmd, "-jar", jar0) args => - val cmdLine = cmdLine0 ++ args + val cmdLine = cmdLine0 ++ args // invoke cmdLine with env, but also capture the output val p = new ProcessBuilder(cmdLine: _*) .directory(cache) @@ -441,7 +441,7 @@ object RepublishPlugin extends AutoPlugin { } else { val repoDir = republishRepo.value - val propsFile = repoDir / "etc" / "EXTRA_PROPERTIES" + val propsFile = repoDir / "EXTRA_PROPERTIES" log.info(s"[republish] Writing extra properties to $propsFile...") Using.fileWriter()(propsFile) { writer => extraProps.foreach { case (k, v) => @@ -485,6 +485,16 @@ object RepublishPlugin extends AutoPlugin { import java.util.Date val base: File = new File(".") // Using the working directory as base for readability + // Copy explicitly added dependencies + val mapped: Seq[(File, String)] = mappings.value + log.info("[republish] Copying explicit dependencies:") + val explicitDepsJars = for ((file, path) <- mapped) yield { + log.info(file.getPath) + val dest = distDir / path + IO.copyFile(file, dest, true) + dest + } + def write(path: String, content: String) { val p = distDir / path IO.write(p, content) From 81e3cc4a3427a68586abd4a4b0bef87851f56938 Mon Sep 17 00:00:00 2001 From: Jamie Thompson Date: Fri, 21 Jun 2024 14:58:42 +0200 Subject: [PATCH 282/371] read last line, split-off with-compiler classpath --- dist/bin/common | 16 ++++++++++++---- dist/bin/scalac.bat | 27 ++++++++++++++++++--------- project/Build.scala | 4 +++- project/RepublishPlugin.scala | 8 +++++++- 4 files changed, 40 insertions(+), 15 deletions(-) diff --git a/dist/bin/common b/dist/bin/common index 1ff0ca66274c..63e598d70d7e 100644 --- a/dist/bin/common +++ b/dist/bin/common @@ -10,10 +10,13 @@ load_classpath () { command="$1" psep_pattern="$2" __CLASS_PATH="" - while IFS= read -r line; do + while IFS= read -r line || [ -n "$line" ]; do + # jna-5 only appropriate for some combinations if ! [[ ( -n ${conemu-} || -n ${msys-}) && "$line" == "*jna-5*" ]]; then - # jna-5 only appropriate for some combinations - __CLASS_PATH+="$PROG_HOME/maven2/$line$psep_pattern" + if [ -n "$__CLASS_PATH" ]; then + __CLASS_PATH+="$psep_pattern" + fi + __CLASS_PATH+="$PROG_HOME/maven2/$line" fi done < "$PROG_HOME/etc/$command.classpath" echo "$__CLASS_PATH" @@ -21,11 +24,16 @@ load_classpath () { compilerJavaClasspathArgs () { toolchain="$(load_classpath "scala" "$PSEP")" + toolchain_extra="$(load_classpath "with_compiler" "$PSEP")" + + if [ -n "$toolchain_extra" ]; then + toolchain+="$PSEP$toolchain_extra" + fi if [ -n "${jvm_cp_args-}" ]; then jvm_cp_args="$toolchain$jvm_cp_args" else - jvm_cp_args="$toolchain$PSEP" + jvm_cp_args="$toolchain" fi } diff --git a/dist/bin/scalac.bat b/dist/bin/scalac.bat index fe6d7e3fad4d..dbcbaf11b8e2 100644 --- a/dist/bin/scalac.bat +++ b/dist/bin/scalac.bat @@ -89,9 +89,16 @@ goto :eof @rem output parameter: _JVM_CP_ARGS :compilerJavaClasspathArgs -call :loadClasspathFromFile +set "CP_FILE=%_ETC_DIR%\scala.classpath" +call :loadClasspathFromFile %CP_FILE% +set "__TOOLCHAIN=%_CLASS_PATH_RESULT%" -set "__TOOLCHAIN=%_CLASS_PATH%" +set "CP_FILE=%_ETC_DIR%\with_compiler.classpath" +call :loadClasspathFromFile %CP_FILE% + +if defined _CLASS_PATH_RESULT ( + set "__TOOLCHAIN=%__TOOLCHAIN%%_PSEP%%_CLASS_PATH_RESULT%" +) if defined _SCALA_CPATH ( set "_JVM_CP_ARGS=%__TOOLCHAIN%%_SCALA_CPATH%" @@ -100,17 +107,19 @@ if defined _SCALA_CPATH ( ) goto :eof -@REM concatentate every line in "%_ETC_DIR%\scala.classpath" with _PSEP +@REM concatentate every line in "%_ARG_FILE%" with _PSEP +@REM arg 1 - file to read :loadClasspathFromFile -set _CLASS_PATH= -if exist "%_ETC_DIR%\scala.classpath" ( - for /f "usebackq delims=" %%i in ("%_ETC_DIR%\scala.classpath") do ( +set _ARG_FILE=%1 +set _CLASS_PATH_RESULT= +if exist "%_ARG_FILE%" ( + for /f "usebackq delims=" %%i in ("%_ARG_FILE%") do ( set "_LIB=%_PROG_HOME%\maven2\%%i" set "_LIB=!_LIB:/=\!" - if not defined _CLASS_PATH ( - set "_CLASS_PATH=!_LIB!" + if not defined _CLASS_PATH_RESULT ( + set "_CLASS_PATH_RESULT=!_LIB!" ) else ( - set "_CLASS_PATH=!_CLASS_PATH!%_PSEP%!_LIB!" + set "_CLASS_PATH_RESULT=!_CLASS_PATH_RESULT!%_PSEP%!_LIB!" ) ) ) diff --git a/project/Build.scala b/project/Build.scala index b72715970fb1..d8f10019c1d7 100644 --- a/project/Build.scala +++ b/project/Build.scala @@ -2129,7 +2129,9 @@ object Build { packResourceDir += (republishRepo.value / "maven2" -> "maven2"), packResourceDir += (republishRepo.value / "etc" -> "etc"), republishCommandLibs += - ("scala" -> List("scala3-interfaces", "scala3-compiler", "scala3-library", "tasty-core", "scala3-staging", "scala3-tasty-inspector")), + ("scala" -> List("scala3-interfaces", "scala3-compiler", "scala3-library", "tasty-core")), + republishCommandLibs += + ("with_compiler" -> List("scala3-staging", "scala3-tasty-inspector", "^!scala3-interfaces", "^!scala3-compiler", "^!scala3-library", "^!tasty-core")), republishCommandLibs += ("scaladoc" -> List("scala3-interfaces", "scala3-compiler", "scala3-library", "tasty-core", "scala3-tasty-inspector", "scaladoc")), Compile / pack := republishPack.value, diff --git a/project/RepublishPlugin.scala b/project/RepublishPlugin.scala index a0a8ce7dae74..e4bf40545a6b 100644 --- a/project/RepublishPlugin.scala +++ b/project/RepublishPlugin.scala @@ -215,7 +215,13 @@ object RepublishPlugin extends AutoPlugin { if (commandLibs.nonEmpty) { IO.createDirectory(republishDir / "etc") for ((command, libs) <- commandLibs) { - val entries = libs.map(fuzzyFind(classpaths, _)).reduce(_ ++ _).distinct + val (negated, actual) = libs.partition(_.startsWith("^!")) + val subtractions = negated.map(_.stripPrefix("^!")) + + def compose(libs: List[String]): List[String] = + libs.map(fuzzyFind(classpaths, _)).reduceOption(_ ++ _).map(_.distinct).getOrElse(Nil) + + val entries = compose(actual).diff(compose(subtractions)) IO.write(republishDir / "etc" / s"$command.classpath", entries.mkString("\n")) } } From e74d681ca64c20eff33cf4e34dda6b7f97ebe1e9 Mon Sep 17 00:00:00 2001 From: Hamza Remmal Date: Mon, 1 Jul 2024 11:55:12 +0200 Subject: [PATCH 283/371] Bump scala-cli to 1.4.0 (#20859) --- project/Build.scala | 4 ++-- 1 file changed, 2 insertions(+), 2 deletions(-) diff --git a/project/Build.scala b/project/Build.scala index 6ff07701c06b..3fa01ca4337d 100644 --- a/project/Build.scala +++ b/project/Build.scala @@ -118,9 +118,9 @@ object Build { val mimaPreviousLTSDottyVersion = "3.3.0" /** Version of Scala CLI to download */ - val scalaCliLauncherVersion = "1.3.2" + val scalaCliLauncherVersion = "1.4.0" /** Version of Scala CLI to download (on Windows - last known validated version) */ - val scalaCliLauncherVersionWindows = "1.3.2" + val scalaCliLauncherVersionWindows = "1.4.0" /** Version of Coursier to download for initializing the local maven repo of Scala command */ val coursierJarVersion = "2.1.10" From d470b7782efa759b21c771ab71ab3c5d8941b9f1 Mon Sep 17 00:00:00 2001 From: Wojciech Mazur Date: Mon, 1 Jul 2024 22:48:33 +0200 Subject: [PATCH 284/371] Fix failing CompletionScalaCliSuite tests due to circe releasing Scala Native 0.5 artifacts --- .../completion/CompletionScalaCliSuite.scala | 16 +++++++++++++--- 1 file changed, 13 insertions(+), 3 deletions(-) diff --git a/presentation-compiler/test/dotty/tools/pc/tests/completion/CompletionScalaCliSuite.scala b/presentation-compiler/test/dotty/tools/pc/tests/completion/CompletionScalaCliSuite.scala index 0d86922d4e70..0a74aed35f48 100644 --- a/presentation-compiler/test/dotty/tools/pc/tests/completion/CompletionScalaCliSuite.scala +++ b/presentation-compiler/test/dotty/tools/pc/tests/completion/CompletionScalaCliSuite.scala @@ -28,7 +28,8 @@ class CompletionScalaCliSuite extends BaseCompletionSuite: |// //> using lib ??? |//> using lib io.circe::circe-core_native0.4 |package A - |""".stripMargin + |""".stripMargin, + assertSingleItem = false ) @Test def `version-sort` = @@ -51,6 +52,9 @@ class CompletionScalaCliSuite extends BaseCompletionSuite: """|circe-core_native0.4_2.12 |circe-core_native0.4_2.13 |circe-core_native0.4_3 + |circe-core_native0.5_2.12 + |circe-core_native0.5_2.13 + |circe-core_native0.5_3 |""".stripMargin ) @@ -78,7 +82,9 @@ class CompletionScalaCliSuite extends BaseCompletionSuite: """|//> using lib "io.circe::circe-core:0.14.0", "io.circe::circe-core_na@@" |package A |""".stripMargin, - "circe-core_native0.4" + """circe-core_native0.4 + |circe-core_native0.5 + |""".stripMargin ) @Test def `script` = @@ -92,6 +98,9 @@ class CompletionScalaCliSuite extends BaseCompletionSuite: """|circe-core_native0.4_2.12 |circe-core_native0.4_2.13 |circe-core_native0.4_3 + |circe-core_native0.5_2.12 + |circe-core_native0.5_2.13 + |circe-core_native0.5_3 |""".stripMargin, filename = "script.sc.scala", enablePackageWrap = false @@ -138,7 +147,8 @@ class CompletionScalaCliSuite extends BaseCompletionSuite: """|//> using libs "io.circe::circe-core:0.14.0", "io.circe::circe-core_na@@" |package A |""".stripMargin, - "circe-core_native0.4" + """circe-core_native0.4 + |circe-core_native0.5""".stripMargin ) private def scriptWrapper(code: String, filename: String): String = From edc8cbc552e61ef963375174686cae0407b4dd14 Mon Sep 17 00:00:00 2001 From: Wojciech Mazur Date: Tue, 2 Jul 2024 12:35:01 +0200 Subject: [PATCH 285/371] Ignore failing tests instead of expecting for completions for both 0.4 and 0.5 SN versins - the outputs seems be non deterministic in the CI --- .../completion/CompletionScalaCliSuite.scala | 18 +++++++----------- 1 file changed, 7 insertions(+), 11 deletions(-) diff --git a/presentation-compiler/test/dotty/tools/pc/tests/completion/CompletionScalaCliSuite.scala b/presentation-compiler/test/dotty/tools/pc/tests/completion/CompletionScalaCliSuite.scala index 0a74aed35f48..79d35944c84d 100644 --- a/presentation-compiler/test/dotty/tools/pc/tests/completion/CompletionScalaCliSuite.scala +++ b/presentation-compiler/test/dotty/tools/pc/tests/completion/CompletionScalaCliSuite.scala @@ -3,6 +3,7 @@ package dotty.tools.pc.tests.completion import dotty.tools.pc.base.BaseCompletionSuite import org.junit.Test +import org.junit.Ignore class CompletionScalaCliSuite extends BaseCompletionSuite: @@ -44,6 +45,7 @@ class CompletionScalaCliSuite extends BaseCompletionSuite: |""".stripMargin, ) + @Ignore @Test def `single-colon` = check( """|//> using lib "io.circe:circe-core_na@@ @@ -52,9 +54,6 @@ class CompletionScalaCliSuite extends BaseCompletionSuite: """|circe-core_native0.4_2.12 |circe-core_native0.4_2.13 |circe-core_native0.4_3 - |circe-core_native0.5_2.12 - |circe-core_native0.5_2.13 - |circe-core_native0.5_3 |""".stripMargin ) @@ -77,16 +76,16 @@ class CompletionScalaCliSuite extends BaseCompletionSuite: |""".stripMargin, ) + @Ignore @Test def `multiple-libs` = check( """|//> using lib "io.circe::circe-core:0.14.0", "io.circe::circe-core_na@@" |package A |""".stripMargin, - """circe-core_native0.4 - |circe-core_native0.5 - |""".stripMargin + "circe-core_native0.4" ) + @Ignore @Test def `script` = check( scriptWrapper( @@ -98,9 +97,6 @@ class CompletionScalaCliSuite extends BaseCompletionSuite: """|circe-core_native0.4_2.12 |circe-core_native0.4_2.13 |circe-core_native0.4_3 - |circe-core_native0.5_2.12 - |circe-core_native0.5_2.13 - |circe-core_native0.5_3 |""".stripMargin, filename = "script.sc.scala", enablePackageWrap = false @@ -142,13 +138,13 @@ class CompletionScalaCliSuite extends BaseCompletionSuite: |io.circul""".stripMargin ) + @Ignore @Test def `multiple-deps2` = check( """|//> using libs "io.circe::circe-core:0.14.0", "io.circe::circe-core_na@@" |package A |""".stripMargin, - """circe-core_native0.4 - |circe-core_native0.5""".stripMargin + "circe-core_native0.4" ) private def scriptWrapper(code: String, filename: String): String = From 1591ac9efbfce9efba46b85f2a2385eff94bcdeb Mon Sep 17 00:00:00 2001 From: Jamie Thompson Date: Mon, 1 Jul 2024 17:02:28 +0200 Subject: [PATCH 286/371] fix issue 20901: etaCollapse context bound type --- .../src/dotty/tools/dotc/typer/Typer.scala | 7 +- tests/pos/i20901/Foo.scala | 6 + tests/pos/i20901/Foo.tastycheck | 124 ++++++++++++++++++ 3 files changed, 134 insertions(+), 3 deletions(-) create mode 100644 tests/pos/i20901/Foo.scala create mode 100644 tests/pos/i20901/Foo.tastycheck diff --git a/compiler/src/dotty/tools/dotc/typer/Typer.scala b/compiler/src/dotty/tools/dotc/typer/Typer.scala index ae62ebbc4a3f..4cb695a15966 100644 --- a/compiler/src/dotty/tools/dotc/typer/Typer.scala +++ b/compiler/src/dotty/tools/dotc/typer/Typer.scala @@ -2366,13 +2366,14 @@ class Typer(@constructorOnly nestingLevel: Int = 0) extends Namer def typedContextBoundTypeTree(tree: untpd.ContextBoundTypeTree)(using Context): Tree = val tycon = typedType(tree.tycon) - val tyconSplice = untpd.TypedSplice(tycon) + def spliced(tree: Tree) = untpd.TypedSplice(tree) val tparam = untpd.Ident(tree.paramName).withSpan(tree.span) if tycon.tpe.typeParams.nonEmpty then - typed(untpd.AppliedTypeTree(tyconSplice, tparam :: Nil)) + val tycon0 = tycon.withType(tycon.tpe.etaCollapse) + typed(untpd.AppliedTypeTree(spliced(tycon0), tparam :: Nil)) else if Feature.enabled(modularity) && tycon.tpe.member(tpnme.Self).symbol.isAbstractOrParamType then val tparamSplice = untpd.TypedSplice(typedExpr(tparam)) - typed(untpd.RefinedTypeTree(tyconSplice, List(untpd.TypeDef(tpnme.Self, tparamSplice)))) + typed(untpd.RefinedTypeTree(spliced(tycon), List(untpd.TypeDef(tpnme.Self, tparamSplice)))) else def selfNote = if Feature.enabled(modularity) then diff --git a/tests/pos/i20901/Foo.scala b/tests/pos/i20901/Foo.scala new file mode 100644 index 000000000000..c1277781db38 --- /dev/null +++ b/tests/pos/i20901/Foo.scala @@ -0,0 +1,6 @@ +//> using options -Ytest-pickler-check + +import reflect.ClassTag + +class Foo: + def mkArray[T: ClassTag] = ??? diff --git a/tests/pos/i20901/Foo.tastycheck b/tests/pos/i20901/Foo.tastycheck new file mode 100644 index 000000000000..0201bfec2056 --- /dev/null +++ b/tests/pos/i20901/Foo.tastycheck @@ -0,0 +1,124 @@ +Header: + version: + tooling: + UUID: + +Names (276 bytes, starting from ): + 0: ASTs + 1: + 2: scala + 3: reflect + 4: scala[Qualified . reflect] + 5: ClassTag + 6: Foo + 7: + 8: java + 9: lang + 10: java[Qualified . lang] + 11: Object + 12: java[Qualified . lang][Qualified . Object] + 13: [Signed Signature(List(),java.lang.Object) @] + 14: Unit + 15: mkArray + 16: T + 17: Nothing + 18: Any + 19: evidence$ + 20: [Unique evidence$ 1] + 21: ??? + 22: Predef + 23: SourceFile + 24: annotation + 25: scala[Qualified . annotation] + 26: internal + 27: scala[Qualified . annotation][Qualified . internal] + 28: scala[Qualified . annotation][Qualified . internal][Qualified . SourceFile] + 29: String + 30: java[Qualified . lang][Qualified . String] + 31: [Signed Signature(List(java.lang.String),scala.annotation.internal.SourceFile) @] + 32: + 33: Positions + 34: Comments + 35: Attributes + +Trees (94 bytes, starting from ): + 0: PACKAGE(92) + 2: TERMREFpkg 1 [] + 4: IMPORT(4) + 6: TERMREFpkg 4 [scala[Qualified . reflect]] + 8: IMPORTED 5 [ClassTag] + 10: TYPEDEF(82) 6 [Foo] + 13: TEMPLATE(61) + 15: APPLY(10) + 17: SELECTin(8) 13 [[Signed Signature(List(),java.lang.Object) @]] + 20: NEW + 21: TYPEREF 11 [Object] + 23: TERMREFpkg 10 [java[Qualified . lang]] + 25: SHAREDtype 21 + 27: DEFDEF(7) 7 [] + 30: EMPTYCLAUSE + 31: TYPEREF 14 [Unit] + 33: TERMREFpkg 2 [scala] + 35: STABLE + 36: DEFDEF(38) 15 [mkArray] + 39: TYPEPARAM(11) 16 [T] + 42: TYPEBOUNDStpt(8) + 44: TYPEREF 17 [Nothing] + 46: SHAREDtype 33 + 48: TYPEREF 18 [Any] + 50: SHAREDtype 33 + 52: PARAM(14) 20 [[Unique evidence$ 1]] + 55: APPLIEDtpt(10) + 57: IDENTtpt 5 [ClassTag] + 59: TYPEREF 5 [ClassTag] + 61: SHAREDtype 6 + 63: IDENTtpt 16 [T] + 65: TYPEREFdirect 39 + 67: IMPLICIT + 68: SHAREDtype 44 + 70: TERMREF 21 [???] + 72: TERMREF 22 [Predef] + 74: SHAREDtype 33 + 76: ANNOTATION(16) + 78: TYPEREF 23 [SourceFile] + 80: TERMREFpkg 27 [scala[Qualified . annotation][Qualified . internal]] + 82: APPLY(10) + 84: SELECTin(6) 31 [[Signed Signature(List(java.lang.String),scala.annotation.internal.SourceFile) @]] + 87: NEW + 88: SHAREDtype 78 + 90: SHAREDtype 78 + 92: STRINGconst 32 [] + 94: + +Positions (72 bytes, starting from ): + lines: 7 + line sizes: + 38, 0, 23, 0, 10, 32, 0 + positions: + 0: 40 .. 108 + 4: 40 .. 63 + 6: 47 .. 54 + 8: 55 .. 63 + 10: 65 .. 108 + 13: 78 .. 108 + 21: 71 .. 71 + 27: 78 .. 78 + 31: 78 .. 78 + 36: 78 .. 108 + 39: 90 .. 101 + 44: 93 .. 93 + 48: 93 .. 93 + 52: 93 .. 101 + 57: 93 .. 101 + 63: 93 .. 101 + 68: 102 .. 102 + 70: 105 .. 108 + 82: 65 .. 108 + 88: 65 .. 65 + 92: 65 .. 65 + + source paths: + 0: 32 [] + +Attributes (2 bytes, starting from ): + SOURCEFILEattr 32 [] From f8a2e563159c85f312f7d2cb48909ee08ba25f24 Mon Sep 17 00:00:00 2001 From: Jamie Thompson Date: Mon, 1 Jul 2024 21:39:14 +0200 Subject: [PATCH 287/371] update semanticdb test (restore references) --- tests/semanticdb/expect/Methods.expect.scala | 2 +- tests/semanticdb/metac.expect | 3 ++- 2 files changed, 3 insertions(+), 2 deletions(-) diff --git a/tests/semanticdb/expect/Methods.expect.scala b/tests/semanticdb/expect/Methods.expect.scala index 4ec723ad584e..e1fcfa6880e1 100644 --- a/tests/semanticdb/expect/Methods.expect.scala +++ b/tests/semanticdb/expect/Methods.expect.scala @@ -15,7 +15,7 @@ class Methods/*<-example::Methods#*/[T/*<-example::Methods#[T]*/] { def m6/*<-example::Methods#m6().*/(x/*<-example::Methods#m6().(x)*/: Int/*->scala::Int#*/) = ???/*->scala::Predef.`???`().*/ def m6/*<-example::Methods#m6(+1).*/(x/*<-example::Methods#m6(+1).(x)*/: List/*->example::Methods#List#*/[T/*->example::Methods#[T]*/]) = ???/*->scala::Predef.`???`().*/ def m6/*<-example::Methods#m6(+2).*/(x/*<-example::Methods#m6(+2).(x)*/: scala.List/*->scala::package.List#*/[T/*->example::Methods#[T]*/]) = ???/*->scala::Predef.`???`().*/ - def m7/*<-example::Methods#m7().*/[U/*<-example::Methods#m7().[U]*/: Ordering/*->example::Methods#m7().[U]*//*<-example::Methods#m7().(evidence$1)*/](c/*<-example::Methods#m7().(c)*/: Methods/*->example::Methods#*/[T/*->example::Methods#[T]*/], l/*<-example::Methods#m7().(l)*/: List/*->example::Methods#List#*/[U/*->example::Methods#m7().[U]*/]) = ???/*->scala::Predef.`???`().*/ + def m7/*<-example::Methods#m7().*/[U/*<-example::Methods#m7().[U]*/: Ordering/*->scala::math::Ordering#*//*->example::Methods#m7().[U]*//*<-example::Methods#m7().(evidence$1)*/](c/*<-example::Methods#m7().(c)*/: Methods/*->example::Methods#*/[T/*->example::Methods#[T]*/], l/*<-example::Methods#m7().(l)*/: List/*->example::Methods#List#*/[U/*->example::Methods#m7().[U]*/]) = ???/*->scala::Predef.`???`().*/ def `m8()./*<-example::Methods#`m8().`().*/`() = ???/*->scala::Predef.`???`().*/ class `m9()./*<-example::Methods#`m9().`#*/` def m9/*<-example::Methods#m9().*/(x/*<-example::Methods#m9().(x)*/: `m9().`/*->example::Methods#`m9().`#*/) = ???/*->scala::Predef.`???`().*/ diff --git a/tests/semanticdb/metac.expect b/tests/semanticdb/metac.expect index 98657f122255..9dc2fd8a44c9 100644 --- a/tests/semanticdb/metac.expect +++ b/tests/semanticdb/metac.expect @@ -2584,7 +2584,7 @@ Uri => Methods.scala Text => empty Language => Scala Symbols => 82 entries -Occurrences => 156 entries +Occurrences => 157 entries Symbols: example/Methods# => class Methods [typeparam T ] extends Object { self: Methods[T] => +44 decls } @@ -2728,6 +2728,7 @@ Occurrences: [16:29..16:32): ??? -> scala/Predef.`???`(). [17:6..17:8): m7 <- example/Methods#m7(). [17:9..17:10): U <- example/Methods#m7().[U] +[17:12..17:20): Ordering -> scala/math/Ordering# [17:12..17:20): Ordering -> example/Methods#m7().[U] [17:12..17:12): <- example/Methods#m7().(evidence$1) [17:22..17:23): c <- example/Methods#m7().(c) From a5c74e79e0b34e9597e4e6725b58196d7dcdb1f6 Mon Sep 17 00:00:00 2001 From: Seth Tisue Date: Mon, 26 Feb 2024 14:32:18 -0800 Subject: [PATCH 288/371] use Scala 2.13.13 stdlib (was .12) --- community-build/community-projects/stdLib213 | 2 +- project/Build.scala | 8 ++++---- project/Scala2LibraryBootstrappedMiMaFilters.scala | 3 --- 3 files changed, 5 insertions(+), 8 deletions(-) diff --git a/community-build/community-projects/stdLib213 b/community-build/community-projects/stdLib213 index 6243e902928c..fcc67cd56c67 160000 --- a/community-build/community-projects/stdLib213 +++ b/community-build/community-projects/stdLib213 @@ -1 +1 @@ -Subproject commit 6243e902928c344fb0e82e21120bb257f08a2af2 +Subproject commit fcc67cd56c67851bf31019ec25ccb09d08b9561b diff --git a/project/Build.scala b/project/Build.scala index 8dbc691136d6..0f32d892e51a 100644 --- a/project/Build.scala +++ b/project/Build.scala @@ -144,8 +144,8 @@ object Build { * scala-library. */ def stdlibVersion(implicit mode: Mode): String = mode match { - case NonBootstrapped => "2.13.12" - case Bootstrapped => "2.13.12" + case NonBootstrapped => "2.13.13" + case Bootstrapped => "2.13.13" } /** Version of the scala-library for which we will generate TASTy. @@ -155,7 +155,7 @@ object Build { * We can use nightly versions to tests the future compatibility in development. * Nightly versions: https://scala-ci.typesafe.com/ui/native/scala-integration/org/scala-lang */ - val stdlibBootstrappedVersion = "2.13.12" + val stdlibBootstrappedVersion = "2.13.13" val dottyOrganization = "org.scala-lang" val dottyGithubUrl = "https://github.com/scala/scala3" @@ -1358,7 +1358,7 @@ object Build { "io.get-coursier" % "interface" % "1.0.18", "org.scalameta" % "mtags-interfaces" % mtagsVersion, ), - libraryDependencies += ("org.scalameta" % "mtags-shared_2.13.12" % mtagsVersion % SourceDeps), + libraryDependencies += ("org.scalameta" % "mtags-shared_2.13.13" % mtagsVersion % SourceDeps), ivyConfigurations += SourceDeps.hide, transitiveClassifiers := Seq("sources"), scalacOptions ++= Seq("-source", "3.3"), // To avoid fatal migration warnings diff --git a/project/Scala2LibraryBootstrappedMiMaFilters.scala b/project/Scala2LibraryBootstrappedMiMaFilters.scala index bd149d5a910b..0d2b5a7fd945 100644 --- a/project/Scala2LibraryBootstrappedMiMaFilters.scala +++ b/project/Scala2LibraryBootstrappedMiMaFilters.scala @@ -78,9 +78,6 @@ object Scala2LibraryBootstrappedMiMaFilters { "scala.collection.IterableOnceOps#Maximized.this", // New in 2.13.11: private inner class "scala.util.Properties.", "scala.util.Sorting.scala$util$Sorting$$mergeSort$default$5", - // New in 2.13.12 -- can be removed once scala/scala#10549 lands in 2.13.13 - // and we take the upgrade here - "scala.collection.immutable.MapNodeRemoveAllSetNodeIterator.next", ).map(ProblemFilters.exclude[DirectMissingMethodProblem]) } ) From b357bc93e512c93e6b70ed0eed906b4daf7febf6 Mon Sep 17 00:00:00 2001 From: Wojciech Mazur Date: Mon, 1 Jul 2024 13:03:41 +0200 Subject: [PATCH 289/371] Upgrade Scala 2 to 2.13.14 --- project/Build.scala | 8 ++++---- project/Scala2LibraryBootstrappedMiMaFilters.scala | 4 ++++ 2 files changed, 8 insertions(+), 4 deletions(-) diff --git a/project/Build.scala b/project/Build.scala index 0f32d892e51a..9dc75838ba15 100644 --- a/project/Build.scala +++ b/project/Build.scala @@ -144,8 +144,8 @@ object Build { * scala-library. */ def stdlibVersion(implicit mode: Mode): String = mode match { - case NonBootstrapped => "2.13.13" - case Bootstrapped => "2.13.13" + case NonBootstrapped => "2.13.14" + case Bootstrapped => "2.13.14" } /** Version of the scala-library for which we will generate TASTy. @@ -155,7 +155,7 @@ object Build { * We can use nightly versions to tests the future compatibility in development. * Nightly versions: https://scala-ci.typesafe.com/ui/native/scala-integration/org/scala-lang */ - val stdlibBootstrappedVersion = "2.13.13" + val stdlibBootstrappedVersion = "2.13.14" val dottyOrganization = "org.scala-lang" val dottyGithubUrl = "https://github.com/scala/scala3" @@ -1358,7 +1358,7 @@ object Build { "io.get-coursier" % "interface" % "1.0.18", "org.scalameta" % "mtags-interfaces" % mtagsVersion, ), - libraryDependencies += ("org.scalameta" % "mtags-shared_2.13.13" % mtagsVersion % SourceDeps), + libraryDependencies += ("org.scalameta" % "mtags-shared_2.13.14" % mtagsVersion % SourceDeps), ivyConfigurations += SourceDeps.hide, transitiveClassifiers := Seq("sources"), scalacOptions ++= Seq("-source", "3.3"), // To avoid fatal migration warnings diff --git a/project/Scala2LibraryBootstrappedMiMaFilters.scala b/project/Scala2LibraryBootstrappedMiMaFilters.scala index 0d2b5a7fd945..102a2a50e9d4 100644 --- a/project/Scala2LibraryBootstrappedMiMaFilters.scala +++ b/project/Scala2LibraryBootstrappedMiMaFilters.scala @@ -172,6 +172,10 @@ object Scala2LibraryBootstrappedMiMaFilters { "scala.collection.mutable.LinkedHashSet.defaultLoadFactor", // private[collection] final def "scala.collection.mutable.LinkedHashSet.defaultinitialSize", // private[collection] final def "scala.collection.mutable.OpenHashMap.nextPositivePowerOfTwo", // private[mutable] def + // New in 2.13.13 + "scala.collection.mutable.ArrayBuffer.resizeUp", // private[mutable] def + // New in 2.13.14 + "scala.util.Properties.consoleIsTerminal", // private[scala] lazy val ).map(ProblemFilters.exclude[DirectMissingMethodProblem]) ++ Seq( // MissingFieldProblem: static field ... in object ... does not have a correspondent in other version "scala.Array.UnapplySeqWrapper", From dcf708ca50511ff6dc7a1f4a6ef3dbd1e8b5fd9c Mon Sep 17 00:00:00 2001 From: Wojciech Mazur Date: Mon, 1 Jul 2024 13:05:24 +0200 Subject: [PATCH 290/371] Phiscally remove the ignored Scala 2 library-aux files instead of filtering them out in `Compile / sources` (not reliable, for some reasone the AnyRef.scala was still compiled) --- project/Build.scala | 26 +++++++++++++++----------- 1 file changed, 15 insertions(+), 11 deletions(-) diff --git a/project/Build.scala b/project/Build.scala index 9dc75838ba15..41e5f3e082f5 100644 --- a/project/Build.scala +++ b/project/Build.scala @@ -1124,19 +1124,23 @@ object Build { IO.createDirectory(trgDir) IO.unzip(scalaLibrarySourcesJar, trgDir) - ((trgDir ** "*.scala") +++ (trgDir ** "*.java")).get.toSet + val (ignoredSources, sources) = + ((trgDir ** "*.scala") +++ (trgDir ** "*.java")).get.toSet + .partition{file => + // sources from https://github.com/scala/scala/tree/2.13.x/src/library-aux + val path = file.getPath.replace('\\', '/') + path.endsWith("scala-library-src/scala/Any.scala") || + path.endsWith("scala-library-src/scala/AnyVal.scala") || + path.endsWith("scala-library-src/scala/AnyRef.scala") || + path.endsWith("scala-library-src/scala/Nothing.scala") || + path.endsWith("scala-library-src/scala/Null.scala") || + path.endsWith("scala-library-src/scala/Singleton.scala") + } + // These sources should be never compiled, filtering them out was not working correctly sometimes + ignoredSources.foreach(_.delete()) + sources } (Set(scalaLibrarySourcesJar)).toSeq }.taskValue, - (Compile / sources) ~= (_.filterNot { file => - // sources from https://github.com/scala/scala/tree/2.13.x/src/library-aux - val path = file.getPath.replace('\\', '/') - path.endsWith("scala-library-src/scala/Any.scala") || - path.endsWith("scala-library-src/scala/AnyVal.scala") || - path.endsWith("scala-library-src/scala/AnyRef.scala") || - path.endsWith("scala-library-src/scala/Nothing.scala") || - path.endsWith("scala-library-src/scala/Null.scala") || - path.endsWith("scala-library-src/scala/Singleton.scala") - }), (Compile / sources) := { val files = (Compile / sources).value val overwrittenSourcesDir = (Compile / scalaSource).value From c042e57d2238e87cc8f91f1aef36270d659f5be8 Mon Sep 17 00:00:00 2001 From: Hamza REMMAL Date: Mon, 1 Jul 2024 10:29:16 +0200 Subject: [PATCH 291/371] Add --skip-cli-updates by default to the scala command --- dist/bin/scala | 1 + dist/bin/scala.bat | 5 +++-- 2 files changed, 4 insertions(+), 2 deletions(-) diff --git a/dist/bin/scala b/dist/bin/scala index 71747a8e9e20..35efdfc38d96 100755 --- a/dist/bin/scala +++ b/dist/bin/scala @@ -59,6 +59,7 @@ done # SCALA_CLI_CMD_BASH is an array, set by cli-common-platform eval "${SCALA_CLI_CMD_BASH[@]}" \ "--prog-name scala" \ + "--skip-cli-updates" \ "--cli-default-scala-version \"$SCALA_VERSION\"" \ "-r \"$MVN_REPOSITORY\"" \ "${scala_args[@]}" diff --git a/dist/bin/scala.bat b/dist/bin/scala.bat index d473facbbb1c..7418909da263 100644 --- a/dist/bin/scala.bat +++ b/dist/bin/scala.bat @@ -21,8 +21,9 @@ call :setScalaOpts call "%_PROG_HOME%\bin\cli-common-platform.bat" -@rem SCALA_CLI_CMD_WIN is an array, set in cli-common-platform.bat -call %SCALA_CLI_CMD_WIN% "--prog-name" "scala" "--cli-default-scala-version" "%_SCALA_VERSION%" "-r" "%MVN_REPOSITORY%" %* +@rem SCALA_CLI_CMD_WIN is an array, set in cli-common-platform.bat. +@rem WE NEED TO PASS '--skip-cli-updates' for JVM launchers but we actually don't need it for native launchers +call %SCALA_CLI_CMD_WIN% "--prog-name" "scala" "--skip-cli-updates" "--cli-default-scala-version" "%_SCALA_VERSION%" "-r" "%MVN_REPOSITORY%" %* if not %ERRORLEVEL%==0 ( set _EXITCODE=1& goto end ) From 91b8abde7bc555aad0d6c01bb99135771c8b6f86 Mon Sep 17 00:00:00 2001 From: odersky Date: Sat, 25 May 2024 15:34:59 +0200 Subject: [PATCH 292/371] Avoid useless warnings about priority change in implicit search Warn about priority change in implicit search only if one of the participating candidates appears in the final result. It could be that we have an priority change between two ranked candidates that both are superseded by the result of the implicit search. In this case, no warning needs to be reported. --- .../dotty/tools/dotc/typer/Implicits.scala | 32 +++++++++++++++---- 1 file changed, 26 insertions(+), 6 deletions(-) diff --git a/compiler/src/dotty/tools/dotc/typer/Implicits.scala b/compiler/src/dotty/tools/dotc/typer/Implicits.scala index 74bd59d4992f..e3615ce40592 100644 --- a/compiler/src/dotty/tools/dotc/typer/Implicits.scala +++ b/compiler/src/dotty/tools/dotc/typer/Implicits.scala @@ -419,6 +419,12 @@ object Implicits: sealed abstract class SearchResult extends Showable { def tree: Tree def toText(printer: Printer): Text = printer.toText(this) + + /** The references that were found, there can be two of them in the case + * of an AmbiguousImplicits failure + */ + def found: List[TermRef] + def recoverWith(other: SearchFailure => SearchResult): SearchResult = this match { case _: SearchSuccess => this case fail: SearchFailure => other(fail) @@ -434,13 +440,17 @@ object Implicits: * @param tstate The typer state to be committed if this alternative is chosen */ case class SearchSuccess(tree: Tree, ref: TermRef, level: Int, isExtension: Boolean = false)(val tstate: TyperState, val gstate: GadtConstraint) - extends SearchResult with RefAndLevel with Showable + extends SearchResult with RefAndLevel with Showable: + final def found = ref :: Nil /** A failed search */ case class SearchFailure(tree: Tree) extends SearchResult { require(tree.tpe.isInstanceOf[SearchFailureType], s"unexpected type for ${tree}") final def isAmbiguous: Boolean = tree.tpe.isInstanceOf[AmbiguousImplicits | TooUnspecific] final def reason: SearchFailureType = tree.tpe.asInstanceOf[SearchFailureType] + final def found = tree.tpe match + case tpe: AmbiguousImplicits => tpe.alt1.ref :: tpe.alt2.ref :: Nil + case _ => Nil } object SearchFailure { @@ -1290,6 +1300,11 @@ trait Implicits: /** Search a list of eligible implicit references */ private def searchImplicit(eligible: List[Candidate], contextual: Boolean): SearchResult = + // A map that associates a priority change warning (between -source 3.4 and 3.6) + // with a candidate ref mentioned in the warning. We report the associated + // message if the candidate ref is part of the result of the implicit search + var priorityChangeWarnings = mutable.ListBuffer[(TermRef, Message)]() + /** Compare `alt1` with `alt2` to determine which one should be chosen. * * @return a number > 0 if `alt1` is preferred over `alt2` @@ -1306,6 +1321,8 @@ trait Implicits: */ def compareAlternatives(alt1: RefAndLevel, alt2: RefAndLevel): Int = def comp(using Context) = explore(compare(alt1.ref, alt2.ref, preferGeneral = true)) + def warn(msg: Message) = + priorityChangeWarnings += (alt1.ref -> msg) += (alt2.ref -> msg) if alt1.ref eq alt2.ref then 0 else if alt1.level != alt2.level then alt1.level - alt2.level else @@ -1319,16 +1336,16 @@ trait Implicits: case 1 => "the first alternative" case _ => "none - it's ambiguous" if sv.stable == SourceVersion.`3.5` then - report.warning( + warn( em"""Given search preference for $pt between alternatives ${alt1.ref} and ${alt2.ref} will change |Current choice : ${choice(prev)} - |New choice from Scala 3.6: ${choice(cmp)}""", srcPos) + |New choice from Scala 3.6: ${choice(cmp)}""") prev else - report.warning( + warn( em"""Change in given search preference for $pt between alternatives ${alt1.ref} and ${alt2.ref} |Previous choice : ${choice(prev)} - |New choice from Scala 3.6: ${choice(cmp)}""", srcPos) + |New choice from Scala 3.6: ${choice(cmp)}""") cmp else cmp else cmp @@ -1578,7 +1595,10 @@ trait Implicits: validateOrdering(ord) throw ex - rank(sort(eligible), NoMatchingImplicitsFailure, Nil) + val result = rank(sort(eligible), NoMatchingImplicitsFailure, Nil) + for (ref, msg) <- priorityChangeWarnings do + if result.found.contains(ref) then report.warning(msg, srcPos) + result end searchImplicit def isUnderSpecifiedArgument(tp: Type): Boolean = From 9354ad5297d8b20bca82159536564d0b88e1820d Mon Sep 17 00:00:00 2001 From: odersky Date: Mon, 27 May 2024 17:57:03 +0200 Subject: [PATCH 293/371] Re-enable semanticdb test --- tests/semanticdb/expect/InventedNames.scala | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/tests/semanticdb/expect/InventedNames.scala b/tests/semanticdb/expect/InventedNames.scala index 61baae46c832..42c14c90e370 100644 --- a/tests/semanticdb/expect/InventedNames.scala +++ b/tests/semanticdb/expect/InventedNames.scala @@ -32,7 +32,7 @@ given [T]: Z[T] with val a = intValue val b = given_String -//val c = given_Double +val c = given_Double val d = given_List_T[Int] val e = given_Char val f = given_Float From 7ac54178c4ac0478808d1cf3fc95da209601d652 Mon Sep 17 00:00:00 2001 From: odersky Date: Mon, 27 May 2024 23:38:03 +0200 Subject: [PATCH 294/371] Update semanticDB expect files --- tests/semanticdb/expect/InventedNames.expect.scala | 2 +- tests/semanticdb/metac.expect | 12 ++++++++---- 2 files changed, 9 insertions(+), 5 deletions(-) diff --git a/tests/semanticdb/expect/InventedNames.expect.scala b/tests/semanticdb/expect/InventedNames.expect.scala index b92e9aa940a7..7c5b008209c2 100644 --- a/tests/semanticdb/expect/InventedNames.expect.scala +++ b/tests/semanticdb/expect/InventedNames.expect.scala @@ -32,7 +32,7 @@ given [T/*<-givens::InventedNames$package.given_Z_T#[T]*/]: Z/*->givens::Z#*/[T/ val a/*<-givens::InventedNames$package.a.*/ = intValue/*->givens::InventedNames$package.intValue.*/ val b/*<-givens::InventedNames$package.b.*/ = given_String/*->givens::InventedNames$package.given_String.*/ -//val c = given_Double +val c/*<-givens::InventedNames$package.c.*/ = given_Double/*->givens::InventedNames$package.given_Double().*/ val d/*<-givens::InventedNames$package.d.*/ = given_List_T/*->givens::InventedNames$package.given_List_T().*/[Int/*->scala::Int#*/] val e/*<-givens::InventedNames$package.e.*/ = given_Char/*->givens::InventedNames$package.given_Char.*/ val f/*<-givens::InventedNames$package.f.*/ = given_Float/*->givens::InventedNames$package.given_Float.*/ diff --git a/tests/semanticdb/metac.expect b/tests/semanticdb/metac.expect index 98657f122255..84c3e7c6a110 100644 --- a/tests/semanticdb/metac.expect +++ b/tests/semanticdb/metac.expect @@ -2093,15 +2093,16 @@ Schema => SemanticDB v4 Uri => InventedNames.scala Text => empty Language => Scala -Symbols => 44 entries -Occurrences => 64 entries -Synthetics => 2 entries +Symbols => 45 entries +Occurrences => 66 entries +Synthetics => 3 entries Symbols: -givens/InventedNames$package. => final package object givens extends Object { self: givens.type => +23 decls } +givens/InventedNames$package. => final package object givens extends Object { self: givens.type => +24 decls } givens/InventedNames$package.`* *`. => final implicit lazy val given method * * Long givens/InventedNames$package.a. => val method a Int givens/InventedNames$package.b. => val method b String +givens/InventedNames$package.c. => val method c Double givens/InventedNames$package.d. => val method d List[Int] givens/InventedNames$package.e. => val method e Char givens/InventedNames$package.f. => val method f Float @@ -2192,6 +2193,8 @@ Occurrences: [32:8..32:16): intValue -> givens/InventedNames$package.intValue. [33:4..33:5): b <- givens/InventedNames$package.b. [33:8..33:20): given_String -> givens/InventedNames$package.given_String. +[34:4..34:5): c <- givens/InventedNames$package.c. +[34:8..34:20): given_Double -> givens/InventedNames$package.given_Double(). [35:4..35:5): d <- givens/InventedNames$package.d. [35:8..35:20): given_List_T -> givens/InventedNames$package.given_List_T(). [35:21..35:24): Int -> scala/Int# @@ -2211,6 +2214,7 @@ Occurrences: Synthetics: [24:0..24:0): => *(x$1) +[34:8..34:20):given_Double => *(intValue) [40:8..40:15):given_Y => *(given_X) expect/Issue1749.scala From 0b812bde5d8ff18badf4db6e19d950d68c9ecac7 Mon Sep 17 00:00:00 2001 From: odersky Date: Tue, 28 May 2024 18:00:27 +0200 Subject: [PATCH 295/371] Drop priority change warnings that don't qualify Drop priority change warnings if one the mentioned references does not succeed via tryImplicit. --- .../dotty/tools/dotc/typer/Implicits.scala | 20 ++++++++++++------- 1 file changed, 13 insertions(+), 7 deletions(-) diff --git a/compiler/src/dotty/tools/dotc/typer/Implicits.scala b/compiler/src/dotty/tools/dotc/typer/Implicits.scala index e3615ce40592..9c23036fa865 100644 --- a/compiler/src/dotty/tools/dotc/typer/Implicits.scala +++ b/compiler/src/dotty/tools/dotc/typer/Implicits.scala @@ -1301,9 +1301,10 @@ trait Implicits: private def searchImplicit(eligible: List[Candidate], contextual: Boolean): SearchResult = // A map that associates a priority change warning (between -source 3.4 and 3.6) - // with a candidate ref mentioned in the warning. We report the associated - // message if the candidate ref is part of the result of the implicit search - var priorityChangeWarnings = mutable.ListBuffer[(TermRef, Message)]() + // with the candidate refs mentioned in the warning. We report the associated + // message if both candidates qualify in tryImplicit and at least one of the candidates + // is part of the result of the implicit search. + val priorityChangeWarnings = mutable.ListBuffer[(TermRef, TermRef, Message)]() /** Compare `alt1` with `alt2` to determine which one should be chosen. * @@ -1322,7 +1323,7 @@ trait Implicits: def compareAlternatives(alt1: RefAndLevel, alt2: RefAndLevel): Int = def comp(using Context) = explore(compare(alt1.ref, alt2.ref, preferGeneral = true)) def warn(msg: Message) = - priorityChangeWarnings += (alt1.ref -> msg) += (alt2.ref -> msg) + priorityChangeWarnings += ((alt1.ref, alt2.ref, msg)) if alt1.ref eq alt2.ref then 0 else if alt1.level != alt2.level then alt1.level - alt2.level else @@ -1440,7 +1441,11 @@ trait Implicits: // need a candidate better than `cand` healAmbiguous(fail, newCand => compareAlternatives(newCand, cand) > 0) - else rank(remaining, found, fail :: rfailures) + else + // keep only warnings that don't involve the failed candidate reference + priorityChangeWarnings.filterInPlace: (ref1, ref2, _) => + ref1 != cand.ref && ref2 != cand.ref + rank(remaining, found, fail :: rfailures) case best: SearchSuccess => if (ctx.mode.is(Mode.ImplicitExploration) || isCoherent) best @@ -1596,8 +1601,9 @@ trait Implicits: throw ex val result = rank(sort(eligible), NoMatchingImplicitsFailure, Nil) - for (ref, msg) <- priorityChangeWarnings do - if result.found.contains(ref) then report.warning(msg, srcPos) + for (ref1, ref2, msg) <- priorityChangeWarnings do + if result.found.exists(ref => ref == ref1 || ref == ref2) then + report.warning(msg, srcPos) result end searchImplicit From 1d993a7099e102233125b210f54da4c33854f2e7 Mon Sep 17 00:00:00 2001 From: odersky Date: Tue, 28 May 2024 18:40:55 +0200 Subject: [PATCH 296/371] Add test for #20484 --- tests/pos/i20484.scala | 3 +++ 1 file changed, 3 insertions(+) create mode 100644 tests/pos/i20484.scala diff --git a/tests/pos/i20484.scala b/tests/pos/i20484.scala new file mode 100644 index 000000000000..2f02e6206101 --- /dev/null +++ b/tests/pos/i20484.scala @@ -0,0 +1,3 @@ +given Int = ??? +given Char = ??? +val a = summon[Int] \ No newline at end of file From 3677eaf8d24ecc1b0b95aac63471e42c025bea71 Mon Sep 17 00:00:00 2001 From: Som Snytt Date: Wed, 3 Jul 2024 04:42:33 -0700 Subject: [PATCH 297/371] Use final result type to check selector bound --- compiler/src/dotty/tools/dotc/transform/CheckUnused.scala | 2 +- tests/pos/i20860.scala | 3 +++ 2 files changed, 4 insertions(+), 1 deletion(-) create mode 100644 tests/pos/i20860.scala diff --git a/compiler/src/dotty/tools/dotc/transform/CheckUnused.scala b/compiler/src/dotty/tools/dotc/transform/CheckUnused.scala index bd4ef73d6eea..ba77167de736 100644 --- a/compiler/src/dotty/tools/dotc/transform/CheckUnused.scala +++ b/compiler/src/dotty/tools/dotc/transform/CheckUnused.scala @@ -728,7 +728,7 @@ object CheckUnused: if selector.isGiven then // Further check that the symbol is a given or implicit and conforms to the bound sym.isOneOf(Given | Implicit) - && (selector.bound.isEmpty || sym.info <:< selector.boundTpe) + && (selector.bound.isEmpty || sym.info.finalResultType <:< selector.boundTpe) else // Normal wildcard, check that the symbol is not a given (but can be implicit) !sym.is(Given) diff --git a/tests/pos/i20860.scala b/tests/pos/i20860.scala new file mode 100644 index 000000000000..1e1ddea11b75 --- /dev/null +++ b/tests/pos/i20860.scala @@ -0,0 +1,3 @@ +def `i20860 use result to check selector bound`: Unit = + import Ordering.Implicits.given Ordering[?] + summon[Ordering[Seq[Int]]] From 876b64810cca7b3282643c6bafe1e0ef05b07e46 Mon Sep 17 00:00:00 2001 From: Wojciech Mazur Date: Wed, 3 Jul 2024 12:06:30 +0200 Subject: [PATCH 298/371] Add changelog for 3.5.0-RC3 --- changelogs/3.5.0-RC3.md | 26 ++++++++++++++++++++++++++ 1 file changed, 26 insertions(+) create mode 100644 changelogs/3.5.0-RC3.md diff --git a/changelogs/3.5.0-RC3.md b/changelogs/3.5.0-RC3.md new file mode 100644 index 000000000000..a7a2d164d5a7 --- /dev/null +++ b/changelogs/3.5.0-RC3.md @@ -0,0 +1,26 @@ +# Backported fixes + +- Release .zip instead of .tar.gz for windows in sdkman [#20630](https://github.com/scala/scala3/pull/20630) +- SIP 46 - read classpath from file, remove lib directory in distribution [#20631](https://github.com/scala/scala3/pull/20631) +- Bump scala-cli to 1.4.0 [#20859](https://github.com/scala/scala3/pull/20859) +- Priority warning fix alternative [#20487](https://github.com/scala/scala3/pull/20487) +- Add --skip-cli-updates by default to the scala command [#20900](https://github.com/scala/scala3/pull/20900) +- Upgrade Scala 2 to 2.13.14 (was 2.13.12) [#20902](https://github.com/scala/scala3/pull/20902) +- fix issue 20901: etaCollapse context bound type [#20910](https://github.com/scala/scala3/pull/20910) +- Use final result type to check selector bound [#20989](https://github.com/scala/scala3/pull/20989) + +# Contributors + +Thank you to all the contributors who made this release possible 🎉 + +According to `git shortlog -sn --no-merges 3.5.0-RC2..3.5.0-RC3` these are: + +``` + 6 Wojciech Mazur + 5 Jamie Thompson + 5 Martin Odersky + 4 Hamza Remmal + 1 Hamza REMMAL + 1 Seth Tisue + 1 Som Snytt +``` From 6abb51aca2c028c4e523e7b5a11ce082acf87bd2 Mon Sep 17 00:00:00 2001 From: Wojciech Mazur Date: Thu, 4 Jul 2024 12:29:47 +0200 Subject: [PATCH 299/371] Release 3.5.0-RC3 --- project/Build.scala | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/project/Build.scala b/project/Build.scala index 41e5f3e082f5..f994ae74cb95 100644 --- a/project/Build.scala +++ b/project/Build.scala @@ -88,7 +88,7 @@ object Build { val referenceVersion = "3.4.2" - val baseVersion = "3.5.0-RC2" + val baseVersion = "3.5.0-RC3" // LTS or Next val versionLine = "Next" From 7a19b325da0983019e45a46d41203e73e71452c0 Mon Sep 17 00:00:00 2001 From: Wojciech Mazur Date: Wed, 29 May 2024 18:26:08 +0200 Subject: [PATCH 300/371] Fix symbol reference retrivial of `scala.caps.Caps` - it was changed from opaque type to class in #18463 --- compiler/src/dotty/tools/dotc/core/Definitions.scala | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/compiler/src/dotty/tools/dotc/core/Definitions.scala b/compiler/src/dotty/tools/dotc/core/Definitions.scala index 11a4a8473e79..6a1332e91afb 100644 --- a/compiler/src/dotty/tools/dotc/core/Definitions.scala +++ b/compiler/src/dotty/tools/dotc/core/Definitions.scala @@ -991,7 +991,7 @@ class Definitions { @tu lazy val CapsModule: Symbol = requiredModule("scala.caps") @tu lazy val captureRoot: TermSymbol = CapsModule.requiredValue("cap") - @tu lazy val Caps_Cap: TypeSymbol = CapsModule.requiredType("Cap") + @tu lazy val Caps_Cap: TypeSymbol = requiredClass("scala.caps.Cap") @tu lazy val Caps_reachCapability: TermSymbol = CapsModule.requiredMethod("reachCapability") @tu lazy val CapsUnsafeModule: Symbol = requiredModule("scala.caps.unsafe") @tu lazy val Caps_unsafeAssumePure: Symbol = CapsUnsafeModule.requiredMethod("unsafeAssumePure") From ec87e7deebeae2f95ce003aba66a23ef15ed962b Mon Sep 17 00:00:00 2001 From: odersky Date: Fri, 5 Jul 2024 14:13:45 +0200 Subject: [PATCH 301/371] Refine implicit priority change warnings Fixes #21036 Fixes #20572 --- .../dotty/tools/dotc/typer/Implicits.scala | 32 +++++++++++++------ tests/neg/given-triangle.check | 4 +++ tests/{warn => neg}/given-triangle.scala | 4 +-- tests/{warn => pos}/bson/Test.scala | 0 tests/{warn => pos}/bson/bson.scala | 0 tests/pos/i20572.scala | 7 ++++ tests/pos/i21036.scala | 16 ++++++++++ tests/run/given-triangle.scala | 2 +- tests/warn/bson.check | 10 ------ tests/warn/given-triangle.check | 6 ---- 10 files changed, 51 insertions(+), 30 deletions(-) create mode 100644 tests/neg/given-triangle.check rename tests/{warn => neg}/given-triangle.scala (73%) rename tests/{warn => pos}/bson/Test.scala (100%) rename tests/{warn => pos}/bson/bson.scala (100%) create mode 100644 tests/pos/i20572.scala create mode 100644 tests/pos/i21036.scala delete mode 100644 tests/warn/bson.check delete mode 100644 tests/warn/given-triangle.check diff --git a/compiler/src/dotty/tools/dotc/typer/Implicits.scala b/compiler/src/dotty/tools/dotc/typer/Implicits.scala index 9c23036fa865..f997ab52fa64 100644 --- a/compiler/src/dotty/tools/dotc/typer/Implicits.scala +++ b/compiler/src/dotty/tools/dotc/typer/Implicits.scala @@ -1302,9 +1302,8 @@ trait Implicits: // A map that associates a priority change warning (between -source 3.4 and 3.6) // with the candidate refs mentioned in the warning. We report the associated - // message if both candidates qualify in tryImplicit and at least one of the candidates - // is part of the result of the implicit search. - val priorityChangeWarnings = mutable.ListBuffer[(TermRef, TermRef, Message)]() + // message if one of the critical candidates is part of the result of the implicit search. + val priorityChangeWarnings = mutable.ListBuffer[(/*critical:*/ List[TermRef], Message)]() /** Compare `alt1` with `alt2` to determine which one should be chosen. * @@ -1319,11 +1318,16 @@ trait Implicits: * return new result with preferGeneral = true * 3.6 and higher: compare with preferGeneral = true * + * @param only2ndCritical If true only the second alternative is critical in case + * of a priority change. */ - def compareAlternatives(alt1: RefAndLevel, alt2: RefAndLevel): Int = + def compareAlternatives(alt1: RefAndLevel, alt2: RefAndLevel, only2ndCritical: Boolean = false): Int = def comp(using Context) = explore(compare(alt1.ref, alt2.ref, preferGeneral = true)) def warn(msg: Message) = - priorityChangeWarnings += ((alt1.ref, alt2.ref, msg)) + val critical = + if only2ndCritical then alt2.ref :: Nil + else alt1.ref :: alt2.ref :: Nil + priorityChangeWarnings += ((critical, msg)) if alt1.ref eq alt2.ref then 0 else if alt1.level != alt2.level then alt1.level - alt2.level else @@ -1443,8 +1447,8 @@ trait Implicits: compareAlternatives(newCand, cand) > 0) else // keep only warnings that don't involve the failed candidate reference - priorityChangeWarnings.filterInPlace: (ref1, ref2, _) => - ref1 != cand.ref && ref2 != cand.ref + priorityChangeWarnings.filterInPlace: (critical, _) => + !critical.contains(cand.ref) rank(remaining, found, fail :: rfailures) case best: SearchSuccess => if (ctx.mode.is(Mode.ImplicitExploration) || isCoherent) @@ -1454,7 +1458,15 @@ trait Implicits: val newPending = if (retained eq found) || remaining.isEmpty then remaining else remaining.filterConserve(cand => - compareAlternatives(retained, cand) <= 0) + compareAlternatives(retained, cand, only2ndCritical = true) <= 0) + // Here we drop some pending alternatives but retain in each case + // `retained`. Therefore, it's a priorty change only if the + // second alternative appears in the final search result. Otherwise + // we have the following scenario: + // - 1st alternative, bit not snd appears in final result + // - Hence, snd was eliminated either here, or otherwise by a direct + // comparison later. + // - Hence, no change in resolution. rank(newPending, retained, rfailures) case fail: SearchFailure => // The ambiguity happened in the current search: to recover we @@ -1601,8 +1613,8 @@ trait Implicits: throw ex val result = rank(sort(eligible), NoMatchingImplicitsFailure, Nil) - for (ref1, ref2, msg) <- priorityChangeWarnings do - if result.found.exists(ref => ref == ref1 || ref == ref2) then + for (critical, msg) <- priorityChangeWarnings do + if result.found.exists(critical.contains(_)) then report.warning(msg, srcPos) result end searchImplicit diff --git a/tests/neg/given-triangle.check b/tests/neg/given-triangle.check new file mode 100644 index 000000000000..bf92efac17fd --- /dev/null +++ b/tests/neg/given-triangle.check @@ -0,0 +1,4 @@ +-- [E172] Type Error: tests/neg/given-triangle.scala:14:18 ------------------------------------------------------------- +14 |@main def Test = f // error + | ^ + |Ambiguous given instances: both given instance given_B and given instance given_C match type A of parameter a of method f diff --git a/tests/warn/given-triangle.scala b/tests/neg/given-triangle.scala similarity index 73% rename from tests/warn/given-triangle.scala rename to tests/neg/given-triangle.scala index ee4888ed1e06..9cc23104fcce 100644 --- a/tests/warn/given-triangle.scala +++ b/tests/neg/given-triangle.scala @@ -1,5 +1,3 @@ -//> using options -source 3.6-migration - class A class B extends A class C extends A @@ -13,4 +11,4 @@ def f(using a: A, b: B, c: C) = println(b.getClass) println(c.getClass) -@main def Test = f // warn +@main def Test = f // error diff --git a/tests/warn/bson/Test.scala b/tests/pos/bson/Test.scala similarity index 100% rename from tests/warn/bson/Test.scala rename to tests/pos/bson/Test.scala diff --git a/tests/warn/bson/bson.scala b/tests/pos/bson/bson.scala similarity index 100% rename from tests/warn/bson/bson.scala rename to tests/pos/bson/bson.scala diff --git a/tests/pos/i20572.scala b/tests/pos/i20572.scala new file mode 100644 index 000000000000..4ee4490c839c --- /dev/null +++ b/tests/pos/i20572.scala @@ -0,0 +1,7 @@ +//> using options -Werror +trait Writes[T] +trait Format[T] extends Writes[T] +given [T: List]: Writes[T] = null +given [T]: Format[T] = null + +val _ = summon[Writes[Int]] diff --git a/tests/pos/i21036.scala b/tests/pos/i21036.scala new file mode 100644 index 000000000000..1c98346e4ef3 --- /dev/null +++ b/tests/pos/i21036.scala @@ -0,0 +1,16 @@ +//> using options -source 3.5 -Werror +trait SameRuntime[A, B] +trait BSONWriter[T] +trait BSONHandler[T] extends BSONWriter[T] + +opaque type Id = String +object Id: + given SameRuntime[Id, String] = ??? + +given BSONHandler[String] = ??? +given [T: BSONHandler]: BSONHandler[List[T]] = ??? + +given opaqueWriter[T, A](using rs: SameRuntime[T, A], writer: BSONWriter[A]): BSONWriter[T] = ??? + +val x = summon[BSONHandler[List[Id]]] // this doesn't emit warning +val y = summon[BSONWriter[List[Id]]] // this did emit warning diff --git a/tests/run/given-triangle.scala b/tests/run/given-triangle.scala index 5ddba8df8b7b..0b483e87f28c 100644 --- a/tests/run/given-triangle.scala +++ b/tests/run/given-triangle.scala @@ -1,4 +1,4 @@ -import language.future +import language.`3.6` class A class B extends A diff --git a/tests/warn/bson.check b/tests/warn/bson.check deleted file mode 100644 index 258ac4b4ff2c..000000000000 --- a/tests/warn/bson.check +++ /dev/null @@ -1,10 +0,0 @@ --- Warning: tests/warn/bson/Test.scala:5:60 ---------------------------------------------------------------------------- -5 |def typedMapHandler[K, V: BSONHandler] = stringMapHandler[V] // warn - | ^ - |Given search preference for bson.BSONWriter[Map[String, V]] between alternatives (bson.BSONWriter.mapWriter : [V²](using x$1: bson.BSONWriter[V²]): bson.BSONDocumentWriter[Map[String, V²]]) and (bson.BSONWriter.collectionWriter : - | [T, Repr <: Iterable[T]](using x$1: bson.BSONWriter[T], x$2: Repr ¬ Option[T]): bson.BSONWriter[Repr]) will change - |Current choice : the first alternative - |New choice from Scala 3.6: none - it's ambiguous - | - |where: V is a type in method typedMapHandler - | V² is a type variable diff --git a/tests/warn/given-triangle.check b/tests/warn/given-triangle.check deleted file mode 100644 index e849f9d4d642..000000000000 --- a/tests/warn/given-triangle.check +++ /dev/null @@ -1,6 +0,0 @@ --- Warning: tests/warn/given-triangle.scala:16:18 ---------------------------------------------------------------------- -16 |@main def Test = f // warn - | ^ - | Change in given search preference for A between alternatives (given_A : A) and (given_B : B) - | Previous choice : the second alternative - | New choice from Scala 3.6: the first alternative From 3e1ed72f299a6713da26a601d152dd26e671470f Mon Sep 17 00:00:00 2001 From: odersky Date: Fri, 5 Jul 2024 16:21:22 +0200 Subject: [PATCH 302/371] Fix -source for neg test --- tests/neg/given-triangle.check | 4 ++-- tests/neg/given-triangle.scala | 1 + 2 files changed, 3 insertions(+), 2 deletions(-) diff --git a/tests/neg/given-triangle.check b/tests/neg/given-triangle.check index bf92efac17fd..f548df0078de 100644 --- a/tests/neg/given-triangle.check +++ b/tests/neg/given-triangle.check @@ -1,4 +1,4 @@ --- [E172] Type Error: tests/neg/given-triangle.scala:14:18 ------------------------------------------------------------- -14 |@main def Test = f // error +-- [E172] Type Error: tests/neg/given-triangle.scala:15:18 ------------------------------------------------------------- +15 |@main def Test = f // error | ^ |Ambiguous given instances: both given instance given_B and given instance given_C match type A of parameter a of method f diff --git a/tests/neg/given-triangle.scala b/tests/neg/given-triangle.scala index 9cc23104fcce..61273ef93925 100644 --- a/tests/neg/given-triangle.scala +++ b/tests/neg/given-triangle.scala @@ -1,3 +1,4 @@ +//> using -source 3.5 class A class B extends A class C extends A From 450d233997354986dde7a627a237f3a13edcfb61 Mon Sep 17 00:00:00 2001 From: odersky Date: Fri, 5 Jul 2024 16:53:27 +0200 Subject: [PATCH 303/371] Filter out more false positives in priority change warnings --- .../dotty/tools/dotc/typer/Implicits.scala | 20 +++++++++++-------- 1 file changed, 12 insertions(+), 8 deletions(-) diff --git a/compiler/src/dotty/tools/dotc/typer/Implicits.scala b/compiler/src/dotty/tools/dotc/typer/Implicits.scala index f997ab52fa64..86be195fae43 100644 --- a/compiler/src/dotty/tools/dotc/typer/Implicits.scala +++ b/compiler/src/dotty/tools/dotc/typer/Implicits.scala @@ -1318,16 +1318,14 @@ trait Implicits: * return new result with preferGeneral = true * 3.6 and higher: compare with preferGeneral = true * + * @param disambiguate The call is used to disambiguate two successes, not for ranking. + * When ranking, we are always filtering out either > 0 or <= 0 results. + * In each case a priority change from 0 to -1 or vice versa makes no difference. * @param only2ndCritical If true only the second alternative is critical in case * of a priority change. */ - def compareAlternatives(alt1: RefAndLevel, alt2: RefAndLevel, only2ndCritical: Boolean = false): Int = + def compareAlternatives(alt1: RefAndLevel, alt2: RefAndLevel, disambiguate: Boolean = false, only2ndCritical: Boolean = false): Int = def comp(using Context) = explore(compare(alt1.ref, alt2.ref, preferGeneral = true)) - def warn(msg: Message) = - val critical = - if only2ndCritical then alt2.ref :: Nil - else alt1.ref :: alt2.ref :: Nil - priorityChangeWarnings += ((critical, msg)) if alt1.ref eq alt2.ref then 0 else if alt1.level != alt2.level then alt1.level - alt2.level else @@ -1336,6 +1334,12 @@ trait Implicits: if sv.stable == SourceVersion.`3.5` || sv == SourceVersion.`3.6-migration` then val prev = comp(using searchContext().addMode(Mode.OldImplicitResolution)) if cmp != prev then + def warn(msg: Message) = + if disambiguate || cmp > 0 || prev > 0 then + val critical = + if only2ndCritical then alt2.ref :: Nil + else alt1.ref :: alt2.ref :: Nil + priorityChangeWarnings += ((critical, msg)) def choice(c: Int) = c match case -1 => "the second alternative" case 1 => "the first alternative" @@ -1362,7 +1366,7 @@ trait Implicits: */ def disambiguate(alt1: SearchResult, alt2: SearchSuccess) = alt1 match case alt1: SearchSuccess => - var diff = compareAlternatives(alt1, alt2) + var diff = compareAlternatives(alt1, alt2, disambiguate = true) assert(diff <= 0) // diff > 0 candidates should already have been eliminated in `rank` if diff == 0 && alt1.ref =:= alt2.ref then diff = 1 // See i12951 for a test where this happens @@ -1463,7 +1467,7 @@ trait Implicits: // `retained`. Therefore, it's a priorty change only if the // second alternative appears in the final search result. Otherwise // we have the following scenario: - // - 1st alternative, bit not snd appears in final result + // - 1st alternative, but not snd appears in final result // - Hence, snd was eliminated either here, or otherwise by a direct // comparison later. // - Hence, no change in resolution. From acffad65f474deab08cf32b90e99fd87e3d2c18c Mon Sep 17 00:00:00 2001 From: odersky Date: Mon, 8 Jul 2024 19:36:27 +0200 Subject: [PATCH 304/371] Fix priority change logic for ranking As worked out in collaboration with @EugeneFlesselle --- .../dotty/tools/dotc/typer/Implicits.scala | 37 ++++++++----------- tests/warn/i21036a.check | 6 +++ tests/warn/i21036a.scala | 7 ++++ tests/warn/i21036b.check | 6 +++ tests/warn/i21036b.scala | 7 ++++ 5 files changed, 41 insertions(+), 22 deletions(-) create mode 100644 tests/warn/i21036a.check create mode 100644 tests/warn/i21036a.scala create mode 100644 tests/warn/i21036b.check create mode 100644 tests/warn/i21036b.scala diff --git a/compiler/src/dotty/tools/dotc/typer/Implicits.scala b/compiler/src/dotty/tools/dotc/typer/Implicits.scala index 86be195fae43..45c8731c553e 100644 --- a/compiler/src/dotty/tools/dotc/typer/Implicits.scala +++ b/compiler/src/dotty/tools/dotc/typer/Implicits.scala @@ -1305,6 +1305,9 @@ trait Implicits: // message if one of the critical candidates is part of the result of the implicit search. val priorityChangeWarnings = mutable.ListBuffer[(/*critical:*/ List[TermRef], Message)]() + def isWarnPriorityChangeVersion(sv: SourceVersion): Boolean = + sv.stable == SourceVersion.`3.5` || sv == SourceVersion.`3.6-migration` + /** Compare `alt1` with `alt2` to determine which one should be chosen. * * @return a number > 0 if `alt1` is preferred over `alt2` @@ -1321,25 +1324,21 @@ trait Implicits: * @param disambiguate The call is used to disambiguate two successes, not for ranking. * When ranking, we are always filtering out either > 0 or <= 0 results. * In each case a priority change from 0 to -1 or vice versa makes no difference. - * @param only2ndCritical If true only the second alternative is critical in case - * of a priority change. */ - def compareAlternatives(alt1: RefAndLevel, alt2: RefAndLevel, disambiguate: Boolean = false, only2ndCritical: Boolean = false): Int = + def compareAlternatives(alt1: RefAndLevel, alt2: RefAndLevel, disambiguate: Boolean = false): Int = def comp(using Context) = explore(compare(alt1.ref, alt2.ref, preferGeneral = true)) if alt1.ref eq alt2.ref then 0 else if alt1.level != alt2.level then alt1.level - alt2.level else var cmp = comp(using searchContext()) val sv = Feature.sourceVersion - if sv.stable == SourceVersion.`3.5` || sv == SourceVersion.`3.6-migration` then + if isWarnPriorityChangeVersion(sv) then val prev = comp(using searchContext().addMode(Mode.OldImplicitResolution)) - if cmp != prev then + if disambiguate && cmp != prev then def warn(msg: Message) = - if disambiguate || cmp > 0 || prev > 0 then - val critical = - if only2ndCritical then alt2.ref :: Nil - else alt1.ref :: alt2.ref :: Nil - priorityChangeWarnings += ((critical, msg)) + val critical = alt1.ref :: alt2.ref :: Nil + priorityChangeWarnings += ((critical, msg)) + implicits.println(i"PRIORITY CHANGE ${alt1.ref}, ${alt2.ref}, $disambiguate") def choice(c: Int) = c match case -1 => "the second alternative" case 1 => "the first alternative" @@ -1356,7 +1355,9 @@ trait Implicits: |Previous choice : ${choice(prev)} |New choice from Scala 3.6: ${choice(cmp)}""") cmp - else cmp + else cmp max prev + // When ranking, we keep the better of cmp and prev, which ends up retaining a candidate + // if it is retained in either version. else cmp end compareAlternatives @@ -1367,7 +1368,8 @@ trait Implicits: def disambiguate(alt1: SearchResult, alt2: SearchSuccess) = alt1 match case alt1: SearchSuccess => var diff = compareAlternatives(alt1, alt2, disambiguate = true) - assert(diff <= 0) // diff > 0 candidates should already have been eliminated in `rank` + assert(diff <= 0 || isWarnPriorityChangeVersion(Feature.sourceVersion)) + // diff > 0 candidates should already have been eliminated in `rank` if diff == 0 && alt1.ref =:= alt2.ref then diff = 1 // See i12951 for a test where this happens else if diff == 0 && alt2.isExtension then @@ -1461,16 +1463,7 @@ trait Implicits: case retained: SearchSuccess => val newPending = if (retained eq found) || remaining.isEmpty then remaining - else remaining.filterConserve(cand => - compareAlternatives(retained, cand, only2ndCritical = true) <= 0) - // Here we drop some pending alternatives but retain in each case - // `retained`. Therefore, it's a priorty change only if the - // second alternative appears in the final search result. Otherwise - // we have the following scenario: - // - 1st alternative, but not snd appears in final result - // - Hence, snd was eliminated either here, or otherwise by a direct - // comparison later. - // - Hence, no change in resolution. + else remaining.filterConserve(newCand => compareAlternatives(newCand, retained) >= 0) rank(newPending, retained, rfailures) case fail: SearchFailure => // The ambiguity happened in the current search: to recover we diff --git a/tests/warn/i21036a.check b/tests/warn/i21036a.check new file mode 100644 index 000000000000..673c01374ef3 --- /dev/null +++ b/tests/warn/i21036a.check @@ -0,0 +1,6 @@ +-- Warning: tests/warn/i21036a.scala:7:17 ------------------------------------------------------------------------------ +7 |val y = summon[A] // warn + | ^ + | Given search preference for A between alternatives (b : B) and (a : A) will change + | Current choice : the first alternative + | New choice from Scala 3.6: the second alternative diff --git a/tests/warn/i21036a.scala b/tests/warn/i21036a.scala new file mode 100644 index 000000000000..ab97429852d6 --- /dev/null +++ b/tests/warn/i21036a.scala @@ -0,0 +1,7 @@ +//> using options -source 3.5 +trait A +trait B extends A +given b: B = ??? +given a: A = ??? + +val y = summon[A] // warn \ No newline at end of file diff --git a/tests/warn/i21036b.check b/tests/warn/i21036b.check new file mode 100644 index 000000000000..ff7fdfd7a87c --- /dev/null +++ b/tests/warn/i21036b.check @@ -0,0 +1,6 @@ +-- Warning: tests/warn/i21036b.scala:7:17 ------------------------------------------------------------------------------ +7 |val y = summon[A] // warn + | ^ + | Change in given search preference for A between alternatives (b : B) and (a : A) + | Previous choice : the first alternative + | New choice from Scala 3.6: the second alternative diff --git a/tests/warn/i21036b.scala b/tests/warn/i21036b.scala new file mode 100644 index 000000000000..16dd72266613 --- /dev/null +++ b/tests/warn/i21036b.scala @@ -0,0 +1,7 @@ +//> using options -source 3.6-migration +trait A +trait B extends A +given b: B = ??? +given a: A = ??? + +val y = summon[A] // warn \ No newline at end of file From dc9246aa12bd317fb678eabdf6c6c4df859ecf83 Mon Sep 17 00:00:00 2001 From: odersky Date: Fri, 5 Jul 2024 18:22:45 +0200 Subject: [PATCH 305/371] Fix -source for neg test (2) --- tests/neg/given-triangle.scala | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/tests/neg/given-triangle.scala b/tests/neg/given-triangle.scala index 61273ef93925..16aca7c44dee 100644 --- a/tests/neg/given-triangle.scala +++ b/tests/neg/given-triangle.scala @@ -1,4 +1,4 @@ -//> using -source 3.5 +//> using options -source 3.5 class A class B extends A class C extends A From 22d9df094ca5d385289d11225ac6183900f20bdd Mon Sep 17 00:00:00 2001 From: Hamza REMMAL Date: Mon, 8 Jul 2024 15:22:57 +0200 Subject: [PATCH 306/371] Use pathing jars in cli commands --- dist/bin/common | 20 ++------------------ dist/bin/common.bat | 2 +- dist/bin/scalac.bat | 31 ++----------------------------- dist/bin/scaladoc | 11 ++--------- dist/bin/scaladoc.bat | 27 +-------------------------- project/Build.scala | 2 +- project/RepublishPlugin.scala | 14 +++++++++++--- 7 files changed, 20 insertions(+), 87 deletions(-) diff --git a/dist/bin/common b/dist/bin/common index 63e598d70d7e..2de8bdf9f99a 100644 --- a/dist/bin/common +++ b/dist/bin/common @@ -6,25 +6,9 @@ source "$PROG_HOME/bin/common-shared" # * The code below is for Dotty # *-------------------------------------------------*/ -load_classpath () { - command="$1" - psep_pattern="$2" - __CLASS_PATH="" - while IFS= read -r line || [ -n "$line" ]; do - # jna-5 only appropriate for some combinations - if ! [[ ( -n ${conemu-} || -n ${msys-}) && "$line" == "*jna-5*" ]]; then - if [ -n "$__CLASS_PATH" ]; then - __CLASS_PATH+="$psep_pattern" - fi - __CLASS_PATH+="$PROG_HOME/maven2/$line" - fi - done < "$PROG_HOME/etc/$command.classpath" - echo "$__CLASS_PATH" -} - compilerJavaClasspathArgs () { - toolchain="$(load_classpath "scala" "$PSEP")" - toolchain_extra="$(load_classpath "with_compiler" "$PSEP")" + toolchain="$PROG_HOME/lib/scala.jar" + toolchain_extra="$PROG_HOME/lib/with_compiler.jar" if [ -n "$toolchain_extra" ]; then toolchain+="$PSEP$toolchain_extra" diff --git a/dist/bin/common.bat b/dist/bin/common.bat index f9c35e432b36..510771d43b6e 100644 --- a/dist/bin/common.bat +++ b/dist/bin/common.bat @@ -38,6 +38,6 @@ if not defined _PROG_HOME ( set _EXITCODE=1 goto :eof ) -set "_ETC_DIR=%_PROG_HOME%\etc" +set "_LIB_DIR=%_PROG_HOME%\lib" set _PSEP=; diff --git a/dist/bin/scalac.bat b/dist/bin/scalac.bat index dbcbaf11b8e2..7ad368582127 100644 --- a/dist/bin/scalac.bat +++ b/dist/bin/scalac.bat @@ -88,17 +88,8 @@ goto :eof @rem output parameter: _JVM_CP_ARGS :compilerJavaClasspathArgs - -set "CP_FILE=%_ETC_DIR%\scala.classpath" -call :loadClasspathFromFile %CP_FILE% -set "__TOOLCHAIN=%_CLASS_PATH_RESULT%" - -set "CP_FILE=%_ETC_DIR%\with_compiler.classpath" -call :loadClasspathFromFile %CP_FILE% - -if defined _CLASS_PATH_RESULT ( - set "__TOOLCHAIN=%__TOOLCHAIN%%_PSEP%%_CLASS_PATH_RESULT%" -) +set "__TOOLCHAIN=%_LIB_DIR%\scala.jar" +set "__TOOLCHAIN=%__TOOLCHAIN%%_PSEP%%_LIB_DIR%\with_compiler.jar%" if defined _SCALA_CPATH ( set "_JVM_CP_ARGS=%__TOOLCHAIN%%_SCALA_CPATH%" @@ -107,24 +98,6 @@ if defined _SCALA_CPATH ( ) goto :eof -@REM concatentate every line in "%_ARG_FILE%" with _PSEP -@REM arg 1 - file to read -:loadClasspathFromFile -set _ARG_FILE=%1 -set _CLASS_PATH_RESULT= -if exist "%_ARG_FILE%" ( - for /f "usebackq delims=" %%i in ("%_ARG_FILE%") do ( - set "_LIB=%_PROG_HOME%\maven2\%%i" - set "_LIB=!_LIB:/=\!" - if not defined _CLASS_PATH_RESULT ( - set "_CLASS_PATH_RESULT=!_LIB!" - ) else ( - set "_CLASS_PATH_RESULT=!_CLASS_PATH_RESULT!%_PSEP%!_LIB!" - ) - ) -) -goto :eof - @rem ######################################################################### @rem ## Cleanups diff --git a/dist/bin/scaladoc b/dist/bin/scaladoc index 15bc0813f93a..f4ef37af00ee 100755 --- a/dist/bin/scaladoc +++ b/dist/bin/scaladoc @@ -36,6 +36,7 @@ CompilerMain=dotty.tools.dotc.Main DecompilerMain=dotty.tools.dotc.decompiler.Main ReplMain=dotty.tools.repl.Main ScriptingMain=dotty.tools.scripting.Main +JVM_CP_ARGS="$PROG_HOME/lib/scaladoc.jar" PROG_NAME=$CompilerMain @@ -52,12 +53,6 @@ addScrip() { script_args+=("'$1'") } -classpathArgs () { - CLASS_PATH="$(load_classpath "scaladoc" "$PSEP")" - - jvm_cp_args="-classpath \"$CLASS_PATH\"" -} - #for A in "$@" ; do echo "A[$A]" ; done ; exit 2 while [[ $# -gt 0 ]]; do @@ -79,12 +74,10 @@ case "$1" in esac done -classpathArgs - eval "\"$JAVACMD\"" \ ${JAVA_OPTS:-$default_java_opts} \ "${java_args[@]}" \ - "${jvm_cp_args-}" \ + -classpath "${JVM_CP_ARGS}" \ -Dscala.usejavacp=true \ "dotty.tools.scaladoc.Main" \ "${scala_args[@]}" \ diff --git a/dist/bin/scaladoc.bat b/dist/bin/scaladoc.bat index 16433a83f501..fe4055633e02 100644 --- a/dist/bin/scaladoc.bat +++ b/dist/bin/scaladoc.bat @@ -21,8 +21,6 @@ call :args %* @rem ######################################################################### @rem ## Main -call :classpathArgs - if defined JAVA_OPTS ( set _JAVA_OPTS=%JAVA_OPTS% ) else ( set _JAVA_OPTS=%_DEFAULT_JAVA_OPTS% ) @@ -31,7 +29,7 @@ if defined JAVA_OPTS ( set _JAVA_OPTS=%JAVA_OPTS% set "_JAVACMD=!_JAVACMD:%%=%%%%!" call "%_JAVACMD%" %_JAVA_OPTS% %_JAVA_DEBUG% %_JAVA_ARGS% ^ --classpath "%_CLASS_PATH%" ^ +-classpath "%_LIB_DIR%\scaladoc.jar" ^ -Dscala.usejavacp=true ^ dotty.tools.scaladoc.Main %_SCALA_ARGS% %_RESIDUAL_ARGS% if not %ERRORLEVEL%==0 ( @@ -103,29 +101,6 @@ goto :eof set _RESIDUAL_ARGS=%_RESIDUAL_ARGS% %~1 goto :eof -@rem output parameter: _CLASS_PATH -:classpathArgs -set "_ETC_DIR=%_PROG_HOME%\etc" -@rem keep list in sync with bash script `bin\scaladoc` ! -call :loadClasspathFromFile -goto :eof - -@REM concatentate every line in "%_ETC_DIR%\scaladoc.classpath" with _PSEP -:loadClasspathFromFile -set _CLASS_PATH= -if exist "%_ETC_DIR%\scaladoc.classpath" ( - for /f "usebackq delims=" %%i in ("%_ETC_DIR%\scaladoc.classpath") do ( - set "_LIB=%_PROG_HOME%\maven2\%%i" - set "_LIB=!_LIB:/=\!" - if not defined _CLASS_PATH ( - set "_CLASS_PATH=!_LIB!" - ) else ( - set "_CLASS_PATH=!_CLASS_PATH!%_PSEP%!_LIB!" - ) - ) -) -goto :eof - @rem ######################################################################### @rem ## Cleanups diff --git a/project/Build.scala b/project/Build.scala index f994ae74cb95..7a3154477f21 100644 --- a/project/Build.scala +++ b/project/Build.scala @@ -2131,7 +2131,7 @@ object Build { republishRepo := target.value / "republish", packResourceDir += (republishRepo.value / "bin" -> "bin"), packResourceDir += (republishRepo.value / "maven2" -> "maven2"), - packResourceDir += (republishRepo.value / "etc" -> "etc"), + packResourceDir += (republishRepo.value / "lib" -> "lib"), republishCommandLibs += ("scala" -> List("scala3-interfaces", "scala3-compiler", "scala3-library", "tasty-core")), republishCommandLibs += diff --git a/project/RepublishPlugin.scala b/project/RepublishPlugin.scala index e4bf40545a6b..8b95c6423e68 100644 --- a/project/RepublishPlugin.scala +++ b/project/RepublishPlugin.scala @@ -213,16 +213,24 @@ object RepublishPlugin extends AutoPlugin { val classpaths = coursierFetch(coursierJar, log, csrCacheDir, localRepo, resolvedLocal.map(_.id.toString)) if (commandLibs.nonEmpty) { - IO.createDirectory(republishDir / "etc") + IO.createDirectory(republishDir / "lib") for ((command, libs) <- commandLibs) { val (negated, actual) = libs.partition(_.startsWith("^!")) val subtractions = negated.map(_.stripPrefix("^!")) def compose(libs: List[String]): List[String] = libs.map(fuzzyFind(classpaths, _)).reduceOption(_ ++ _).map(_.distinct).getOrElse(Nil) - + + // Compute the classpath entries val entries = compose(actual).diff(compose(subtractions)) - IO.write(republishDir / "etc" / s"$command.classpath", entries.mkString("\n")) + // Generate the MANIFEST for the pathing jar + val manifest = new java.util.jar.Manifest(); + manifest.getMainAttributes().put(java.util.jar.Attributes.Name.MANIFEST_VERSION, "1.0"); + manifest.getMainAttributes().put(java.util.jar.Attributes.Name.CLASS_PATH, entries.map(e => s"../maven2/$e").mkString(" ")) + // Write the pathing jar to the Disk + val file = republishDir / "lib" / s"$command.jar" + val jar = new java.util.jar.JarOutputStream(new java.io.FileOutputStream(file), manifest) + jar.close() } } From 1910ea91d0dc8af5047f5a845ce09476cfa86183 Mon Sep 17 00:00:00 2001 From: Hamza REMMAL Date: Mon, 8 Jul 2024 19:18:51 +0200 Subject: [PATCH 307/371] Add support for Class-Path entries in Manifest --- .../dotc/classpath/ClassPathFactory.scala | 21 +++++++++++++++++-- compiler/src/dotty/tools/io/ClassPath.scala | 13 ++++++++---- dist/bin/scalac | 1 + dist/bin/scalac.bat | 2 +- dist/bin/scaladoc | 1 + dist/bin/scaladoc.bat | 1 + project/RepublishPlugin.scala | 2 +- 7 files changed, 33 insertions(+), 8 deletions(-) diff --git a/compiler/src/dotty/tools/dotc/classpath/ClassPathFactory.scala b/compiler/src/dotty/tools/dotc/classpath/ClassPathFactory.scala index 0b66f339bf53..080f8d4e63d2 100644 --- a/compiler/src/dotty/tools/dotc/classpath/ClassPathFactory.scala +++ b/compiler/src/dotty/tools/dotc/classpath/ClassPathFactory.scala @@ -7,6 +7,7 @@ import dotty.tools.io.{AbstractFile, VirtualDirectory} import FileUtils.* import dotty.tools.io.ClassPath import dotty.tools.dotc.core.Contexts.* +import java.nio.file.Files /** * Provides factory methods for classpath. When creating classpath instances for a given path, @@ -52,14 +53,30 @@ class ClassPathFactory { // Internal protected def classesInPathImpl(path: String, expand: Boolean)(using Context): List[ClassPath] = - for { + val files = for { file <- expandPath(path, expand) dir <- { def asImage = if (file.endsWith(".jimage")) Some(AbstractFile.getFile(file)) else None Option(AbstractFile.getDirectory(file)).orElse(asImage) } } - yield newClassPath(dir) + yield dir + + val expanded = + if scala.util.Properties.propOrFalse("scala.expandjavacp") then + for + file <- files + a <- ClassPath.expandManifestPath(file.absolutePath) + path = java.nio.file.Paths.get(a.toURI()).nn + if Files.exists(path) + yield + newClassPath(AbstractFile.getFile(path)) + else + Seq.empty + + files.map(newClassPath) ++ expanded + + end classesInPathImpl private def createSourcePath(file: AbstractFile)(using Context): ClassPath = if (file.isJarOrZip) diff --git a/compiler/src/dotty/tools/io/ClassPath.scala b/compiler/src/dotty/tools/io/ClassPath.scala index f77bc1efca91..01a3f2cc1870 100644 --- a/compiler/src/dotty/tools/io/ClassPath.scala +++ b/compiler/src/dotty/tools/io/ClassPath.scala @@ -152,13 +152,18 @@ object ClassPath { val baseDir = file.parent new Jar(file).classPathElements map (elem => - specToURL(elem) getOrElse (baseDir / elem).toURL + specToURL(elem, baseDir) getOrElse (baseDir / elem).toURL ) } - def specToURL(spec: String): Option[URL] = - try Some(new URI(spec).toURL) - catch case _: MalformedURLException | _: URISyntaxException => None + def specToURL(spec: String, basedir: Directory): Option[URL] = + try + val uri = new URI(spec) + if uri.isAbsolute() then Some(uri.toURL()) + else + Some(basedir.resolve(Path(spec)).toURL) + catch + case _: MalformedURLException | _: URISyntaxException => None def manifests: List[java.net.URL] = { import scala.jdk.CollectionConverters.EnumerationHasAsScala diff --git a/dist/bin/scalac b/dist/bin/scalac index d9bd21ca425b..a527d9767749 100755 --- a/dist/bin/scalac +++ b/dist/bin/scalac @@ -86,6 +86,7 @@ eval "\"$JAVACMD\"" \ ${JAVA_OPTS:-$default_java_opts} \ "${java_args[@]}" \ "-classpath \"$jvm_cp_args\"" \ + "-Dscala.expandjavacp=true" \ "-Dscala.usejavacp=true" \ "-Dscala.home=\"$PROG_HOME\"" \ "dotty.tools.MainGenericCompiler" \ diff --git a/dist/bin/scalac.bat b/dist/bin/scalac.bat index 7ad368582127..e2898bdc2890 100644 --- a/dist/bin/scalac.bat +++ b/dist/bin/scalac.bat @@ -24,7 +24,7 @@ call :compilerJavaClasspathArgs @rem we need to escape % in the java command path, for some reason this doesnt work in common.bat set "_JAVACMD=!_JAVACMD:%%=%%%%!" -call "%_JAVACMD%" %_JAVA_ARGS% -classpath "%_JVM_CP_ARGS%" "-Dscala.usejavacp=true" "-Dscala.home=%_PROG_HOME%" dotty.tools.MainGenericCompiler %_SCALA_ARGS% +call "%_JAVACMD%" %_JAVA_ARGS% -classpath "%_JVM_CP_ARGS%" "-Dscala.usejavacp=true" "-Dscala.expandjavacp=true" "-Dscala.home=%_PROG_HOME%" dotty.tools.MainGenericCompiler %_SCALA_ARGS% if not %ERRORLEVEL%==0 ( set _EXITCODE=1 goto end diff --git a/dist/bin/scaladoc b/dist/bin/scaladoc index f4ef37af00ee..0af5a2b55acb 100755 --- a/dist/bin/scaladoc +++ b/dist/bin/scaladoc @@ -78,6 +78,7 @@ eval "\"$JAVACMD\"" \ ${JAVA_OPTS:-$default_java_opts} \ "${java_args[@]}" \ -classpath "${JVM_CP_ARGS}" \ + -Dscala.expandjavacp=true \ -Dscala.usejavacp=true \ "dotty.tools.scaladoc.Main" \ "${scala_args[@]}" \ diff --git a/dist/bin/scaladoc.bat b/dist/bin/scaladoc.bat index fe4055633e02..b9e4820b006d 100644 --- a/dist/bin/scaladoc.bat +++ b/dist/bin/scaladoc.bat @@ -30,6 +30,7 @@ set "_JAVACMD=!_JAVACMD:%%=%%%%!" call "%_JAVACMD%" %_JAVA_OPTS% %_JAVA_DEBUG% %_JAVA_ARGS% ^ -classpath "%_LIB_DIR%\scaladoc.jar" ^ +-Dscala.expandjavacp=true ^ -Dscala.usejavacp=true ^ dotty.tools.scaladoc.Main %_SCALA_ARGS% %_RESIDUAL_ARGS% if not %ERRORLEVEL%==0 ( diff --git a/project/RepublishPlugin.scala b/project/RepublishPlugin.scala index 8b95c6423e68..5611af798b33 100644 --- a/project/RepublishPlugin.scala +++ b/project/RepublishPlugin.scala @@ -220,7 +220,7 @@ object RepublishPlugin extends AutoPlugin { def compose(libs: List[String]): List[String] = libs.map(fuzzyFind(classpaths, _)).reduceOption(_ ++ _).map(_.distinct).getOrElse(Nil) - + // Compute the classpath entries val entries = compose(actual).diff(compose(subtractions)) // Generate the MANIFEST for the pathing jar From 1a1a77fcf925a82e098a854e0668b3f75eef048c Mon Sep 17 00:00:00 2001 From: Jamie Thompson Date: Wed, 10 Jul 2024 22:10:51 +0200 Subject: [PATCH 308/371] expand classpath in scala_legacy --- dist/bin/scala_legacy | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/dist/bin/scala_legacy b/dist/bin/scala_legacy index bd69d40c2b97..18fc6d874e34 100755 --- a/dist/bin/scala_legacy +++ b/dist/bin/scala_legacy @@ -65,7 +65,7 @@ done # exec here would prevent onExit from being called, leaving terminal in unusable state compilerJavaClasspathArgs [ -z "${ConEmuPID-}" -o -n "${cygwin-}" ] && export MSYSTEM= PWD= # workaround for #12405 -eval "\"$JAVACMD\"" "${java_args[@]}" "-Dscala.home=\"$PROG_HOME\"" "-classpath \"$jvm_cp_args\"" "dotty.tools.MainGenericRunner" "-classpath \"$jvm_cp_args\"" "${scala_args[@]}" +eval "\"$JAVACMD\"" "${java_args[@]}" "-Dscala.home=\"$PROG_HOME\"" "-classpath \"$jvm_cp_args\"" "-Dscala.expandjavacp=true" "dotty.tools.MainGenericRunner" "-classpath \"$jvm_cp_args\"" "${scala_args[@]}" scala_exit_status=$? From fad86e392dc45f72c326aa57461d272d95dd59f9 Mon Sep 17 00:00:00 2001 From: Wojciech Mazur Date: Thu, 11 Jul 2024 10:23:09 +0200 Subject: [PATCH 309/371] Add changelog for 3.5.0-RC4 --- changelogs/3.5.0-RC4.md | 19 +++++++++++++++++++ 1 file changed, 19 insertions(+) create mode 100644 changelogs/3.5.0-RC4.md diff --git a/changelogs/3.5.0-RC4.md b/changelogs/3.5.0-RC4.md new file mode 100644 index 000000000000..75e72870d6f4 --- /dev/null +++ b/changelogs/3.5.0-RC4.md @@ -0,0 +1,19 @@ +# Backported fixes + +- Refine implicit priority change warnings [#21045](https://github.com/scala/scala3/pull/21045) +- Use pathing jars in cli commands [#21121](https://github.com/scala/scala3/pull/21121) +- expand classpath of pathing jars in scala_legacy command [#21160](https://github.com/scala/scala3/pull/21160) +- Fix symbol reference retrivial of `scala.caps.Caps` [#20493](https://github.com/scala/scala3/pull/20493) + +# Contributors + +Thank you to all the contributors who made this release possible 🎉 + +According to `git shortlog -sn --no-merges 3.5.0-RC3..3.5.0-RC4` these are: + +``` + 5 Martin Odersky + 3 Wojciech Mazur + 2 Hamza REMMAL + 1 Jamie Thompson +``` From 97fc22c3331a8cd1aca0cd563e99328815a2a9e6 Mon Sep 17 00:00:00 2001 From: Wojciech Mazur Date: Thu, 11 Jul 2024 10:24:08 +0200 Subject: [PATCH 310/371] Release 3.5.0-RC4 --- project/Build.scala | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/project/Build.scala b/project/Build.scala index 7a3154477f21..45402aebc9c4 100644 --- a/project/Build.scala +++ b/project/Build.scala @@ -88,7 +88,7 @@ object Build { val referenceVersion = "3.4.2" - val baseVersion = "3.5.0-RC3" + val baseVersion = "3.5.0-RC4" // LTS or Next val versionLine = "Next" From a5514c58c830a79fc8e7c62f8a18299bf3fe119e Mon Sep 17 00:00:00 2001 From: Jamie Thompson Date: Fri, 12 Jul 2024 17:44:11 +0200 Subject: [PATCH 311/371] emit generatedNonLocalClass in backend when callback is not enabled --- compiler/src/dotty/tools/backend/jvm/CodeGen.scala | 11 +++++++++-- 1 file changed, 9 insertions(+), 2 deletions(-) diff --git a/compiler/src/dotty/tools/backend/jvm/CodeGen.scala b/compiler/src/dotty/tools/backend/jvm/CodeGen.scala index 2286ad6c2c25..c5b0ec0929b8 100644 --- a/compiler/src/dotty/tools/backend/jvm/CodeGen.scala +++ b/compiler/src/dotty/tools/backend/jvm/CodeGen.scala @@ -133,8 +133,15 @@ class CodeGen(val int: DottyBackendInterface, val primitives: DottyPrimitives)( if (ctx.compilerCallback != null) ctx.compilerCallback.onClassGenerated(sourceFile, convertAbstractFile(clsFile), className) - if isLocal then - ctx.withIncCallback(_.generatedLocalClass(sourceFile, clsFile.jpath)) + ctx.withIncCallback: cb => + if isLocal then + cb.generatedLocalClass(sourceFile, clsFile.jpath) + else if !cb.enabled() then + // callback is not enabled, so nonLocalClasses were not reported in ExtractAPI + val fullClassName = atPhase(sbtExtractDependenciesPhase) { + ExtractDependencies.classNameAsString(claszSymbol) + } + cb.generatedNonLocalClass(sourceFile, clsFile.jpath, className, fullClassName) } } From 73428165d3c525e3ea3aca039cf1a75eeea99c76 Mon Sep 17 00:00:00 2001 From: Jamie Thompson Date: Tue, 16 Jul 2024 19:46:43 +0200 Subject: [PATCH 312/371] add test to assert classes are still reported --- .../xsbt/ExtractUsedNamesSpecification.scala | 5 +- .../test/xsbt/ProductsSpecification.scala | 41 ++++++++ .../xsbt/ScalaCompilerForUnitTesting.scala | 95 +++++++++++-------- sbt-bridge/test/xsbti/TestCallback.scala | 4 + 4 files changed, 103 insertions(+), 42 deletions(-) create mode 100644 sbt-bridge/test/xsbt/ProductsSpecification.scala diff --git a/sbt-bridge/test/xsbt/ExtractUsedNamesSpecification.scala b/sbt-bridge/test/xsbt/ExtractUsedNamesSpecification.scala index e47371175de6..0abefe2985c3 100644 --- a/sbt-bridge/test/xsbt/ExtractUsedNamesSpecification.scala +++ b/sbt-bridge/test/xsbt/ExtractUsedNamesSpecification.scala @@ -1,7 +1,6 @@ package xsbt import xsbti.UseScope -import ScalaCompilerForUnitTesting.Callbacks import org.junit.{ Test, Ignore } import org.junit.Assert._ @@ -227,9 +226,9 @@ class ExtractUsedNamesSpecification { def findPatMatUsages(in: String): Set[String] = { val compilerForTesting = new ScalaCompilerForUnitTesting - val (_, Callbacks(callback, _)) = + val output = compilerForTesting.compileSrcs(List(List(sealedClass, in))) - val clientNames = callback.usedNamesAndScopes.view.filterKeys(!_.startsWith("base.")) + val clientNames = output.analysis.usedNamesAndScopes.view.filterKeys(!_.startsWith("base.")) val names: Set[String] = clientNames.flatMap { case (_, usages) => diff --git a/sbt-bridge/test/xsbt/ProductsSpecification.scala b/sbt-bridge/test/xsbt/ProductsSpecification.scala new file mode 100644 index 000000000000..adee351b5289 --- /dev/null +++ b/sbt-bridge/test/xsbt/ProductsSpecification.scala @@ -0,0 +1,41 @@ +package xsbt + +import org.junit.Assert.* +import org.junit.Ignore +import org.junit.Test + +import java.io.File +import java.nio.file.Path +import java.nio.file.Paths + +class ProductsSpecification { + + @Test + def extractNonLocalClassesNoInc = { + val src = + """package example + | + |class A { + | class B + | def foo = + | class C + |}""".stripMargin + val output = compiler.compileSrcsNoInc(src) + val srcFile = output.srcFiles.head + val (srcNames, binaryNames) = output.analysis.classNames(srcFile).unzip // non local class names + + assertFalse(output.analysis.enabled()) // inc phases are disabled + assertTrue(output.analysis.apis.isEmpty) // extract-api did not run + assertTrue(output.analysis.usedNamesAndScopes.isEmpty) // extract-dependencies did not run + + // note that local class C is not included, classNames only records non local classes + val expectedBinary = Set("example.A", "example.A$B") + assertEquals(expectedBinary, binaryNames.toSet) + + // note that local class C is not included, classNames only records non local classes + val expectedSrc = Set("example.A", "example.A.B") + assertEquals(expectedSrc, srcNames.toSet) + } + + private def compiler = new ScalaCompilerForUnitTesting +} diff --git a/sbt-bridge/test/xsbt/ScalaCompilerForUnitTesting.scala b/sbt-bridge/test/xsbt/ScalaCompilerForUnitTesting.scala index f17be692ee50..a5a969ee48b9 100644 --- a/sbt-bridge/test/xsbt/ScalaCompilerForUnitTesting.scala +++ b/sbt-bridge/test/xsbt/ScalaCompilerForUnitTesting.scala @@ -3,6 +3,7 @@ package xsbt import xsbti.compile.{CompileProgress, SingleOutput} import java.io.File +import java.nio.file.Path import xsbti._ import sbt.io.IO import xsbti.api.{ ClassLike, Def, DependencyContext } @@ -15,6 +16,8 @@ import dotty.tools.xsbt.CompilerBridge import TestCallback.ExtractedClassDependencies import ScalaCompilerForUnitTesting.Callbacks +case class CompileOutput(srcFiles: Seq[VirtualFileRef], classesOutput: Path, analysis: TestCallback, progress: TestCompileProgress) + object ScalaCompilerForUnitTesting: case class Callbacks(analysis: TestCallback, progress: TestCompileProgress) @@ -25,29 +28,24 @@ object ScalaCompilerForUnitTesting: class ScalaCompilerForUnitTesting { def extractEnteredPhases(srcs: String*): Seq[List[String]] = { - val (tempSrcFiles, Callbacks(_, testProgress)) = compileSrcs(srcs*) - val run = testProgress.runs.head - tempSrcFiles.map(src => run.unitPhases(src.id)) + val output = compileSrcs(srcs*) + val run = output.progress.runs.head + output.srcFiles.map(src => run.unitPhases(src.id)) } - def extractTotal(srcs: String*)(extraSourcePath: String*): Int = { - val (tempSrcFiles, Callbacks(_, testProgress)) = compileSrcs(List(srcs.toList), extraSourcePath.toList) - val run = testProgress.runs.head - run.total - } + def extractTotal(srcs: String*)(extraSourcePath: String*): Int = + compileSrcs(List(srcs.toList), extraSourcePath.toList).progress.runs.head.total - def extractProgressPhases(srcs: String*): List[String] = { - val (_, Callbacks(_, testProgress)) = compileSrcs(srcs*) - testProgress.runs.head.phases - } + def extractProgressPhases(srcs: String*): List[String] = + compileSrcs(srcs*).progress.runs.head.phases /** * Compiles given source code using Scala compiler and returns API representation * extracted by ExtractAPI class. */ def extractApiFromSrc(src: String): Seq[ClassLike] = { - val (Seq(tempSrcFile), Callbacks(analysisCallback, _)) = compileSrcs(src) - analysisCallback.apis(tempSrcFile) + val output = compileSrcs(src) + output.analysis.apis(output.srcFiles.head) } /** @@ -55,8 +53,8 @@ class ScalaCompilerForUnitTesting { * extracted by ExtractAPI class. */ def extractApisFromSrcs(srcs: List[String]*): Seq[Seq[ClassLike]] = { - val (tempSrcFiles, Callbacks(analysisCallback, _)) = compileSrcs(srcs.toList) - tempSrcFiles.map(analysisCallback.apis) + val output = compileSrcs(srcs.toList) + output.srcFiles.map(output.analysis.apis) } /** @@ -73,15 +71,16 @@ class ScalaCompilerForUnitTesting { assertDefaultScope: Boolean = true ): Map[String, Set[String]] = { // we drop temp src file corresponding to the definition src file - val (Seq(_, tempSrcFile), Callbacks(analysisCallback, _)) = compileSrcs(definitionSrc, actualSrc) + val output = compileSrcs(definitionSrc, actualSrc) + val analysis = output.analysis if (assertDefaultScope) for { - (className, used) <- analysisCallback.usedNamesAndScopes - analysisCallback.TestUsedName(name, scopes) <- used + (className, used) <- analysis.usedNamesAndScopes + analysis.TestUsedName(name, scopes) <- used } assert(scopes.size() == 1 && scopes.contains(UseScope.Default), s"$className uses $name in $scopes") - val classesInActualSrc = analysisCallback.classNames(tempSrcFile).map(_._1) - classesInActualSrc.map(className => className -> analysisCallback.usedNames(className)).toMap + val classesInActualSrc = analysis.classNames(output.srcFiles.head).map(_._1) + classesInActualSrc.map(className => className -> analysis.usedNames(className)).toMap } /** @@ -91,11 +90,11 @@ class ScalaCompilerForUnitTesting { * Only the names used in the last src file are returned. */ def extractUsedNamesFromSrc(sources: String*): Map[String, Set[String]] = { - val (srcFiles, Callbacks(analysisCallback, _)) = compileSrcs(sources*) - srcFiles + val output = compileSrcs(sources*) + output.srcFiles .map { srcFile => - val classesInSrc = analysisCallback.classNames(srcFile).map(_._1) - classesInSrc.map(className => className -> analysisCallback.usedNames(className)).toMap + val classesInSrc = output.analysis.classNames(srcFile).map(_._1) + classesInSrc.map(className => className -> output.analysis.usedNames(className)).toMap } .reduce(_ ++ _) } @@ -113,15 +112,15 @@ class ScalaCompilerForUnitTesting { * file system-independent way of testing dependencies between source code "files". */ def extractDependenciesFromSrcs(srcs: List[List[String]]): ExtractedClassDependencies = { - val (_, Callbacks(testCallback, _)) = compileSrcs(srcs) + val analysis = compileSrcs(srcs).analysis - val memberRefDeps = testCallback.classDependencies collect { + val memberRefDeps = analysis.classDependencies collect { case (target, src, DependencyByMemberRef) => (src, target) } - val inheritanceDeps = testCallback.classDependencies collect { + val inheritanceDeps = analysis.classDependencies collect { case (target, src, DependencyByInheritance) => (src, target) } - val localInheritanceDeps = testCallback.classDependencies collect { + val localInheritanceDeps = analysis.classDependencies collect { case (target, src, LocalDependencyByInheritance) => (src, target) } ExtractedClassDependencies.fromPairs(memberRefDeps, inheritanceDeps, localInheritanceDeps) @@ -142,12 +141,24 @@ class ScalaCompilerForUnitTesting { * The sequence of temporary files corresponding to passed snippets and analysis * callback is returned as a result. */ - def compileSrcs(groupedSrcs: List[List[String]], sourcePath: List[String] = Nil): (Seq[VirtualFile], Callbacks) = { + def compileSrcs(groupedSrcs: List[List[String]], sourcePath: List[String] = Nil, compileToJar: Boolean = false, incEnabled: Boolean = true): CompileOutput = { val temp = IO.createTemporaryDirectory - val analysisCallback = new TestCallback + val (forceSbtArgs, analysisCallback) = + if (incEnabled) + (Seq("-Yforce-sbt-phases"), new TestCallback) + else + (Seq.empty, new TestCallbackNoInc) val testProgress = new TestCompileProgress - val classesDir = new File(temp, "classes") - classesDir.mkdir() + val classesOutput = + if (compileToJar) { + val jar = new File(temp, "classes.jar") + jar.createNewFile() + jar + } else { + val dir = new File(temp, "classes") + dir.mkdir() + dir + } val bridge = new CompilerBridge @@ -164,16 +175,16 @@ class ScalaCompilerForUnitTesting { } val virtualSrcFiles = srcFiles.toArray - val classesDirPath = classesDir.getAbsolutePath.toString + val classesOutputPath = classesOutput.getAbsolutePath() val output = new SingleOutput: - def getOutputDirectory() = classesDir + def getOutputDirectory() = classesOutput val maybeSourcePath = if extraFiles.isEmpty then Nil else List("-sourcepath", temp.getAbsolutePath.toString) bridge.run( virtualSrcFiles, new TestDependencyChanges, - Array("-Yforce-sbt-phases", "-classpath", classesDirPath, "-usejavacp", "-d", classesDirPath) ++ maybeSourcePath, + (forceSbtArgs ++: Array("-classpath", classesOutputPath, "-usejavacp", "-d", classesOutputPath)) ++ maybeSourcePath, output, analysisCallback, new TestReporter, @@ -185,17 +196,23 @@ class ScalaCompilerForUnitTesting { srcFiles } - (files.flatten.toSeq, Callbacks(analysisCallback, testProgress)) + CompileOutput(files.flatten.toSeq, classesOutput.toPath, analysisCallback, testProgress) } - def compileSrcs(srcs: String*): (Seq[VirtualFile], Callbacks) = { + def compileSrcs(srcs: String*): CompileOutput = { compileSrcs(List(srcs.toList)) } + def compileSrcsNoInc(srcs: String*): CompileOutput = { + compileSrcs(List(srcs.toList), incEnabled = false) + } + + def compileSrcsToJar(srcs: String*): CompileOutput = + compileSrcs(List(srcs.toList), compileToJar = true) + private def prepareSrcFile(baseDir: File, fileName: String, src: String): VirtualFile = { val srcFile = new File(baseDir, fileName) IO.write(srcFile, src) new TestVirtualFile(srcFile.toPath) } } - diff --git a/sbt-bridge/test/xsbti/TestCallback.scala b/sbt-bridge/test/xsbti/TestCallback.scala index 3398590b169a..9f6df75d84f0 100644 --- a/sbt-bridge/test/xsbti/TestCallback.scala +++ b/sbt-bridge/test/xsbti/TestCallback.scala @@ -11,6 +11,10 @@ import DependencyContext._ import java.{util => ju} import ju.Optional +class TestCallbackNoInc extends TestCallback { + override def enabled(): Boolean = false +} + class TestCallback extends AnalysisCallback2 { case class TestUsedName(name: String, scopes: ju.EnumSet[UseScope]) From 1e20d47d09a3b8e2f0e045a3428d89170632da35 Mon Sep 17 00:00:00 2001 From: Wojciech Mazur Date: Wed, 17 Jul 2024 14:33:14 +0200 Subject: [PATCH 313/371] Add changelog for 3.5.0-RC5 --- changelogs/3.5.0-RC5.md | 14 ++++++++++++++ 1 file changed, 14 insertions(+) create mode 100644 changelogs/3.5.0-RC5.md diff --git a/changelogs/3.5.0-RC5.md b/changelogs/3.5.0-RC5.md new file mode 100644 index 000000000000..405396223eb7 --- /dev/null +++ b/changelogs/3.5.0-RC5.md @@ -0,0 +1,14 @@ +# Backported fixes + +- emit generatedNonLocalClass in backend when callback is not enabled [#21186](https://github.com/scala/scala3/pull/21186) + +# Contributors + +Thank you to all the contributors who made this release possible 🎉 + +According to `git shortlog -sn --no-merges 3.5.0-RC4..3.5.0-RC5` these are: + +``` + 2 Jamie Thompson + 2 Wojciech Mazur +``` From 8e6b582e17c428452eabbaa649695a07c1f541cc Mon Sep 17 00:00:00 2001 From: Wojciech Mazur Date: Wed, 17 Jul 2024 14:33:49 +0200 Subject: [PATCH 314/371] Release 3.5.0-RC5 --- project/Build.scala | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/project/Build.scala b/project/Build.scala index 45402aebc9c4..cbf1b354b073 100644 --- a/project/Build.scala +++ b/project/Build.scala @@ -88,7 +88,7 @@ object Build { val referenceVersion = "3.4.2" - val baseVersion = "3.5.0-RC4" + val baseVersion = "3.5.0-RC5" // LTS or Next val versionLine = "Next" From 318054e614f02d2d70ac3e3ec6bf6c99db39b4ba Mon Sep 17 00:00:00 2001 From: Wojciech Mazur Date: Wed, 24 Jul 2024 17:58:29 +0200 Subject: [PATCH 315/371] Revert "Approximate MatchTypes with lub of case bodies, if non-recursive" --- compiler/src/dotty/tools/dotc/core/TypeComparer.scala | 7 ------- compiler/src/dotty/tools/dotc/typer/Typer.scala | 10 +--------- tests/pos/13633.scala | 2 +- tests/pos/Tuple.Drop.scala | 7 ------- tests/pos/Tuple.Elem.scala | 7 ------- tests/pos/i19710.scala | 11 ----------- 6 files changed, 2 insertions(+), 42 deletions(-) delete mode 100644 tests/pos/Tuple.Drop.scala delete mode 100644 tests/pos/Tuple.Elem.scala delete mode 100644 tests/pos/i19710.scala diff --git a/compiler/src/dotty/tools/dotc/core/TypeComparer.scala b/compiler/src/dotty/tools/dotc/core/TypeComparer.scala index c2c502a984c4..93ed6e7d03a5 100644 --- a/compiler/src/dotty/tools/dotc/core/TypeComparer.scala +++ b/compiler/src/dotty/tools/dotc/core/TypeComparer.scala @@ -2904,13 +2904,6 @@ class TypeComparer(@constructorOnly initctx: Context) extends ConstraintHandling tp case tp: HKTypeLambda => tp - case tp: ParamRef => - val st = tp.superTypeNormalized - if st.exists then - disjointnessBoundary(st) - else - // workaround for when ParamRef#underlying returns NoType - defn.AnyType case tp: TypeProxy => disjointnessBoundary(tp.superTypeNormalized) case tp: WildcardType => diff --git a/compiler/src/dotty/tools/dotc/typer/Typer.scala b/compiler/src/dotty/tools/dotc/typer/Typer.scala index 4cb695a15966..2a877a45b550 100644 --- a/compiler/src/dotty/tools/dotc/typer/Typer.scala +++ b/compiler/src/dotty/tools/dotc/typer/Typer.scala @@ -2569,15 +2569,7 @@ class Typer(@constructorOnly nestingLevel: Int = 0) extends Namer report.error(MatchTypeScrutineeCannotBeHigherKinded(sel1Tpe), sel1.srcPos) val pt1 = if (bound1.isEmpty) pt else bound1.tpe val cases1 = tree.cases.mapconserve(typedTypeCase(_, sel1Tpe, pt1)) - val bound2 = if tree.bound.isEmpty then - val lub = cases1.foldLeft(defn.NothingType: Type): (acc, case1) => - if !acc.exists then NoType - else if case1.body.tpe.isProvisional then NoType - else acc | case1.body.tpe - if lub.exists then TypeTree(lub, inferred = true) - else bound1 - else bound1 - assignType(cpy.MatchTypeTree(tree)(bound2, sel1, cases1), bound2, sel1, cases1) + assignType(cpy.MatchTypeTree(tree)(bound1, sel1, cases1), bound1, sel1, cases1) } def typedByNameTypeTree(tree: untpd.ByNameTypeTree)(using Context): ByNameTypeTree = tree.result match diff --git a/tests/pos/13633.scala b/tests/pos/13633.scala index 8883ef98d0be..ca0f7e68e81e 100644 --- a/tests/pos/13633.scala +++ b/tests/pos/13633.scala @@ -21,7 +21,7 @@ object Sums extends App: type Reverse[A] = ReverseLoop[A, EmptyTuple] - type PlusTri[A, B, C] <: Tuple = (A, B, C) match + type PlusTri[A, B, C] = (A, B, C) match case (false, false, false) => (false, false) case (true, false, false) | (false, true, false) | (false, false, true) => (false, true) case (true, true, false) | (true, false, true) | (false, true, true) => (true, false) diff --git a/tests/pos/Tuple.Drop.scala b/tests/pos/Tuple.Drop.scala deleted file mode 100644 index 9b88cc227966..000000000000 --- a/tests/pos/Tuple.Drop.scala +++ /dev/null @@ -1,7 +0,0 @@ -import compiletime.ops.int.* - -type Drop[T <: Tuple, N <: Int] <: Tuple = N match - case 0 => T - case S[n1] => T match - case EmptyTuple => EmptyTuple - case x *: xs => Drop[xs, n1] diff --git a/tests/pos/Tuple.Elem.scala b/tests/pos/Tuple.Elem.scala deleted file mode 100644 index 81494485c321..000000000000 --- a/tests/pos/Tuple.Elem.scala +++ /dev/null @@ -1,7 +0,0 @@ -import compiletime.ops.int.* - -type Elem[T <: Tuple, I <: Int] = T match - case h *: tail => - I match - case 0 => h - case S[j] => Elem[tail, j] diff --git a/tests/pos/i19710.scala b/tests/pos/i19710.scala deleted file mode 100644 index 03fd1e2d80b3..000000000000 --- a/tests/pos/i19710.scala +++ /dev/null @@ -1,11 +0,0 @@ -import scala.util.NotGiven - -type HasName1 = [n] =>> [x] =>> x match { - case n => true - case _ => false - } -@main def Test = { - summon[HasName1["foo"]["foo"] =:= true] - summon[NotGiven[HasName1["foo"]["bar"] =:= true]] - summon[Tuple.Filter[(1, "foo", 2, "bar"), HasName1["foo"]] =:= Tuple1["foo"]] // error -} From 51629a24ba5f59738600f46a45a9455dff9946e0 Mon Sep 17 00:00:00 2001 From: Wojciech Mazur Date: Wed, 24 Jul 2024 19:53:08 +0200 Subject: [PATCH 316/371] Fix failing run-macros/type-show test --- tests/run-macros/type-show/Test_2.scala | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/tests/run-macros/type-show/Test_2.scala b/tests/run-macros/type-show/Test_2.scala index de845f3e84dd..3bc9da043885 100644 --- a/tests/run-macros/type-show/Test_2.scala +++ b/tests/run-macros/type-show/Test_2.scala @@ -23,7 +23,7 @@ object Test { """TypeRef(ThisType(TypeRef(NoPrefix(), "scala")), "Nothing"), """+ """TypeRef(ThisType(TypeRef(NoPrefix(), "scala")), "Any"))), """+ """MatchType("""+ - """TypeRef(TermRef(ThisType(TypeRef(NoPrefix(), "")), "scala"), "Int"), """+ // match type bound + """TypeRef(ThisType(TypeRef(NoPrefix(), "scala")), "Any"), """+ // match type bound """ParamRef(binder, 0), """+ """List("""+ """MatchCase("""+ From 6a5e6e67ae639e905cae4480a5a0ec114ca07c3e Mon Sep 17 00:00:00 2001 From: Wojciech Mazur Date: Mon, 29 Jul 2024 13:42:07 +0200 Subject: [PATCH 317/371] Add changelog for 3.5.0-RC6 --- changelogs/3.5.0-RC6.md | 13 +++++++++++++ 1 file changed, 13 insertions(+) create mode 100644 changelogs/3.5.0-RC6.md diff --git a/changelogs/3.5.0-RC6.md b/changelogs/3.5.0-RC6.md new file mode 100644 index 000000000000..77731f346750 --- /dev/null +++ b/changelogs/3.5.0-RC6.md @@ -0,0 +1,13 @@ +# Backported fixes + +- Revert "Approximate MatchTypes with lub of case bodies, if non-recursive" in 3.5.0 [#21266](https://github.com/scala/scala3/pull/21266) + +# Contributors + +Thank you to all the contributors who made this release possible 🎉 + +According to `git shortlog -sn --no-merges 3.5.0-RC5..3.5.0-RC6` these are: + +``` + 4 Wojciech Mazur +``` From 1fb613f9ecb938d7cdc9270393cb2d0a48a3a81e Mon Sep 17 00:00:00 2001 From: Wojciech Mazur Date: Mon, 29 Jul 2024 13:42:28 +0200 Subject: [PATCH 318/371] Release 3.5.0-RC6 --- project/Build.scala | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/project/Build.scala b/project/Build.scala index cbf1b354b073..e1a61d82aca7 100644 --- a/project/Build.scala +++ b/project/Build.scala @@ -88,7 +88,7 @@ object Build { val referenceVersion = "3.4.2" - val baseVersion = "3.5.0-RC5" + val baseVersion = "3.5.0-RC6" // LTS or Next val versionLine = "Next" From b079b115cd7161850c01b93a6ac0d07b0c4bf0d7 Mon Sep 17 00:00:00 2001 From: Eugene Flesselle Date: Thu, 18 Jul 2024 18:35:43 +0200 Subject: [PATCH 319/371] Prefer extensions over conversions and implicits for member selection Before the changes, if `isAsGoodValueType` was called with an extension and a given conversion, it would prefer the conversion over the extension, because only the former yielded true in `isGiven`. Which contradicted the logic from searchImplicit which preferred extension over conversions for member selection. --- .../src/dotty/tools/dotc/typer/Applications.scala | 14 ++++++-------- tests/pos/i19715.scala | 3 ++- 2 files changed, 8 insertions(+), 9 deletions(-) diff --git a/compiler/src/dotty/tools/dotc/typer/Applications.scala b/compiler/src/dotty/tools/dotc/typer/Applications.scala index c3369ac58e31..114372f1fa59 100644 --- a/compiler/src/dotty/tools/dotc/typer/Applications.scala +++ b/compiler/src/dotty/tools/dotc/typer/Applications.scala @@ -1816,10 +1816,8 @@ trait Applications extends Compatibility { isAsGood(alt1, tp1.instantiate(tparams.map(_.typeRef)), alt2, tp2) } case _ => // (3) - def isGiven(alt: TermRef) = - alt1.symbol.is(Given) && alt.symbol != defn.NotGivenClass def compareValues(tp1: Type, tp2: Type)(using Context) = - isAsGoodValueType(tp1, tp2, isGiven(alt1), isGiven(alt2)) + isAsGoodValueType(tp1, tp2, alt1.symbol.is(Implicit), alt2.symbol.is(Implicit)) tp2 match case tp2: MethodType => true // (3a) case tp2: PolyType if tp2.resultType.isInstanceOf[MethodType] => true // (3a) @@ -1856,7 +1854,7 @@ trait Applications extends Compatibility { * for overloading resolution (when `preferGeneral is false), and the opposite relation * `U <: T` or `U convertible to `T` for implicit disambiguation between givens * (when `preferGeneral` is true). For old-style implicit values, the 3.4 behavior is kept. - * If one of the alternatives is a given and the other is an implicit, the given wins. + * If one of the alternatives is an implicit and the other is a given (or an extension), the implicit loses. * * - In Scala 3.5 and Scala 3.6-migration, we issue a warning if the result under * Scala 3.6 differ wrt to the old behavior up to 3.5. @@ -1864,7 +1862,7 @@ trait Applications extends Compatibility { * Also and only for given resolution: If a compared type refers to a given or its module class, use * the intersection of its parent classes instead. */ - def isAsGoodValueType(tp1: Type, tp2: Type, alt1isGiven: Boolean, alt2isGiven: Boolean)(using Context): Boolean = + def isAsGoodValueType(tp1: Type, tp2: Type, alt1IsImplicit: Boolean, alt2IsImplicit: Boolean)(using Context): Boolean = val oldResolution = ctx.mode.is(Mode.OldImplicitResolution) if !preferGeneral || Feature.migrateTo3 && oldResolution then // Normal specificity test for overloading resolution (where `preferGeneral` is false) @@ -1882,7 +1880,7 @@ trait Applications extends Compatibility { if Feature.sourceVersion.isAtMost(SourceVersion.`3.4`) || oldResolution - || !alt1isGiven && !alt2isGiven + || alt1IsImplicit && alt2IsImplicit then // Intermediate rules: better means specialize, but map all type arguments downwards // These are enabled for 3.0-3.5, and for all comparisons between old-style implicits, @@ -1897,8 +1895,8 @@ trait Applications extends Compatibility { case _ => mapOver(t) (flip(tp1p) relaxed_<:< flip(tp2p)) || viewExists(tp1, tp2) else - // New rules: better means generalize, givens always beat implicits - if alt1isGiven != alt2isGiven then alt1isGiven + // New rules: better means generalize, givens (and extensions) always beat implicits + if alt1IsImplicit != alt2IsImplicit then alt2IsImplicit else (tp2p relaxed_<:< tp1p) || viewExists(tp2, tp1) end isAsGoodValueType diff --git a/tests/pos/i19715.scala b/tests/pos/i19715.scala index 91aeda5c1698..be5471ffa9b3 100644 --- a/tests/pos/i19715.scala +++ b/tests/pos/i19715.scala @@ -6,7 +6,8 @@ class NT(t: Tup): object NT: extension (x: NT) def app(n: Int): Boolean = true - given Conversion[NT, Tup] = _.toTup + given c1: Conversion[NT, Tup] = _.toTup + implicit def c2(t: NT): Tup = c1(t) def test = val nt = new NT(Tup()) From 07ccc8d9582183e0fd058e3860a7c1b8315b37ca Mon Sep 17 00:00:00 2001 From: odersky Date: Mon, 5 Aug 2024 11:28:18 +0200 Subject: [PATCH 320/371] A left-biased variant for implicit/given pairs We now use a left-biased scheme, as follows. From 3.6 on: - A given x: X is better than a given or implicit y: Y if y can be instantiated/widened to X. - An implicit x: X is better than a given or implicit y: Y if y can be instantiated to a supertype of X. - Use owner score for givens as a tie breaker if after all other tests we still have an ambiguity. This is not transitive, so we need a separate scheme to work around that. Other change: - Drop special handling of NotGiven in prioritization. The previous logic pretended to do so, but was ineffective. --- .../tools/dotc/printing/Formatting.scala | 17 ++--- .../dotty/tools/dotc/reporting/messages.scala | 2 +- .../dotty/tools/dotc/typer/Applications.scala | 65 +++++++++++------ .../dotty/tools/dotc/typer/Implicits.scala | 39 ++++++++-- .../tools/dotc/StringFormatterTest.scala | 1 + tests/neg/given-triangle.check | 8 ++ tests/neg/i21212.check | 4 + tests/neg/i21212.scala | 11 +++ tests/neg/i21303/JavaEnum.java | 1 + tests/neg/i21303/Test.scala | 33 +++++++++ tests/neg/i2974.scala | 16 ++++ tests/neg/scala-uri.check | 14 ++++ tests/neg/scala-uri.scala | 30 ++++++++ tests/pos/given-priority.scala | 24 ++++++ tests/pos/i21212.scala | 22 ++++++ tests/pos/i21303/JavaEnum.java | 1 + tests/pos/i21303/Test.scala | 32 ++++++++ tests/pos/i21303a/JavaEnum.java | 1 + tests/pos/i21303a/Test.scala | 35 +++++++++ tests/pos/i21320.scala | 73 +++++++++++++++++++ tests/pos/i2974.scala | 3 +- tests/pos/scala-uri.scala | 22 ++++++ tests/pos/slick-migration-api-example.scala | 23 ++++++ tests/warn/i21036a.check | 6 +- tests/warn/i21036b.check | 6 +- 25 files changed, 445 insertions(+), 44 deletions(-) create mode 100644 tests/neg/i21212.check create mode 100644 tests/neg/i21212.scala create mode 100644 tests/neg/i21303/JavaEnum.java create mode 100644 tests/neg/i21303/Test.scala create mode 100644 tests/neg/i2974.scala create mode 100644 tests/neg/scala-uri.check create mode 100644 tests/neg/scala-uri.scala create mode 100644 tests/pos/given-priority.scala create mode 100644 tests/pos/i21212.scala create mode 100644 tests/pos/i21303/JavaEnum.java create mode 100644 tests/pos/i21303/Test.scala create mode 100644 tests/pos/i21303a/JavaEnum.java create mode 100644 tests/pos/i21303a/Test.scala create mode 100644 tests/pos/i21320.scala create mode 100644 tests/pos/scala-uri.scala create mode 100644 tests/pos/slick-migration-api-example.scala diff --git a/compiler/src/dotty/tools/dotc/printing/Formatting.scala b/compiler/src/dotty/tools/dotc/printing/Formatting.scala index 6f1c32beb822..43cac17e6318 100644 --- a/compiler/src/dotty/tools/dotc/printing/Formatting.scala +++ b/compiler/src/dotty/tools/dotc/printing/Formatting.scala @@ -2,8 +2,6 @@ package dotty.tools package dotc package printing -import scala.language.unsafeNulls - import scala.collection.mutable import core.* @@ -52,7 +50,11 @@ object Formatting { object ShowAny extends Show[Any]: def show(x: Any): Shown = x - class ShowImplicits3: + class ShowImplicits4: + given [X: Show]: Show[X | Null] with + def show(x: X | Null) = if x == null then "null" else CtxShow(toStr(x.nn)) + + class ShowImplicits3 extends ShowImplicits4: given Show[Product] = ShowAny class ShowImplicits2 extends ShowImplicits3: @@ -77,15 +79,10 @@ object Formatting { given [K: Show, V: Show]: Show[Map[K, V]] with def show(x: Map[K, V]) = CtxShow(x.map((k, v) => s"${toStr(k)} => ${toStr(v)}")) - end given given [H: Show, T <: Tuple: Show]: Show[H *: T] with def show(x: H *: T) = CtxShow(toStr(x.head) *: toShown(x.tail).asInstanceOf[Tuple]) - end given - - given [X: Show]: Show[X | Null] with - def show(x: X | Null) = if x == null then "null" else CtxShow(toStr(x.nn)) given Show[FlagSet] with def show(x: FlagSet) = x.flagsString @@ -148,8 +145,8 @@ object Formatting { private def treatArg(arg: Shown, suffix: String)(using Context): (String, String) = arg.runCtxShow match { case arg: Seq[?] if suffix.indexOf('%') == 0 && suffix.indexOf('%', 1) != -1 => val end = suffix.indexOf('%', 1) - val sep = StringContext.processEscapes(suffix.substring(1, end)) - (arg.mkString(sep), suffix.substring(end + 1)) + val sep = StringContext.processEscapes(suffix.substring(1, end).nn) + (arg.mkString(sep), suffix.substring(end + 1).nn) case arg: Seq[?] => (arg.map(showArg).mkString("[", ", ", "]"), suffix) case arg => diff --git a/compiler/src/dotty/tools/dotc/reporting/messages.scala b/compiler/src/dotty/tools/dotc/reporting/messages.scala index ceb8ecbc8e03..9a20f149a6d1 100644 --- a/compiler/src/dotty/tools/dotc/reporting/messages.scala +++ b/compiler/src/dotty/tools/dotc/reporting/messages.scala @@ -2955,7 +2955,7 @@ class MissingImplicitArgument( /** Default error message for non-nested ambiguous implicits. */ def defaultAmbiguousImplicitMsg(ambi: AmbiguousImplicits) = - s"Ambiguous given instances: ${ambi.explanation}${location("of")}" + s"Ambiguous given instances: ${ambi.explanation}${location("of")}${ambi.priorityChangeWarningNote}" /** Default error messages for non-ambiguous implicits, or nested ambiguous * implicits. diff --git a/compiler/src/dotty/tools/dotc/typer/Applications.scala b/compiler/src/dotty/tools/dotc/typer/Applications.scala index 114372f1fa59..2efe5282f025 100644 --- a/compiler/src/dotty/tools/dotc/typer/Applications.scala +++ b/compiler/src/dotty/tools/dotc/typer/Applications.scala @@ -1748,6 +1748,17 @@ trait Applications extends Compatibility { else if sym2.is(Module) then compareOwner(sym1, cls2) else 0 + enum CompareScheme: + case Old // Normal specificity test for overloading resolution (where `preferGeneral` is false) + // and in mode Scala3-migration when we compare with the old Scala 2 rules. + + case Intermediate // Intermediate rules: better means specialize, but map all type arguments downwards + // These are enabled for 3.0-3.4, or if OldImplicitResolution + // is specified, and also for all comparisons between old-style implicits, + + case New // New rules: better means generalize, givens (and extensions) always beat implicits + end CompareScheme + /** Compare two alternatives of an overloaded call or an implicit search. * * @param alt1, alt2 Non-overloaded references indicating the two choices @@ -1774,6 +1785,15 @@ trait Applications extends Compatibility { */ def compare(alt1: TermRef, alt2: TermRef, preferGeneral: Boolean = false)(using Context): Int = trace(i"compare($alt1, $alt2)", overload) { record("resolveOverloaded.compare") + val scheme = + val oldResolution = ctx.mode.is(Mode.OldImplicitResolution) + if !preferGeneral || Feature.migrateTo3 && oldResolution then + CompareScheme.Old + else if Feature.sourceVersion.isAtMost(SourceVersion.`3.4`) + || oldResolution + || alt1.symbol.is(Implicit) && alt2.symbol.is(Implicit) + then CompareScheme.Intermediate + else CompareScheme.New /** Is alternative `alt1` with type `tp1` as good as alternative * `alt2` with type `tp2` ? @@ -1816,15 +1836,15 @@ trait Applications extends Compatibility { isAsGood(alt1, tp1.instantiate(tparams.map(_.typeRef)), alt2, tp2) } case _ => // (3) - def compareValues(tp1: Type, tp2: Type)(using Context) = - isAsGoodValueType(tp1, tp2, alt1.symbol.is(Implicit), alt2.symbol.is(Implicit)) + def compareValues(tp2: Type)(using Context) = + isAsGoodValueType(tp1, tp2, alt1.symbol.is(Implicit)) tp2 match case tp2: MethodType => true // (3a) case tp2: PolyType if tp2.resultType.isInstanceOf[MethodType] => true // (3a) case tp2: PolyType => // (3b) - explore(compareValues(tp1, instantiateWithTypeVars(tp2))) + explore(compareValues(instantiateWithTypeVars(tp2))) case _ => // 3b) - compareValues(tp1, tp2) + compareValues(tp2) } /** Test whether value type `tp1` is as good as value type `tp2`. @@ -1862,9 +1882,8 @@ trait Applications extends Compatibility { * Also and only for given resolution: If a compared type refers to a given or its module class, use * the intersection of its parent classes instead. */ - def isAsGoodValueType(tp1: Type, tp2: Type, alt1IsImplicit: Boolean, alt2IsImplicit: Boolean)(using Context): Boolean = - val oldResolution = ctx.mode.is(Mode.OldImplicitResolution) - if !preferGeneral || Feature.migrateTo3 && oldResolution then + def isAsGoodValueType(tp1: Type, tp2: Type, alt1IsImplicit: Boolean)(using Context): Boolean = + if scheme == CompareScheme.Old then // Normal specificity test for overloading resolution (where `preferGeneral` is false) // and in mode Scala3-migration when we compare with the old Scala 2 rules. isCompatible(tp1, tp2) @@ -1878,13 +1897,7 @@ trait Applications extends Compatibility { val tp1p = prepare(tp1) val tp2p = prepare(tp2) - if Feature.sourceVersion.isAtMost(SourceVersion.`3.4`) - || oldResolution - || alt1IsImplicit && alt2IsImplicit - then - // Intermediate rules: better means specialize, but map all type arguments downwards - // These are enabled for 3.0-3.5, and for all comparisons between old-style implicits, - // and in 3.5 and 3.6-migration when we compare with previous rules. + if scheme == CompareScheme.Intermediate || alt1IsImplicit then val flip = new TypeMap: def apply(t: Type) = t match case t @ AppliedType(tycon, args) => @@ -1895,9 +1908,7 @@ trait Applications extends Compatibility { case _ => mapOver(t) (flip(tp1p) relaxed_<:< flip(tp2p)) || viewExists(tp1, tp2) else - // New rules: better means generalize, givens (and extensions) always beat implicits - if alt1IsImplicit != alt2IsImplicit then alt2IsImplicit - else (tp2p relaxed_<:< tp1p) || viewExists(tp2, tp1) + (tp2p relaxed_<:< tp1p) || viewExists(tp2, tp1) end isAsGoodValueType /** Widen the result type of synthetic given methods from the implementation class to the @@ -1968,13 +1979,19 @@ trait Applications extends Compatibility { // alternatives are the same after following ExprTypes, pick one of them // (prefer the one that is not a method, but that's arbitrary). if alt1.widenExpr =:= alt2 then -1 else 1 - else ownerScore match - case 1 => if winsType1 || !winsType2 then 1 else 0 - case -1 => if winsType2 || !winsType1 then -1 else 0 - case 0 => - if winsType1 != winsType2 then if winsType1 then 1 else -1 - else if alt1.symbol == alt2.symbol then comparePrefixes - else 0 + else + // For new implicit resolution, take ownerscore as more significant than type resolution + // Reason: People use owner hierarchies to explicitly prioritize, we should not + // break that by changing implicit priority of types. + def drawOrOwner = + if scheme == CompareScheme.New then ownerScore else 0 + ownerScore match + case 1 => if winsType1 || !winsType2 then 1 else drawOrOwner + case -1 => if winsType2 || !winsType1 then -1 else drawOrOwner + case 0 => + if winsType1 != winsType2 then if winsType1 then 1 else -1 + else if alt1.symbol == alt2.symbol then comparePrefixes + else 0 end compareWithTypes if alt1.symbol.is(ConstructorProxy) && !alt2.symbol.is(ConstructorProxy) then -1 diff --git a/compiler/src/dotty/tools/dotc/typer/Implicits.scala b/compiler/src/dotty/tools/dotc/typer/Implicits.scala index 45c8731c553e..d98fc87655bf 100644 --- a/compiler/src/dotty/tools/dotc/typer/Implicits.scala +++ b/compiler/src/dotty/tools/dotc/typer/Implicits.scala @@ -549,6 +549,11 @@ object Implicits: /** An ambiguous implicits failure */ class AmbiguousImplicits(val alt1: SearchSuccess, val alt2: SearchSuccess, val expectedType: Type, val argument: Tree, val nested: Boolean = false) extends SearchFailureType: + private[Implicits] var priorityChangeWarnings: List[Message] = Nil + + def priorityChangeWarningNote(using Context): String = + priorityChangeWarnings.map(msg => s"\n\nNote: $msg").mkString + def msg(using Context): Message = var str1 = err.refStr(alt1.ref) var str2 = err.refStr(alt2.ref) @@ -1330,7 +1335,7 @@ trait Implicits: if alt1.ref eq alt2.ref then 0 else if alt1.level != alt2.level then alt1.level - alt2.level else - var cmp = comp(using searchContext()) + val cmp = comp(using searchContext()) val sv = Feature.sourceVersion if isWarnPriorityChangeVersion(sv) then val prev = comp(using searchContext().addMode(Mode.OldImplicitResolution)) @@ -1345,13 +1350,21 @@ trait Implicits: case _ => "none - it's ambiguous" if sv.stable == SourceVersion.`3.5` then warn( - em"""Given search preference for $pt between alternatives ${alt1.ref} and ${alt2.ref} will change + em"""Given search preference for $pt between alternatives + | ${alt1.ref} + |and + | ${alt2.ref} + |will change. |Current choice : ${choice(prev)} |New choice from Scala 3.6: ${choice(cmp)}""") prev else warn( - em"""Change in given search preference for $pt between alternatives ${alt1.ref} and ${alt2.ref} + em"""Given search preference for $pt between alternatives + | ${alt1.ref} + |and + | ${alt2.ref} + |has changed. |Previous choice : ${choice(prev)} |New choice from Scala 3.6: ${choice(cmp)}""") cmp @@ -1610,9 +1623,23 @@ trait Implicits: throw ex val result = rank(sort(eligible), NoMatchingImplicitsFailure, Nil) - for (critical, msg) <- priorityChangeWarnings do - if result.found.exists(critical.contains(_)) then - report.warning(msg, srcPos) + + // Issue all priority change warnings that can affect the result + val shownWarnings = priorityChangeWarnings.toList.collect: + case (critical, msg) if result.found.exists(critical.contains(_)) => + msg + result match + case result: SearchFailure => + result.reason match + case ambi: AmbiguousImplicits => + // Make warnings part of error message because otherwise they are suppressed when + // the error is emitted. + ambi.priorityChangeWarnings = shownWarnings + case _ => + case _ => + for msg <- shownWarnings do + report.warning(msg, srcPos) + result end searchImplicit diff --git a/compiler/test/dotty/tools/dotc/StringFormatterTest.scala b/compiler/test/dotty/tools/dotc/StringFormatterTest.scala index 4dfc08cc7e9b..b0ff8b8fc03e 100644 --- a/compiler/test/dotty/tools/dotc/StringFormatterTest.scala +++ b/compiler/test/dotty/tools/dotc/StringFormatterTest.scala @@ -23,6 +23,7 @@ class StringFormatterTest extends AbstractStringFormatterTest: @Test def flagsTup = check("(,final)", i"${(JavaStatic, Final)}") @Test def seqOfTup2 = check("(final,given), (private,lazy)", i"${Seq((Final, Given), (Private, Lazy))}%, %") @Test def seqOfTup3 = check("(Foo,given, (right is approximated))", i"${Seq((Foo, Given, TypeComparer.ApproxState.None.addHigh))}%, %") + @Test def tupleNull = check("(1,null)", i"${(1, null: String | Null)}") class StorePrinter extends Printer: var string: String = "" diff --git a/tests/neg/given-triangle.check b/tests/neg/given-triangle.check index f548df0078de..73d5aea12dc4 100644 --- a/tests/neg/given-triangle.check +++ b/tests/neg/given-triangle.check @@ -2,3 +2,11 @@ 15 |@main def Test = f // error | ^ |Ambiguous given instances: both given instance given_B and given instance given_C match type A of parameter a of method f + | + |Note: Given search preference for A between alternatives + | (given_A : A) + |and + | (given_B : B) + |will change. + |Current choice : the second alternative + |New choice from Scala 3.6: the first alternative diff --git a/tests/neg/i21212.check b/tests/neg/i21212.check new file mode 100644 index 000000000000..5d9fe7728cbc --- /dev/null +++ b/tests/neg/i21212.check @@ -0,0 +1,4 @@ +-- [E172] Type Error: tests/neg/i21212.scala:8:52 ---------------------------------------------------------------------- +8 | def test2(using a2: A)(implicit b2: B) = summon[A] // error: ambiguous + | ^ + |Ambiguous given instances: both parameter b2 and parameter a2 match type Minimization.A of parameter x of method summon in object Predef diff --git a/tests/neg/i21212.scala b/tests/neg/i21212.scala new file mode 100644 index 000000000000..389a82b19f1f --- /dev/null +++ b/tests/neg/i21212.scala @@ -0,0 +1,11 @@ +//> using options -source:3.6 +object Minimization: + + trait A + trait B extends A + + def test1(using a1: A)(using b1: B) = summon[A] // picks (most general) a1 + def test2(using a2: A)(implicit b2: B) = summon[A] // error: ambiguous + def test3(implicit a3: A, b3: B) = summon[A] // picks (most specific) b3 + +end Minimization diff --git a/tests/neg/i21303/JavaEnum.java b/tests/neg/i21303/JavaEnum.java new file mode 100644 index 000000000000..e626d5070626 --- /dev/null +++ b/tests/neg/i21303/JavaEnum.java @@ -0,0 +1 @@ +public enum JavaEnum { ABC, DEF, GHI } diff --git a/tests/neg/i21303/Test.scala b/tests/neg/i21303/Test.scala new file mode 100644 index 000000000000..fa8058140067 --- /dev/null +++ b/tests/neg/i21303/Test.scala @@ -0,0 +1,33 @@ +//> using options -source 3.6-migration +import scala.deriving.Mirror +import scala.compiletime.* +import scala.reflect.ClassTag +import scala.annotation.implicitNotFound + + +trait TSType[T] +object TSType extends DefaultTSTypes with TSTypeMacros + +trait TSNamedType[T] extends TSType[T] + +trait DefaultTSTypes extends JavaTSTypes +trait JavaTSTypes { + given javaEnumTSType[E <: java.lang.Enum[E]: ClassTag]: TSNamedType[E] = ??? +} +object DefaultTSTypes extends DefaultTSTypes +trait TSTypeMacros { + inline given [T: Mirror.Of]: TSType[T] = derived[T] + inline def derived[T](using m: Mirror.Of[T]): TSType[T] = { + val elemInstances = summonAll[m.MirroredElemTypes] + ??? + } + + private inline def summonAll[T <: Tuple]: List[TSType[_]] = { + inline erasedValue[T] match { + case _: EmptyTuple => Nil + case _: (t *: ts) => summonInline[TSType[t]] :: summonAll[ts] + } + } +} + +@main def Test = summon[TSType[JavaEnum]] // error \ No newline at end of file diff --git a/tests/neg/i2974.scala b/tests/neg/i2974.scala new file mode 100644 index 000000000000..0bff2da1f3ba --- /dev/null +++ b/tests/neg/i2974.scala @@ -0,0 +1,16 @@ + +trait Foo[-T] +trait Bar[-T] extends Foo[T] + +object Test { + + locally: + implicit val fa: Foo[Int] = ??? + implicit val ba: Bar[Int] = ??? + summon[Foo[Int]] // ok + + locally: + implicit val fa: Foo[Int] = ??? + implicit val ba: Bar[Any] = ??? + summon[Foo[Int]] // error: ambiguous +} diff --git a/tests/neg/scala-uri.check b/tests/neg/scala-uri.check new file mode 100644 index 000000000000..91bcd7ab6a6c --- /dev/null +++ b/tests/neg/scala-uri.check @@ -0,0 +1,14 @@ +-- [E172] Type Error: tests/neg/scala-uri.scala:30:59 ------------------------------------------------------------------ +30 |@main def Test = summon[QueryKeyValue[(String, None.type)]] // error + | ^ + |No best given instance of type QueryKeyValue[(String, None.type)] was found for parameter x of method summon in object Predef. + |I found: + | + | QueryKeyValue.tuple2QueryKeyValue[String, None.type](QueryKey.stringQueryKey, + | QueryValue.optionQueryValue[A]( + | /* ambiguous: both given instance stringQueryValue in trait QueryValueInstances1 and given instance noneQueryValue in trait QueryValueInstances1 match type QueryValue[A] */ + | summon[QueryValue[A]] + | ) + | ) + | + |But both given instance stringQueryValue in trait QueryValueInstances1 and given instance noneQueryValue in trait QueryValueInstances1 match type QueryValue[A]. diff --git a/tests/neg/scala-uri.scala b/tests/neg/scala-uri.scala new file mode 100644 index 000000000000..3820f8cf5613 --- /dev/null +++ b/tests/neg/scala-uri.scala @@ -0,0 +1,30 @@ +import scala.language.implicitConversions + +trait QueryKey[A] +object QueryKey extends QueryKeyInstances +sealed trait QueryKeyInstances: + given stringQueryKey: QueryKey[String] = ??? + +trait QueryValue[-A] +object QueryValue extends QueryValueInstances +sealed trait QueryValueInstances1: + given stringQueryValue: QueryValue[String] = ??? + given noneQueryValue: QueryValue[None.type] = ??? + // The noneQueryValue makes no sense at this priority. Since QueryValue + // is contravariant, QueryValue[None.type] is always better than QueryValue[Option[A]] + // no matter whether it's old or new resolution. So taking both owner and type + // score into account, it's always a draw. With the new disambiguation, we prefer + // the optionQueryValue[A], which gives an ambiguity down the road, because we don't + // know what the wrapped type A is. Previously, we preferred QueryValue[None.type] + // because it is unconditional. The solution is to put QueryValue[None.type] in the + // same trait as QueryValue[Option[A]], as is shown in pos/scala-uri.scala. + +sealed trait QueryValueInstances extends QueryValueInstances1: + given optionQueryValue[A: QueryValue]: QueryValue[Option[A]] = ??? + +trait QueryKeyValue[A] +object QueryKeyValue: + given tuple2QueryKeyValue[K: QueryKey, V: QueryValue]: QueryKeyValue[(K, V)] = ??? + + +@main def Test = summon[QueryKeyValue[(String, None.type)]] // error diff --git a/tests/pos/given-priority.scala b/tests/pos/given-priority.scala new file mode 100644 index 000000000000..048e063eff35 --- /dev/null +++ b/tests/pos/given-priority.scala @@ -0,0 +1,24 @@ +/* These tests show various mechanisms available for implicit prioritization. + */ +import language.`3.6` + +class A // The type for which we infer terms below +class B extends A + +/* First, two schemes that require a pre-planned architecture for how and + * where given instances are defined. + * + * Traditional scheme: prioritize with location in class hierarchy + */ +class LowPriorityImplicits: + given g1: A() + +object NormalImplicits extends LowPriorityImplicits: + given g2: B() + +def test1 = + import NormalImplicits.given + val x = summon[A] + val _: B = x + val y = summon[B] + val _: B = y diff --git a/tests/pos/i21212.scala b/tests/pos/i21212.scala new file mode 100644 index 000000000000..1a1f2e35819a --- /dev/null +++ b/tests/pos/i21212.scala @@ -0,0 +1,22 @@ + +trait Functor[F[_]]: + def map[A, B](fa: F[A])(f: A => B): F[B] = ??? +trait Monad[F[_]] extends Functor[F] +trait MonadError[F[_], E] extends Monad[F]: + def raiseError[A](e: E): F[A] +trait Temporal[F[_]] extends MonadError[F, Throwable] + +trait FunctorOps[F[_], A]: + def map[B](f: A => B): F[B] = ??? +implicit def toFunctorOps[F[_], A](target: F[A])(implicit tc: Functor[F]): FunctorOps[F, A] = ??? + +class ContextBounds[F[_]: Temporal](using err: MonadError[F, Throwable]): + def useCase = err.raiseError(new RuntimeException()) + val bool: F[Boolean] = ??? + def fails = toFunctorOps(bool).map(_ => ()) // warns under -source:3.5, // error under -source:3.6 + +class UsingArguments[F[_]](using Temporal[F])(using err: MonadError[F, Throwable]): + def useCase = err.raiseError(new RuntimeException()) + val bool: F[Boolean] = ??? + def works = toFunctorOps(bool).map(_ => ()) // warns under -source:3.5 + diff --git a/tests/pos/i21303/JavaEnum.java b/tests/pos/i21303/JavaEnum.java new file mode 100644 index 000000000000..e626d5070626 --- /dev/null +++ b/tests/pos/i21303/JavaEnum.java @@ -0,0 +1 @@ +public enum JavaEnum { ABC, DEF, GHI } diff --git a/tests/pos/i21303/Test.scala b/tests/pos/i21303/Test.scala new file mode 100644 index 000000000000..fe3efa6e38f3 --- /dev/null +++ b/tests/pos/i21303/Test.scala @@ -0,0 +1,32 @@ +import scala.deriving.Mirror +import scala.compiletime.* +import scala.reflect.ClassTag +import scala.annotation.implicitNotFound + + +trait TSType[T] +object TSType extends DefaultTSTypes with TSTypeMacros + +trait TSNamedType[T] extends TSType[T] + +trait DefaultTSTypes extends JavaTSTypes +trait JavaTSTypes { + given javaEnumTSType[E <: java.lang.Enum[E]: ClassTag]: TSType[E] = ??? +} +object DefaultTSTypes extends DefaultTSTypes +trait TSTypeMacros { + inline given [T: Mirror.Of]: TSType[T] = derived[T] + inline def derived[T](using m: Mirror.Of[T]): TSType[T] = { + val elemInstances = summonAll[m.MirroredElemTypes] + ??? + } + + private inline def summonAll[T <: Tuple]: List[TSType[_]] = { + inline erasedValue[T] match { + case _: EmptyTuple => Nil + case _: (t *: ts) => summonInline[TSType[t]] :: summonAll[ts] + } + } +} + +@main def Test = summon[TSType[JavaEnum]] \ No newline at end of file diff --git a/tests/pos/i21303a/JavaEnum.java b/tests/pos/i21303a/JavaEnum.java new file mode 100644 index 000000000000..e626d5070626 --- /dev/null +++ b/tests/pos/i21303a/JavaEnum.java @@ -0,0 +1 @@ +public enum JavaEnum { ABC, DEF, GHI } diff --git a/tests/pos/i21303a/Test.scala b/tests/pos/i21303a/Test.scala new file mode 100644 index 000000000000..83a598b5f17f --- /dev/null +++ b/tests/pos/i21303a/Test.scala @@ -0,0 +1,35 @@ +import scala.deriving.Mirror +import scala.compiletime.* +import scala.reflect.ClassTag +import scala.annotation.implicitNotFound + + +trait TSType[T] +object TSType extends DefaultTSTypes with TSTypeMacros + +trait TSNamedType[T] extends TSType[T] + +trait DefaultTSTypes extends JavaTSTypes +trait JavaTSTypes { + given javaEnumTSType[E <: java.lang.Enum[E]: ClassTag]: TSType[E] = ??? + given javaEnumTSNamedType[E <: java.lang.Enum[E]: ClassTag]: TSNamedType[E] = ??? +} +object DefaultTSTypes extends DefaultTSTypes +trait TSTypeMacros { + inline given [T: Mirror.Of]: TSType[T] = derived[T] + inline def derived[T](using m: Mirror.Of[T]): TSType[T] = { + val elemInstances = summonAll[m.MirroredElemTypes] + ??? + } + + private inline def summonAll[T <: Tuple]: List[TSType[_]] = { + inline erasedValue[T] match { + case _: EmptyTuple => Nil + case _: (t *: ts) => summonInline[TSType[t]] :: summonAll[ts] + } + } +} + +@main def Test = + summon[TSType[JavaEnum]] + summon[TSNamedType[JavaEnum]] diff --git a/tests/pos/i21320.scala b/tests/pos/i21320.scala new file mode 100644 index 000000000000..0a7e0d1941d1 --- /dev/null +++ b/tests/pos/i21320.scala @@ -0,0 +1,73 @@ +import scala.deriving.* +import scala.compiletime.* + +trait ConfigMonoid[T]: + def zero: T + def orElse(main: T, defaults: T): T + +object ConfigMonoid: + given option[T]: ConfigMonoid[Option[T]] = ??? + + inline def zeroTuple[C <: Tuple]: Tuple = + inline erasedValue[C] match + case _: EmptyTuple => EmptyTuple + case _: (t *: ts) => + summonInline[ConfigMonoid[t]].zero *: zeroTuple[ts] + + inline def valueTuple[C <: Tuple, T](index: Int, main: T, defaults: T): Tuple = + inline erasedValue[C] match + case _: EmptyTuple => EmptyTuple + case _: (t *: ts) => + def get(v: T) = v.asInstanceOf[Product].productElement(index).asInstanceOf[t] + summonInline[ConfigMonoid[t]].orElse(get(main), get(defaults)) *: valueTuple[ts, T]( + index + 1, + main, + defaults + ) + + inline given derive[T](using m: Mirror.ProductOf[T]): ConfigMonoid[T] = + new ConfigMonoid[T]: + def zero: T = m.fromProduct(zeroTuple[m.MirroredElemTypes]) + def orElse(main: T, defaults: T): T = m.fromProduct(valueTuple[m.MirroredElemTypes, T](0, main, defaults)) + + + +final case class PublishOptions( + v1: Option[String] = None, + v2: Option[String] = None, + v3: Option[String] = None, + v4: Option[String] = None, + v5: Option[String] = None, + v6: Option[String] = None, + v7: Option[String] = None, + v8: Option[String] = None, + v9: Option[String] = None, + ci: PublishContextualOptions = PublishContextualOptions(), +) +object PublishOptions: + implicit val monoid: ConfigMonoid[PublishOptions] = ConfigMonoid.derive + +final case class PublishContextualOptions( + v1: Option[String] = None, + v2: Option[String] = None, + v3: Option[String] = None, + v4: Option[String] = None, + v5: Option[String] = None, + v6: Option[String] = None, + v7: Option[String] = None, + v8: Option[String] = None, + v9: Option[String] = None, + v10: Option[String] = None, + v11: Option[String] = None, + v12: Option[String] = None, + v13: Option[String] = None, + v14: Option[String] = None, + v15: Option[String] = None, + v16: Option[String] = None, + v17: Option[String] = None, + v18: Option[String] = None, + v19: Option[String] = None, + v20: Option[String] = None +) +object PublishContextualOptions: + given monoid: ConfigMonoid[PublishContextualOptions] = ConfigMonoid.derive \ No newline at end of file diff --git a/tests/pos/i2974.scala b/tests/pos/i2974.scala index 75c6a24a41bb..8f1c2e2d6d2f 100644 --- a/tests/pos/i2974.scala +++ b/tests/pos/i2974.scala @@ -7,6 +7,7 @@ object Test { implicit val ba: Bar[Int] = ??? def test: Unit = { - implicitly[Foo[Int]] + val x = summon[Foo[Int]] + val _: Bar[Int] = x } } diff --git a/tests/pos/scala-uri.scala b/tests/pos/scala-uri.scala new file mode 100644 index 000000000000..75ea2fc70d8a --- /dev/null +++ b/tests/pos/scala-uri.scala @@ -0,0 +1,22 @@ +// This works for implicit/implicit pairs but not for givens, see neg version. +import scala.language.implicitConversions + +trait QueryKey[A] +object QueryKey extends QueryKeyInstances +sealed trait QueryKeyInstances: + implicit val stringQueryKey: QueryKey[String] = ??? + +trait QueryValue[-A] +object QueryValue extends QueryValueInstances +sealed trait QueryValueInstances1: + implicit final val stringQueryValue: QueryValue[String] = ??? + implicit final val noneQueryValue: QueryValue[None.type] = ??? + +sealed trait QueryValueInstances extends QueryValueInstances1: + implicit final def optionQueryValue[A: QueryValue]: QueryValue[Option[A]] = ??? + +trait QueryKeyValue[A] +object QueryKeyValue: + implicit def tuple2QueryKeyValue[K: QueryKey, V: QueryValue]: QueryKeyValue[(K, V)] = ??? + +@main def Test = summon[QueryKeyValue[(String, None.type)]] diff --git a/tests/pos/slick-migration-api-example.scala b/tests/pos/slick-migration-api-example.scala new file mode 100644 index 000000000000..3b6f1b4a82f4 --- /dev/null +++ b/tests/pos/slick-migration-api-example.scala @@ -0,0 +1,23 @@ +trait Migration +object Migration: + implicit class MigrationConcat[M <: Migration](m: M): + def &[N <: Migration, O](n: N)(implicit ccm: CanConcatMigrations[M, N, O]): O = ??? + +trait ReversibleMigration extends Migration +trait MigrationSeq extends Migration +trait ReversibleMigrationSeq extends MigrationSeq with ReversibleMigration + +trait ToReversible[-A <: Migration] +object ToReversible: + implicit val reversible: ToReversible[ReversibleMigration] = ??? +class CanConcatMigrations[-A, -B, +C] +trait CanConcatMigrationsLow: + implicit def default[A <: Migration, B <: Migration]: CanConcatMigrations[A, B, MigrationSeq] = ??? +object CanConcatMigrations extends CanConcatMigrationsLow: + implicit def reversible[A <: Migration, B <: Migration](implicit reverseA: ToReversible[A], + reverseB: ToReversible[B]): CanConcatMigrations[A, B, ReversibleMigrationSeq] = ??? + +@main def Test = + val rm: ReversibleMigration = ??? + val rms = rm & rm & rm + summon[rms.type <:< ReversibleMigrationSeq] // error Cannot prove that (rms : slick.migration.api.MigrationSeq) <:< slick.migration.api.ReversibleMigrationSeq. \ No newline at end of file diff --git a/tests/warn/i21036a.check b/tests/warn/i21036a.check index 673c01374ef3..876a81ad8a83 100644 --- a/tests/warn/i21036a.check +++ b/tests/warn/i21036a.check @@ -1,6 +1,10 @@ -- Warning: tests/warn/i21036a.scala:7:17 ------------------------------------------------------------------------------ 7 |val y = summon[A] // warn | ^ - | Given search preference for A between alternatives (b : B) and (a : A) will change + | Given search preference for A between alternatives + | (b : B) + | and + | (a : A) + | will change. | Current choice : the first alternative | New choice from Scala 3.6: the second alternative diff --git a/tests/warn/i21036b.check b/tests/warn/i21036b.check index ff7fdfd7a87c..11bb38727d77 100644 --- a/tests/warn/i21036b.check +++ b/tests/warn/i21036b.check @@ -1,6 +1,10 @@ -- Warning: tests/warn/i21036b.scala:7:17 ------------------------------------------------------------------------------ 7 |val y = summon[A] // warn | ^ - | Change in given search preference for A between alternatives (b : B) and (a : A) + | Given search preference for A between alternatives + | (b : B) + | and + | (a : A) + | has changed. | Previous choice : the first alternative | New choice from Scala 3.6: the second alternative From 8a41389dd4ea6c15f2089519ac5883c4c72a0c56 Mon Sep 17 00:00:00 2001 From: odersky Date: Mon, 5 Aug 2024 13:48:25 +0200 Subject: [PATCH 321/371] Compensate loss of transitivity We only have transitivity between givens or between implicits. To cope with that - We tank first all implicits, giving a best implicit search result. - Then we rank all givens startign with the implicit result. If there is a given that is better than the best implicit, the best given will be chosen. Otherwise we will stick with the best implicit. --- .../src/dotty/tools/dotc/typer/Implicits.scala | 18 +++++++++++++++--- tests/pos/given-owner-disambiguate.scala | 13 +++++++++++++ 2 files changed, 28 insertions(+), 3 deletions(-) create mode 100644 tests/pos/given-owner-disambiguate.scala diff --git a/compiler/src/dotty/tools/dotc/typer/Implicits.scala b/compiler/src/dotty/tools/dotc/typer/Implicits.scala index d98fc87655bf..8a4ec986e23a 100644 --- a/compiler/src/dotty/tools/dotc/typer/Implicits.scala +++ b/compiler/src/dotty/tools/dotc/typer/Implicits.scala @@ -1381,8 +1381,6 @@ trait Implicits: def disambiguate(alt1: SearchResult, alt2: SearchSuccess) = alt1 match case alt1: SearchSuccess => var diff = compareAlternatives(alt1, alt2, disambiguate = true) - assert(diff <= 0 || isWarnPriorityChangeVersion(Feature.sourceVersion)) - // diff > 0 candidates should already have been eliminated in `rank` if diff == 0 && alt1.ref =:= alt2.ref then diff = 1 // See i12951 for a test where this happens else if diff == 0 && alt2.isExtension then @@ -1622,7 +1620,21 @@ trait Implicits: validateOrdering(ord) throw ex - val result = rank(sort(eligible), NoMatchingImplicitsFailure, Nil) + val sorted = sort(eligible) + val result = sorted match + case first :: rest => + val firstIsImplicit = first.ref.symbol.is(Implicit) + if rest.exists(_.ref.symbol.is(Implicit) != firstIsImplicit) then + // Mixture of implicits and givens + // Rank implicits first, then, if there is a given that it better than the best implicit(s) + // switch over to givens. + val (sortedImplicits, sortedGivens) = sorted.partition(_.ref.symbol.is(Implicit)) + val implicitResult = rank(sortedImplicits, NoMatchingImplicitsFailure, Nil) + rank(sortedGivens, implicitResult, Nil) + else + rank(sorted, NoMatchingImplicitsFailure, Nil) + case _ => + NoMatchingImplicitsFailure // Issue all priority change warnings that can affect the result val shownWarnings = priorityChangeWarnings.toList.collect: diff --git a/tests/pos/given-owner-disambiguate.scala b/tests/pos/given-owner-disambiguate.scala new file mode 100644 index 000000000000..f0a44ecc441a --- /dev/null +++ b/tests/pos/given-owner-disambiguate.scala @@ -0,0 +1,13 @@ +class General +class Specific extends General + +class LowPriority: + given a:General() + +object NormalPriority extends LowPriority: + given b:Specific() + +def run = + import NormalPriority.given + val x = summon[General] + val _: Specific = x // <- b was picked \ No newline at end of file From 0f0c20d759d008769e5210876c113aeb4569a2c9 Mon Sep 17 00:00:00 2001 From: odersky Date: Mon, 5 Aug 2024 18:14:37 +0200 Subject: [PATCH 322/371] Delay priority change until 3.7 Warnings from 3.6, change in 3.7. This is one version later than originally planned. --- .../tools/dotc/config/SourceVersion.scala | 1 + .../dotty/tools/dotc/typer/Applications.scala | 14 +++++++------- .../src/dotty/tools/dotc/typer/Implicits.scala | 18 +++++++++--------- .../scala/runtime/stdLibPatches/language.scala | 15 +++++++++++++++ tests/neg/given-triangle.check | 2 +- tests/neg/given-triangle.scala | 2 +- tests/neg/i15264.scala | 2 +- tests/neg/i21212.check | 4 ++-- tests/neg/i21212.scala | 3 ++- tests/neg/i21303/Test.scala | 2 +- tests/pos/source-import-3-7-migration.scala | 1 + tests/pos/source-import-3-7.scala | 1 + tests/run/given-triangle.scala | 2 +- tests/run/implicit-specifity.scala | 2 +- tests/run/implied-priority.scala | 2 +- tests/warn/i20420.scala | 2 +- tests/warn/i21036a.check | 2 +- tests/warn/i21036a.scala | 2 +- tests/warn/i21036b.check | 2 +- tests/warn/i21036b.scala | 2 +- 20 files changed, 50 insertions(+), 31 deletions(-) create mode 100644 tests/pos/source-import-3-7-migration.scala create mode 100644 tests/pos/source-import-3-7.scala diff --git a/compiler/src/dotty/tools/dotc/config/SourceVersion.scala b/compiler/src/dotty/tools/dotc/config/SourceVersion.scala index 935b95003729..02140c3f4e3b 100644 --- a/compiler/src/dotty/tools/dotc/config/SourceVersion.scala +++ b/compiler/src/dotty/tools/dotc/config/SourceVersion.scala @@ -12,6 +12,7 @@ enum SourceVersion: case `3.4-migration`, `3.4` case `3.5-migration`, `3.5` case `3.6-migration`, `3.6` + case `3.7-migration`, `3.7` // !!! Keep in sync with scala.runtime.stdlibPatches.language !!! case `future-migration`, `future` diff --git a/compiler/src/dotty/tools/dotc/typer/Applications.scala b/compiler/src/dotty/tools/dotc/typer/Applications.scala index 2efe5282f025..9a5db44b15ca 100644 --- a/compiler/src/dotty/tools/dotc/typer/Applications.scala +++ b/compiler/src/dotty/tools/dotc/typer/Applications.scala @@ -1753,7 +1753,7 @@ trait Applications extends Compatibility { // and in mode Scala3-migration when we compare with the old Scala 2 rules. case Intermediate // Intermediate rules: better means specialize, but map all type arguments downwards - // These are enabled for 3.0-3.4, or if OldImplicitResolution + // These are enabled for 3.0-3.5, or if OldImplicitResolution // is specified, and also for all comparisons between old-style implicits, case New // New rules: better means generalize, givens (and extensions) always beat implicits @@ -1789,7 +1789,7 @@ trait Applications extends Compatibility { val oldResolution = ctx.mode.is(Mode.OldImplicitResolution) if !preferGeneral || Feature.migrateTo3 && oldResolution then CompareScheme.Old - else if Feature.sourceVersion.isAtMost(SourceVersion.`3.4`) + else if Feature.sourceVersion.isAtMost(SourceVersion.`3.5`) || oldResolution || alt1.symbol.is(Implicit) && alt2.symbol.is(Implicit) then CompareScheme.Intermediate @@ -1855,7 +1855,7 @@ trait Applications extends Compatibility { * available in 3.0-migration if mode `Mode.OldImplicitResolution` is turned on as well. * It is used to highlight differences between Scala 2 and 3 behavior. * - * - In Scala 3.0-3.5, the behavior is as follows: `T <:p U` iff there is an impliit conversion + * - In Scala 3.0-3.6, the behavior is as follows: `T <:p U` iff there is an implicit conversion * from `T` to `U`, or * * flip(T) <: flip(U) @@ -1870,14 +1870,14 @@ trait Applications extends Compatibility { * of parameters are not affected. So `T <: U` would imply `Set[Cmp[U]] <:p Set[Cmp[T]]`, * as usual, because `Set` is non-variant. * - * - From Scala 3.6, `T <:p U` means `T <: U` or `T` convertible to `U` + * - From Scala 3.7, `T <:p U` means `T <: U` or `T` convertible to `U` * for overloading resolution (when `preferGeneral is false), and the opposite relation * `U <: T` or `U convertible to `T` for implicit disambiguation between givens - * (when `preferGeneral` is true). For old-style implicit values, the 3.4 behavior is kept. + * (when `preferGeneral` is true). For old-style implicit values, the 3.5 behavior is kept. * If one of the alternatives is an implicit and the other is a given (or an extension), the implicit loses. * - * - In Scala 3.5 and Scala 3.6-migration, we issue a warning if the result under - * Scala 3.6 differ wrt to the old behavior up to 3.5. + * - In Scala 3.6 and Scala 3.7-migration, we issue a warning if the result under + * Scala 3.7 differs wrt to the old behavior up to 3.6. * * Also and only for given resolution: If a compared type refers to a given or its module class, use * the intersection of its parent classes instead. diff --git a/compiler/src/dotty/tools/dotc/typer/Implicits.scala b/compiler/src/dotty/tools/dotc/typer/Implicits.scala index 8a4ec986e23a..056356db6947 100644 --- a/compiler/src/dotty/tools/dotc/typer/Implicits.scala +++ b/compiler/src/dotty/tools/dotc/typer/Implicits.scala @@ -1305,13 +1305,13 @@ trait Implicits: /** Search a list of eligible implicit references */ private def searchImplicit(eligible: List[Candidate], contextual: Boolean): SearchResult = - // A map that associates a priority change warning (between -source 3.4 and 3.6) + // A map that associates a priority change warning (between -source 3.6 and 3.7) // with the candidate refs mentioned in the warning. We report the associated // message if one of the critical candidates is part of the result of the implicit search. val priorityChangeWarnings = mutable.ListBuffer[(/*critical:*/ List[TermRef], Message)]() def isWarnPriorityChangeVersion(sv: SourceVersion): Boolean = - sv.stable == SourceVersion.`3.5` || sv == SourceVersion.`3.6-migration` + sv.stable == SourceVersion.`3.6` || sv == SourceVersion.`3.7-migration` /** Compare `alt1` with `alt2` to determine which one should be chosen. * @@ -1319,12 +1319,12 @@ trait Implicits: * a number < 0 if `alt2` is preferred over `alt1` * 0 if neither alternative is preferred over the other * The behavior depends on the source version - * before 3.5: compare with preferGeneral = false - * 3.5: compare twice with preferGeneral = false and true, warning if result is different, + * before 3.6: compare with preferGeneral = false + * 3.6: compare twice with preferGeneral = false and true, warning if result is different, * return old result with preferGeneral = false - * 3.6-migration: compare twice with preferGeneral = false and true, warning if result is different, + * 3.7-migration: compare twice with preferGeneral = false and true, warning if result is different, * return new result with preferGeneral = true - * 3.6 and higher: compare with preferGeneral = true + * 3.7 and higher: compare with preferGeneral = true * * @param disambiguate The call is used to disambiguate two successes, not for ranking. * When ranking, we are always filtering out either > 0 or <= 0 results. @@ -1348,7 +1348,7 @@ trait Implicits: case -1 => "the second alternative" case 1 => "the first alternative" case _ => "none - it's ambiguous" - if sv.stable == SourceVersion.`3.5` then + if sv.stable == SourceVersion.`3.6` then warn( em"""Given search preference for $pt between alternatives | ${alt1.ref} @@ -1356,7 +1356,7 @@ trait Implicits: | ${alt2.ref} |will change. |Current choice : ${choice(prev)} - |New choice from Scala 3.6: ${choice(cmp)}""") + |New choice from Scala 3.7: ${choice(cmp)}""") prev else warn( @@ -1366,7 +1366,7 @@ trait Implicits: | ${alt2.ref} |has changed. |Previous choice : ${choice(prev)} - |New choice from Scala 3.6: ${choice(cmp)}""") + |New choice from Scala 3.7: ${choice(cmp)}""") cmp else cmp max prev // When ranking, we keep the better of cmp and prev, which ends up retaining a candidate diff --git a/library/src/scala/runtime/stdLibPatches/language.scala b/library/src/scala/runtime/stdLibPatches/language.scala index 1171c62602fb..b9f9d47bb0b1 100644 --- a/library/src/scala/runtime/stdLibPatches/language.scala +++ b/library/src/scala/runtime/stdLibPatches/language.scala @@ -300,6 +300,21 @@ object language: @compileTimeOnly("`3.6` can only be used at compile time in import statements") object `3.6` + /** Set source version to 3.7-migration. + * + * @see [[https://docs.scala-lang.org/scala3/guides/migration/compatibility-intro.html]] + */ + @compileTimeOnly("`3.7-migration` can only be used at compile time in import statements") + object `3.7-migration` + + /** Set source version to 3.7 + * + * @see [[https://docs.scala-lang.org/scala3/guides/migration/compatibility-intro.html]] + */ + @compileTimeOnly("`3.7` can only be used at compile time in import statements") + object `3.7` + + // !!! Keep in sync with dotty.tools.dotc.config.SourceVersion !!! // Also add tests in `tests/pos/source-import-3-x.scala` and `tests/pos/source-import-3-x-migration.scala` diff --git a/tests/neg/given-triangle.check b/tests/neg/given-triangle.check index 73d5aea12dc4..147c54270afb 100644 --- a/tests/neg/given-triangle.check +++ b/tests/neg/given-triangle.check @@ -9,4 +9,4 @@ | (given_B : B) |will change. |Current choice : the second alternative - |New choice from Scala 3.6: the first alternative + |New choice from Scala 3.7: the first alternative diff --git a/tests/neg/given-triangle.scala b/tests/neg/given-triangle.scala index 16aca7c44dee..4842c5314f51 100644 --- a/tests/neg/given-triangle.scala +++ b/tests/neg/given-triangle.scala @@ -1,4 +1,4 @@ -//> using options -source 3.5 +//> using options -source 3.6 class A class B extends A class C extends A diff --git a/tests/neg/i15264.scala b/tests/neg/i15264.scala index 825e74701f73..d690eccf23f3 100644 --- a/tests/neg/i15264.scala +++ b/tests/neg/i15264.scala @@ -1,4 +1,4 @@ -import language.`3.6` +import language.`3.7` object priority: // lower number = higher priority class Prio0 extends Prio1 diff --git a/tests/neg/i21212.check b/tests/neg/i21212.check index 5d9fe7728cbc..06740af36d77 100644 --- a/tests/neg/i21212.check +++ b/tests/neg/i21212.check @@ -1,4 +1,4 @@ --- [E172] Type Error: tests/neg/i21212.scala:8:52 ---------------------------------------------------------------------- -8 | def test2(using a2: A)(implicit b2: B) = summon[A] // error: ambiguous +-- [E172] Type Error: tests/neg/i21212.scala:9:52 ---------------------------------------------------------------------- +9 | def test2(using a2: A)(implicit b2: B) = summon[A] // error: ambiguous | ^ |Ambiguous given instances: both parameter b2 and parameter a2 match type Minimization.A of parameter x of method summon in object Predef diff --git a/tests/neg/i21212.scala b/tests/neg/i21212.scala index 389a82b19f1f..3b030cefcdc7 100644 --- a/tests/neg/i21212.scala +++ b/tests/neg/i21212.scala @@ -1,4 +1,5 @@ -//> using options -source:3.6 +//> using options -source 3.7 + object Minimization: trait A diff --git a/tests/neg/i21303/Test.scala b/tests/neg/i21303/Test.scala index fa8058140067..25d43dac344e 100644 --- a/tests/neg/i21303/Test.scala +++ b/tests/neg/i21303/Test.scala @@ -1,4 +1,4 @@ -//> using options -source 3.6-migration +//> using options -source 3.7-migration import scala.deriving.Mirror import scala.compiletime.* import scala.reflect.ClassTag diff --git a/tests/pos/source-import-3-7-migration.scala b/tests/pos/source-import-3-7-migration.scala new file mode 100644 index 000000000000..2e80fcb0bab2 --- /dev/null +++ b/tests/pos/source-import-3-7-migration.scala @@ -0,0 +1 @@ +import language.`3.7-migration` \ No newline at end of file diff --git a/tests/pos/source-import-3-7.scala b/tests/pos/source-import-3-7.scala new file mode 100644 index 000000000000..7fa68fd496f6 --- /dev/null +++ b/tests/pos/source-import-3-7.scala @@ -0,0 +1 @@ +import language.`3.7` \ No newline at end of file diff --git a/tests/run/given-triangle.scala b/tests/run/given-triangle.scala index 0b483e87f28c..66339f44e43c 100644 --- a/tests/run/given-triangle.scala +++ b/tests/run/given-triangle.scala @@ -1,4 +1,4 @@ -import language.`3.6` +import language.`3.7` class A class B extends A diff --git a/tests/run/implicit-specifity.scala b/tests/run/implicit-specifity.scala index da90110c9866..9e59cf5f1869 100644 --- a/tests/run/implicit-specifity.scala +++ b/tests/run/implicit-specifity.scala @@ -1,4 +1,4 @@ -import language.`3.6` +import language.`3.7` case class Show[T](val i: Int) object Show { diff --git a/tests/run/implied-priority.scala b/tests/run/implied-priority.scala index 15f6a40a27ef..a9380e117875 100644 --- a/tests/run/implied-priority.scala +++ b/tests/run/implied-priority.scala @@ -1,6 +1,6 @@ /* These tests show various mechanisms available for implicit prioritization. */ -import language.`3.6` +import language.`3.7` class E[T](val str: String) // The type for which we infer terms below diff --git a/tests/warn/i20420.scala b/tests/warn/i20420.scala index d28270509f91..4c7585e32f48 100644 --- a/tests/warn/i20420.scala +++ b/tests/warn/i20420.scala @@ -1,4 +1,4 @@ -//> using options -source 3.5-migration +//> using options -source 3.6-migration final class StrictEqual[V] final class Less[V] diff --git a/tests/warn/i21036a.check b/tests/warn/i21036a.check index 876a81ad8a83..63d611a6e246 100644 --- a/tests/warn/i21036a.check +++ b/tests/warn/i21036a.check @@ -7,4 +7,4 @@ | (a : A) | will change. | Current choice : the first alternative - | New choice from Scala 3.6: the second alternative + | New choice from Scala 3.7: the second alternative diff --git a/tests/warn/i21036a.scala b/tests/warn/i21036a.scala index ab97429852d6..b7aba27ca95e 100644 --- a/tests/warn/i21036a.scala +++ b/tests/warn/i21036a.scala @@ -1,4 +1,4 @@ -//> using options -source 3.5 +//> using options -source 3.6 trait A trait B extends A given b: B = ??? diff --git a/tests/warn/i21036b.check b/tests/warn/i21036b.check index 11bb38727d77..dfa19a0e9bb1 100644 --- a/tests/warn/i21036b.check +++ b/tests/warn/i21036b.check @@ -7,4 +7,4 @@ | (a : A) | has changed. | Previous choice : the first alternative - | New choice from Scala 3.6: the second alternative + | New choice from Scala 3.7: the second alternative diff --git a/tests/warn/i21036b.scala b/tests/warn/i21036b.scala index 16dd72266613..c440f5d3c06d 100644 --- a/tests/warn/i21036b.scala +++ b/tests/warn/i21036b.scala @@ -1,4 +1,4 @@ -//> using options -source 3.6-migration +//> using options -source 3.7-migration trait A trait B extends A given b: B = ??? From f68345811a3353c96131f3f73c9f73a01abd7254 Mon Sep 17 00:00:00 2001 From: odersky Date: Tue, 6 Aug 2024 19:57:01 +0200 Subject: [PATCH 323/371] Fix ranking logic --- .../dotty/tools/dotc/typer/Implicits.scala | 31 +++++++--- tests/pos/i15264.scala | 1 + tests/warn/i15264.scala | 56 +++++++++++++++++++ 3 files changed, 79 insertions(+), 9 deletions(-) create mode 100644 tests/warn/i15264.scala diff --git a/compiler/src/dotty/tools/dotc/typer/Implicits.scala b/compiler/src/dotty/tools/dotc/typer/Implicits.scala index 056356db6947..14491184b7a2 100644 --- a/compiler/src/dotty/tools/dotc/typer/Implicits.scala +++ b/compiler/src/dotty/tools/dotc/typer/Implicits.scala @@ -1369,8 +1369,13 @@ trait Implicits: |New choice from Scala 3.7: ${choice(cmp)}""") cmp else cmp max prev - // When ranking, we keep the better of cmp and prev, which ends up retaining a candidate - // if it is retained in either version. + // When ranking, alt1 is always the new candidate and alt2 is the + // solution found previously. We keep the candidate if the outcome is 0 + // (ambiguous) or 1 (first wins). Or, when ranking in healImplicit we keep the + // candidate only if the outcome is 1. In both cases, keeping the better + // of `cmp` and `prev` means we keep candidates that could match + // in either scheme. This means that subsequent disambiguation + // comparisons will record a warning if cmp != prev. else cmp end compareAlternatives @@ -1416,7 +1421,15 @@ trait Implicits: if diff < 0 then alt2 else if diff > 0 then alt1 else SearchFailure(new AmbiguousImplicits(alt1, alt2, pt, argument), span) - case _: SearchFailure => alt2 + case fail: SearchFailure => + fail.reason match + case ambi: AmbiguousImplicits => + if compareAlternatives(ambi.alt1, alt2) < 0 && + compareAlternatives(ambi.alt2, alt2) < 0 + then alt2 + else alt1 + case _ => + alt2 /** Try to find a best matching implicit term among all the candidates in `pending`. * @param pending The list of candidates that remain to be tested @@ -1621,7 +1634,7 @@ trait Implicits: throw ex val sorted = sort(eligible) - val result = sorted match + val res = sorted match case first :: rest => val firstIsImplicit = first.ref.symbol.is(Implicit) if rest.exists(_.ref.symbol.is(Implicit) != firstIsImplicit) then @@ -1638,11 +1651,11 @@ trait Implicits: // Issue all priority change warnings that can affect the result val shownWarnings = priorityChangeWarnings.toList.collect: - case (critical, msg) if result.found.exists(critical.contains(_)) => + case (critical, msg) if res.found.exists(critical.contains(_)) => msg - result match - case result: SearchFailure => - result.reason match + res match + case res: SearchFailure => + res.reason match case ambi: AmbiguousImplicits => // Make warnings part of error message because otherwise they are suppressed when // the error is emitted. @@ -1652,7 +1665,7 @@ trait Implicits: for msg <- shownWarnings do report.warning(msg, srcPos) - result + res end searchImplicit def isUnderSpecifiedArgument(tp: Type): Boolean = diff --git a/tests/pos/i15264.scala b/tests/pos/i15264.scala index 5be8436c12ba..18ca92df6cb1 100644 --- a/tests/pos/i15264.scala +++ b/tests/pos/i15264.scala @@ -1,3 +1,4 @@ +import language.`3.7` object priority: // lower number = higher priority class Prio0 extends Prio1 diff --git a/tests/warn/i15264.scala b/tests/warn/i15264.scala new file mode 100644 index 000000000000..9435c6364c08 --- /dev/null +++ b/tests/warn/i15264.scala @@ -0,0 +1,56 @@ +// Note: No check file for this test since the precise warning messages are non-deterministic +import language.`3.7-migration` +object priority: + // lower number = higher priority + class Prio0 extends Prio1 + object Prio0 { given Prio0() } + + class Prio1 extends Prio2 + object Prio1 { given Prio1() } + + class Prio2 + object Prio2 { given Prio2() } + +object repro: + // analogous to cats Eq, Hash, Order: + class A[V] + class B[V] extends A[V] + class C[V] extends A[V] + + class Q[V] + + object context: + // prios work here, which is cool + given[V](using priority.Prio0): C[V] = new C[V] + given[V](using priority.Prio1): B[V] = new B[V] + given[V](using priority.Prio2): A[V] = new A[V] + + object exports: + // so will these exports + export context.given + + // if you import these don't import from 'context' above + object qcontext: + // base defs, like what you would get from cats + given ga: A[Int] = new B[Int] // added so that we don't get an ambiguity in test2 + given gb: B[Int] = new B[Int] + given gc: C[Int] = new C[Int] + + // these seem like they should work but don't + given gcq[V](using p0: priority.Prio0)(using c: C[V]): C[Q[V]] = new C[Q[V]] + given gbq[V](using p1: priority.Prio1)(using b: B[V]): B[Q[V]] = new B[Q[V]] + given gaq[V](using p2: priority.Prio2)(using a: A[V]): A[Q[V]] = new A[Q[V]] + +object test1: + import repro.* + import repro.exports.given + + // these will work + val a = summon[A[Int]] // warn + + +object test2: + import repro.* + import repro.qcontext.given + + val a = summon[A[Q[Int]]] // warn From 33d7da88bc63f6f163adf4ef919fb0374ae9cf76 Mon Sep 17 00:00:00 2001 From: odersky Date: Tue, 6 Aug 2024 20:05:59 +0200 Subject: [PATCH 324/371] Make priority change warning messages stable Make the wording of a priority change warning message stable under different orders of eligibles. We now always report the previously chosen alternative first and the new one second. Note: We can still get ambiguities by fallging different pairs of alternatives depending on initial order. --- .../dotty/tools/dotc/typer/Implicits.scala | 66 +++++++++---------- tests/neg/given-triangle.check | 8 +-- 2 files changed, 36 insertions(+), 38 deletions(-) diff --git a/compiler/src/dotty/tools/dotc/typer/Implicits.scala b/compiler/src/dotty/tools/dotc/typer/Implicits.scala index 14491184b7a2..e6b2d16eace2 100644 --- a/compiler/src/dotty/tools/dotc/typer/Implicits.scala +++ b/compiler/src/dotty/tools/dotc/typer/Implicits.scala @@ -1310,9 +1310,6 @@ trait Implicits: // message if one of the critical candidates is part of the result of the implicit search. val priorityChangeWarnings = mutable.ListBuffer[(/*critical:*/ List[TermRef], Message)]() - def isWarnPriorityChangeVersion(sv: SourceVersion): Boolean = - sv.stable == SourceVersion.`3.6` || sv == SourceVersion.`3.7-migration` - /** Compare `alt1` with `alt2` to determine which one should be chosen. * * @return a number > 0 if `alt1` is preferred over `alt2` @@ -1337,37 +1334,38 @@ trait Implicits: else val cmp = comp(using searchContext()) val sv = Feature.sourceVersion - if isWarnPriorityChangeVersion(sv) then + val isLastOldVersion = sv.stable == SourceVersion.`3.6` + val isMigratingVersion = sv == SourceVersion.`3.7-migration` + if isLastOldVersion || isMigratingVersion then val prev = comp(using searchContext().addMode(Mode.OldImplicitResolution)) if disambiguate && cmp != prev then - def warn(msg: Message) = - val critical = alt1.ref :: alt2.ref :: Nil - priorityChangeWarnings += ((critical, msg)) - implicits.println(i"PRIORITY CHANGE ${alt1.ref}, ${alt2.ref}, $disambiguate") - def choice(c: Int) = c match - case -1 => "the second alternative" - case 1 => "the first alternative" - case _ => "none - it's ambiguous" - if sv.stable == SourceVersion.`3.6` then - warn( - em"""Given search preference for $pt between alternatives - | ${alt1.ref} - |and - | ${alt2.ref} - |will change. - |Current choice : ${choice(prev)} - |New choice from Scala 3.7: ${choice(cmp)}""") - prev - else - warn( - em"""Given search preference for $pt between alternatives - | ${alt1.ref} - |and - | ${alt2.ref} - |has changed. - |Previous choice : ${choice(prev)} - |New choice from Scala 3.7: ${choice(cmp)}""") - cmp + implicits.println(i"PRIORITY CHANGE ${alt1.ref}, ${alt2.ref}") + val (loser, winner) = + prev match + case 1 => (alt1, alt2) + case -1 => (alt2, alt1) + case 0 => + cmp match + case 1 => (alt2, alt1) + case -1 => (alt1, alt2) + def choice(nth: String, c: Int) = + if c == 0 then "none - it's ambiguous" + else s"the $nth alternative" + val (change, whichChoice) = + if isLastOldVersion + then ("will change", "Current choice ") + else ("has changed", "Previous choice") + val msg = + em"""Given search preference for $pt between alternatives + | ${loser.ref} + |and + | ${winner.ref} + |$change. + |$whichChoice : ${choice("first", prev)} + |New choice from Scala 3.7: ${choice("second", cmp)}""" + val critical = alt1.ref :: alt2.ref :: Nil + priorityChangeWarnings += ((critical, msg)) + if isLastOldVersion then prev else cmp else cmp max prev // When ranking, alt1 is always the new candidate and alt2 is the // solution found previously. We keep the candidate if the outcome is 0 @@ -1424,8 +1422,8 @@ trait Implicits: case fail: SearchFailure => fail.reason match case ambi: AmbiguousImplicits => - if compareAlternatives(ambi.alt1, alt2) < 0 && - compareAlternatives(ambi.alt2, alt2) < 0 + if compareAlternatives(ambi.alt1, alt2, disambiguate = true) < 0 + && compareAlternatives(ambi.alt2, alt2, disambiguate = true) < 0 then alt2 else alt1 case _ => diff --git a/tests/neg/given-triangle.check b/tests/neg/given-triangle.check index 147c54270afb..f366c18e78f0 100644 --- a/tests/neg/given-triangle.check +++ b/tests/neg/given-triangle.check @@ -4,9 +4,9 @@ |Ambiguous given instances: both given instance given_B and given instance given_C match type A of parameter a of method f | |Note: Given search preference for A between alternatives - | (given_A : A) - |and | (given_B : B) + |and + | (given_A : A) |will change. - |Current choice : the second alternative - |New choice from Scala 3.7: the first alternative + |Current choice : the first alternative + |New choice from Scala 3.7: the second alternative From d439b58bb09380453830db9c5ee11aa721a27ad5 Mon Sep 17 00:00:00 2001 From: Eugene Flesselle Date: Tue, 6 Aug 2024 23:42:07 +0200 Subject: [PATCH 325/371] Fix `healAmbiguous` to `compareAlternatives` with `disambiguate = true` On the final result, compared with all the ambiguous candidates we are trying to recover from. We should still use `disambiguate = false` when filtering the `pending` candidates for the purpose of warnings, as in the other cases. Before the changes, it was possible for an ambiguous SearchFailure to be healed by a candidate which was considered better (possibly only) under a prioritization scheme different from the current one. As an optimization, we can avoid redoing compareAlternatives in versions which could have only used the new prioritization scheme to begin with. Also restores behaviour avoiding false positive warnings. Specifically, in cases where we could report a change in prioritization, despite having not yet done `tryImplicit` on the alternative, i.e. it was only compared as part of an early filtering See #21045 for related changes --- .../dotty/tools/dotc/typer/Implicits.scala | 49 ++++++++++--------- 1 file changed, 27 insertions(+), 22 deletions(-) diff --git a/compiler/src/dotty/tools/dotc/typer/Implicits.scala b/compiler/src/dotty/tools/dotc/typer/Implicits.scala index e6b2d16eace2..90e8c832dd87 100644 --- a/compiler/src/dotty/tools/dotc/typer/Implicits.scala +++ b/compiler/src/dotty/tools/dotc/typer/Implicits.scala @@ -1310,6 +1310,10 @@ trait Implicits: // message if one of the critical candidates is part of the result of the implicit search. val priorityChangeWarnings = mutable.ListBuffer[(/*critical:*/ List[TermRef], Message)]() + val sv = Feature.sourceVersion + val isLastOldVersion = sv.stable == SourceVersion.`3.6` + val isWarnPriorityChangeVersion = isLastOldVersion || sv == SourceVersion.`3.7-migration` + /** Compare `alt1` with `alt2` to determine which one should be chosen. * * @return a number > 0 if `alt1` is preferred over `alt2` @@ -1333,10 +1337,7 @@ trait Implicits: else if alt1.level != alt2.level then alt1.level - alt2.level else val cmp = comp(using searchContext()) - val sv = Feature.sourceVersion - val isLastOldVersion = sv.stable == SourceVersion.`3.6` - val isMigratingVersion = sv == SourceVersion.`3.7-migration` - if isLastOldVersion || isMigratingVersion then + if isWarnPriorityChangeVersion then val prev = comp(using searchContext().addMode(Mode.OldImplicitResolution)) if disambiguate && cmp != prev then implicits.println(i"PRIORITY CHANGE ${alt1.ref}, ${alt2.ref}") @@ -1419,15 +1420,7 @@ trait Implicits: if diff < 0 then alt2 else if diff > 0 then alt1 else SearchFailure(new AmbiguousImplicits(alt1, alt2, pt, argument), span) - case fail: SearchFailure => - fail.reason match - case ambi: AmbiguousImplicits => - if compareAlternatives(ambi.alt1, alt2, disambiguate = true) < 0 - && compareAlternatives(ambi.alt2, alt2, disambiguate = true) < 0 - then alt2 - else alt1 - case _ => - alt2 + case _: SearchFailure => alt2 /** Try to find a best matching implicit term among all the candidates in `pending`. * @param pending The list of candidates that remain to be tested @@ -1451,12 +1444,27 @@ trait Implicits: pending match { case cand :: remaining => /** To recover from an ambiguous implicit failure, we need to find a pending - * candidate that is strictly better than the failed candidate(s). + * candidate that is strictly better than the failed `ambiguous` candidate(s). * If no such candidate is found, we propagate the ambiguity. */ - def healAmbiguous(fail: SearchFailure, betterThanFailed: Candidate => Boolean) = - val newPending = remaining.filter(betterThanFailed) - rank(newPending, fail, Nil).recoverWith(_ => fail) + def healAmbiguous(fail: SearchFailure, ambiguous: List[RefAndLevel]) = + def betterThanAmbiguous(newCand: RefAndLevel, disambiguate: Boolean): Boolean = + ambiguous.forall(compareAlternatives(newCand, _, disambiguate) > 0) + + inline def betterByCurrentScheme(newCand: RefAndLevel): Boolean = + if isWarnPriorityChangeVersion then + // newCand may have only been kept in pending because it was better in the other priotization scheme. + // If that candidate produces a SearchSuccess, disambiguate will return it as the found SearchResult. + // We must now recheck it was really better than the ambigous candidates we are recovering from, + // under the rules of the current scheme, which are applied when disambiguate = true. + betterThanAmbiguous(newCand, disambiguate = true) + else true + + val newPending = remaining.filter(betterThanAmbiguous(_, disambiguate = false)) + rank(newPending, fail, Nil) match + case found: SearchSuccess if betterByCurrentScheme(found) => found + case _ => fail + end healAmbiguous negateIfNot(tryImplicit(cand, contextual)) match { case fail: SearchFailure => @@ -1471,8 +1479,7 @@ trait Implicits: else // The ambiguity happened in a nested search: to recover we // need a candidate better than `cand` - healAmbiguous(fail, newCand => - compareAlternatives(newCand, cand) > 0) + healAmbiguous(fail, cand :: Nil) else // keep only warnings that don't involve the failed candidate reference priorityChangeWarnings.filterInPlace: (critical, _) => @@ -1491,9 +1498,7 @@ trait Implicits: // The ambiguity happened in the current search: to recover we // need a candidate better than the two ambiguous alternatives. val ambi = fail.reason.asInstanceOf[AmbiguousImplicits] - healAmbiguous(fail, newCand => - compareAlternatives(newCand, ambi.alt1) > 0 && - compareAlternatives(newCand, ambi.alt2) > 0) + healAmbiguous(fail, ambi.alt1 :: ambi.alt2 :: Nil) } } case nil => From 73c6e883318324d46b978154ff0213b8e6eed76d Mon Sep 17 00:00:00 2001 From: Wojciech Mazur Date: Wed, 7 Aug 2024 13:57:38 +0200 Subject: [PATCH 326/371] Adjust compilation tests to backported changes --- tests/neg/scala-uri.check | 4 ++-- tests/neg/scala-uri.scala | 1 + tests/pos/i13044.scala | 2 +- 3 files changed, 4 insertions(+), 3 deletions(-) diff --git a/tests/neg/scala-uri.check b/tests/neg/scala-uri.check index 91bcd7ab6a6c..b6d52d6fffd0 100644 --- a/tests/neg/scala-uri.check +++ b/tests/neg/scala-uri.check @@ -1,5 +1,5 @@ --- [E172] Type Error: tests/neg/scala-uri.scala:30:59 ------------------------------------------------------------------ -30 |@main def Test = summon[QueryKeyValue[(String, None.type)]] // error +-- [E172] Type Error: tests/neg/scala-uri.scala:31:59 ------------------------------------------------------------------ +31 |@main def Test = summon[QueryKeyValue[(String, None.type)]] // error | ^ |No best given instance of type QueryKeyValue[(String, None.type)] was found for parameter x of method summon in object Predef. |I found: diff --git a/tests/neg/scala-uri.scala b/tests/neg/scala-uri.scala index 3820f8cf5613..f3bff269234f 100644 --- a/tests/neg/scala-uri.scala +++ b/tests/neg/scala-uri.scala @@ -1,3 +1,4 @@ +//> using options -source:3.6 import scala.language.implicitConversions trait QueryKey[A] diff --git a/tests/pos/i13044.scala b/tests/pos/i13044.scala index 4c9b8b914062..36299d9e8366 100644 --- a/tests/pos/i13044.scala +++ b/tests/pos/i13044.scala @@ -1,4 +1,4 @@ -//> using options -Xmax-inlines:33 +//> using options -Xmax-inlines:35 import scala.deriving.Mirror import scala.compiletime._ From a1882e1edc5e04a7e16200354ff161a4533b1009 Mon Sep 17 00:00:00 2001 From: Wojciech Mazur Date: Wed, 7 Aug 2024 21:38:02 +0200 Subject: [PATCH 327/371] Revert "Compensate loss of transitivity" This reverts commit 8a41389dd4ea6c15f2089519ac5883c4c72a0c56. --- .../src/dotty/tools/dotc/typer/Implicits.scala | 17 ++--------------- tests/pos/given-owner-disambiguate.scala | 13 ------------- 2 files changed, 2 insertions(+), 28 deletions(-) delete mode 100644 tests/pos/given-owner-disambiguate.scala diff --git a/compiler/src/dotty/tools/dotc/typer/Implicits.scala b/compiler/src/dotty/tools/dotc/typer/Implicits.scala index 90e8c832dd87..5ca5ac5bb59d 100644 --- a/compiler/src/dotty/tools/dotc/typer/Implicits.scala +++ b/compiler/src/dotty/tools/dotc/typer/Implicits.scala @@ -1385,6 +1385,7 @@ trait Implicits: def disambiguate(alt1: SearchResult, alt2: SearchSuccess) = alt1 match case alt1: SearchSuccess => var diff = compareAlternatives(alt1, alt2, disambiguate = true) + // diff > 0 candidates should already have been eliminated in `rank` if diff == 0 && alt1.ref =:= alt2.ref then diff = 1 // See i12951 for a test where this happens else if diff == 0 && alt2.isExtension then @@ -1636,21 +1637,7 @@ trait Implicits: validateOrdering(ord) throw ex - val sorted = sort(eligible) - val res = sorted match - case first :: rest => - val firstIsImplicit = first.ref.symbol.is(Implicit) - if rest.exists(_.ref.symbol.is(Implicit) != firstIsImplicit) then - // Mixture of implicits and givens - // Rank implicits first, then, if there is a given that it better than the best implicit(s) - // switch over to givens. - val (sortedImplicits, sortedGivens) = sorted.partition(_.ref.symbol.is(Implicit)) - val implicitResult = rank(sortedImplicits, NoMatchingImplicitsFailure, Nil) - rank(sortedGivens, implicitResult, Nil) - else - rank(sorted, NoMatchingImplicitsFailure, Nil) - case _ => - NoMatchingImplicitsFailure + val res = rank(sort(eligible), NoMatchingImplicitsFailure, Nil) // Issue all priority change warnings that can affect the result val shownWarnings = priorityChangeWarnings.toList.collect: diff --git a/tests/pos/given-owner-disambiguate.scala b/tests/pos/given-owner-disambiguate.scala deleted file mode 100644 index f0a44ecc441a..000000000000 --- a/tests/pos/given-owner-disambiguate.scala +++ /dev/null @@ -1,13 +0,0 @@ -class General -class Specific extends General - -class LowPriority: - given a:General() - -object NormalPriority extends LowPriority: - given b:Specific() - -def run = - import NormalPriority.given - val x = summon[General] - val _: Specific = x // <- b was picked \ No newline at end of file From d72e8e0421524389b209fcd65eb656cf3fc0d385 Mon Sep 17 00:00:00 2001 From: Wojciech Mazur Date: Thu, 8 Aug 2024 10:20:09 +0200 Subject: [PATCH 328/371] Add changelog for 3.5.0-RC7 --- changelogs/3.5.0-RC7.md | 15 +++++++++++++++ 1 file changed, 15 insertions(+) create mode 100644 changelogs/3.5.0-RC7.md diff --git a/changelogs/3.5.0-RC7.md b/changelogs/3.5.0-RC7.md new file mode 100644 index 000000000000..dab10f60b1ee --- /dev/null +++ b/changelogs/3.5.0-RC7.md @@ -0,0 +1,15 @@ +# Backported fixes + +- Backport "Fix healAmbiguous to compareAlternatives with disambiguate = true" to 3.5.0 [#21344](https://github.com/scala/scala3/pull/21344) + +# Contributors + +Thank you to all the contributors who made this release possible 🎉 + +According to `git shortlog -sn --no-merges 3.5.0-RC6..3.5.0-RC7` these are: + +``` + 5 Martin Odersky + 4 Wojciech Mazur + 2 Eugene Flesselle +``` From 19534dbf7252d003ae5e9044f981773ab653a955 Mon Sep 17 00:00:00 2001 From: Wojciech Mazur Date: Thu, 8 Aug 2024 10:20:36 +0200 Subject: [PATCH 329/371] Release 3.5.0-RC7 --- project/Build.scala | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/project/Build.scala b/project/Build.scala index e1a61d82aca7..3ab5aa77bf15 100644 --- a/project/Build.scala +++ b/project/Build.scala @@ -88,7 +88,7 @@ object Build { val referenceVersion = "3.4.2" - val baseVersion = "3.5.0-RC6" + val baseVersion = "3.5.0-RC7" // LTS or Next val versionLine = "Next" From 180deab26a55b1e8936b2fc6fb0e48eb4af21bf4 Mon Sep 17 00:00:00 2001 From: =?UTF-8?q?Pawe=C5=82=20Marks?= Date: Mon, 12 Aug 2024 15:04:35 +0200 Subject: [PATCH 330/371] Add changelog for 3.5.0 --- changelogs/3.5.0.md | 278 ++++++++++++++++++++++++++++++++++++++++++++ 1 file changed, 278 insertions(+) create mode 100644 changelogs/3.5.0.md diff --git a/changelogs/3.5.0.md b/changelogs/3.5.0.md new file mode 100644 index 000000000000..654a19b169a8 --- /dev/null +++ b/changelogs/3.5.0.md @@ -0,0 +1,278 @@ +# Highlights of the release + +- Bundle scala-cli in scala command (For RC1 requires JVM 17, further RCs will use native launchers) +- Introduce Best Effort compilation options [#17582](https://github.com/lampepfl/dotty/pull/17582) +- Add support for Pipelined builds [#18880](https://github.com/lampepfl/dotty/pull/18880) +- Add support for `var` in refinements [#19982](https://github.com/lampepfl/dotty/pull/19982) +- Implement SIP-42 - Support for binary integer literals [#19405](https://github.com/lampepfl/dotty/pull/19405) + +# Other changes and fixes + +## Backend + +- Fix Closure span assignment in makeClosure [#15841](https://github.com/lampepfl/dotty/pull/15841) + +## Default parameters + +- Fix default args lookup for given classes [#20256](https://github.com/lampepfl/dotty/pull/20256) +- Fix implicit search failure reporting [#20261](https://github.com/lampepfl/dotty/pull/20261) + +## Derivation + +- Fix infinite loop in Mirror synthesis of unreducible match type [#20133](https://github.com/lampepfl/dotty/pull/20133) + +## Desugaring + +- Add explanation to checkCaseClassInheritanceInvariant error msg [#20141](https://github.com/lampepfl/dotty/pull/20141) + +## Exports + +- Add annotations in parameters for exports [#20140](https://github.com/lampepfl/dotty/pull/20140) +- Fix isAliasType [#20195](https://github.com/lampepfl/dotty/pull/20195) + +## Implicits + +- Fix implicitNotFound message for type aliases [#19343](https://github.com/lampepfl/dotty/pull/19343) +- Normalize types before collecting parts determining implicit scope [#20077](https://github.com/lampepfl/dotty/pull/20077) +- Better error diagnostics under -explain-cyclic [#20251](https://github.com/lampepfl/dotty/pull/20251) +- Update unreducible match types error reporting [#19954](https://github.com/lampepfl/dotty/pull/19954) +- Improve ConstraintHandling of SkolemTypes [#20175](https://github.com/lampepfl/dotty/pull/20175) + +## Incremental Compilation + +- Retain default parameters with `export` [#20167](https://github.com/lampepfl/dotty/pull/20167) + +## Inline + +- Fix by-name parameter in beta-reduction [#20096](https://github.com/lampepfl/dotty/pull/20096) +- Add warning for anonymous inline classes (#16723) [#20291](https://github.com/lampepfl/dotty/pull/20291) +- Avoid conversion of `Unit` type into `()` term [#20295](https://github.com/lampepfl/dotty/pull/20295) +- Type desugared `transparent inline def unapply` call in the correct mode [#20108](https://github.com/lampepfl/dotty/pull/20108) +- Regression: fix compilation performance on Windows [#20193](https://github.com/lampepfl/dotty/pull/20193) +- Fix inline match on blocks with multiple statements [#20125](https://github.com/lampepfl/dotty/pull/20125) +- Inline `unapply`s in the inlining phase [#19382](https://github.com/lampepfl/dotty/pull/19382) +- Fix outerSelect in Inliner [#20313](https://github.com/lampepfl/dotty/pull/20313) + +## Linting + +- Fix #20146: attach the original name if there is an import selection for an indent [#20163](https://github.com/lampepfl/dotty/pull/20163) +- Add regression test for issue 18632 [#20308](https://github.com/lampepfl/dotty/pull/20308) + +## Match Types + +- Make aliases of `MatchAlias`es normal `TypeAlias`es [#19871](https://github.com/lampepfl/dotty/pull/19871) +- Fix #19746: Do not follow param term refs in `isConcrete`. [#20015](https://github.com/lampepfl/dotty/pull/20015) +- Do match type reduction atPhaseNoLater than ElimOpaque [#20017](https://github.com/lampepfl/dotty/pull/20017) +- Do not flag match types as `Deferred` and amend #20077 [#20147](https://github.com/lampepfl/dotty/pull/20147) +- Always use baseType when constraining patternTp with scrutineeTp [#20032](https://github.com/lampepfl/dotty/pull/20032) +- Use `MirrorSource.reduce` result for `companionPath` [#20207](https://github.com/lampepfl/dotty/pull/20207) +- Regression: Fix match type extraction of a MatchAlias [#20111](https://github.com/lampepfl/dotty/pull/20111) +- Revert "Approximate MatchTypes with lub of case bodies, if non-recursive" in 3.5.0 [#21266](https://github.com/scala/scala3/pull/21266) + +## Polyfunctions + +- Discard poly-functions when trying to resolve overloading [#20181](https://github.com/lampepfl/dotty/pull/20181) + +## Presentation Compiler + +- Stabilise returned completions by improving deduplication + extra completions for constructors [#19976](https://github.com/lampepfl/dotty/pull/19976) +- Fix active param index for empty param lists [#20142](https://github.com/lampepfl/dotty/pull/20142) +- Delias type members in hover [#20173](https://github.com/lampepfl/dotty/pull/20173) +- Interactive: handle context bounds in extension construct workaround [#20201](https://github.com/lampepfl/dotty/pull/20201) +- Fix: prefer non-export definition locations [#20252](https://github.com/lampepfl/dotty/pull/20252) +- Don't show enum completions in new keyword context [#20304](https://github.com/lampepfl/dotty/pull/20304) +- Chore: Backport changes for presentation compiler [#20345](https://github.com/lampepfl/dotty/pull/20345) +- Add custom matchers for completions (fuzzy search for presentation compiler) [#19850](https://github.com/lampepfl/dotty/pull/19850) + +## Quotes + +- Fix TermRef prefixes not having their type healed [#20102](https://github.com/lampepfl/dotty/pull/20102) +- Improve reporting in staging about the possible use of an incorrect class loader [#20137](https://github.com/lampepfl/dotty/pull/20137) +- Introduce MethodTypeKind to quotes reflection API [#20249](https://github.com/lampepfl/dotty/pull/20249) +- Add quote ASTs to TASTy [#20165](https://github.com/lampepfl/dotty/pull/20165) + +## Reflection + +- Allow to beta reduce curried function applications in quotes reflect [#18121](https://github.com/lampepfl/dotty/pull/18121) +- Set the inlining phase in the Context used for checking macro trees [#20087](https://github.com/lampepfl/dotty/pull/20087) +- Add Symbol.isSuperAccessor to reflection API [#13388](https://github.com/lampepfl/dotty/pull/13388) +- Stabilize reflect `SymbolMethods.isSuperAccessor` [#20198](https://github.com/lampepfl/dotty/pull/20198) + +## Repl + +- Fix validity period of derived SingleDenotations [#19983](https://github.com/lampepfl/dotty/pull/19983) +- Fix #18383: Never consider top-level `import`s as unused in the repl. [#20310](https://github.com/lampepfl/dotty/pull/20310) + +## Reporting + +- Warn if extension receiver already has member [#17543](https://github.com/lampepfl/dotty/pull/17543) +- Deprecation of case class elements [#17911](https://github.com/lampepfl/dotty/pull/17911) +- Support src filter in -WConf (Closes #17635) [#18783](https://github.com/lampepfl/dotty/pull/18783) +- Add note about type mismatch in automatically inserted apply argument [#20023](https://github.com/lampepfl/dotty/pull/20023) +- Make error reporting resilient to exception thrown while reporting [#20158](https://github.com/lampepfl/dotty/pull/20158) +- Remove duplicate comma from Matchable selector warning [#20159](https://github.com/lampepfl/dotty/pull/20159) +- Generalize warnings for top-level calls to Any or AnyRef methods [#20312](https://github.com/lampepfl/dotty/pull/20312) +- Make CheckUnused not slow. [#20321](https://github.com/lampepfl/dotty/pull/20321) +- Bring back ambiguity filter when we report an implicit not found error [#20368](https://github.com/scala/scala3/pull/20368) +- Treat 3.5-migration the same as 3.5 for a warning about implicit priority change [#20436](https://github.com/scala/scala3/pull/20436) +- Priority warning fix alternative [#20487](https://github.com/scala/scala3/pull/20487) +- Use final result type to check selector bound [#20989](https://github.com/scala/scala3/pull/20989) +- Refine implicit priority change warnings [#21045](https://github.com/scala/scala3/pull/21045) +- Backport "Fix healAmbiguous to compareAlternatives with disambiguate = true" to 3.5.0 [#21344](https://github.com/scala/scala3/pull/21344) + +## Rewrites + +- Patch indentation when removing braces (and other bug fixes in `-indent -rewrite`) [#17522](https://github.com/lampepfl/dotty/pull/17522) +- Extra check to avoid converting block expressions on the rhs of an in… [#20043](https://github.com/lampepfl/dotty/pull/20043) + +## Scaladoc + +- Fix scaladoc crash on Windows - illegal path character [#20311](https://github.com/lampepfl/dotty/pull/20311) +- Scaladoc: improve refined function types rendering [#20333](https://github.com/lampepfl/dotty/pull/20333) +- Relax font-weight reset [#20348](https://github.com/lampepfl/dotty/pull/20348) + +## Scala JS + +- Optimize main.js [#20093](https://github.com/lampepfl/dotty/pull/20093) + +## Settings + +- Lift Scala Settings from experimental to stabilized [#20199](https://github.com/lampepfl/dotty/pull/20199) + +## Tooling + +- Detect macro dependencies that are missing from the classloader [#20139](https://github.com/lampepfl/dotty/pull/20139) +- Write pipelined tasty in parallel. [#20153](https://github.com/lampepfl/dotty/pull/20153) +- ConsoleReporter sends INFO to stdout [#20328](https://github.com/lampepfl/dotty/pull/20328) +- Bundle scala-cli in scala command [#20351](https://github.com/scala/scala3/pull/20351) +- Adapt the workflow to release on SDKMAN! [#20535](https://github.com/scala/scala3/pull/20535) +- Adapt the release workflow to SIP-46 [#20565](https://github.com/scala/scala3/pull/20565) +- Release .zip instead of .tar.gz for windows in sdkman [#20630](https://github.com/scala/scala3/pull/20630) +- SIP 46 - read classpath from file, remove lib directory in distribution [#20631](https://github.com/scala/scala3/pull/20631) +.gz for windows in sdkman [#20630](https://github.com/scala/scala3/pull/20630) +- Bump scala-cli to 1.4.0 [#20859](https://github.com/scala/scala3/pull/20859) +- Add --skip-cli-updates by default to the scala command [#20900](https://github.com/scala/scala3/pull/20900) +- Use pathing jars in cli commands [#21121](https://github.com/scala/scala3/pull/21121) +- expand classpath of pathing jars in scala_legacy command [#21160](https://github.com/scala/scala3/pull/21160) +- emit generatedNonLocalClass in backend when callback is not enabled [#21186](https://github.com/scala/scala3/pull/21186) + +## Transform + +- Fix overloaded default methods test in RefChecks [#20218](https://github.com/lampepfl/dotty/pull/20218) +- Fix handling of AppliedType aliases in outerPrefix [#20190](https://github.com/lampepfl/dotty/pull/20190) +- Elide unit binding when beta-reducing [#20085](https://github.com/lampepfl/dotty/pull/20085) + +## Typer + +- Reduce projections of type aliases with class type prefixes [#19931](https://github.com/lampepfl/dotty/pull/19931) +- Re-lub also hard union types in simplify [#20027](https://github.com/lampepfl/dotty/pull/20027) +- Fix #19789: Merge same TypeParamRef in orDominator [#20090](https://github.com/lampepfl/dotty/pull/20090) +- Allow SAM types to contain match alias refinements [#20092](https://github.com/lampepfl/dotty/pull/20092) +- Don't dealias when deciding which arguments to defer [#20116](https://github.com/lampepfl/dotty/pull/20116) +- Avoid the TypeVar.inst trap [#20160](https://github.com/lampepfl/dotty/pull/20160) +- Avoid crash when superType does not exist after erasure [#20188](https://github.com/lampepfl/dotty/pull/20188) +- Refine overloading and implicit disambiguation [#20084](https://github.com/lampepfl/dotty/pull/20084) +- Refactor constant folding of applications [#20099](https://github.com/lampepfl/dotty/pull/20099) +- Rollback constraints if `isSameType` failed second direction [#20109](https://github.com/lampepfl/dotty/pull/20109) +- Suppress "extension method will never be selected" for overrides [#20164](https://github.com/lampepfl/dotty/pull/20164) +- Allow SAM types to contain multiple refinements [#20172](https://github.com/lampepfl/dotty/pull/20172) +- Normalize when verifying if TypeTestCasts are unchecked [#20258](https://github.com/lampepfl/dotty/pull/20258) +- Avoid stacked thisCall contexts [#20488](https://github.com/scala/scala3/pull/20488) +- fix issue 20901: etaCollapse context bound type [#20910](https://github.com/scala/scala3/pull/20910) +- Fix symbol reference retrivial of `scala.caps.Caps` [#20493](https://github.com/scala/scala3/pull/20493) + +# Experimental Changes + +- Named tuples second implementation [#19174](https://github.com/lampepfl/dotty/pull/19174) +- Change rules for given prioritization [#19300](https://github.com/lampepfl/dotty/pull/19300) +- Enable experimental mode when experimental feature is imported [#19807](https://github.com/lampepfl/dotty/pull/19807) +- Add message parameter to `@experimental` annotation [#19935](https://github.com/lampepfl/dotty/pull/19935) +- Implement match type amendment: extractors follow aliases and singletons [#20161](https://github.com/lampepfl/dotty/pull/20161) +- Avoid forcing whole package when using -experimental [#20409](https://github.com/scala/scala3/pull/20409) + +## Capture Checking + +- Carry and check universal capability from parents correctly [#20004](https://github.com/lampepfl/dotty/pull/20004) +- Make parameter types of context functions inferred type trees [#20155](https://github.com/lampepfl/dotty/pull/20155) +- Handle reach capabilities correctly in depedent functions [#20203](https://github.com/lampepfl/dotty/pull/20203) +- Fix the visibility check in `markFree` [#20221](https://github.com/lampepfl/dotty/pull/20221) +- Make inline proxy vals have inferred types [#20241](https://github.com/lampepfl/dotty/pull/20241) +- CC: Give more info when context function parameters leak [#20244](https://github.com/lampepfl/dotty/pull/20244) +- Plug soundness hole for reach capabilities [#20051](https://github.com/lampepfl/dotty/pull/20051) +- Tighten the screws a bit more to seal the soundness hole for reach capabilities [#20056](https://github.com/lampepfl/dotty/pull/20056) +- Drop retains annotations in inferred type trees [#20057](https://github.com/lampepfl/dotty/pull/20057) +- Allow @retains arguments to be context functions [#20232](https://github.com/lampepfl/dotty/pull/20232) +- Fix conversion of this.fld capture refs under separate compilation [#20238](https://github.com/lampepfl/dotty/pull/20238) + +## Erased definitions + +- Fix "Compiler crash when using CanThrow" [#20210](https://github.com/lampepfl/dotty/pull/20210) +- Only allow erased parameters in erased definitions [#19686](https://github.com/lampepfl/dotty/pull/19686) + +## Initialization + +- Deprecate `StandardPlugin.init` in favor of `initialize` method taking implicit Context [#20330](https://github.com/lampepfl/dotty/pull/20330) +- Fix missing changesParents in PostTyper [#20062](https://github.com/lampepfl/dotty/pull/20062) +- Special case for next field of colon colon in global init checker [#20281](https://github.com/lampepfl/dotty/pull/20281) +- Extend whitelist in global initialization checker [#20290](https://github.com/lampepfl/dotty/pull/20290) + +## Macro Annotations + +- Allow macro annotation to transform companion [#19677](https://github.com/lampepfl/dotty/pull/19677) +- Remove experimental `MainAnnotation`/`newMain` (replaced with `MacroAnnotation`) [#19937](https://github.com/lampepfl/dotty/pull/19937) + +## Nullability + +- Add flexible types to deal with Java-defined signatures under -Yexplicit-nulls [#18112](https://github.com/lampepfl/dotty/pull/18112) +- Fix #20287: Add flexible types to Quotes library [#20293](https://github.com/lampepfl/dotty/pull/20293) +- Add fromNullable to Predef for explicit nulls [#20222](https://github.com/lampepfl/dotty/pull/20222) + + +# Contributors + +Thank you to all the contributors who made this release possible 🎉 + +According to `git shortlog -sn --no-merges 3.4.2..3.5.0` these are: + +``` + 153 Martin Odersky + 53 Eugene Flesselle + 41 Jamie Thompson + 29 Wojciech Mazur + 25 Nicolas Stucki + 22 Sébastien Doeraene + 18 noti0na1 + 16 Matt Bovel + 13 Guillaume Martres + 11 Paweł Marks + 10 Hamza REMMAL + 9 Yichen Xu + 8 Jan Chyb + 7 Hamza Remmal + 7 Som Snytt + 6 Jędrzej Rochala + 5 Fengyun Liu + 5 dependabot[bot] + 3 Mikołaj Fornal + 2 Aviv Keller + 2 EnzeXing + 1 Chris Pado + 1 Filip Zybała + 1 Georgi Krastev + 1 Jisoo Park + 1 Katarzyna Marek + 1 Lucas Nouguier + 1 Lucy Martin + 1 Ola Flisbäck + 1 Pascal Weisenburger + 1 Quentin Bernet + 1 Raphael Jolly + 1 Seth Tisue + 1 Stephane Bersier + 1 Tomasz Godzik + 1 Yoonjae Jeon + 1 aherlihy + 1 rochala + 1 willerf + +``` From 834c973b61848dfdd9c8a817a372e319526d7fdd Mon Sep 17 00:00:00 2001 From: =?UTF-8?q?Pawe=C5=82=20Marks?= Date: Mon, 12 Aug 2024 15:07:18 +0200 Subject: [PATCH 331/371] Release 3.5.0 --- project/Build.scala | 2 +- tasty/src/dotty/tools/tasty/TastyFormat.scala | 2 +- 2 files changed, 2 insertions(+), 2 deletions(-) diff --git a/project/Build.scala b/project/Build.scala index 3ab5aa77bf15..047f2c0c22ea 100644 --- a/project/Build.scala +++ b/project/Build.scala @@ -88,7 +88,7 @@ object Build { val referenceVersion = "3.4.2" - val baseVersion = "3.5.0-RC7" + val baseVersion = "3.5.0" // LTS or Next val versionLine = "Next" diff --git a/tasty/src/dotty/tools/tasty/TastyFormat.scala b/tasty/src/dotty/tools/tasty/TastyFormat.scala index c29ea99bcd8d..1e075efcf857 100644 --- a/tasty/src/dotty/tools/tasty/TastyFormat.scala +++ b/tasty/src/dotty/tools/tasty/TastyFormat.scala @@ -340,7 +340,7 @@ object TastyFormat { * is able to read final TASTy documents if the file's * `MinorVersion` is strictly less than the current value. */ - final val ExperimentalVersion: Int = 1 + final val ExperimentalVersion: Int = 0 /**This method implements a binary relation (`<:<`) between two TASTy versions. * From 7590f91da3f42854ba7abcc707d5487153c2b20c Mon Sep 17 00:00:00 2001 From: Hamza Remmal Date: Wed, 21 Aug 2024 11:39:41 +0100 Subject: [PATCH 332/371] Update hamzaremmal/sdkman-release-action action --- .github/workflows/publish-sdkman.yml | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/.github/workflows/publish-sdkman.yml b/.github/workflows/publish-sdkman.yml index de12f81426b5..77bbebf3f846 100644 --- a/.github/workflows/publish-sdkman.yml +++ b/.github/workflows/publish-sdkman.yml @@ -46,7 +46,7 @@ jobs: - platform: WINDOWS_64 archive : 'scala3-${{ inputs.version }}-x86_64-pc-win32.zip' steps: - - uses: hamzaremmal/sdkman-release-action@7e437233a6bd79bc4cb0fa9071b685e94bdfdba6 + - uses: hamzaremmal/sdkman-release-action@978b8cdb5f9c3b83ebdc45e0a1bf97bf17cc6280 with: CONSUMER-KEY : ${{ secrets.CONSUMER-KEY }} CONSUMER-TOKEN : ${{ secrets.CONSUMER-TOKEN }} From 9da1ae80e4536a1b987f862eef634be8974ab996 Mon Sep 17 00:00:00 2001 From: Hamza Remmal Date: Wed, 21 Aug 2024 12:10:54 +0100 Subject: [PATCH 333/371] Update hamzaremmal/sdkman-release-action & hamzaremmal/sdkman-default-action action --- .github/workflows/publish-sdkman.yml | 4 ++-- 1 file changed, 2 insertions(+), 2 deletions(-) diff --git a/.github/workflows/publish-sdkman.yml b/.github/workflows/publish-sdkman.yml index 77bbebf3f846..6f10ac128b6e 100644 --- a/.github/workflows/publish-sdkman.yml +++ b/.github/workflows/publish-sdkman.yml @@ -46,7 +46,7 @@ jobs: - platform: WINDOWS_64 archive : 'scala3-${{ inputs.version }}-x86_64-pc-win32.zip' steps: - - uses: hamzaremmal/sdkman-release-action@978b8cdb5f9c3b83ebdc45e0a1bf97bf17cc6280 + - uses: hamzaremmal/sdkman-release-action@4cb6c8cf99cfdf0ed5de586d6b38500558737e65 with: CONSUMER-KEY : ${{ secrets.CONSUMER-KEY }} CONSUMER-TOKEN : ${{ secrets.CONSUMER-TOKEN }} @@ -59,7 +59,7 @@ jobs: runs-on: ubuntu-latest needs: publish steps: - - uses: hamzaremmal/sdkman-default-action@866bc79fc5bd397eeb48f9cedda2f15221c8515d + - uses: hamzaremmal/sdkman-default-action@f312ff69dec7c4f83b060c3df90df7ed19e2d70e with: CONSUMER-KEY : ${{ secrets.CONSUMER-KEY }} CONSUMER-TOKEN : ${{ secrets.CONSUMER-TOKEN }} From 8dcaed1b4dca9539ec6ffd900b0b6acee7c1ee6e Mon Sep 17 00:00:00 2001 From: Wojciech Mazur Date: Wed, 21 Aug 2024 14:20:06 +0200 Subject: [PATCH 334/371] Sync with 3.5.0 release --- tasty/src/dotty/tools/tasty/TastyFormat.scala | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/tasty/src/dotty/tools/tasty/TastyFormat.scala b/tasty/src/dotty/tools/tasty/TastyFormat.scala index 67beb1ea1d56..1e075efcf857 100644 --- a/tasty/src/dotty/tools/tasty/TastyFormat.scala +++ b/tasty/src/dotty/tools/tasty/TastyFormat.scala @@ -324,7 +324,7 @@ object TastyFormat { * compatibility, but remains backwards compatible, with all * preceding `MinorVersion`. */ - final val MinorVersion: Int = 4 + final val MinorVersion: Int = 5 /** Natural Number. The `ExperimentalVersion` allows for * experimentation with changes to TASTy without committing From 80e09f71434b73b00768795416984777d61d7733 Mon Sep 17 00:00:00 2001 From: Wojciech Mazur Date: Wed, 21 Aug 2024 14:40:06 +0200 Subject: [PATCH 335/371] Regenerate reference-expected-links after 3.5.0 merge --- project/scripts/expected-links/reference-expected-links.txt | 1 + 1 file changed, 1 insertion(+) diff --git a/project/scripts/expected-links/reference-expected-links.txt b/project/scripts/expected-links/reference-expected-links.txt index 0fbb84831e37..59add1da0153 100644 --- a/project/scripts/expected-links/reference-expected-links.txt +++ b/project/scripts/expected-links/reference-expected-links.txt @@ -104,6 +104,7 @@ ./new-types/union-types-spec.html ./new-types/union-types.html ./other-new-features.html +./other-new-features/binary-literals.html ./other-new-features/control-syntax.html ./other-new-features/creator-applications.html ./other-new-features/experimental-defs.html From b43f630d8024bc984034c328db053df6f7e5d330 Mon Sep 17 00:00:00 2001 From: Wojciech Mazur Date: Mon, 9 Dec 2024 19:35:26 +0100 Subject: [PATCH 336/371] Backport "Make context bounds for poly functions a standard feature" to 3.6 (#22172) Backports #22019 to the 3.6.3. PR submitted by the release tooling. [skip ci] Co-authored-by: Kacper Korban --- compiler/src/dotty/tools/dotc/parsing/Parsers.scala | 2 +- tests/pos/contextbounds-for-poly-functions.scala | 3 --- tests/run/contextbounds-for-poly-functions.scala | 3 --- 3 files changed, 1 insertion(+), 7 deletions(-) diff --git a/compiler/src/dotty/tools/dotc/parsing/Parsers.scala b/compiler/src/dotty/tools/dotc/parsing/Parsers.scala index 2e441553689c..ca8ebaf79b09 100644 --- a/compiler/src/dotty/tools/dotc/parsing/Parsers.scala +++ b/compiler/src/dotty/tools/dotc/parsing/Parsers.scala @@ -3491,7 +3491,7 @@ object Parsers { val hkparams = typeParamClauseOpt(ParamOwner.Hk) val bounds = if paramOwner.acceptsCtxBounds then typeAndCtxBounds(name) - else if in.featureEnabled(Feature.modularity) && paramOwner == ParamOwner.Type then typeAndCtxBounds(name) + else if sourceVersion.isAtLeast(`3.6`) && paramOwner == ParamOwner.Type then typeAndCtxBounds(name) else typeBounds() TypeDef(name, lambdaAbstract(hkparams, bounds)).withMods(mods) } diff --git a/tests/pos/contextbounds-for-poly-functions.scala b/tests/pos/contextbounds-for-poly-functions.scala index 13411a3ad769..49975cf8591d 100644 --- a/tests/pos/contextbounds-for-poly-functions.scala +++ b/tests/pos/contextbounds-for-poly-functions.scala @@ -1,6 +1,3 @@ -import scala.language.experimental.modularity -import scala.language.future - trait Ord[X]: def compare(x: X, y: X): Int type T diff --git a/tests/run/contextbounds-for-poly-functions.scala b/tests/run/contextbounds-for-poly-functions.scala index dcc974fce198..72eed8939fcf 100644 --- a/tests/run/contextbounds-for-poly-functions.scala +++ b/tests/run/contextbounds-for-poly-functions.scala @@ -1,6 +1,3 @@ -import scala.language.experimental.modularity -import scala.language.future - trait Show[X]: def show(x: X): String From e07cab55572f674ba337e302975fa9a37abb8332 Mon Sep 17 00:00:00 2001 From: Wojciech Mazur Date: Mon, 9 Dec 2024 19:35:43 +0100 Subject: [PATCH 337/371] Backport "Update Scala CLI to 1.5.4 (was 1.5.1) & `coursier` to 2.1.18 (was 2.1.13)" to 3.6 (#22173) Backports #22021 to the 3.6.3. PR submitted by the release tooling. [skip ci] --------- Co-authored-by: Kacper Korban Co-authored-by: Piotr Chabelski --- project/Build.scala | 4 ++-- 1 file changed, 2 insertions(+), 2 deletions(-) diff --git a/project/Build.scala b/project/Build.scala index d742ab9ad3f4..cf582762054a 100644 --- a/project/Build.scala +++ b/project/Build.scala @@ -162,9 +162,9 @@ object Build { val mimaPreviousLTSDottyVersion = "3.3.0" /** Version of Scala CLI to download */ - val scalaCliLauncherVersion = "1.5.1" + val scalaCliLauncherVersion = "1.5.4" /** Version of Coursier to download for initializing the local maven repo of Scala command */ - val coursierJarVersion = "2.1.13" + val coursierJarVersion = "2.1.18" object CompatMode { final val BinaryCompatible = 0 From 9fd972d5fa950f564dceb4f350c034a3ded4c4dc Mon Sep 17 00:00:00 2001 From: Wojciech Mazur Date: Mon, 9 Dec 2024 19:35:57 +0100 Subject: [PATCH 338/371] Backport "Make named tuples an experimental feature again" to 3.6 (#22174) Backports #22045 to the 3.6.3. PR submitted by the release tooling. --------- Co-authored-by: Kacper Korban Co-authored-by: Piotr Chabelski --- .../src/dotty/tools/dotc/config/Feature.scala | 2 + .../tools/dotc/config/MigrationVersion.scala | 2 +- .../dotty/tools/dotc/parsing/Parsers.scala | 8 +- .../tools/dotc/reporting/ErrorMessageID.scala | 4 +- .../dotty/tools/dotc/reporting/messages.scala | 19 ++-- .../src/dotty/tools/dotc/typer/Typer.scala | 11 ++- .../dotty/tools/dotc/CompilationTests.scala | 1 + .../named-tuples.md | 4 +- docs/sidebar.yml | 2 +- library/src/scala/NamedTuple.scala | 3 + .../runtime/stdLibPatches/language.scala | 1 - .../pc/tests/completion/CompletionSuite.scala | 6 +- tests/neg/i20517.check | 14 +-- tests/neg/i20517.scala | 1 + tests/neg/infix-named-args.check | 40 ++++---- tests/neg/infix-named-args.scala | 2 + tests/neg/named-tuple-selectable.scala | 2 + tests/neg/named-tuples-2.check | 8 +- tests/neg/named-tuples-2.scala | 1 + tests/neg/named-tuples-3.check | 4 +- tests/neg/named-tuples-3.scala | 2 + tests/neg/named-tuples.check | 92 +++++++++---------- tests/neg/named-tuples.scala | 3 +- tests/new/test.scala | 2 + tests/pos/fieldsOf.scala | 2 + tests/pos/i20377.scala | 1 + tests/pos/i21300.scala | 6 +- tests/pos/i21413.scala | 2 + tests/pos/named-tuple-combinators.scala | 1 + tests/pos/named-tuple-selectable.scala | 1 + tests/pos/named-tuple-selections.scala | 1 + tests/pos/named-tuple-unstable.scala | 1 + tests/pos/named-tuple-widen.scala | 1 + tests/pos/named-tuples-ops-mirror.scala | 1 + tests/pos/named-tuples1.scala | 1 + tests/pos/namedtuple-src-incompat.scala | 1 + tests/pos/tuple-ops.scala | 1 + .../ambigious-named-tuple-assignment.check | 19 ++++ .../ambigious-named-tuple-assignment.scala | 19 ++++ tests/rewrites/infix-named-args.check | 4 +- tests/rewrites/infix-named-args.scala | 2 + .../stdlibExperimentalDefinitions.scala | 6 ++ tests/run/named-patmatch.scala | 1 + tests/run/named-patterns.scala | 1 + tests/run/named-tuple-ops.scala | 1 + tests/run/named-tuples-xxl.scala | 1 + tests/run/named-tuples.scala | 1 + tests/run/tyql.scala | 1 + tests/warn/21681.check | 7 +- tests/warn/21681.scala | 2 + tests/warn/21681b.check | 7 +- tests/warn/21681b.scala | 2 + tests/warn/21681c.check | 7 +- tests/warn/21681c.scala | 2 + tests/warn/21770.check | 7 +- tests/warn/21770.scala | 4 +- tests/warn/infix-named-args-migration.scala | 2 + 57 files changed, 225 insertions(+), 125 deletions(-) rename docs/_docs/reference/{other-new-features => experimental}/named-tuples.md (98%) create mode 100644 tests/rewrites/ambigious-named-tuple-assignment.check create mode 100644 tests/rewrites/ambigious-named-tuple-assignment.scala diff --git a/compiler/src/dotty/tools/dotc/config/Feature.scala b/compiler/src/dotty/tools/dotc/config/Feature.scala index ad20bab46c1e..8b9a64924ace 100644 --- a/compiler/src/dotty/tools/dotc/config/Feature.scala +++ b/compiler/src/dotty/tools/dotc/config/Feature.scala @@ -34,6 +34,7 @@ object Feature: val pureFunctions = experimental("pureFunctions") val captureChecking = experimental("captureChecking") val into = experimental("into") + val namedTuples = experimental("namedTuples") val modularity = experimental("modularity") val betterMatchTypeExtractors = experimental("betterMatchTypeExtractors") val quotedPatternsWithPolymorphicFunctions = experimental("quotedPatternsWithPolymorphicFunctions") @@ -65,6 +66,7 @@ object Feature: (pureFunctions, "Enable pure functions for capture checking"), (captureChecking, "Enable experimental capture checking"), (into, "Allow into modifier on parameter types"), + (namedTuples, "Allow named tuples"), (modularity, "Enable experimental modularity features"), (betterMatchTypeExtractors, "Enable better match type extractors"), (betterFors, "Enable improvements in `for` comprehensions") diff --git a/compiler/src/dotty/tools/dotc/config/MigrationVersion.scala b/compiler/src/dotty/tools/dotc/config/MigrationVersion.scala index 247e3f62a98d..1d99caa789d3 100644 --- a/compiler/src/dotty/tools/dotc/config/MigrationVersion.scala +++ b/compiler/src/dotty/tools/dotc/config/MigrationVersion.scala @@ -26,7 +26,7 @@ enum MigrationVersion(val warnFrom: SourceVersion, val errorFrom: SourceVersion) case WithOperator extends MigrationVersion(`3.4`, future) case FunctionUnderscore extends MigrationVersion(`3.4`, future) case NonNamedArgumentInJavaAnnotation extends MigrationVersion(`3.6`, `3.6`) - case AmbiguousNamedTupleInfixApply extends MigrationVersion(`3.6`, never) + case AmbiguousNamedTupleSyntax extends MigrationVersion(`3.6`, future) case ImportWildcard extends MigrationVersion(future, future) case ImportRename extends MigrationVersion(future, future) case ParameterEnclosedByParenthesis extends MigrationVersion(future, future) diff --git a/compiler/src/dotty/tools/dotc/parsing/Parsers.scala b/compiler/src/dotty/tools/dotc/parsing/Parsers.scala index ca8ebaf79b09..220053e277a5 100644 --- a/compiler/src/dotty/tools/dotc/parsing/Parsers.scala +++ b/compiler/src/dotty/tools/dotc/parsing/Parsers.scala @@ -667,7 +667,7 @@ object Parsers { else leading :: Nil def maybeNamed(op: () => Tree): () => Tree = () => - if isIdent && in.lookahead.token == EQUALS && sourceVersion.isAtLeast(`3.6`) then + if isIdent && in.lookahead.token == EQUALS && in.featureEnabled(Feature.namedTuples) then atSpan(in.offset): val name = ident() in.nextToken() @@ -1149,8 +1149,8 @@ object Parsers { if isType then infixOp else infixOp.right match case Tuple(args) if args.exists(_.isInstanceOf[NamedArg]) && !isNamedTupleOperator => - report.errorOrMigrationWarning(AmbiguousNamedTupleInfixApply(), infixOp.right.srcPos, MigrationVersion.AmbiguousNamedTupleInfixApply) - if MigrationVersion.AmbiguousNamedTupleInfixApply.needsPatch then + report.errorOrMigrationWarning(DeprecatedInfixNamedArgumentSyntax(), infixOp.right.srcPos, MigrationVersion.AmbiguousNamedTupleSyntax) + if MigrationVersion.AmbiguousNamedTupleSyntax.needsPatch then val asApply = cpy.Apply(infixOp)(Select(opInfo.operand, opInfo.operator.name), args) patch(source, infixOp.span, asApply.show(using ctx.withoutColors)) asApply // allow to use pre-3.6 syntax in migration mode @@ -2172,7 +2172,7 @@ object Parsers { if namedOK && isIdent && in.lookahead.token == EQUALS then commaSeparated(() => namedArgType()) - else if tupleOK && isIdent && in.lookahead.isColon && sourceVersion.isAtLeast(`3.6`) then + else if tupleOK && isIdent && in.lookahead.isColon && in.featureEnabled(Feature.namedTuples) then commaSeparated(() => namedElem()) else commaSeparated(() => argType()) diff --git a/compiler/src/dotty/tools/dotc/reporting/ErrorMessageID.scala b/compiler/src/dotty/tools/dotc/reporting/ErrorMessageID.scala index 35c170858bbf..2c3774b59a9a 100644 --- a/compiler/src/dotty/tools/dotc/reporting/ErrorMessageID.scala +++ b/compiler/src/dotty/tools/dotc/reporting/ErrorMessageID.scala @@ -216,8 +216,8 @@ enum ErrorMessageID(val isActive: Boolean = true) extends java.lang.Enum[ErrorMe case FinalLocalDefID // errorNumber: 200 case NonNamedArgumentInJavaAnnotationID // errorNumber: 201 case QuotedTypeMissingID // errorNumber: 202 - case AmbiguousNamedTupleAssignmentID // errorNumber: 203 - case AmbiguousNamedTupleInfixApplyID // errorNumber: 204 + case DeprecatedAssignmentSyntaxID // errorNumber: 203 + case DeprecatedInfixNamedArgumentSyntaxID // errorNumber: 204 def errorNumber = ordinal - 1 diff --git a/compiler/src/dotty/tools/dotc/reporting/messages.scala b/compiler/src/dotty/tools/dotc/reporting/messages.scala index 328ec122f848..3721c18fd294 100644 --- a/compiler/src/dotty/tools/dotc/reporting/messages.scala +++ b/compiler/src/dotty/tools/dotc/reporting/messages.scala @@ -3344,21 +3344,20 @@ final class QuotedTypeMissing(tpe: Type)(using Context) extends StagingMessage(Q end QuotedTypeMissing -final class AmbiguousNamedTupleAssignment(key: Name, value: untpd.Tree)(using Context) extends SyntaxMsg(AmbiguousNamedTupleAssignmentID): +final class DeprecatedAssignmentSyntax(key: Name, value: untpd.Tree)(using Context) extends SyntaxMsg(DeprecatedAssignmentSyntaxID): override protected def msg(using Context): String = - i"""Ambiguous syntax: this is interpreted as a named tuple with one element, + i"""Deprecated syntax: in the future it would be interpreted as a named tuple with one element, |not as an assignment. | |To assign a value, use curly braces: `{${key} = ${value}}`.""" - + + Message.rewriteNotice("This", version = SourceVersion.`3.6-migration`) + override protected def explain(using Context): String = "" -class AmbiguousNamedTupleInfixApply()(using Context) extends SyntaxMsg(AmbiguousNamedTupleInfixApplyID): +class DeprecatedInfixNamedArgumentSyntax()(using Context) extends SyntaxMsg(DeprecatedInfixNamedArgumentSyntaxID): def msg(using Context) = - "Ambigious syntax: this infix call argument list is interpreted as single named tuple argument, not as an named arguments list." - + Message.rewriteNotice("This", version = SourceVersion.`3.6-migration`) + i"""Deprecated syntax: infix named arguments lists are deprecated; in the future it would be interpreted as a single name tuple argument. + |To avoid this warning, either remove the argument names or use dotted selection.""" + + Message.rewriteNotice("This", version = SourceVersion.`3.6-migration`) - def explain(using Context) = - i"""Starting with Scala 3.6 infix named arguments are interpretted as Named Tuple. - | - |To avoid this warning, either remove the argument names or use dotted selection.""" + def explain(using Context) = "" diff --git a/compiler/src/dotty/tools/dotc/typer/Typer.scala b/compiler/src/dotty/tools/dotc/typer/Typer.scala index 76f0e8a3ecf6..5c5ca8af46c6 100644 --- a/compiler/src/dotty/tools/dotc/typer/Typer.scala +++ b/compiler/src/dotty/tools/dotc/typer/Typer.scala @@ -792,7 +792,7 @@ class Typer(@constructorOnly nestingLevel: Int = 0) extends Namer def tryNamedTupleSelection() = val namedTupleElems = qual.tpe.widenDealias.namedTupleElementTypes val nameIdx = namedTupleElems.indexWhere(_._1 == selName) - if nameIdx >= 0 && sourceVersion.isAtLeast(`3.6`) then + if nameIdx >= 0 && Feature.enabled(Feature.namedTuples) then typed( untpd.Apply( untpd.Select(untpd.TypedSplice(qual), nme.apply), @@ -3398,7 +3398,7 @@ class Typer(@constructorOnly nestingLevel: Int = 0) extends Namer /** Translate tuples of all arities */ def typedTuple(tree: untpd.Tuple, pt: Type)(using Context): Tree = val tree1 = desugar.tuple(tree, pt) - checkAmbiguousNamedTupleAssignment(tree) + checkDeprecatedAssignmentSyntax(tree) if tree1 ne tree then typed(tree1, pt) else val arity = tree.trees.length @@ -3427,7 +3427,7 @@ class Typer(@constructorOnly nestingLevel: Int = 0) extends Namer /** Checks if `tree` is a named tuple with one element that could be * interpreted as an assignment, such as `(x = 1)`. If so, issues a warning. */ - def checkAmbiguousNamedTupleAssignment(tree: untpd.Tuple)(using Context): Unit = + def checkDeprecatedAssignmentSyntax(tree: untpd.Tuple)(using Context): Unit = tree.trees match case List(NamedArg(name, value)) => val tmpCtx = ctx.fresh.setNewTyperState() @@ -3435,7 +3435,10 @@ class Typer(@constructorOnly nestingLevel: Int = 0) extends Namer if !tmpCtx.reporter.hasErrors then // If there are no errors typing the above, then the named tuple is // ambiguous and we issue a warning. - report.migrationWarning(AmbiguousNamedTupleAssignment(name, value), tree.srcPos) + report.migrationWarning(DeprecatedAssignmentSyntax(name, value), tree.srcPos) + if MigrationVersion.AmbiguousNamedTupleSyntax.needsPatch then + patch(tree.source, Span(tree.span.start, tree.span.start + 1), "{") + patch(tree.source, Span(tree.span.end - 1, tree.span.end), "}") case _ => () /** Retrieve symbol attached to given tree */ diff --git a/compiler/test/dotty/tools/dotc/CompilationTests.scala b/compiler/test/dotty/tools/dotc/CompilationTests.scala index 3bd3b5138fad..9f72db6fc390 100644 --- a/compiler/test/dotty/tools/dotc/CompilationTests.scala +++ b/compiler/test/dotty/tools/dotc/CompilationTests.scala @@ -80,6 +80,7 @@ class CompilationTests { compileDir("tests/rewrites/annotation-named-pararamters", defaultOptions.and("-rewrite", "-source:3.6-migration")), compileFile("tests/rewrites/i21418.scala", unindentOptions.and("-rewrite", "-source:3.5-migration")), compileFile("tests/rewrites/infix-named-args.scala", defaultOptions.and("-rewrite", "-source:3.6-migration")), + compileFile("tests/rewrites/ambigious-named-tuple-assignment.scala", defaultOptions.and("-rewrite", "-source:3.6-migration")), ).checkRewrites() } diff --git a/docs/_docs/reference/other-new-features/named-tuples.md b/docs/_docs/reference/experimental/named-tuples.md similarity index 98% rename from docs/_docs/reference/other-new-features/named-tuples.md rename to docs/_docs/reference/experimental/named-tuples.md index 5483c5cc255b..27d74259725d 100644 --- a/docs/_docs/reference/other-new-features/named-tuples.md +++ b/docs/_docs/reference/experimental/named-tuples.md @@ -1,10 +1,10 @@ --- layout: doc-page title: "Named Tuples" -nightlyOf: https://docs.scala-lang.org/scala3/reference/other-new-features/named-tuples.html +nightlyOf: https://docs.scala-lang.org/scala3/reference/experimental/named-tuples.html --- -Starting in Scala 3.6, the elements of a tuple can be named. Example: +The elements of a tuple can now be named. Example: ```scala type Person = (name: String, age: Int) val Bob: Person = (name = "Bob", age = 33) diff --git a/docs/sidebar.yml b/docs/sidebar.yml index 74aee3dfc668..a306d8bdf274 100644 --- a/docs/sidebar.yml +++ b/docs/sidebar.yml @@ -72,7 +72,6 @@ subsection: - page: reference/other-new-features/export.md - page: reference/other-new-features/opaques.md - page: reference/other-new-features/opaques-details.md - - page: reference/other-new-features/named-tuples.md - page: reference/other-new-features/open-classes.md - page: reference/other-new-features/parameter-untupling.md - page: reference/other-new-features/parameter-untupling-spec.md @@ -159,6 +158,7 @@ subsection: - page: reference/experimental/cc.md - page: reference/experimental/purefuns.md - page: reference/experimental/tupled-function.md + - page: reference/experimental/named-tuples.md - page: reference/experimental/modularity.md - page: reference/experimental/typeclasses.md - page: reference/experimental/runtimeChecked.md diff --git a/library/src/scala/NamedTuple.scala b/library/src/scala/NamedTuple.scala index d105cf042f37..6da7f940dc47 100644 --- a/library/src/scala/NamedTuple.scala +++ b/library/src/scala/NamedTuple.scala @@ -1,6 +1,8 @@ package scala +import annotation.experimental import compiletime.ops.boolean.* +@experimental object NamedTuple: /** The type to which named tuples get mapped to. For instance, @@ -131,6 +133,7 @@ object NamedTuple: end NamedTuple /** Separate from NamedTuple object so that we can match on the opaque type NamedTuple. */ +@experimental object NamedTupleDecomposition: import NamedTuple.* extension [N <: Tuple, V <: Tuple](x: NamedTuple[N, V]) diff --git a/library/src/scala/runtime/stdLibPatches/language.scala b/library/src/scala/runtime/stdLibPatches/language.scala index b8d990cf56f5..547710d55293 100644 --- a/library/src/scala/runtime/stdLibPatches/language.scala +++ b/library/src/scala/runtime/stdLibPatches/language.scala @@ -97,7 +97,6 @@ object language: * @see [[https://dotty.epfl.ch/docs/reference/experimental/named-tuples]] */ @compileTimeOnly("`namedTuples` can only be used at compile time in import statements") - @deprecated("The experimental.namedTuples language import is no longer needed since the feature is now standard", since = "3.6") object namedTuples /** Experimental support for new features for better modularity, including diff --git a/presentation-compiler/test/dotty/tools/pc/tests/completion/CompletionSuite.scala b/presentation-compiler/test/dotty/tools/pc/tests/completion/CompletionSuite.scala index 7be2ea6181ef..ab28baea994b 100644 --- a/presentation-compiler/test/dotty/tools/pc/tests/completion/CompletionSuite.scala +++ b/presentation-compiler/test/dotty/tools/pc/tests/completion/CompletionSuite.scala @@ -1988,7 +1988,8 @@ class CompletionSuite extends BaseCompletionSuite: @Test def `namedTuple completions` = check( - """|import scala.NamedTuple.* + """|import scala.language.experimental.namedTuples + |import scala.NamedTuple.* | |val person = (name = "Jamie", city = "Lausanne") | @@ -1999,7 +2000,8 @@ class CompletionSuite extends BaseCompletionSuite: @Test def `Selectable with namedTuple Fields member` = check( - """|import scala.NamedTuple.* + """|import scala.language.experimental.namedTuples + |import scala.NamedTuple.* | |class NamedTupleSelectable extends Selectable { | type Fields <: AnyNamedTuple diff --git a/tests/neg/i20517.check b/tests/neg/i20517.check index 119c34025ee0..55aeff46572b 100644 --- a/tests/neg/i20517.check +++ b/tests/neg/i20517.check @@ -1,7 +1,7 @@ --- [E007] Type Mismatch Error: tests/neg/i20517.scala:9:43 ------------------------------------------------------------- -9 | def dep(foo: Foo[Any]): From[foo.type] = (elem = "") // error - | ^^^^^^^^^^^ - | Found: (elem : String) - | Required: NamedTuple.From[(foo : Foo[Any])] - | - | longer explanation available when compiling with `-explain` +-- [E007] Type Mismatch Error: tests/neg/i20517.scala:10:43 ------------------------------------------------------------ +10 | def dep(foo: Foo[Any]): From[foo.type] = (elem = "") // error + | ^^^^^^^^^^^ + | Found: (elem : String) + | Required: NamedTuple.From[(foo : Foo[Any])] + | + | longer explanation available when compiling with `-explain` diff --git a/tests/neg/i20517.scala b/tests/neg/i20517.scala index 342a7d86ca7e..11c4432434dd 100644 --- a/tests/neg/i20517.scala +++ b/tests/neg/i20517.scala @@ -1,3 +1,4 @@ +import scala.language.experimental.namedTuples import NamedTuple.From case class Foo[+T](elem: T) diff --git a/tests/neg/infix-named-args.check b/tests/neg/infix-named-args.check index 0cfbbaef73a3..d960892a9624 100644 --- a/tests/neg/infix-named-args.check +++ b/tests/neg/infix-named-args.check @@ -1,5 +1,5 @@ --- [E134] Type Error: tests/neg/infix-named-args.scala:2:13 ------------------------------------------------------------ -2 | def f = 42 + (x = 1) // error // werror +-- [E134] Type Error: tests/neg/infix-named-args.scala:4:13 ------------------------------------------------------------ +4 | def f = 42 + (x = 1) // error // werror | ^^^^ | None of the overloaded alternatives of method + in class Int with types | (x: Double): Double @@ -11,31 +11,27 @@ | (x: Byte): Int | (x: String): String | match arguments ((x : Int)) (a named tuple) --- [E204] Syntax Warning: tests/neg/infix-named-args.scala:2:15 -------------------------------------------------------- -2 | def f = 42 + (x = 1) // error // werror +-- [E204] Syntax Warning: tests/neg/infix-named-args.scala:4:15 -------------------------------------------------------- +4 | def f = 42 + (x = 1) // error // werror | ^^^^^^^ - |Ambigious syntax: this infix call argument list is interpreted as single named tuple argument, not as an named arguments list. + |Deprecated syntax: infix named arguments lists are deprecated; in the future it would be interpreted as a single name tuple argument. + |To avoid this warning, either remove the argument names or use dotted selection. |This can be rewritten automatically under -rewrite -source 3.6-migration. - | - | longer explanation available when compiling with `-explain` --- [E204] Syntax Warning: tests/neg/infix-named-args.scala:5:26 -------------------------------------------------------- -5 | def g = new C() `multi` (x = 42, y = 27) // werror +-- [E204] Syntax Warning: tests/neg/infix-named-args.scala:7:26 -------------------------------------------------------- +7 | def g = new C() `multi` (x = 42, y = 27) // werror | ^^^^^^^^^^^^^^^^ - |Ambigious syntax: this infix call argument list is interpreted as single named tuple argument, not as an named arguments list. + |Deprecated syntax: infix named arguments lists are deprecated; in the future it would be interpreted as a single name tuple argument. + |To avoid this warning, either remove the argument names or use dotted selection. |This can be rewritten automatically under -rewrite -source 3.6-migration. - | - | longer explanation available when compiling with `-explain` --- [E204] Syntax Warning: tests/neg/infix-named-args.scala:6:21 -------------------------------------------------------- -6 | def h = new C() ** (x = 42, y = 27) // werror +-- [E204] Syntax Warning: tests/neg/infix-named-args.scala:8:21 -------------------------------------------------------- +8 | def h = new C() ** (x = 42, y = 27) // werror | ^^^^^^^^^^^^^^^^ - |Ambigious syntax: this infix call argument list is interpreted as single named tuple argument, not as an named arguments list. + |Deprecated syntax: infix named arguments lists are deprecated; in the future it would be interpreted as a single name tuple argument. + |To avoid this warning, either remove the argument names or use dotted selection. |This can be rewritten automatically under -rewrite -source 3.6-migration. - | - | longer explanation available when compiling with `-explain` --- [E204] Syntax Warning: tests/neg/infix-named-args.scala:13:18 ------------------------------------------------------- -13 | def f = this ** (x = 2) // werror +-- [E204] Syntax Warning: tests/neg/infix-named-args.scala:15:18 ------------------------------------------------------- +15 | def f = this ** (x = 2) // werror | ^^^^^^^ - |Ambigious syntax: this infix call argument list is interpreted as single named tuple argument, not as an named arguments list. + |Deprecated syntax: infix named arguments lists are deprecated; in the future it would be interpreted as a single name tuple argument. + |To avoid this warning, either remove the argument names or use dotted selection. |This can be rewritten automatically under -rewrite -source 3.6-migration. - | - | longer explanation available when compiling with `-explain` diff --git a/tests/neg/infix-named-args.scala b/tests/neg/infix-named-args.scala index d8616899540c..b0ef555cf965 100644 --- a/tests/neg/infix-named-args.scala +++ b/tests/neg/infix-named-args.scala @@ -1,3 +1,5 @@ +import scala.language.experimental.namedTuples + class C: def f = 42 + (x = 1) // error // werror def multi(x: Int, y: Int): Int = x + y diff --git a/tests/neg/named-tuple-selectable.scala b/tests/neg/named-tuple-selectable.scala index c81eba1237ff..5cf7e68654ef 100644 --- a/tests/neg/named-tuple-selectable.scala +++ b/tests/neg/named-tuple-selectable.scala @@ -1,3 +1,5 @@ +import scala.language.experimental.namedTuples + class FromFields extends Selectable: type Fields = (i: Int) def selectDynamic(key: String) = diff --git a/tests/neg/named-tuples-2.check b/tests/neg/named-tuples-2.check index daa1c0d69069..0a52d5f3989b 100644 --- a/tests/neg/named-tuples-2.check +++ b/tests/neg/named-tuples-2.check @@ -1,8 +1,8 @@ --- Error: tests/neg/named-tuples-2.scala:4:9 --------------------------------------------------------------------------- -4 | case (name, age) => () // error +-- Error: tests/neg/named-tuples-2.scala:5:9 --------------------------------------------------------------------------- +5 | case (name, age) => () // error | ^ | this case is unreachable since type (String, Int, Boolean) is not a subclass of class Tuple2 --- Error: tests/neg/named-tuples-2.scala:5:9 --------------------------------------------------------------------------- -5 | case (n, a, m, x) => () // error +-- Error: tests/neg/named-tuples-2.scala:6:9 --------------------------------------------------------------------------- +6 | case (n, a, m, x) => () // error | ^ | this case is unreachable since type (String, Int, Boolean) is not a subclass of class Tuple4 diff --git a/tests/neg/named-tuples-2.scala b/tests/neg/named-tuples-2.scala index b3917d9ad57c..0507891e0549 100644 --- a/tests/neg/named-tuples-2.scala +++ b/tests/neg/named-tuples-2.scala @@ -1,3 +1,4 @@ +import language.experimental.namedTuples def Test = val person = (name = "Bob", age = 33, married = true) person match diff --git a/tests/neg/named-tuples-3.check b/tests/neg/named-tuples-3.check index 2809836b4803..2091c36191c0 100644 --- a/tests/neg/named-tuples-3.check +++ b/tests/neg/named-tuples-3.check @@ -1,5 +1,5 @@ --- [E007] Type Mismatch Error: tests/neg/named-tuples-3.scala:5:16 ----------------------------------------------------- -5 |val p: Person = f // error +-- [E007] Type Mismatch Error: tests/neg/named-tuples-3.scala:7:16 ----------------------------------------------------- +7 |val p: Person = f // error | ^ | Found: NamedTuple.NamedTuple[(Int, Any), (Int, String)] | Required: Person diff --git a/tests/neg/named-tuples-3.scala b/tests/neg/named-tuples-3.scala index 21e6ed9b3741..0f1215338b0a 100644 --- a/tests/neg/named-tuples-3.scala +++ b/tests/neg/named-tuples-3.scala @@ -1,3 +1,5 @@ +import language.experimental.namedTuples + def f: NamedTuple.NamedTuple[(Int, Any), (Int, String)] = ??? type Person = (name: Int, age: String) diff --git a/tests/neg/named-tuples.check b/tests/neg/named-tuples.check index 8ec958b6a75d..db3cc703722f 100644 --- a/tests/neg/named-tuples.check +++ b/tests/neg/named-tuples.check @@ -1,101 +1,101 @@ --- Error: tests/neg/named-tuples.scala:8:19 ---------------------------------------------------------------------------- -8 | val illformed = (_2 = 2) // error +-- Error: tests/neg/named-tuples.scala:9:19 ---------------------------------------------------------------------------- +9 | val illformed = (_2 = 2) // error | ^^^^^^ | _2 cannot be used as the name of a tuple element because it is a regular tuple selector --- Error: tests/neg/named-tuples.scala:9:20 ---------------------------------------------------------------------------- -9 | type Illformed = (_1: Int) // error - | ^^^^^^^ - | _1 cannot be used as the name of a tuple element because it is a regular tuple selector --- Error: tests/neg/named-tuples.scala:10:40 --------------------------------------------------------------------------- -10 | val illformed2 = (name = "", age = 0, name = true) // error +-- Error: tests/neg/named-tuples.scala:10:20 --------------------------------------------------------------------------- +10 | type Illformed = (_1: Int) // error + | ^^^^^^^ + | _1 cannot be used as the name of a tuple element because it is a regular tuple selector +-- Error: tests/neg/named-tuples.scala:11:40 --------------------------------------------------------------------------- +11 | val illformed2 = (name = "", age = 0, name = true) // error | ^^^^^^^^^^^ | Duplicate tuple element name --- Error: tests/neg/named-tuples.scala:11:45 --------------------------------------------------------------------------- -11 | type Illformed2 = (name: String, age: Int, name: Boolean) // error +-- Error: tests/neg/named-tuples.scala:12:45 --------------------------------------------------------------------------- +12 | type Illformed2 = (name: String, age: Int, name: Boolean) // error | ^^^^^^^^^^^^^ | Duplicate tuple element name --- [E007] Type Mismatch Error: tests/neg/named-tuples.scala:19:20 ------------------------------------------------------ -19 | val _: NameOnly = person // error +-- [E007] Type Mismatch Error: tests/neg/named-tuples.scala:20:20 ------------------------------------------------------ +20 | val _: NameOnly = person // error | ^^^^^^ | Found: (Test.person : (name : String, age : Int)) | Required: Test.NameOnly | | longer explanation available when compiling with `-explain` --- [E007] Type Mismatch Error: tests/neg/named-tuples.scala:20:18 ------------------------------------------------------ -20 | val _: Person = nameOnly // error +-- [E007] Type Mismatch Error: tests/neg/named-tuples.scala:21:18 ------------------------------------------------------ +21 | val _: Person = nameOnly // error | ^^^^^^^^ | Found: (Test.nameOnly : (name : String)) | Required: Test.Person | | longer explanation available when compiling with `-explain` --- [E172] Type Error: tests/neg/named-tuples.scala:21:41 --------------------------------------------------------------- -21 | val _: Person = (name = "") ++ nameOnly // error +-- [E172] Type Error: tests/neg/named-tuples.scala:22:41 --------------------------------------------------------------- +22 | val _: Person = (name = "") ++ nameOnly // error | ^ | Cannot prove that Tuple.Disjoint[Tuple1[("name" : String)], Tuple1[("name" : String)]] =:= (true : Boolean). --- [E008] Not Found Error: tests/neg/named-tuples.scala:22:9 ----------------------------------------------------------- -22 | person._1 // error +-- [E008] Not Found Error: tests/neg/named-tuples.scala:23:9 ----------------------------------------------------------- +23 | person._1 // error | ^^^^^^^^^ | value _1 is not a member of (name : String, age : Int) --- [E007] Type Mismatch Error: tests/neg/named-tuples.scala:24:36 ------------------------------------------------------ -24 | val _: (age: Int, name: String) = person // error +-- [E007] Type Mismatch Error: tests/neg/named-tuples.scala:25:36 ------------------------------------------------------ +25 | val _: (age: Int, name: String) = person // error | ^^^^^^ | Found: (Test.person : (name : String, age : Int)) | Required: (age : Int, name : String) | | longer explanation available when compiling with `-explain` --- Error: tests/neg/named-tuples.scala:26:17 --------------------------------------------------------------------------- -26 | val (name = x, agee = y) = person // error +-- Error: tests/neg/named-tuples.scala:27:17 --------------------------------------------------------------------------- +27 | val (name = x, agee = y) = person // error | ^^^^^^^^ | No element named `agee` is defined in selector type (name : String, age : Int) --- Error: tests/neg/named-tuples.scala:29:10 --------------------------------------------------------------------------- -29 | case (name = n, age = a) => () // error // error +-- Error: tests/neg/named-tuples.scala:30:10 --------------------------------------------------------------------------- +30 | case (name = n, age = a) => () // error // error | ^^^^^^^^ | No element named `name` is defined in selector type (String, Int) --- Error: tests/neg/named-tuples.scala:29:20 --------------------------------------------------------------------------- -29 | case (name = n, age = a) => () // error // error +-- Error: tests/neg/named-tuples.scala:30:20 --------------------------------------------------------------------------- +30 | case (name = n, age = a) => () // error // error | ^^^^^^^ | No element named `age` is defined in selector type (String, Int) --- [E172] Type Error: tests/neg/named-tuples.scala:31:27 --------------------------------------------------------------- -31 | val pp = person ++ (1, 2) // error +-- [E172] Type Error: tests/neg/named-tuples.scala:32:27 --------------------------------------------------------------- +32 | val pp = person ++ (1, 2) // error | ^ | Cannot prove that Tuple.Disjoint[(("name" : String), ("age" : String)), Tuple] =:= (true : Boolean). --- [E172] Type Error: tests/neg/named-tuples.scala:34:18 --------------------------------------------------------------- -34 | person ++ (1, 2) match // error +-- [E172] Type Error: tests/neg/named-tuples.scala:35:18 --------------------------------------------------------------- +35 | person ++ (1, 2) match // error | ^ | Cannot prove that Tuple.Disjoint[(("name" : String), ("age" : String)), Tuple] =:= (true : Boolean). --- Error: tests/neg/named-tuples.scala:37:17 --------------------------------------------------------------------------- -37 | val bad = ("", age = 10) // error +-- Error: tests/neg/named-tuples.scala:38:17 --------------------------------------------------------------------------- +38 | val bad = ("", age = 10) // error | ^^^^^^^^ | Illegal combination of named and unnamed tuple elements --- Error: tests/neg/named-tuples.scala:40:20 --------------------------------------------------------------------------- -40 | case (name = n, age) => () // error +-- Error: tests/neg/named-tuples.scala:41:20 --------------------------------------------------------------------------- +41 | case (name = n, age) => () // error | ^^^ | Illegal combination of named and unnamed tuple elements --- Error: tests/neg/named-tuples.scala:41:16 --------------------------------------------------------------------------- -41 | case (name, age = a) => () // error +-- Error: tests/neg/named-tuples.scala:42:16 --------------------------------------------------------------------------- +42 | case (name, age = a) => () // error | ^^^^^^^ | Illegal combination of named and unnamed tuple elements --- Error: tests/neg/named-tuples.scala:44:10 --------------------------------------------------------------------------- -44 | case (age = x) => // error +-- Error: tests/neg/named-tuples.scala:45:10 --------------------------------------------------------------------------- +45 | case (age = x) => // error | ^^^^^^^ | No element named `age` is defined in selector type Tuple --- [E172] Type Error: tests/neg/named-tuples.scala:46:27 --------------------------------------------------------------- -46 | val p2 = person ++ person // error +-- [E172] Type Error: tests/neg/named-tuples.scala:47:27 --------------------------------------------------------------- +47 | val p2 = person ++ person // error | ^ |Cannot prove that Tuple.Disjoint[(("name" : String), ("age" : String)), (("name" : String), ("age" : String))] =:= (true : Boolean). --- [E172] Type Error: tests/neg/named-tuples.scala:47:43 --------------------------------------------------------------- -47 | val p3 = person ++ (first = 11, age = 33) // error +-- [E172] Type Error: tests/neg/named-tuples.scala:48:43 --------------------------------------------------------------- +48 | val p3 = person ++ (first = 11, age = 33) // error | ^ |Cannot prove that Tuple.Disjoint[(("name" : String), ("age" : String)), (("first" : String), ("age" : String))] =:= (true : Boolean). --- [E007] Type Mismatch Error: tests/neg/named-tuples.scala:49:22 ------------------------------------------------------ -49 | val p5 = person.zip((first = 11, age = 33)) // error +-- [E007] Type Mismatch Error: tests/neg/named-tuples.scala:50:22 ------------------------------------------------------ +50 | val p5 = person.zip((first = 11, age = 33)) // error | ^^^^^^^^^^^^^^^^^^^^^^ | Found: (first : Int, age : Int) | Required: NamedTuple.NamedTuple[(("name" : String), ("age" : String)), Tuple] | | longer explanation available when compiling with `-explain` --- [E007] Type Mismatch Error: tests/neg/named-tuples.scala:60:32 ------------------------------------------------------ -60 | val typo: (name: ?, age: ?) = (name = "he", ag = 1) // error +-- [E007] Type Mismatch Error: tests/neg/named-tuples.scala:61:32 ------------------------------------------------------ +61 | val typo: (name: ?, age: ?) = (name = "he", ag = 1) // error | ^^^^^^^^^^^^^^^^^^^^^ | Found: (name : String, ag : Int) | Required: (name : ?, age : ?) diff --git a/tests/neg/named-tuples.scala b/tests/neg/named-tuples.scala index daae6e26bac2..8f78f7915206 100644 --- a/tests/neg/named-tuples.scala +++ b/tests/neg/named-tuples.scala @@ -1,6 +1,7 @@ import annotation.experimental +import language.experimental.namedTuples -object Test: +@experimental object Test: type Person = (name: String, age: Int) val person = (name = "Bob", age = 33): (name: String, age: Int) diff --git a/tests/new/test.scala b/tests/new/test.scala index dc1891f3525c..18644422ab06 100644 --- a/tests/new/test.scala +++ b/tests/new/test.scala @@ -1,3 +1,5 @@ +import language.experimental.namedTuples + type Person = (name: String, age: Int) trait A: diff --git a/tests/pos/fieldsOf.scala b/tests/pos/fieldsOf.scala index 08f20a1f7e8e..2594dae2cbf7 100644 --- a/tests/pos/fieldsOf.scala +++ b/tests/pos/fieldsOf.scala @@ -1,3 +1,5 @@ +import language.experimental.namedTuples + case class Person(name: String, age: Int) type PF = NamedTuple.From[Person] diff --git a/tests/pos/i20377.scala b/tests/pos/i20377.scala index a555e01867ab..661fa7adfca9 100644 --- a/tests/pos/i20377.scala +++ b/tests/pos/i20377.scala @@ -1,3 +1,4 @@ +import language.experimental.namedTuples import NamedTuple.{NamedTuple, AnyNamedTuple} // Repros for bugs or questions diff --git a/tests/pos/i21300.scala b/tests/pos/i21300.scala index e7c7965b0e9a..22859482ef98 100644 --- a/tests/pos/i21300.scala +++ b/tests/pos/i21300.scala @@ -1,15 +1,17 @@ +import scala.language.experimental.namedTuples + class Test[S <: String & Singleton](name: S): type NT = NamedTuple.NamedTuple[(S, "foo"), (Int, Long)] def nt: NT = ??? type Name = S - + type NT2 = NamedTuple.NamedTuple[(Name, "foo"), (Int, Long)] def nt2: NT2 = ??? def test = val foo = new Test("bar") - + foo.nt.bar foo.nt2.bar diff --git a/tests/pos/i21413.scala b/tests/pos/i21413.scala index d2dc52e34630..72b5c6d59d8d 100644 --- a/tests/pos/i21413.scala +++ b/tests/pos/i21413.scala @@ -1,2 +1,4 @@ +import scala.language.experimental.namedTuples + val x = (aaa = 1).aaa //val y = x.aaa \ No newline at end of file diff --git a/tests/pos/named-tuple-combinators.scala b/tests/pos/named-tuple-combinators.scala index c027ba688d02..a5134b2e7d26 100644 --- a/tests/pos/named-tuple-combinators.scala +++ b/tests/pos/named-tuple-combinators.scala @@ -1,3 +1,4 @@ +import scala.language.experimental.namedTuples object Test: // original code from issue https://github.com/scala/scala3/issues/20427 diff --git a/tests/pos/named-tuple-selectable.scala b/tests/pos/named-tuple-selectable.scala index 0e1324f70ae6..be5f0400e58c 100644 --- a/tests/pos/named-tuple-selectable.scala +++ b/tests/pos/named-tuple-selectable.scala @@ -1,3 +1,4 @@ +import scala.language.experimental.namedTuples class FromFields extends Selectable: type Fields = (xs: List[Int], poly: [T] => (x: List[T]) => Option[T]) diff --git a/tests/pos/named-tuple-selections.scala b/tests/pos/named-tuple-selections.scala index 7b73daad2e72..c3569f21b323 100644 --- a/tests/pos/named-tuple-selections.scala +++ b/tests/pos/named-tuple-selections.scala @@ -1,3 +1,4 @@ +import scala.language.experimental.namedTuples object Test1: // original code from issue https://github.com/scala/scala3/issues/20439 diff --git a/tests/pos/named-tuple-unstable.scala b/tests/pos/named-tuple-unstable.scala index d15bdc578a3a..6a6a36732a14 100644 --- a/tests/pos/named-tuple-unstable.scala +++ b/tests/pos/named-tuple-unstable.scala @@ -1,3 +1,4 @@ +import scala.language.experimental.namedTuples import NamedTuple.{AnyNamedTuple, NamedTuple} trait Foo extends Selectable: diff --git a/tests/pos/named-tuple-widen.scala b/tests/pos/named-tuple-widen.scala index cc12a5f09b16..410832e04c17 100644 --- a/tests/pos/named-tuple-widen.scala +++ b/tests/pos/named-tuple-widen.scala @@ -1,3 +1,4 @@ +import language.experimental.namedTuples class A class B diff --git a/tests/pos/named-tuples-ops-mirror.scala b/tests/pos/named-tuples-ops-mirror.scala index b8745cf785d5..f66eb89534fb 100644 --- a/tests/pos/named-tuples-ops-mirror.scala +++ b/tests/pos/named-tuples-ops-mirror.scala @@ -1,3 +1,4 @@ +import language.experimental.namedTuples import NamedTuple.* @FailsWith[HttpError] diff --git a/tests/pos/named-tuples1.scala b/tests/pos/named-tuples1.scala index 532f1df7efd4..58e3fc065e61 100644 --- a/tests/pos/named-tuples1.scala +++ b/tests/pos/named-tuples1.scala @@ -1,4 +1,5 @@ import annotation.experimental +import language.experimental.namedTuples @main def Test = val bob = (name = "Bob", age = 33): (name: String, age: Int) diff --git a/tests/pos/namedtuple-src-incompat.scala b/tests/pos/namedtuple-src-incompat.scala index 76eb5e4aa850..57451a4321b7 100644 --- a/tests/pos/namedtuple-src-incompat.scala +++ b/tests/pos/namedtuple-src-incompat.scala @@ -1,3 +1,4 @@ +import language.experimental.namedTuples var age = 22 val x = (age = 1) val _: (age: Int) = x diff --git a/tests/pos/tuple-ops.scala b/tests/pos/tuple-ops.scala index e89c0e8e51aa..739b1ebeeb02 100644 --- a/tests/pos/tuple-ops.scala +++ b/tests/pos/tuple-ops.scala @@ -1,3 +1,4 @@ +import language.experimental.namedTuples import Tuple.* def test = diff --git a/tests/rewrites/ambigious-named-tuple-assignment.check b/tests/rewrites/ambigious-named-tuple-assignment.check new file mode 100644 index 000000000000..00e6cc4112f1 --- /dev/null +++ b/tests/rewrites/ambigious-named-tuple-assignment.check @@ -0,0 +1,19 @@ +import scala.language.experimental.namedTuples + +object i21770: + def f(g: Int => Unit) = g(0) + var cache: Option[Int] = None + f(i => {cache = Some(i)}) + +object i21861: + var age: Int = 28 + { + age = 29 + } + + +object i21861c: + def age: Int = ??? + def age_=(x: Int): Unit = () + age = 29 + { age = 29 } diff --git a/tests/rewrites/ambigious-named-tuple-assignment.scala b/tests/rewrites/ambigious-named-tuple-assignment.scala new file mode 100644 index 000000000000..e9685b7b58cf --- /dev/null +++ b/tests/rewrites/ambigious-named-tuple-assignment.scala @@ -0,0 +1,19 @@ +import scala.language.experimental.namedTuples + +object i21770: + def f(g: Int => Unit) = g(0) + var cache: Option[Int] = None + f(i => (cache = Some(i))) + +object i21861: + var age: Int = 28 + ( + age = 29 + ) + + +object i21861c: + def age: Int = ??? + def age_=(x: Int): Unit = () + age = 29 + ( age = 29 ) diff --git a/tests/rewrites/infix-named-args.check b/tests/rewrites/infix-named-args.check index a50593ef18a8..5f59cf272ba1 100644 --- a/tests/rewrites/infix-named-args.check +++ b/tests/rewrites/infix-named-args.check @@ -1,3 +1,5 @@ +import scala.language.experimental.namedTuples + class C: def multi(x: Int, y: Int): Int = x + y def **(x: Int, y: Int): Int = x + y @@ -12,4 +14,4 @@ class D(d: Int): def f = this.**(x = 2) def g = this ** 2 def h = this ** ((x = 2)) - def i = this.**(x = (1 + 1)) \ No newline at end of file + def i = this.**(x = (1 + 1)) diff --git a/tests/rewrites/infix-named-args.scala b/tests/rewrites/infix-named-args.scala index bcdf4a21a9d2..a954776a9104 100644 --- a/tests/rewrites/infix-named-args.scala +++ b/tests/rewrites/infix-named-args.scala @@ -1,3 +1,5 @@ +import scala.language.experimental.namedTuples + class C: def multi(x: Int, y: Int): Int = x + y def **(x: Int, y: Int): Int = x + y diff --git a/tests/run-tasty-inspector/stdlibExperimentalDefinitions.scala b/tests/run-tasty-inspector/stdlibExperimentalDefinitions.scala index 7a8dcb9bd2df..65e3a730ee7e 100644 --- a/tests/run-tasty-inspector/stdlibExperimentalDefinitions.scala +++ b/tests/run-tasty-inspector/stdlibExperimentalDefinitions.scala @@ -77,6 +77,12 @@ val experimentalDefinitionInLibrary = Set( // New feature: fromNullable for explicit nulls "scala.Predef$.fromNullable", + // New feature: named tuples + "scala.NamedTuple", + "scala.NamedTuple$", + "scala.NamedTupleDecomposition", + "scala.NamedTupleDecomposition$", + // New feature: modularity "scala.Precise", "scala.annotation.internal.WitnessNames", diff --git a/tests/run/named-patmatch.scala b/tests/run/named-patmatch.scala index 6fe1934f008e..e62497e4aa8f 100644 --- a/tests/run/named-patmatch.scala +++ b/tests/run/named-patmatch.scala @@ -1,4 +1,5 @@ import annotation.experimental +import language.experimental.namedTuples @main def Test = locally: diff --git a/tests/run/named-patterns.scala b/tests/run/named-patterns.scala index e92bbf751c22..7c24dc8d683a 100644 --- a/tests/run/named-patterns.scala +++ b/tests/run/named-patterns.scala @@ -1,3 +1,4 @@ +import language.experimental.namedTuples object Test1: class Person(val name: String, val age: Int) diff --git a/tests/run/named-tuple-ops.scala b/tests/run/named-tuple-ops.scala index 8c6db6f2fa1c..076ab5028c6c 100644 --- a/tests/run/named-tuple-ops.scala +++ b/tests/run/named-tuple-ops.scala @@ -1,4 +1,5 @@ //> using options -source future +import language.experimental.namedTuples import scala.compiletime.asMatchable type City = (name: String, zip: Int, pop: Int) diff --git a/tests/run/named-tuples-xxl.scala b/tests/run/named-tuples-xxl.scala index 8c831fb1d223..3a0a1e5e1294 100644 --- a/tests/run/named-tuples-xxl.scala +++ b/tests/run/named-tuples-xxl.scala @@ -1,3 +1,4 @@ +import language.experimental.namedTuples import NamedTuple.toTuple type Person = ( diff --git a/tests/run/named-tuples.scala b/tests/run/named-tuples.scala index c99393a403b3..406c6195cf0f 100644 --- a/tests/run/named-tuples.scala +++ b/tests/run/named-tuples.scala @@ -1,3 +1,4 @@ +import language.experimental.namedTuples import NamedTuple.* type Person = (name: String, age: Int) diff --git a/tests/run/tyql.scala b/tests/run/tyql.scala index ee3fd1138265..8fe253b559ac 100644 --- a/tests/run/tyql.scala +++ b/tests/run/tyql.scala @@ -1,3 +1,4 @@ +import language.experimental.namedTuples import NamedTuple.{NamedTuple, AnyNamedTuple} /* This is a demonstrator that shows how to map regular for expressions to diff --git a/tests/warn/21681.check b/tests/warn/21681.check index e86ce4e36134..adf3586e6e0b 100644 --- a/tests/warn/21681.check +++ b/tests/warn/21681.check @@ -1,7 +1,8 @@ --- [E203] Syntax Migration Warning: tests/warn/21681.scala:3:2 --------------------------------------------------------- -3 | (age = 29) // warn +-- [E203] Syntax Migration Warning: tests/warn/21681.scala:5:2 --------------------------------------------------------- +5 | (age = 29) // warn | ^^^^^^^^^^ - | Ambiguous syntax: this is interpreted as a named tuple with one element, + | Deprecated syntax: in the future it would be interpreted as a named tuple with one element, | not as an assignment. | | To assign a value, use curly braces: `{age = 29}`. + | This can be rewritten automatically under -rewrite -source 3.6-migration. diff --git a/tests/warn/21681.scala b/tests/warn/21681.scala index 76a19c96e1cb..67f45571ecf6 100644 --- a/tests/warn/21681.scala +++ b/tests/warn/21681.scala @@ -1,3 +1,5 @@ +import scala.language.experimental.namedTuples + def main() = var age: Int = 28 (age = 29) // warn diff --git a/tests/warn/21681b.check b/tests/warn/21681b.check index 32760e00ebb6..09c007f351b4 100644 --- a/tests/warn/21681b.check +++ b/tests/warn/21681b.check @@ -1,7 +1,8 @@ --- [E203] Syntax Migration Warning: tests/warn/21681b.scala:3:2 -------------------------------------------------------- -3 | (age = 29) // warn +-- [E203] Syntax Migration Warning: tests/warn/21681b.scala:5:2 -------------------------------------------------------- +5 | (age = 29) // warn | ^^^^^^^^^^ - | Ambiguous syntax: this is interpreted as a named tuple with one element, + | Deprecated syntax: in the future it would be interpreted as a named tuple with one element, | not as an assignment. | | To assign a value, use curly braces: `{age = 29}`. + | This can be rewritten automatically under -rewrite -source 3.6-migration. diff --git a/tests/warn/21681b.scala b/tests/warn/21681b.scala index 710d69b0dd23..44d04fc98aad 100644 --- a/tests/warn/21681b.scala +++ b/tests/warn/21681b.scala @@ -1,3 +1,5 @@ +import scala.language.experimental.namedTuples + object Test: var age: Int = 28 (age = 29) // warn diff --git a/tests/warn/21681c.check b/tests/warn/21681c.check index 11c427f87cfe..20273f723384 100644 --- a/tests/warn/21681c.check +++ b/tests/warn/21681c.check @@ -1,7 +1,8 @@ --- [E203] Syntax Migration Warning: tests/warn/21681c.scala:5:2 -------------------------------------------------------- -5 | (age = 29) // warn +-- [E203] Syntax Migration Warning: tests/warn/21681c.scala:7:2 -------------------------------------------------------- +7 | (age = 29) // warn | ^^^^^^^^^^ - | Ambiguous syntax: this is interpreted as a named tuple with one element, + | Deprecated syntax: in the future it would be interpreted as a named tuple with one element, | not as an assignment. | | To assign a value, use curly braces: `{age = 29}`. + | This can be rewritten automatically under -rewrite -source 3.6-migration. diff --git a/tests/warn/21681c.scala b/tests/warn/21681c.scala index 5e2eae11708c..a0c361382a54 100644 --- a/tests/warn/21681c.scala +++ b/tests/warn/21681c.scala @@ -1,3 +1,5 @@ +import scala.language.experimental.namedTuples + object Test: def age: Int = ??? def age_=(x: Int): Unit = () diff --git a/tests/warn/21770.check b/tests/warn/21770.check index 0899f11d6ca5..7853d77a423c 100644 --- a/tests/warn/21770.check +++ b/tests/warn/21770.check @@ -1,7 +1,8 @@ --- [E203] Syntax Migration Warning: tests/warn/21770.scala:5:9 --------------------------------------------------------- -5 | f(i => (cache = Some(i))) // warn +-- [E203] Syntax Migration Warning: tests/warn/21770.scala:7:9 --------------------------------------------------------- +7 | f(i => (cache = Some(i))) // warn | ^^^^^^^^^^^^^^^^^ - | Ambiguous syntax: this is interpreted as a named tuple with one element, + | Deprecated syntax: in the future it would be interpreted as a named tuple with one element, | not as an assignment. | | To assign a value, use curly braces: `{cache = Some(i)}`. + | This can be rewritten automatically under -rewrite -source 3.6-migration. diff --git a/tests/warn/21770.scala b/tests/warn/21770.scala index 9696a31d6ba8..8ee5b52e7b3f 100644 --- a/tests/warn/21770.scala +++ b/tests/warn/21770.scala @@ -1,5 +1,7 @@ +import scala.language.experimental.namedTuples + def f(g: Int => Unit) = g(0) -def test = +def test = var cache: Option[Int] = None f(i => (cache = Some(i))) // warn diff --git a/tests/warn/infix-named-args-migration.scala b/tests/warn/infix-named-args-migration.scala index df4bfb50271c..361004f08f13 100644 --- a/tests/warn/infix-named-args-migration.scala +++ b/tests/warn/infix-named-args-migration.scala @@ -1,4 +1,6 @@ //> using options -source:3.6-migration +import scala.language.experimental.namedTuples + class C: def f = 42 + (x = 1) // warn // interpreted as 42.+(x = 1) under migration, x is a valid synthetic parameter name def multi(x: Int, y: Int): Int = x + y From de83e5616fe0731e77abfff2f89ddb36045f0074 Mon Sep 17 00:00:00 2001 From: Wojciech Mazur Date: Mon, 9 Dec 2024 19:40:12 +0100 Subject: [PATCH 339/371] Backport "Fix CLA checks after domain change of CLA check server" to 3.6 (#22175) Backports #22148 to the 3.6.3. PR submitted by the release tooling. --- project/scripts/check-cla.sh | 6 +++--- 1 file changed, 3 insertions(+), 3 deletions(-) diff --git a/project/scripts/check-cla.sh b/project/scripts/check-cla.sh index e4e489830f11..dbb148d3c652 100755 --- a/project/scripts/check-cla.sh +++ b/project/scripts/check-cla.sh @@ -5,16 +5,16 @@ echo "Pull request submitted by $AUTHOR"; if [[ "$AUTHOR" == "github-actions[bot]" || "$AUTHOR" == "dependabot[bot]" ]] ; then echo "CLA check for $AUTHOR successful"; else - signed=$(curl -s "https://www.lightbend.com/contribute/cla/scala/check/$AUTHOR" | jq -r ".signed"); + signed=$(curl -L -s "https://contribute.akka.io/contribute/cla/scala/check/$AUTHOR" | jq -r ".signed"); if [ "$signed" = "true" ] ; then echo "CLA check for $AUTHOR successful"; else echo "CLA check for $AUTHOR failed"; echo "Please sign the Scala CLA to contribute to the Scala compiler."; - echo "Go to https://www.lightbend.com/contribute/cla/scala and then"; + echo "Go to https://contribute.akka.io/contribute/cla/scala and then"; echo "comment on the pull request to ask for a new check."; echo ""; - echo "Check if CLA is signed: https://www.lightbend.com/contribute/cla/scala/check/$AUTHOR"; + echo "Check if CLA is signed: https://contribute.akka.io/contribute/cla/scala/check/$AUTHOR"; exit 1; fi; fi; From 954fd3401cfdfd09251b601ecdfb0ee393fe2121 Mon Sep 17 00:00:00 2001 From: Wojciech Mazur Date: Mon, 9 Dec 2024 19:41:55 +0100 Subject: [PATCH 340/371] Replace deprecated actions/create-release and `actions/upload-release-asset` with `gh release create` command --- .github/workflows/ci.yaml | 323 +++----------------------------------- 1 file changed, 21 insertions(+), 302 deletions(-) diff --git a/.github/workflows/ci.yaml b/.github/workflows/ci.yaml index 303922719b5b..e842aeab229b 100644 --- a/.github/workflows/ci.yaml +++ b/.github/workflows/ci.yaml @@ -728,7 +728,7 @@ jobs: publish_release: permissions: - contents: write # for actions/create-release to create a release + contents: write # for GH CLI to create a release runs-on: [self-hosted, Linux] container: image: lampepfl/dotty:2024-10-18 @@ -770,6 +770,7 @@ jobs: - name: Add SBT proxy repositories run: cp -vf .github/workflows/repositories /root/.sbt/ ; true + # Extract the release tag - name: Extract the release tag run : echo "RELEASE_TAG=${GITHUB_REF#*refs/tags/}" >> $GITHUB_ENV @@ -828,311 +829,29 @@ jobs: mv scala.msi "${msiInstaller}" sha256sum "${msiInstaller}" > "${msiInstaller}.sha256" + - name: Install GH CLI + uses: dev-hanz-ops/install-gh-cli-action@v0.2.0 + with: + gh-cli-version: 2.59.0 + # Create the GitHub release - name: Create GitHub Release - id: create_gh_release - uses: actions/create-release@latest env: GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }} # This token is provided by Actions, you do not need to create your own token - with: - tag_name: ${{ github.ref }} - release_name: ${{ github.ref }} - body_path: ./changelogs/${{ env.RELEASE_TAG }}.md - draft: true - prerelease: ${{ contains(env.RELEASE_TAG, '-') }} - - # The following upload steps are generated using template: - # val baseFileName = "scala3-${{ env.RELEASE_TAG }}" - # def upload(kind: String, path: String, contentType: String, distribution: String) = - # s"""- name: Upload $kind to GitHub Release ($distribution) - # uses: actions/upload-release-asset@v1 - # env: - # GITHUB_TOKEN: $${{ secrets.GITHUB_TOKEN }} - # with: - # upload_url: $${{ steps.create_gh_release.outputs.upload_url }} - # asset_path: ./${path} - # asset_name: ${path} - # asset_content_type: ${contentType}""" - # def uploadSDK(distribution: String, suffix: String) = - # val filename = s"${baseFileName}${suffix}" - # s""" - # # $distribution - # ${upload("zip archive", s"$filename.zip", "application/zip", distribution)} - # ${upload("zip archive SHA", s"$filename.zip.sha256", "text/plain", distribution)} - # ${upload("tar.gz archive", s"$filename.tar.gz", "application/gzip", distribution)} - # ${upload("tar.gz archive SHA", s"$filename.tar.gz.sha256", "text/plain", distribution)} - # """ - # def uploadMSI() = - # val distribution = "Windows x86_64 MSI" - # s""" - # # $distribution - # ${upload(".msi file", s"${baseFileName}.msi", "application/x-msi", distribution)} - # ${upload(".msi file SHA", s"${baseFileName}.msi.sha256", "text/plain", distribution)} - # """ - # @main def gen = - # Seq( - # uploadSDK("Universal", ""), - # uploadSDK("Linux x86-64", "-x86_64-pc-linux"), - # uploadSDK("Linux aarch64", "-aarch64-pc-linux"), - # uploadSDK("Mac x86-64", "-x86_64-apple-darwin"), - # uploadSDK("Mac aarch64", "-aarch64-apple-darwin"), - # uploadSDK("Windows x86_64", "-x86_64-pc-win32"), - # uploadMSI() - # ).foreach(println) - - # Universal - - name: Upload zip archive to GitHub Release (Universal) - uses: actions/upload-release-asset@v1 - env: - GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }} - with: - upload_url: ${{ steps.create_gh_release.outputs.upload_url }} - asset_path: ./scala3-${{ env.RELEASE_TAG }}.zip - asset_name: scala3-${{ env.RELEASE_TAG }}.zip - asset_content_type: application/zip - - name: Upload zip archive SHA to GitHub Release (Universal) - uses: actions/upload-release-asset@v1 - env: - GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }} - with: - upload_url: ${{ steps.create_gh_release.outputs.upload_url }} - asset_path: ./scala3-${{ env.RELEASE_TAG }}.zip.sha256 - asset_name: scala3-${{ env.RELEASE_TAG }}.zip.sha256 - asset_content_type: text/plain - - name: Upload tar.gz archive to GitHub Release (Universal) - uses: actions/upload-release-asset@v1 - env: - GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }} - with: - upload_url: ${{ steps.create_gh_release.outputs.upload_url }} - asset_path: ./scala3-${{ env.RELEASE_TAG }}.tar.gz - asset_name: scala3-${{ env.RELEASE_TAG }}.tar.gz - asset_content_type: application/gzip - - name: Upload tar.gz archive SHA to GitHub Release (Universal) - uses: actions/upload-release-asset@v1 - env: - GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }} - with: - upload_url: ${{ steps.create_gh_release.outputs.upload_url }} - asset_path: ./scala3-${{ env.RELEASE_TAG }}.tar.gz.sha256 - asset_name: scala3-${{ env.RELEASE_TAG }}.tar.gz.sha256 - asset_content_type: text/plain - - - # Linux x86-64 - - name: Upload zip archive to GitHub Release (Linux x86-64) - uses: actions/upload-release-asset@v1 - env: - GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }} - with: - upload_url: ${{ steps.create_gh_release.outputs.upload_url }} - asset_path: ./scala3-${{ env.RELEASE_TAG }}-x86_64-pc-linux.zip - asset_name: scala3-${{ env.RELEASE_TAG }}-x86_64-pc-linux.zip - asset_content_type: application/zip - - name: Upload zip archive SHA to GitHub Release (Linux x86-64) - uses: actions/upload-release-asset@v1 - env: - GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }} - with: - upload_url: ${{ steps.create_gh_release.outputs.upload_url }} - asset_path: ./scala3-${{ env.RELEASE_TAG }}-x86_64-pc-linux.zip.sha256 - asset_name: scala3-${{ env.RELEASE_TAG }}-x86_64-pc-linux.zip.sha256 - asset_content_type: text/plain - - name: Upload tar.gz archive to GitHub Release (Linux x86-64) - uses: actions/upload-release-asset@v1 - env: - GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }} - with: - upload_url: ${{ steps.create_gh_release.outputs.upload_url }} - asset_path: ./scala3-${{ env.RELEASE_TAG }}-x86_64-pc-linux.tar.gz - asset_name: scala3-${{ env.RELEASE_TAG }}-x86_64-pc-linux.tar.gz - asset_content_type: application/gzip - - name: Upload tar.gz archive SHA to GitHub Release (Linux x86-64) - uses: actions/upload-release-asset@v1 - env: - GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }} - with: - upload_url: ${{ steps.create_gh_release.outputs.upload_url }} - asset_path: ./scala3-${{ env.RELEASE_TAG }}-x86_64-pc-linux.tar.gz.sha256 - asset_name: scala3-${{ env.RELEASE_TAG }}-x86_64-pc-linux.tar.gz.sha256 - asset_content_type: text/plain - - - # Linux aarch64 - - name: Upload zip archive to GitHub Release (Linux aarch64) - uses: actions/upload-release-asset@v1 - env: - GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }} - with: - upload_url: ${{ steps.create_gh_release.outputs.upload_url }} - asset_path: ./scala3-${{ env.RELEASE_TAG }}-aarch64-pc-linux.zip - asset_name: scala3-${{ env.RELEASE_TAG }}-aarch64-pc-linux.zip - asset_content_type: application/zip - - name: Upload zip archive SHA to GitHub Release (Linux aarch64) - uses: actions/upload-release-asset@v1 - env: - GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }} - with: - upload_url: ${{ steps.create_gh_release.outputs.upload_url }} - asset_path: ./scala3-${{ env.RELEASE_TAG }}-aarch64-pc-linux.zip.sha256 - asset_name: scala3-${{ env.RELEASE_TAG }}-aarch64-pc-linux.zip.sha256 - asset_content_type: text/plain - - name: Upload tar.gz archive to GitHub Release (Linux aarch64) - uses: actions/upload-release-asset@v1 - env: - GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }} - with: - upload_url: ${{ steps.create_gh_release.outputs.upload_url }} - asset_path: ./scala3-${{ env.RELEASE_TAG }}-aarch64-pc-linux.tar.gz - asset_name: scala3-${{ env.RELEASE_TAG }}-aarch64-pc-linux.tar.gz - asset_content_type: application/gzip - - name: Upload tar.gz archive SHA to GitHub Release (Linux aarch64) - uses: actions/upload-release-asset@v1 - env: - GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }} - with: - upload_url: ${{ steps.create_gh_release.outputs.upload_url }} - asset_path: ./scala3-${{ env.RELEASE_TAG }}-aarch64-pc-linux.tar.gz.sha256 - asset_name: scala3-${{ env.RELEASE_TAG }}-aarch64-pc-linux.tar.gz.sha256 - asset_content_type: text/plain - - - # Mac x86-64 - - name: Upload zip archive to GitHub Release (Mac x86-64) - uses: actions/upload-release-asset@v1 - env: - GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }} - with: - upload_url: ${{ steps.create_gh_release.outputs.upload_url }} - asset_path: ./scala3-${{ env.RELEASE_TAG }}-x86_64-apple-darwin.zip - asset_name: scala3-${{ env.RELEASE_TAG }}-x86_64-apple-darwin.zip - asset_content_type: application/zip - - name: Upload zip archive SHA to GitHub Release (Mac x86-64) - uses: actions/upload-release-asset@v1 - env: - GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }} - with: - upload_url: ${{ steps.create_gh_release.outputs.upload_url }} - asset_path: ./scala3-${{ env.RELEASE_TAG }}-x86_64-apple-darwin.zip.sha256 - asset_name: scala3-${{ env.RELEASE_TAG }}-x86_64-apple-darwin.zip.sha256 - asset_content_type: text/plain - - name: Upload tar.gz archive to GitHub Release (Mac x86-64) - uses: actions/upload-release-asset@v1 - env: - GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }} - with: - upload_url: ${{ steps.create_gh_release.outputs.upload_url }} - asset_path: ./scala3-${{ env.RELEASE_TAG }}-x86_64-apple-darwin.tar.gz - asset_name: scala3-${{ env.RELEASE_TAG }}-x86_64-apple-darwin.tar.gz - asset_content_type: application/gzip - - name: Upload tar.gz archive SHA to GitHub Release (Mac x86-64) - uses: actions/upload-release-asset@v1 - env: - GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }} - with: - upload_url: ${{ steps.create_gh_release.outputs.upload_url }} - asset_path: ./scala3-${{ env.RELEASE_TAG }}-x86_64-apple-darwin.tar.gz.sha256 - asset_name: scala3-${{ env.RELEASE_TAG }}-x86_64-apple-darwin.tar.gz.sha256 - asset_content_type: text/plain - - - # Mac aarch64 - - name: Upload zip archive to GitHub Release (Mac aarch64) - uses: actions/upload-release-asset@v1 - env: - GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }} - with: - upload_url: ${{ steps.create_gh_release.outputs.upload_url }} - asset_path: ./scala3-${{ env.RELEASE_TAG }}-aarch64-apple-darwin.zip - asset_name: scala3-${{ env.RELEASE_TAG }}-aarch64-apple-darwin.zip - asset_content_type: application/zip - - name: Upload zip archive SHA to GitHub Release (Mac aarch64) - uses: actions/upload-release-asset@v1 - env: - GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }} - with: - upload_url: ${{ steps.create_gh_release.outputs.upload_url }} - asset_path: ./scala3-${{ env.RELEASE_TAG }}-aarch64-apple-darwin.zip.sha256 - asset_name: scala3-${{ env.RELEASE_TAG }}-aarch64-apple-darwin.zip.sha256 - asset_content_type: text/plain - - name: Upload tar.gz archive to GitHub Release (Mac aarch64) - uses: actions/upload-release-asset@v1 - env: - GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }} - with: - upload_url: ${{ steps.create_gh_release.outputs.upload_url }} - asset_path: ./scala3-${{ env.RELEASE_TAG }}-aarch64-apple-darwin.tar.gz - asset_name: scala3-${{ env.RELEASE_TAG }}-aarch64-apple-darwin.tar.gz - asset_content_type: application/gzip - - name: Upload tar.gz archive SHA to GitHub Release (Mac aarch64) - uses: actions/upload-release-asset@v1 - env: - GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }} - with: - upload_url: ${{ steps.create_gh_release.outputs.upload_url }} - asset_path: ./scala3-${{ env.RELEASE_TAG }}-aarch64-apple-darwin.tar.gz.sha256 - asset_name: scala3-${{ env.RELEASE_TAG }}-aarch64-apple-darwin.tar.gz.sha256 - asset_content_type: text/plain - - - # Windows x86_64 - - name: Upload zip archive to GitHub Release (Windows x86_64) - uses: actions/upload-release-asset@v1 - env: - GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }} - with: - upload_url: ${{ steps.create_gh_release.outputs.upload_url }} - asset_path: ./scala3-${{ env.RELEASE_TAG }}-x86_64-pc-win32.zip - asset_name: scala3-${{ env.RELEASE_TAG }}-x86_64-pc-win32.zip - asset_content_type: application/zip - - name: Upload zip archive SHA to GitHub Release (Windows x86_64) - uses: actions/upload-release-asset@v1 - env: - GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }} - with: - upload_url: ${{ steps.create_gh_release.outputs.upload_url }} - asset_path: ./scala3-${{ env.RELEASE_TAG }}-x86_64-pc-win32.zip.sha256 - asset_name: scala3-${{ env.RELEASE_TAG }}-x86_64-pc-win32.zip.sha256 - asset_content_type: text/plain - - name: Upload tar.gz archive to GitHub Release (Windows x86_64) - uses: actions/upload-release-asset@v1 - env: - GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }} - with: - upload_url: ${{ steps.create_gh_release.outputs.upload_url }} - asset_path: ./scala3-${{ env.RELEASE_TAG }}-x86_64-pc-win32.tar.gz - asset_name: scala3-${{ env.RELEASE_TAG }}-x86_64-pc-win32.tar.gz - asset_content_type: application/gzip - - name: Upload tar.gz archive SHA to GitHub Release (Windows x86_64) - uses: actions/upload-release-asset@v1 - env: - GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }} - with: - upload_url: ${{ steps.create_gh_release.outputs.upload_url }} - asset_path: ./scala3-${{ env.RELEASE_TAG }}-x86_64-pc-win32.tar.gz.sha256 - asset_name: scala3-${{ env.RELEASE_TAG }}-x86_64-pc-win32.tar.gz.sha256 - asset_content_type: text/plain - - - # Windows x86_64 MSI - - name: Upload .msi file to GitHub Release (Windows x86_64 MSI) - uses: actions/upload-release-asset@v1 - env: - GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }} - with: - upload_url: ${{ steps.create_gh_release.outputs.upload_url }} - asset_path: ./scala3-${{ env.RELEASE_TAG }}.msi - asset_name: scala3-${{ env.RELEASE_TAG }}.msi - asset_content_type: application/x-msi - - name: Upload .msi file SHA to GitHub Release (Windows x86_64 MSI) - uses: actions/upload-release-asset@v1 - env: - GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }} - with: - upload_url: ${{ steps.create_gh_release.outputs.upload_url }} - asset_path: ./scala3-${{ env.RELEASE_TAG }}.msi.sha256 - asset_name: scala3-${{ env.RELEASE_TAG }}.msi.sha256 - asset_content_type: text/plain + shell: bash + run: | + git config --global --add safe.directory /__w/scala3/scala3 + gh release create \ + --draft \ + --title "${{ env.RELEASE_TAG }}" \ + --notes-file ./changelogs/${{ env.RELEASE_TAG }}.md \ + --latest=${{ !contains(env.RELEASE_TAG, '-RC') }} \ + --prerelease=${{ contains(env.RELEASE_TAG, '-RC') }} \ + --verify-tag ${{ env.RELEASE_TAG }} \ + scala3-${{ env.RELEASE_TAG }}*.zip \ + scala3-${{ env.RELEASE_TAG }}*.tar.gz \ + scala3-${{ env.RELEASE_TAG }}*.sha256 \ + scala3-${{ env.RELEASE_TAG }}.msi - name: Publish Release run: ./project/scripts/sbtPublish ";project scala3-bootstrapped ;publishSigned ;sonatypeBundleUpload" From 6b8d86f9734d29f450a8b7dc7ff0ae8e1bc5cc42 Mon Sep 17 00:00:00 2001 From: Wojciech Mazur Date: Mon, 9 Dec 2024 21:51:06 +0100 Subject: [PATCH 341/371] Add changelog for 3.6.3-RC1 --- changelogs/3.6.3-RC1.md | 179 ++++++++++++++++++++++++++++++++++++++++ 1 file changed, 179 insertions(+) create mode 100644 changelogs/3.6.3-RC1.md diff --git a/changelogs/3.6.3-RC1.md b/changelogs/3.6.3-RC1.md new file mode 100644 index 000000000000..201201fbf1bc --- /dev/null +++ b/changelogs/3.6.3-RC1.md @@ -0,0 +1,179 @@ +# Highlights of the release + +- Scala 2 forwardport: `-Yprofile-trace` [#19897](https://github.com/scala/scala3/pull/19897) + +# Other changes and fixes + +## Annotations + +- Fix Java parsing of annotations on qualified types [#21867](https://github.com/scala/scala3/pull/21867) +- Consider all arguments in Annotations.refersToParamOf [#22001](https://github.com/scala/scala3/pull/22001) + +## Backend + +- Flag class file collision as error [#19332](https://github.com/scala/scala3/pull/19332) + +## Compiler Phases + +- Fix #21939: Update names and descriptions for cc and setup phases [#21942](https://github.com/scala/scala3/pull/21942) + +## Experimental: Explicit Nulls + +- Improve warning for wildcard matching only null under the explicit nulls flag (scala#21577) [#21623](https://github.com/scala/scala3/pull/21623) +- Fix warning message for matching on redundant nulls [#21850](https://github.com/scala/scala3/pull/21850) + +## Experimental: Capture Checking + +- Fix #21868, #21869, and #21870: handle CapsOf in more places [#21875](https://github.com/scala/scala3/pull/21875) +- Consolidate CC [#21863](https://github.com/scala/scala3/pull/21863) +- Add path support for capture checking [#21445](https://github.com/scala/scala3/pull/21445) + +## Experimentals + +- Replace symbol traversal with tree traversal when finding top level experimentals [#21827](https://github.com/scala/scala3/pull/21827) + +## Extension Methods + +- Nowarn extension matching nonpublic member [#21825](https://github.com/scala/scala3/pull/21825) + +## Implicits + +- Apply implicit conversion from derived Conversion instance defined as implicit rather than given [#21785](https://github.com/scala/scala3/pull/21785) + +## Imports + +- Allow imports nested in packagings to shadow [#21539](https://github.com/scala/scala3/pull/21539) + +## Inline + +- Avoid using the current denotation in NamedType.disambiguate [#21414](https://github.com/scala/scala3/pull/21414) +- Drop phase.isTyper use in isLegalPrefix/asf [#21954](https://github.com/scala/scala3/pull/21954) +- Fix for macro annotation that resolves macro-based implicit crashing the compiler [#20353](https://github.com/scala/scala3/pull/20353) +- Allow macro annotations to recover from suspension [#21969](https://github.com/scala/scala3/pull/21969) + +## Linting + +- Disallow open modifier on objects [#21922](https://github.com/scala/scala3/pull/21922) +- Allow discarding "Discarded non-Unit" warnings with `: Unit` [#21927](https://github.com/scala/scala3/pull/21927) + +## Opaque Types + +- Fix pkg obj prefix of opaque tp ext meth [#21527](https://github.com/scala/scala3/pull/21527) + +## Parser + +- Fix: don't consider `into` as a soft-modifier [#21924](https://github.com/scala/scala3/pull/21924) + +## Pattern Matching + +- Drop inaccessible subclasses from refineUsingParent [#21799](https://github.com/scala/scala3/pull/21799) +- (Re-)Drop inaccessible subclasses from refineUsingParent [#21930](https://github.com/scala/scala3/pull/21930) +- Fix use of class terms in match analysis [#21848](https://github.com/scala/scala3/pull/21848) +- Don't project nested wildcard patterns to nullable [#21934](https://github.com/scala/scala3/pull/21934) +- Fix provablyDisjoint handling enum constants with mixins [#21876](https://github.com/scala/scala3/pull/21876) +- Do not consider uninhabited constructors when performing exhaustive match checking [#21750](https://github.com/scala/scala3/pull/21750) + +## Presentation Compiler + +- Update mtags to 1.4.1 and backport remaining changes [#21859](https://github.com/scala/scala3/pull/21859) +- Backport changes for the presentation compiler from Metals [#21756](https://github.com/scala/scala3/pull/21756) + +## Pickling + +- Avoid orphan param from default arg [#21824](https://github.com/scala/scala3/pull/21824) +- Make sure definition tree has the defined symbol [#21851](https://github.com/scala/scala3/pull/21851) + +## REPL + +- Allow top-level opaque type definitions in REPL [#21753](https://github.com/scala/scala3/pull/21753) + +## Scaladoc + +- Fix scaladoc TastyInspector regressions [#21716](https://github.com/scala/scala3/pull/21716) +- Bring back the fix for scaladoc TastyInspector regressions [#21929](https://github.com/scala/scala3/pull/21929) + +## Standard Library + +- Combine cases of `Tuple.Zip` disjoint from `(h1 *: t1, h2 *: t2)` [#21287](https://github.com/scala/scala3/pull/21287) + +## Quotes + +- Fix #20471: owners of top-level symbols in cached quoted code being incorrect [#21945](https://github.com/scala/scala3/pull/21945) + +## Reporting + +- Do not warn about expected missing positions in quotes.reflect.Symbol [#21677](https://github.com/scala/scala3/pull/21677) +- Add missing error messages to asserts in QuotesImpl [#21852](https://github.com/scala/scala3/pull/21852) +- Don't point to the compiler backlog when a compiler plugin phase crashes [#21887](https://github.com/scala/scala3/pull/21887) +- Better error message for polytypes wrapping capturing types [#21843](https://github.com/scala/scala3/pull/21843) +- Pretty-print lambdas [#21846](https://github.com/scala/scala3/pull/21846) + +## Scala.js + +- Shade scalajs.ir under dotty.tools [#21765](https://github.com/scala/scala3/pull/21765) + +## Scaladoc + +- Fix scaladoc graph highlight background color in dark mode [#21814](https://github.com/scala/scala3/pull/21814) + +## SemanticDB + +- Extract semanticDB for lifted definitions [#21856](https://github.com/scala/scala3/pull/21856) + +## Transform + +- Fix enclosingClass from returning refinement classes [#21411](https://github.com/scala/scala3/pull/21411) +- Attempt to beta reduce only if parameters and arguments have same shape [#21970](https://github.com/scala/scala3/pull/21970) +- Drop copied parent refinements before generating bytecode [#21733](https://github.com/scala/scala3/pull/21733) + +## Tooling + +- Ensure to escape characters before constructing JSON profile trace [#21872](https://github.com/scala/scala3/pull/21872) + +## Tuples + +- Fix tupleTypeFromSeq for XXL tuples [#21782](https://github.com/scala/scala3/pull/21782) + +## Typer + +- Do not crash when typing a closure with unknown type, since it can occur for erroneous input [#21178](https://github.com/scala/scala3/pull/21178) +- Revert SAM condition to what it was before [#21684](https://github.com/scala/scala3/pull/21684) +- Fix ctx implicits under case unapplySeq [#21748](https://github.com/scala/scala3/pull/21748) +- Avoid erasure/preErasure issues around Any in transformIsInstanceOf [#21647](https://github.com/scala/scala3/pull/21647) + +# Contributors + +Thank you to all the contributors who made this release possible 🎉 + +According to `git shortlog -sn --no-merges 3.6.2..3.6.3-RC1` these are: + +``` + 30 Dale Wijnand + 30 Kacper Korban + 27 Wojciech Mazur + 14 noti0na1 + 10 Eugene Flesselle + 10 Hamza Remmal + 10 HarrisL2 + 9 Martin Odersky + 8 Matt Bovel + 7 Jan Chyb + 6 Tomasz Godzik + 4 Jamie Thompson + 2 Friendseeker + 2 Pascal Weisenburger + 2 Seth Tisue + 2 Sébastien Doeraene + 1 Adrien Piquerez + 1 Alden Torres + 1 Alexander + 1 Fengyun Liu + 1 Georgi Krastev + 1 Jentsch + 1 Lunfu Zhong + 1 Michał Pałka + 1 Natsu Kagami + 1 dependabot[bot] + 1 friendseeker + 1 tgodzik +``` From 0bcc325c2f220421fd78b015d63504befa8d95f1 Mon Sep 17 00:00:00 2001 From: Wojciech Mazur Date: Mon, 9 Dec 2024 22:36:55 +0100 Subject: [PATCH 342/371] Release 3.6.3-RC1 --- project/Build.scala | 4 ++-- tasty/src/dotty/tools/tasty/TastyFormat.scala | 2 +- 2 files changed, 3 insertions(+), 3 deletions(-) diff --git a/project/Build.scala b/project/Build.scala index cf582762054a..5ac59067e4c9 100644 --- a/project/Build.scala +++ b/project/Build.scala @@ -97,7 +97,7 @@ object Build { * - In release branch it should be the last stable release * 3.6.0-RC1 was released as 3.6.0 - it's having and experimental TASTy version */ - val referenceVersion = "3.6.0" + val referenceVersion = "3.6.2" /** Version of the Scala compiler targeted in the current release cycle * Contains a version without RC/SNAPSHOT/NIGHTLY specific suffixes @@ -151,7 +151,7 @@ object Build { * - `3.(M-1).0` if `P = 0` * 3.6.1 is an exception from this rule - 3.6.0 was a broken release */ - val mimaPreviousDottyVersion = "3.6.1" + val mimaPreviousDottyVersion = "3.6.2" /** LTS version against which we check binary compatibility. * diff --git a/tasty/src/dotty/tools/tasty/TastyFormat.scala b/tasty/src/dotty/tools/tasty/TastyFormat.scala index 8f5f9d57a8a5..8da8879185f5 100644 --- a/tasty/src/dotty/tools/tasty/TastyFormat.scala +++ b/tasty/src/dotty/tools/tasty/TastyFormat.scala @@ -340,7 +340,7 @@ object TastyFormat { * is able to read final TASTy documents if the file's * `MinorVersion` is strictly less than the current value. */ - final val ExperimentalVersion: Int = 1 + final val ExperimentalVersion: Int = 0 /**This method implements a binary relation (`<:<`) between two TASTy versions. * From beb40d896541ebd51245f90318d60780d05c4a17 Mon Sep 17 00:00:00 2001 From: Seth Tisue Date: Thu, 12 Dec 2024 18:07:29 -0800 Subject: [PATCH 343/371] REPL: JLine: follow recommendation to use JNI, not JNA as per the https://github.com/jline/jline3 readme fixes #22201 [Cherry-picked 0589be3356a274700bf7f69d709eb539c2d75f8b] --- dist/libexec/common-shared | 4 +--- project/Build.scala | 2 +- 2 files changed, 2 insertions(+), 4 deletions(-) diff --git a/dist/libexec/common-shared b/dist/libexec/common-shared index 8c85993a5283..fa1e62c09241 100644 --- a/dist/libexec/common-shared +++ b/dist/libexec/common-shared @@ -28,7 +28,7 @@ function onExit() { # to reenable echo if we are interrupted before completing. trap onExit INT TERM EXIT -unset cygwin mingw msys darwin conemu +unset cygwin mingw msys darwin # COLUMNS is used together with command line option '-pageWidth'. if command -v tput >/dev/null 2>&1; then @@ -57,8 +57,6 @@ esac unset CYGPATHCMD if [[ ${cygwin-} || ${mingw-} || ${msys-} ]]; then - # ConEmu terminal is incompatible with jna-5.*.jar - [[ (${CONEMUANSI-} || ${ConEmuANSI-}) ]] && conemu=true # cygpath is used by various windows shells: cygwin, git-sdk, gitbash, msys, etc. CYGPATHCMD=`which cygpath 2>/dev/null` case "$TERM" in diff --git a/project/Build.scala b/project/Build.scala index 5ac59067e4c9..7830af552df9 100644 --- a/project/Build.scala +++ b/project/Build.scala @@ -743,7 +743,7 @@ object Build { Dependencies.compilerInterface, "org.jline" % "jline-reader" % "3.27.0", // used by the REPL "org.jline" % "jline-terminal" % "3.27.0", - "org.jline" % "jline-terminal-jna" % "3.27.0", // needed for Windows + "org.jline" % "jline-terminal-jni" % "3.27.0", // needed for Windows ("io.get-coursier" %% "coursier" % "2.0.16" % Test).cross(CrossVersion.for3Use2_13), ), From 484c29ed154d15d18c9a6c349fb4daf925005eaf Mon Sep 17 00:00:00 2001 From: Seth Tisue Date: Thu, 12 Dec 2024 18:32:15 -0800 Subject: [PATCH 344/371] JLine 3.27.1 (was 3.27.0) [Cherry-picked e5e4c4039f7e141209fdb7f845a2a6cfcb77821b] --- project/Build.scala | 6 +++--- 1 file changed, 3 insertions(+), 3 deletions(-) diff --git a/project/Build.scala b/project/Build.scala index 7830af552df9..16c3e594ea5d 100644 --- a/project/Build.scala +++ b/project/Build.scala @@ -741,9 +741,9 @@ object Build { libraryDependencies ++= Seq( "org.scala-lang.modules" % "scala-asm" % "9.7.0-scala-2", // used by the backend Dependencies.compilerInterface, - "org.jline" % "jline-reader" % "3.27.0", // used by the REPL - "org.jline" % "jline-terminal" % "3.27.0", - "org.jline" % "jline-terminal-jni" % "3.27.0", // needed for Windows + "org.jline" % "jline-reader" % "3.27.1", // used by the REPL + "org.jline" % "jline-terminal" % "3.27.1", + "org.jline" % "jline-terminal-jni" % "3.27.1", // needed for Windows ("io.get-coursier" %% "coursier" % "2.0.16" % Test).cross(CrossVersion.for3Use2_13), ), From eeba529eea73db286297e8c4c154fd3825edc19b Mon Sep 17 00:00:00 2001 From: Wojciech Mazur Date: Thu, 12 Dec 2024 15:21:33 +0100 Subject: [PATCH 345/371] Fix layout of released SDK archives, restore intermiediete top-level directory (#22199) Fixes #22194 Restores top-level directory `scala3-${version}` that is present in artifacts published before Scala 3.6, removed during hotfix 3.6.1 release. We now follow the [Well formed SDK archives layout](https://github.com/sdkman/sdkman-cli/wiki/Well-formed-SDK-archives). Removing the top-level directory even though at first glance looked like an improvement was in fact introducing problems to multiple package managers and build tools. [Cherry-picked 5b3d82a41aafcaccab99bad95aa5a035a5dacabb] --- .github/workflows/ci.yaml | 12 +++--------- project/Build.scala | 9 ++++++++- 2 files changed, 11 insertions(+), 10 deletions(-) diff --git a/.github/workflows/ci.yaml b/.github/workflows/ci.yaml index e842aeab229b..5931219f472a 100644 --- a/.github/workflows/ci.yaml +++ b/.github/workflows/ci.yaml @@ -794,19 +794,13 @@ jobs: distDir="$3" # Build binaries - ./project/scripts/sbt "${sbtProject}/Universal/stage" + ./project/scripts/sbt "all ${sbtProject}/Universal/packageBin ${sbtProject}/Universal/packageZipTarball" - outputPath="${distDir}/target/universal/stage" artifactName="scala3-${{ env.RELEASE_TAG }}${distroSuffix}" - zipArchive="${artifactName}.zip" - tarGzArchive="${artifactName}.tar.gz" - - cwd=$(pwd) - (cd $outputPath && zip -r ${zipArchive} . && mv ${zipArchive} "${cwd}/") - tar -czf ${tarGzArchive} -C "$outputPath" . # Caluclate SHA for each of archive files - for file in "${zipArchive}" "${tarGzArchive}"; do + for file in "${artifactName}.zip" "${artifactName}.tar.gz"; do + mv ${distDir}/target/universal/$file $file sha256sum "${file}" > "${file}.sha256" done } diff --git a/project/Build.scala b/project/Build.scala index 16c3e594ea5d..743e299767d2 100644 --- a/project/Build.scala +++ b/project/Build.scala @@ -2230,7 +2230,14 @@ object Build { // ======== Universal / stage := (Universal / stage).dependsOn(republish).value, Universal / packageBin := (Universal / packageBin).dependsOn(republish).value, - Universal / packageZipTarball := (Universal / packageZipTarball).dependsOn(republish).value, + Universal / packageZipTarball := (Universal / packageZipTarball).dependsOn(republish) + .map { archiveFile => + // Rename .tgz to .tar.gz for consistency with previous versions + val renamedFile = archiveFile.getParentFile() / archiveFile.getName.replaceAll("\\.tgz$", ".tar.gz") + IO.move(archiveFile, renamedFile) + renamedFile + } + .value, // ======== Universal / mappings ++= directory(dist.base / "bin"), Universal / mappings ++= directory(republishRepo.value / "maven2"), From ddef7997ceba5ad0f5cf43f65fd12dcf56b4de23 Mon Sep 17 00:00:00 2001 From: Eugene Yokota Date: Wed, 11 Dec 2024 01:32:40 -0500 Subject: [PATCH 346/371] refactor: improve Given search preference warning **Problem** It wasn't clear what action users was suppose to take to suppress the new-from-3.6 Given search preference warning. **Solution** 1. This refactors the code to give the warning an error code E205. 2. In case of warnings, tell the user to choose -source 3.5 vs 3.7, or use nowarn annotation. [Cherry-picked 004cfc5ed76ea34245ca30c9cc3872e86f9e6d5e] --- .../tools/dotc/reporting/ErrorMessageID.scala | 1 + .../dotty/tools/dotc/reporting/messages.scala | 38 +++++++++++++++++++ .../dotty/tools/dotc/typer/Implicits.scala | 22 ++--------- tests/neg/given-triangle.check | 6 +-- tests/warn/i21036a.check | 11 ++++-- tests/warn/i21036b.check | 9 +++-- tests/warn/i21036c.scala | 7 ++++ 7 files changed, 66 insertions(+), 28 deletions(-) create mode 100644 tests/warn/i21036c.scala diff --git a/compiler/src/dotty/tools/dotc/reporting/ErrorMessageID.scala b/compiler/src/dotty/tools/dotc/reporting/ErrorMessageID.scala index 2c3774b59a9a..d3467fe70c52 100644 --- a/compiler/src/dotty/tools/dotc/reporting/ErrorMessageID.scala +++ b/compiler/src/dotty/tools/dotc/reporting/ErrorMessageID.scala @@ -218,6 +218,7 @@ enum ErrorMessageID(val isActive: Boolean = true) extends java.lang.Enum[ErrorMe case QuotedTypeMissingID // errorNumber: 202 case DeprecatedAssignmentSyntaxID // errorNumber: 203 case DeprecatedInfixNamedArgumentSyntaxID // errorNumber: 204 + case GivenSearchPriorityID // errorNumber: 205 def errorNumber = ordinal - 1 diff --git a/compiler/src/dotty/tools/dotc/reporting/messages.scala b/compiler/src/dotty/tools/dotc/reporting/messages.scala index 3721c18fd294..bb3194558cae 100644 --- a/compiler/src/dotty/tools/dotc/reporting/messages.scala +++ b/compiler/src/dotty/tools/dotc/reporting/messages.scala @@ -3361,3 +3361,41 @@ class DeprecatedInfixNamedArgumentSyntax()(using Context) extends SyntaxMsg(Depr + Message.rewriteNotice("This", version = SourceVersion.`3.6-migration`) def explain(using Context) = "" + +class GivenSearchPriorityWarning( + pt: Type, + cmp: Int, + prev: Int, + winner: TermRef, + loser: TermRef, + isLastOldVersion: Boolean +)(using Context) extends Message(GivenSearchPriorityID): + def kind = MessageKind.PotentialIssue + def choice(nth: String, c: Int) = + if c == 0 then "none - it's ambiguous" + else s"the $nth alternative" + val (change, whichChoice) = + if isLastOldVersion + then ("will change in the future release", "Current choice ") + else ("has changed", "Previous choice") + def warningMessage: String = + i"""Given search preference for $pt between alternatives + | ${loser} + |and + | ${winner} + |$change. + |$whichChoice : ${choice("first", prev)} + |Choice from Scala 3.7 : ${choice("second", cmp)}""" + def migrationHints: String = + i"""Suppress this warning by choosing -source 3.5, -source 3.7, or + |by using @annotation.nowarn("id=205")""" + def ambiguousNote: String = + i""" + | + |Note: $warningMessage""" + def msg(using Context) = + i"""$warningMessage + | + |$migrationHints""" + + def explain(using Context) = "" diff --git a/compiler/src/dotty/tools/dotc/typer/Implicits.scala b/compiler/src/dotty/tools/dotc/typer/Implicits.scala index 0727c83d8469..9d273ebca866 100644 --- a/compiler/src/dotty/tools/dotc/typer/Implicits.scala +++ b/compiler/src/dotty/tools/dotc/typer/Implicits.scala @@ -549,10 +549,10 @@ object Implicits: /** An ambiguous implicits failure */ class AmbiguousImplicits(val alt1: SearchSuccess, val alt2: SearchSuccess, val expectedType: Type, val argument: Tree, val nested: Boolean = false) extends SearchFailureType: - private[Implicits] var priorityChangeWarnings: List[Message] = Nil + private[Implicits] var priorityChangeWarnings: List[GivenSearchPriorityWarning] = Nil def priorityChangeWarningNote(using Context): String = - priorityChangeWarnings.map(msg => s"\n\nNote: $msg").mkString + priorityChangeWarnings.map(_.ambiguousNote).mkString def msg(using Context): Message = var str1 = err.refStr(alt1.ref) @@ -1312,7 +1312,7 @@ trait Implicits: // A map that associates a priority change warning (between -source 3.6 and 3.7) // with the candidate refs mentioned in the warning. We report the associated // message if one of the critical candidates is part of the result of the implicit search. - val priorityChangeWarnings = mutable.ListBuffer[(/*critical:*/ List[TermRef], Message)]() + val priorityChangeWarnings = mutable.ListBuffer[(/*critical:*/ List[TermRef], GivenSearchPriorityWarning)]() val sv = Feature.sourceVersion val isLastOldVersion = sv.stable == SourceVersion.`3.6` @@ -1353,21 +1353,7 @@ trait Implicits: cmp match case 1 => (alt2, alt1) case -1 => (alt1, alt2) - def choice(nth: String, c: Int) = - if c == 0 then "none - it's ambiguous" - else s"the $nth alternative" - val (change, whichChoice) = - if isLastOldVersion - then ("will change", "Current choice ") - else ("has changed", "Previous choice") - val msg = - em"""Given search preference for $pt between alternatives - | ${loser.ref} - |and - | ${winner.ref} - |$change. - |$whichChoice : ${choice("first", prev)} - |New choice from Scala 3.7: ${choice("second", cmp)}""" + val msg = GivenSearchPriorityWarning(pt, cmp, prev, winner.ref, loser.ref, isLastOldVersion) val critical = alt1.ref :: alt2.ref :: Nil priorityChangeWarnings += ((critical, msg)) if isLastOldVersion then prev else cmp diff --git a/tests/neg/given-triangle.check b/tests/neg/given-triangle.check index f366c18e78f0..8a05ed4b3129 100644 --- a/tests/neg/given-triangle.check +++ b/tests/neg/given-triangle.check @@ -7,6 +7,6 @@ | (given_B : B) |and | (given_A : A) - |will change. - |Current choice : the first alternative - |New choice from Scala 3.7: the second alternative + |will change in the future release. + |Current choice : the first alternative + |Choice from Scala 3.7 : the second alternative diff --git a/tests/warn/i21036a.check b/tests/warn/i21036a.check index 63d611a6e246..6ce5b94d123f 100644 --- a/tests/warn/i21036a.check +++ b/tests/warn/i21036a.check @@ -1,10 +1,13 @@ --- Warning: tests/warn/i21036a.scala:7:17 ------------------------------------------------------------------------------ +-- [E205] Potential Issue Warning: tests/warn/i21036a.scala:7:17 ------------------------------------------------------- 7 |val y = summon[A] // warn | ^ | Given search preference for A between alternatives | (b : B) | and | (a : A) - | will change. - | Current choice : the first alternative - | New choice from Scala 3.7: the second alternative + | will change in the future release. + | Current choice : the first alternative + | Choice from Scala 3.7 : the second alternative + | + | Suppress this warning by choosing -source 3.5, -source 3.7, or + | by using @annotation.nowarn("id=205") diff --git a/tests/warn/i21036b.check b/tests/warn/i21036b.check index dfa19a0e9bb1..da0639438c86 100644 --- a/tests/warn/i21036b.check +++ b/tests/warn/i21036b.check @@ -1,4 +1,4 @@ --- Warning: tests/warn/i21036b.scala:7:17 ------------------------------------------------------------------------------ +-- [E205] Potential Issue Warning: tests/warn/i21036b.scala:7:17 ------------------------------------------------------- 7 |val y = summon[A] // warn | ^ | Given search preference for A between alternatives @@ -6,5 +6,8 @@ | and | (a : A) | has changed. - | Previous choice : the first alternative - | New choice from Scala 3.7: the second alternative + | Previous choice : the first alternative + | Choice from Scala 3.7 : the second alternative + | + | Suppress this warning by choosing -source 3.5, -source 3.7, or + | by using @annotation.nowarn("id=205") diff --git a/tests/warn/i21036c.scala b/tests/warn/i21036c.scala new file mode 100644 index 000000000000..4015cc8a84bb --- /dev/null +++ b/tests/warn/i21036c.scala @@ -0,0 +1,7 @@ +trait A +trait B extends A +given b: B = ??? +given a: A = ??? + +@annotation.nowarn("id=205") +val y = summon[A] // don't warn \ No newline at end of file From 42101109eae04a91d8d9e72f6cbe93cebf594c75 Mon Sep 17 00:00:00 2001 From: Rui Chen Date: Tue, 10 Dec 2024 15:38:45 -0500 Subject: [PATCH 347/371] fix: update `scala-cli.jar` path Signed-off-by: Rui Chen [Cherry-picked 70cc1a19da85f502fc58c8f0ed4fbe6ff9444e7d] --- dist/libexec/cli-common-platform | 2 +- dist/libexec/cli-common-platform.bat | 2 +- 2 files changed, 2 insertions(+), 2 deletions(-) diff --git a/dist/libexec/cli-common-platform b/dist/libexec/cli-common-platform index a5906e882bb4..e56f5221dbf2 100644 --- a/dist/libexec/cli-common-platform +++ b/dist/libexec/cli-common-platform @@ -1,3 +1,3 @@ #!/usr/bin/env bash -SCALA_CLI_CMD_BASH=("\"$JAVACMD\"" "-jar \"$PROG_HOME/bin/scala-cli.jar\"") +SCALA_CLI_CMD_BASH=("\"$JAVACMD\"" "-jar \"$PROG_HOME/libexec/scala-cli.jar\"") diff --git a/dist/libexec/cli-common-platform.bat b/dist/libexec/cli-common-platform.bat index 99103266c1d9..45b09f3460e6 100644 --- a/dist/libexec/cli-common-platform.bat +++ b/dist/libexec/cli-common-platform.bat @@ -2,4 +2,4 @@ @rem we need to escape % in the java command path, for some reason this doesnt work in common.bat set "_JAVACMD=!_JAVACMD:%%=%%%%!" -set SCALA_CLI_CMD_WIN="%_JAVACMD%" "-jar" "%_PROG_HOME%\bin\scala-cli.jar" \ No newline at end of file +set SCALA_CLI_CMD_WIN="%_JAVACMD%" "-jar" "%_PROG_HOME%\libexec\scala-cli.jar" From ff78d080612b0f816d68e2b88d481cb6825e5daa Mon Sep 17 00:00:00 2001 From: =?UTF-8?q?Jo=C3=A3o=20Ferreira?= Date: Tue, 10 Dec 2024 11:24:02 +0000 Subject: [PATCH 348/371] Limit exposure to ConcurrentModificationException when sys props are replaced or mutated port of https://github.com/scala/scala/commit/f6859f28bb49193fde83e6020a6a89ce926a91e8 [Cherry-picked 705c33ca9ca4ed01e8b11c7928468fbd2a267aa7] --- .../src/dotty/tools/dotc/config/PathResolver.scala | 13 ++++++++++--- 1 file changed, 10 insertions(+), 3 deletions(-) diff --git a/compiler/src/dotty/tools/dotc/config/PathResolver.scala b/compiler/src/dotty/tools/dotc/config/PathResolver.scala index 29e6e35855c8..67be0e3587cb 100644 --- a/compiler/src/dotty/tools/dotc/config/PathResolver.scala +++ b/compiler/src/dotty/tools/dotc/config/PathResolver.scala @@ -36,9 +36,16 @@ object PathResolver { /** Values found solely by inspecting environment or property variables. */ object Environment { - private def searchForBootClasspath = ( - systemProperties find (_._1 endsWith ".boot.class.path") map (_._2) getOrElse "" - ) + private def searchForBootClasspath = { + import scala.jdk.CollectionConverters.* + val props = System.getProperties + // This formulation should be immune to ConcurrentModificationExceptions when system properties + // we're unlucky enough to witness a partially published result of System.setProperty or direct + // mutation of the System property map. stringPropertyNames internally uses the Enumeration interface, + // rather than Iterator, and this disables the fail-fast ConcurrentModificationException. + val propNames = props.stringPropertyNames() + propNames.asScala collectFirst { case k if k endsWith ".boot.class.path" => props.getProperty(k) } getOrElse "" + } /** Environment variables which java pays attention to so it * seems we do as well. From 87d6ed9e30e0e94eac97758ed47a36a72a86c554 Mon Sep 17 00:00:00 2001 From: =?UTF-8?q?Jo=C3=A3o=20Ferreira?= Date: Tue, 10 Dec 2024 13:32:39 +0000 Subject: [PATCH 349/371] improve javaBootClassPath lazy evaluation [Cherry-picked 31690d45237fb6aab7e0474ee115d7bdfe8a0892] --- compiler/src/dotty/tools/dotc/config/PathResolver.scala | 3 ++- 1 file changed, 2 insertions(+), 1 deletion(-) diff --git a/compiler/src/dotty/tools/dotc/config/PathResolver.scala b/compiler/src/dotty/tools/dotc/config/PathResolver.scala index 67be0e3587cb..f60727e6bba2 100644 --- a/compiler/src/dotty/tools/dotc/config/PathResolver.scala +++ b/compiler/src/dotty/tools/dotc/config/PathResolver.scala @@ -53,7 +53,8 @@ object PathResolver { def classPathEnv: String = envOrElse("CLASSPATH", "") def sourcePathEnv: String = envOrElse("SOURCEPATH", "") - def javaBootClassPath: String = propOrElse("sun.boot.class.path", searchForBootClasspath) + //using propOrNone/getOrElse instead of propOrElse so that searchForBootClasspath is lazy evaluated + def javaBootClassPath: String = propOrNone("sun.boot.class.path") getOrElse searchForBootClasspath def javaExtDirs: String = propOrEmpty("java.ext.dirs") def scalaHome: String = propOrEmpty("scala.home") From 09a8ff1bb3e59ed32b0bf9bad4f6ba9bd7ca7268 Mon Sep 17 00:00:00 2001 From: Som Snytt Date: Tue, 22 Oct 2024 03:00:58 -0700 Subject: [PATCH 350/371] Nowarn extension matching nonpublic member [Cherry-picked d0fdbfb4cace1bba61fcffc95fba4de1c236e3eb] --- .../dotty/tools/dotc/typer/RefChecks.scala | 23 +++++++++++-------- tests/warn/i21816.scala | 17 ++++++++++++++ 2 files changed, 30 insertions(+), 10 deletions(-) create mode 100644 tests/warn/i21816.scala diff --git a/compiler/src/dotty/tools/dotc/typer/RefChecks.scala b/compiler/src/dotty/tools/dotc/typer/RefChecks.scala index 0a0356707048..6945dffbe2f2 100644 --- a/compiler/src/dotty/tools/dotc/typer/RefChecks.scala +++ b/compiler/src/dotty/tools/dotc/typer/RefChecks.scala @@ -1169,16 +1169,19 @@ object RefChecks { target.nonPrivateMember(sym.name) .filterWithPredicate: member => - val memberIsImplicit = member.info.hasImplicitParams - val paramTps = - if memberIsImplicit then methTp.stripPoly.firstParamTypes - else methTp.firstExplicitParamTypes - - paramTps.isEmpty || memberIsImplicit && !methTp.hasImplicitParams || { - val memberParamTps = member.info.stripPoly.firstParamTypes - !memberParamTps.isEmpty - && memberParamTps.lengthCompare(paramTps) == 0 - && memberParamTps.lazyZip(paramTps).forall((m, x) => x frozen_<:< m) + val memberIsPublic = (member.symbol.flags & AccessFlags).isEmpty && !member.symbol.privateWithin.exists + memberIsPublic && { + val memberIsImplicit = member.info.hasImplicitParams + val paramTps = + if memberIsImplicit then methTp.stripPoly.firstParamTypes + else methTp.firstExplicitParamTypes + + paramTps.isEmpty || memberIsImplicit && !methTp.hasImplicitParams || { + val memberParamTps = member.info.stripPoly.firstParamTypes + !memberParamTps.isEmpty + && memberParamTps.lengthCompare(paramTps) == 0 + && memberParamTps.lazyZip(paramTps).forall((m, x) => x frozen_<:< m) + } } .exists if !target.typeSymbol.denot.isAliasType && !target.typeSymbol.denot.isOpaqueAlias && hidden diff --git a/tests/warn/i21816.scala b/tests/warn/i21816.scala new file mode 100644 index 000000000000..9153b8b0ee2f --- /dev/null +++ b/tests/warn/i21816.scala @@ -0,0 +1,17 @@ + +case class CC(a: String, b: String) extends Iterable[String] { + override def iterator: Iterator[String] = Iterator(a, b) +} + +trait T { + extension (cc: CC) def className: String = "foo" +} + +object O extends T { + def foo = { + val cc = CC("a", "b") + println(cc.className) + } +} + +@main def main() = O.foo From b6bc62c8acbd1f38a2ff1799be32ca3c311a4f0c Mon Sep 17 00:00:00 2001 From: Som Snytt Date: Tue, 22 Oct 2024 08:04:10 -0700 Subject: [PATCH 351/371] Prefer isPublic in RefChecks [Cherry-picked ae1b583325a160ed980808be7915c1adb66ac22a] --- compiler/src/dotty/tools/dotc/typer/RefChecks.scala | 6 ++---- 1 file changed, 2 insertions(+), 4 deletions(-) diff --git a/compiler/src/dotty/tools/dotc/typer/RefChecks.scala b/compiler/src/dotty/tools/dotc/typer/RefChecks.scala index 6945dffbe2f2..0ec9458cac5c 100644 --- a/compiler/src/dotty/tools/dotc/typer/RefChecks.scala +++ b/compiler/src/dotty/tools/dotc/typer/RefChecks.scala @@ -525,7 +525,6 @@ object RefChecks { // todo: align accessibility implication checking with isAccessible in Contexts def isOverrideAccessOK = - val memberIsPublic = (member.flags & AccessFlags).isEmpty && !member.privateWithin.exists def protectedOK = !other.is(Protected) || member.is(Protected) // if o is protected, so is m def accessBoundaryOK = val ob = other.accessBoundary(member.owner) @@ -534,7 +533,7 @@ object RefChecks { def companionBoundaryOK = ob.isClass && !ob.isLocalToBlock && mb.is(Module) && (ob.companionModule eq mb.companionModule) ob.isContainedIn(mb) || companionBoundaryOK // m relaxes o's access boundary, def otherIsJavaProtected = other.isAllOf(JavaProtected) // or o is Java defined and protected (see #3946) - memberIsPublic || protectedOK && (accessBoundaryOK || otherIsJavaProtected) + member.isPublic || protectedOK && (accessBoundaryOK || otherIsJavaProtected) end isOverrideAccessOK if !member.hasTargetName(other.targetName) then @@ -1169,8 +1168,7 @@ object RefChecks { target.nonPrivateMember(sym.name) .filterWithPredicate: member => - val memberIsPublic = (member.symbol.flags & AccessFlags).isEmpty && !member.symbol.privateWithin.exists - memberIsPublic && { + member.symbol.isPublic && { val memberIsImplicit = member.info.hasImplicitParams val paramTps = if memberIsImplicit then methTp.stripPoly.firstParamTypes From 9530960b5ac58a79591107f2329dce55a5a7aa36 Mon Sep 17 00:00:00 2001 From: Wojciech Mazur Date: Mon, 30 Dec 2024 13:55:09 +0100 Subject: [PATCH 352/371] chore: use sbt/setup-sbt when using ubuntu-latest image [Cherry-picked 4961d1eaf219cbf920206ba1e15101a7cdc94eef][modified] --- .github/workflows/build-sdk.yml | 1 + .github/workflows/language-reference.yaml | 1 + .github/workflows/launchers.yml | 14 +++++--------- .github/workflows/scaladoc.yaml | 1 + 4 files changed, 8 insertions(+), 9 deletions(-) diff --git a/.github/workflows/build-sdk.yml b/.github/workflows/build-sdk.yml index b2af623d731a..cd111df1a083 100644 --- a/.github/workflows/build-sdk.yml +++ b/.github/workflows/build-sdk.yml @@ -61,6 +61,7 @@ jobs: distribution: temurin java-version: ${{ inputs.java-version }} cache : sbt + - uses: sbt/setup-sbt@v1 - name: Build and pack the SDK (universal) run : ./project/scripts/sbt dist/Universal/stage - name: Build and pack the SDK (linux x86-64) diff --git a/.github/workflows/language-reference.yaml b/.github/workflows/language-reference.yaml index 7f87b4a453ef..d79f4d029a77 100644 --- a/.github/workflows/language-reference.yaml +++ b/.github/workflows/language-reference.yaml @@ -36,6 +36,7 @@ jobs: distribution: 'temurin' java-version: 17 cache: 'sbt' + - uses: sbt/setup-sbt@v1 - name: Generate reference documentation and test links run: | diff --git a/.github/workflows/launchers.yml b/.github/workflows/launchers.yml index ce3aac235224..4ee07e4bfcc9 100644 --- a/.github/workflows/launchers.yml +++ b/.github/workflows/launchers.yml @@ -20,6 +20,7 @@ jobs: java-version: '17' distribution: 'temurin' cache: 'sbt' + - uses: sbt/setup-sbt@v1 - name: Build and test launcher command run: ./project/scripts/native-integration/bashTests env: @@ -37,9 +38,7 @@ jobs: java-version: '17' distribution: 'temurin' cache: 'sbt' - # https://github.com/actions/runner-images/issues/9369 - - name: Install sbt - run: brew install sbt + - uses: sbt/setup-sbt@v1 - name: Build and test launcher command run: ./project/scripts/native-integration/bashTests env: @@ -58,9 +57,7 @@ jobs: java-version: '17' distribution: 'temurin' cache: 'sbt' - # https://github.com/actions/runner-images/issues/9369 - - name: Install sbt - run: brew install sbt + - uses: sbt/setup-sbt@v1 - name: Build and test launcher command run: ./project/scripts/native-integration/bashTests env: @@ -79,9 +76,7 @@ jobs: java-version: '17' distribution: 'temurin' cache: 'sbt' - # https://github.com/actions/runner-images/issues/9369 - - name: Install sbt - run: brew install sbt + - uses: sbt/setup-sbt@v1 - name: Build and test launcher command run: ./project/scripts/native-integration/bashTests env: @@ -100,6 +95,7 @@ jobs: java-version: '17' distribution: 'temurin' cache: 'sbt' + - uses: sbt/setup-sbt@v1 - name: Build the launcher command run: sbt "dist-win-x86_64/Universal/stage" - name: Run the launcher command tests diff --git a/.github/workflows/scaladoc.yaml b/.github/workflows/scaladoc.yaml index 4f6f5bbfe2fb..d2e3071e765b 100644 --- a/.github/workflows/scaladoc.yaml +++ b/.github/workflows/scaladoc.yaml @@ -37,6 +37,7 @@ jobs: java-version: 17 cache: 'sbt' + - uses: sbt/setup-sbt@v1 - name: Compile and test scala3doc-js run: ./project/scripts/sbt scaladoc-js-main/test From 376fc170f8d4d506992408011d563322304a9340 Mon Sep 17 00:00:00 2001 From: Hamza Remmal Date: Tue, 17 Dec 2024 15:56:30 +0100 Subject: [PATCH 353/371] fix: add sbt/setup-sbt for the dependency graph workflow [Cherry-picked 72848b3576e94a5092a6927ee0ec450d15e6621d] --- .github/workflows/dependency-graph.yml | 1 + 1 file changed, 1 insertion(+) diff --git a/.github/workflows/dependency-graph.yml b/.github/workflows/dependency-graph.yml index 35af4fa0526d..6a3f8174b2d7 100644 --- a/.github/workflows/dependency-graph.yml +++ b/.github/workflows/dependency-graph.yml @@ -9,6 +9,7 @@ jobs: runs-on: ubuntu-latest steps: - uses: actions/checkout@v4 + - uses: sbt/setup-sbt@v1 - uses: scalacenter/sbt-dependency-submission@v3 env: DEVELOCITY_ACCESS_KEY: ${{ secrets.DEVELOCITY_ACCESS_KEY }} From a05de1ca4177a5b24104ce08d5ac03a0a083b2f9 Mon Sep 17 00:00:00 2001 From: Wojciech Mazur Date: Mon, 30 Dec 2024 18:00:18 +0100 Subject: [PATCH 354/371] Add changelog for 3.6.3-RC2 --- changelogs/3.6.3-RC2.md | 24 ++++++++++++++++++++++++ 1 file changed, 24 insertions(+) create mode 100644 changelogs/3.6.3-RC2.md diff --git a/changelogs/3.6.3-RC2.md b/changelogs/3.6.3-RC2.md new file mode 100644 index 000000000000..0da2783bd6fe --- /dev/null +++ b/changelogs/3.6.3-RC2.md @@ -0,0 +1,24 @@ +# Backported fixes + +- Fix: update `scala-cli.jar` path [#22274](http://github.com/scala/scala3/pull/22274) +- Nowarn extension matching nonpublic member [#22276](http://github.com/scala/scala3/pull/22276) +- Limit exposure to ConcurrentModificationException when sys props are replaced or mutated [#22275](http://github.com/scala/scala3/pull/22275) +- Refactor: Improve Given search preference warning [#22273](http://github.com/scala/scala3/pull/22273) +- Fix layout of released SDK archives, restore intermiediete top-level directory [#22272](http://github.com/scala/scala3/pull/22272) +- REPL: JLine: follow recommendation to use JNI, not JNA; also JLine 3.27.1 (was 3.27.0) [#22271](http://github.com/scala/scala3/pull/22271) + +# Contributors + +Thank you to all the contributors who made this release possible 🎉 + +According to `git shortlog -sn --no-merges 3.6.3-RC1..3.6.3-RC2` these are: + +``` + 4 Wojciech Mazur + 2 João Ferreira + 2 Seth Tisue + 2 Som Snytt + 1 Eugene Yokota + 1 Hamza Remmal + 1 Rui Chen +``` From b89886b88dc6fa3777eb7316b0195abaec478107 Mon Sep 17 00:00:00 2001 From: Wojciech Mazur Date: Mon, 30 Dec 2024 18:00:43 +0100 Subject: [PATCH 355/371] Release 3.6.3-RC2 --- project/Build.scala | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/project/Build.scala b/project/Build.scala index 743e299767d2..9f7a4839c3dc 100644 --- a/project/Build.scala +++ b/project/Build.scala @@ -115,7 +115,7 @@ object Build { * During release candidate cycle incremented by the release officer before publishing a subsequent RC version; * During final, stable release is set exactly to `developedVersion`. */ - val baseVersion = s"$developedVersion-RC1" + val baseVersion = s"$developedVersion-RC2" /** Final version of Scala compiler, controlled by environment variables. */ val dottyVersion = { From 7457b168912c88625470f300714f5eddeef2ae92 Mon Sep 17 00:00:00 2001 From: Seth Tisue Date: Wed, 8 Jan 2025 15:25:02 -0800 Subject: [PATCH 356/371] copyright 2025 note that I added "dba Akka" to NOTICE.md but I don't believe it's necessary to pollute the version history adding that to the top of every source file, too. in legal contexts, "Lightbend, Inc." is still the company's legal name --- NOTICE.md | 19 ++++++++++--------- pkgs/chocolatey/scala.nuspec | 2 +- 2 files changed, 11 insertions(+), 10 deletions(-) diff --git a/NOTICE.md b/NOTICE.md index fd931397a500..b3f97913df2f 100644 --- a/NOTICE.md +++ b/NOTICE.md @@ -1,6 +1,6 @@ -Dotty (https://dotty.epfl.ch) -Copyright 2012-2024 EPFL -Copyright 2012-2024 Lightbend, Inc. +Scala 3 (https://www.scala-lang.org) +Copyright 2012-2025 EPFL +Copyright 2012-2025 Lightbend, Inc. dba Akka Licensed under the Apache License, Version 2.0 (the "License"): http://www.apache.org/licenses/LICENSE-2.0 @@ -11,12 +11,13 @@ WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the License for the specific language governing permissions and limitations under the License. -The dotty compiler frontend has been developed since November 2012 by Martin -Odersky. It is expected and hoped for that the list of contributors to the -codebase will grow quickly. Dotty draws inspiration and code from the original -Scala compiler "nsc", which is developed at scala/scala [1]. +The Scala 3 compiler is also known as Dotty. The Dotty compiler +frontend has been developed since November 2012 by Martin Odersky. It +is expected and hoped for that the list of contributors to the +codebase will grow quickly. Dotty draws inspiration and code from the +original Scala 2 compiler "nsc", which is still developed at scala/scala [1]. -The majority of the dotty codebase is new code, with the exception of the +The majority of the Dotty codebase is new code, with the exception of the components mentioned below. We have for each component tried to come up with a list of the original authors in the scala/scala [1] codebase. Apologies if some major authors were omitted by oversight. @@ -28,7 +29,7 @@ major authors were omitted by oversight. * dotty.tools.dotc.classpath: The classpath handling is taken mostly as is from scala/scala [1]. The original authors were Grzegorz Kossakowski, - Michał Pociecha, Lukas Rytz, Jason Zaugg and others. + Michał Pociecha, Lukas Rytz, Jason Zaugg and others. * dotty.tools.dotc.config: The configuration components were adapted and extended from scala/scala [1]. The original sources were authored by Paul diff --git a/pkgs/chocolatey/scala.nuspec b/pkgs/chocolatey/scala.nuspec index bb2e0e07ce70..83033fe4b349 100644 --- a/pkgs/chocolatey/scala.nuspec +++ b/pkgs/chocolatey/scala.nuspec @@ -13,7 +13,7 @@ https://github.com/scala/scala3 https://scala-lang.org/ https://github.com/scala/scala3/issues - © 2002-2024, LAMP/EPFL + © 2002-2025, LAMP/EPFL https://cdn.jsdelivr.net/gh/scala/scala3@a046b0014ffd9536144d67a48f8759901b96d12f/pkgs/chocolatey/icon.svg https://github.com/scala/scala3/blob/main/LICENSE true From 47733321f2d537bb8f6cf8b660715dfb47b18cee Mon Sep 17 00:00:00 2001 From: Wojciech Mazur Date: Wed, 15 Jan 2025 15:18:25 +0100 Subject: [PATCH 357/371] [CI]: Replace deprecated `actions/create-release` and `actions/upload-release-assert` actions (#22176) Both of these actions were archived in 2021. In the future they're going to fail, due to deprecation of `set-output` command in GitHub actions. We replace their usage with a single `gh release create` command. It also allows us to simplify the workflow [skip ci] --- .github/workflows/ci.yaml | 325 +++----------------------------------- 1 file changed, 23 insertions(+), 302 deletions(-) diff --git a/.github/workflows/ci.yaml b/.github/workflows/ci.yaml index cc1eb5d40d97..b6bd9c319b1c 100644 --- a/.github/workflows/ci.yaml +++ b/.github/workflows/ci.yaml @@ -736,7 +736,7 @@ jobs: publish_release: permissions: - contents: write # for actions/create-release to create a release + contents: write # for GH CLI to create a release runs-on: [self-hosted, Linux] container: image: lampepfl/dotty:2024-10-18 @@ -778,6 +778,7 @@ jobs: - name: Add SBT proxy repositories run: cp -vf .github/workflows/repositories /root/.sbt/ ; true + # Extract the release tag - name: Extract the release tag run : echo "RELEASE_TAG=${GITHUB_REF#*refs/tags/}" >> $GITHUB_ENV @@ -830,311 +831,31 @@ jobs: mv scala.msi "${msiInstaller}" sha256sum "${msiInstaller}" > "${msiInstaller}.sha256" + - name: Install GH CLI + uses: dev-hanz-ops/install-gh-cli-action@v0.2.0 + with: + gh-cli-version: 2.59.0 + # Create the GitHub release - name: Create GitHub Release - id: create_gh_release - uses: actions/create-release@latest env: GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }} # This token is provided by Actions, you do not need to create your own token - with: - tag_name: ${{ github.ref }} - release_name: ${{ github.ref }} - body_path: ./changelogs/${{ env.RELEASE_TAG }}.md - draft: true - prerelease: ${{ contains(env.RELEASE_TAG, '-') }} - - # The following upload steps are generated using template: - # val baseFileName = "scala3-${{ env.RELEASE_TAG }}" - # def upload(kind: String, path: String, contentType: String, distribution: String) = - # s"""- name: Upload $kind to GitHub Release ($distribution) - # uses: actions/upload-release-asset@v1 - # env: - # GITHUB_TOKEN: $${{ secrets.GITHUB_TOKEN }} - # with: - # upload_url: $${{ steps.create_gh_release.outputs.upload_url }} - # asset_path: ./${path} - # asset_name: ${path} - # asset_content_type: ${contentType}""" - # def uploadSDK(distribution: String, suffix: String) = - # val filename = s"${baseFileName}${suffix}" - # s""" - # # $distribution - # ${upload("zip archive", s"$filename.zip", "application/zip", distribution)} - # ${upload("zip archive SHA", s"$filename.zip.sha256", "text/plain", distribution)} - # ${upload("tar.gz archive", s"$filename.tar.gz", "application/gzip", distribution)} - # ${upload("tar.gz archive SHA", s"$filename.tar.gz.sha256", "text/plain", distribution)} - # """ - # def uploadMSI() = - # val distribution = "Windows x86_64 MSI" - # s""" - # # $distribution - # ${upload(".msi file", s"${baseFileName}.msi", "application/x-msi", distribution)} - # ${upload(".msi file SHA", s"${baseFileName}.msi.sha256", "text/plain", distribution)} - # """ - # @main def gen = - # Seq( - # uploadSDK("Universal", ""), - # uploadSDK("Linux x86-64", "-x86_64-pc-linux"), - # uploadSDK("Linux aarch64", "-aarch64-pc-linux"), - # uploadSDK("Mac x86-64", "-x86_64-apple-darwin"), - # uploadSDK("Mac aarch64", "-aarch64-apple-darwin"), - # uploadSDK("Windows x86_64", "-x86_64-pc-win32"), - # uploadMSI() - # ).foreach(println) - - # Universal - - name: Upload zip archive to GitHub Release (Universal) - uses: actions/upload-release-asset@v1 - env: - GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }} - with: - upload_url: ${{ steps.create_gh_release.outputs.upload_url }} - asset_path: ./scala3-${{ env.RELEASE_TAG }}.zip - asset_name: scala3-${{ env.RELEASE_TAG }}.zip - asset_content_type: application/zip - - name: Upload zip archive SHA to GitHub Release (Universal) - uses: actions/upload-release-asset@v1 - env: - GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }} - with: - upload_url: ${{ steps.create_gh_release.outputs.upload_url }} - asset_path: ./scala3-${{ env.RELEASE_TAG }}.zip.sha256 - asset_name: scala3-${{ env.RELEASE_TAG }}.zip.sha256 - asset_content_type: text/plain - - name: Upload tar.gz archive to GitHub Release (Universal) - uses: actions/upload-release-asset@v1 - env: - GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }} - with: - upload_url: ${{ steps.create_gh_release.outputs.upload_url }} - asset_path: ./scala3-${{ env.RELEASE_TAG }}.tar.gz - asset_name: scala3-${{ env.RELEASE_TAG }}.tar.gz - asset_content_type: application/gzip - - name: Upload tar.gz archive SHA to GitHub Release (Universal) - uses: actions/upload-release-asset@v1 - env: - GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }} - with: - upload_url: ${{ steps.create_gh_release.outputs.upload_url }} - asset_path: ./scala3-${{ env.RELEASE_TAG }}.tar.gz.sha256 - asset_name: scala3-${{ env.RELEASE_TAG }}.tar.gz.sha256 - asset_content_type: text/plain - - - # Linux x86-64 - - name: Upload zip archive to GitHub Release (Linux x86-64) - uses: actions/upload-release-asset@v1 - env: - GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }} - with: - upload_url: ${{ steps.create_gh_release.outputs.upload_url }} - asset_path: ./scala3-${{ env.RELEASE_TAG }}-x86_64-pc-linux.zip - asset_name: scala3-${{ env.RELEASE_TAG }}-x86_64-pc-linux.zip - asset_content_type: application/zip - - name: Upload zip archive SHA to GitHub Release (Linux x86-64) - uses: actions/upload-release-asset@v1 - env: - GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }} - with: - upload_url: ${{ steps.create_gh_release.outputs.upload_url }} - asset_path: ./scala3-${{ env.RELEASE_TAG }}-x86_64-pc-linux.zip.sha256 - asset_name: scala3-${{ env.RELEASE_TAG }}-x86_64-pc-linux.zip.sha256 - asset_content_type: text/plain - - name: Upload tar.gz archive to GitHub Release (Linux x86-64) - uses: actions/upload-release-asset@v1 - env: - GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }} - with: - upload_url: ${{ steps.create_gh_release.outputs.upload_url }} - asset_path: ./scala3-${{ env.RELEASE_TAG }}-x86_64-pc-linux.tar.gz - asset_name: scala3-${{ env.RELEASE_TAG }}-x86_64-pc-linux.tar.gz - asset_content_type: application/gzip - - name: Upload tar.gz archive SHA to GitHub Release (Linux x86-64) - uses: actions/upload-release-asset@v1 - env: - GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }} - with: - upload_url: ${{ steps.create_gh_release.outputs.upload_url }} - asset_path: ./scala3-${{ env.RELEASE_TAG }}-x86_64-pc-linux.tar.gz.sha256 - asset_name: scala3-${{ env.RELEASE_TAG }}-x86_64-pc-linux.tar.gz.sha256 - asset_content_type: text/plain - - - # Linux aarch64 - - name: Upload zip archive to GitHub Release (Linux aarch64) - uses: actions/upload-release-asset@v1 - env: - GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }} - with: - upload_url: ${{ steps.create_gh_release.outputs.upload_url }} - asset_path: ./scala3-${{ env.RELEASE_TAG }}-aarch64-pc-linux.zip - asset_name: scala3-${{ env.RELEASE_TAG }}-aarch64-pc-linux.zip - asset_content_type: application/zip - - name: Upload zip archive SHA to GitHub Release (Linux aarch64) - uses: actions/upload-release-asset@v1 - env: - GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }} - with: - upload_url: ${{ steps.create_gh_release.outputs.upload_url }} - asset_path: ./scala3-${{ env.RELEASE_TAG }}-aarch64-pc-linux.zip.sha256 - asset_name: scala3-${{ env.RELEASE_TAG }}-aarch64-pc-linux.zip.sha256 - asset_content_type: text/plain - - name: Upload tar.gz archive to GitHub Release (Linux aarch64) - uses: actions/upload-release-asset@v1 - env: - GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }} - with: - upload_url: ${{ steps.create_gh_release.outputs.upload_url }} - asset_path: ./scala3-${{ env.RELEASE_TAG }}-aarch64-pc-linux.tar.gz - asset_name: scala3-${{ env.RELEASE_TAG }}-aarch64-pc-linux.tar.gz - asset_content_type: application/gzip - - name: Upload tar.gz archive SHA to GitHub Release (Linux aarch64) - uses: actions/upload-release-asset@v1 - env: - GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }} - with: - upload_url: ${{ steps.create_gh_release.outputs.upload_url }} - asset_path: ./scala3-${{ env.RELEASE_TAG }}-aarch64-pc-linux.tar.gz.sha256 - asset_name: scala3-${{ env.RELEASE_TAG }}-aarch64-pc-linux.tar.gz.sha256 - asset_content_type: text/plain - - - # Mac x86-64 - - name: Upload zip archive to GitHub Release (Mac x86-64) - uses: actions/upload-release-asset@v1 - env: - GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }} - with: - upload_url: ${{ steps.create_gh_release.outputs.upload_url }} - asset_path: ./scala3-${{ env.RELEASE_TAG }}-x86_64-apple-darwin.zip - asset_name: scala3-${{ env.RELEASE_TAG }}-x86_64-apple-darwin.zip - asset_content_type: application/zip - - name: Upload zip archive SHA to GitHub Release (Mac x86-64) - uses: actions/upload-release-asset@v1 - env: - GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }} - with: - upload_url: ${{ steps.create_gh_release.outputs.upload_url }} - asset_path: ./scala3-${{ env.RELEASE_TAG }}-x86_64-apple-darwin.zip.sha256 - asset_name: scala3-${{ env.RELEASE_TAG }}-x86_64-apple-darwin.zip.sha256 - asset_content_type: text/plain - - name: Upload tar.gz archive to GitHub Release (Mac x86-64) - uses: actions/upload-release-asset@v1 - env: - GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }} - with: - upload_url: ${{ steps.create_gh_release.outputs.upload_url }} - asset_path: ./scala3-${{ env.RELEASE_TAG }}-x86_64-apple-darwin.tar.gz - asset_name: scala3-${{ env.RELEASE_TAG }}-x86_64-apple-darwin.tar.gz - asset_content_type: application/gzip - - name: Upload tar.gz archive SHA to GitHub Release (Mac x86-64) - uses: actions/upload-release-asset@v1 - env: - GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }} - with: - upload_url: ${{ steps.create_gh_release.outputs.upload_url }} - asset_path: ./scala3-${{ env.RELEASE_TAG }}-x86_64-apple-darwin.tar.gz.sha256 - asset_name: scala3-${{ env.RELEASE_TAG }}-x86_64-apple-darwin.tar.gz.sha256 - asset_content_type: text/plain - - - # Mac aarch64 - - name: Upload zip archive to GitHub Release (Mac aarch64) - uses: actions/upload-release-asset@v1 - env: - GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }} - with: - upload_url: ${{ steps.create_gh_release.outputs.upload_url }} - asset_path: ./scala3-${{ env.RELEASE_TAG }}-aarch64-apple-darwin.zip - asset_name: scala3-${{ env.RELEASE_TAG }}-aarch64-apple-darwin.zip - asset_content_type: application/zip - - name: Upload zip archive SHA to GitHub Release (Mac aarch64) - uses: actions/upload-release-asset@v1 - env: - GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }} - with: - upload_url: ${{ steps.create_gh_release.outputs.upload_url }} - asset_path: ./scala3-${{ env.RELEASE_TAG }}-aarch64-apple-darwin.zip.sha256 - asset_name: scala3-${{ env.RELEASE_TAG }}-aarch64-apple-darwin.zip.sha256 - asset_content_type: text/plain - - name: Upload tar.gz archive to GitHub Release (Mac aarch64) - uses: actions/upload-release-asset@v1 - env: - GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }} - with: - upload_url: ${{ steps.create_gh_release.outputs.upload_url }} - asset_path: ./scala3-${{ env.RELEASE_TAG }}-aarch64-apple-darwin.tar.gz - asset_name: scala3-${{ env.RELEASE_TAG }}-aarch64-apple-darwin.tar.gz - asset_content_type: application/gzip - - name: Upload tar.gz archive SHA to GitHub Release (Mac aarch64) - uses: actions/upload-release-asset@v1 - env: - GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }} - with: - upload_url: ${{ steps.create_gh_release.outputs.upload_url }} - asset_path: ./scala3-${{ env.RELEASE_TAG }}-aarch64-apple-darwin.tar.gz.sha256 - asset_name: scala3-${{ env.RELEASE_TAG }}-aarch64-apple-darwin.tar.gz.sha256 - asset_content_type: text/plain - - - # Windows x86_64 - - name: Upload zip archive to GitHub Release (Windows x86_64) - uses: actions/upload-release-asset@v1 - env: - GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }} - with: - upload_url: ${{ steps.create_gh_release.outputs.upload_url }} - asset_path: ./scala3-${{ env.RELEASE_TAG }}-x86_64-pc-win32.zip - asset_name: scala3-${{ env.RELEASE_TAG }}-x86_64-pc-win32.zip - asset_content_type: application/zip - - name: Upload zip archive SHA to GitHub Release (Windows x86_64) - uses: actions/upload-release-asset@v1 - env: - GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }} - with: - upload_url: ${{ steps.create_gh_release.outputs.upload_url }} - asset_path: ./scala3-${{ env.RELEASE_TAG }}-x86_64-pc-win32.zip.sha256 - asset_name: scala3-${{ env.RELEASE_TAG }}-x86_64-pc-win32.zip.sha256 - asset_content_type: text/plain - - name: Upload tar.gz archive to GitHub Release (Windows x86_64) - uses: actions/upload-release-asset@v1 - env: - GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }} - with: - upload_url: ${{ steps.create_gh_release.outputs.upload_url }} - asset_path: ./scala3-${{ env.RELEASE_TAG }}-x86_64-pc-win32.tar.gz - asset_name: scala3-${{ env.RELEASE_TAG }}-x86_64-pc-win32.tar.gz - asset_content_type: application/gzip - - name: Upload tar.gz archive SHA to GitHub Release (Windows x86_64) - uses: actions/upload-release-asset@v1 - env: - GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }} - with: - upload_url: ${{ steps.create_gh_release.outputs.upload_url }} - asset_path: ./scala3-${{ env.RELEASE_TAG }}-x86_64-pc-win32.tar.gz.sha256 - asset_name: scala3-${{ env.RELEASE_TAG }}-x86_64-pc-win32.tar.gz.sha256 - asset_content_type: text/plain - - - # Windows x86_64 MSI - - name: Upload .msi file to GitHub Release (Windows x86_64 MSI) - uses: actions/upload-release-asset@v1 - env: - GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }} - with: - upload_url: ${{ steps.create_gh_release.outputs.upload_url }} - asset_path: ./scala3-${{ env.RELEASE_TAG }}.msi - asset_name: scala3-${{ env.RELEASE_TAG }}.msi - asset_content_type: application/x-msi - - name: Upload .msi file SHA to GitHub Release (Windows x86_64 MSI) - uses: actions/upload-release-asset@v1 - env: - GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }} - with: - upload_url: ${{ steps.create_gh_release.outputs.upload_url }} - asset_path: ./scala3-${{ env.RELEASE_TAG }}.msi.sha256 - asset_name: scala3-${{ env.RELEASE_TAG }}.msi.sha256 - asset_content_type: text/plain + shell: bash + run: | + # We need to config safe.directory in every step that might reference git + # It is not persisted between steps + git config --global --add safe.directory /__w/scala3/scala3 + gh release create \ + --draft \ + --title "${{ env.RELEASE_TAG }}" \ + --notes-file ./changelogs/${{ env.RELEASE_TAG }}.md \ + --latest=${{ !contains(env.RELEASE_TAG, '-RC') }} \ + --prerelease=${{ contains(env.RELEASE_TAG, '-RC') }} \ + --verify-tag ${{ env.RELEASE_TAG }} \ + scala3-${{ env.RELEASE_TAG }}*.zip \ + scala3-${{ env.RELEASE_TAG }}*.tar.gz \ + scala3-${{ env.RELEASE_TAG }}*.sha256 \ + scala3-${{ env.RELEASE_TAG }}.msi - name: Publish Release run: ./project/scripts/sbtPublish ";project scala3-bootstrapped ;publishSigned ;sonatypeBundleUpload" From 1f8842fcab00dd626332c5f9355103700df10a2c Mon Sep 17 00:00:00 2001 From: Hamza Remmal Date: Mon, 13 Jan 2025 00:23:20 +0100 Subject: [PATCH 358/371] fix: drop jackson-module-scala from CB --- .../test/scala/dotty/communitybuild/CommunityBuildTest.scala | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/community-build/test/scala/dotty/communitybuild/CommunityBuildTest.scala b/community-build/test/scala/dotty/communitybuild/CommunityBuildTest.scala index bf78c8e1a2cf..5c2ea408413c 100644 --- a/community-build/test/scala/dotty/communitybuild/CommunityBuildTest.scala +++ b/community-build/test/scala/dotty/communitybuild/CommunityBuildTest.scala @@ -68,7 +68,7 @@ class CommunityBuildTestC: @Test def fastparse = projects.fastparse.run() @Test def geny = projects.geny.run() @Test def intent = projects.intent.run() - @Test def jacksonModuleScala = projects.jacksonModuleScala.run() + //@Test def jacksonModuleScala = projects.jacksonModuleScala.run() @Test def libretto = projects.libretto.run() @Test def minitest = projects.minitest.run() //@Test def onnxScala = projects.onnxScala.run() From cdd72a03a79eca8831b3ab18f4de5aac95b358ad Mon Sep 17 00:00:00 2001 From: Wojciech Mazur Date: Wed, 15 Jan 2025 13:55:56 +0100 Subject: [PATCH 359/371] Add changelog for 3.6.3 --- changelogs/3.6.3.md | 192 ++++++++++++++++++++++++++++++++++++++++++++ 1 file changed, 192 insertions(+) create mode 100644 changelogs/3.6.3.md diff --git a/changelogs/3.6.3.md b/changelogs/3.6.3.md new file mode 100644 index 000000000000..2b3d23b75222 --- /dev/null +++ b/changelogs/3.6.3.md @@ -0,0 +1,192 @@ +# Highlights of the release + +- Scala 2 forwardport: `-Yprofile-trace` [#19897](https://github.com/scala/scala3/pull/19897) + +# Other changes and fixes + +## Annotations + +- Fix Java parsing of annotations on qualified types [#21867](https://github.com/scala/scala3/pull/21867) +- Consider all arguments in Annotations.refersToParamOf [#22001](https://github.com/scala/scala3/pull/22001) + +## Backend + +- Flag class file collision as error [#19332](https://github.com/scala/scala3/pull/19332) + +## Compiler Phases + +- Fix #21939: Update names and descriptions for cc and setup phases [#21942](https://github.com/scala/scala3/pull/21942) +- Limit exposure to ConcurrentModificationException when sys props are replaced or mutated [#22275](http://github.com/scala/scala3/pull/22275) + +## Experimental: Explicit Nulls + +- Improve warning for wildcard matching only null under the explicit nulls flag (scala#21577) [#21623](https://github.com/scala/scala3/pull/21623) +- Fix warning message for matching on redundant nulls [#21850](https://github.com/scala/scala3/pull/21850) + +## Experimental: Capture Checking + +- Fix #21868, #21869, and #21870: handle CapsOf in more places [#21875](https://github.com/scala/scala3/pull/21875) +- Consolidate CC [#21863](https://github.com/scala/scala3/pull/21863) +- Add path support for capture checking [#21445](https://github.com/scala/scala3/pull/21445) + +## Experimentals + +- Replace symbol traversal with tree traversal when finding top level experimentals [#21827](https://github.com/scala/scala3/pull/21827) + +## Extension Methods + +- Nowarn extension matching nonpublic member [#21825](https://github.com/scala/scala3/pull/21825) + +## Implicits + +- Apply implicit conversion from derived Conversion instance defined as implicit rather than given [#21785](https://github.com/scala/scala3/pull/21785) + +## Imports + +- Allow imports nested in packagings to shadow [#21539](https://github.com/scala/scala3/pull/21539) + +## Inline + +- Avoid using the current denotation in NamedType.disambiguate [#21414](https://github.com/scala/scala3/pull/21414) +- Drop phase.isTyper use in isLegalPrefix/asf [#21954](https://github.com/scala/scala3/pull/21954) +- Fix for macro annotation that resolves macro-based implicit crashing the compiler [#20353](https://github.com/scala/scala3/pull/20353) +- Allow macro annotations to recover from suspension [#21969](https://github.com/scala/scala3/pull/21969) + +## Linting + +- Disallow open modifier on objects [#21922](https://github.com/scala/scala3/pull/21922) +- Allow discarding "Discarded non-Unit" warnings with `: Unit` [#21927](https://github.com/scala/scala3/pull/21927) + +## Opaque Types + +- Fix pkg obj prefix of opaque tp ext meth [#21527](https://github.com/scala/scala3/pull/21527) + +## Parser + +- Fix: don't consider `into` as a soft-modifier [#21924](https://github.com/scala/scala3/pull/21924) + +## Pattern Matching + +- Drop inaccessible subclasses from refineUsingParent [#21799](https://github.com/scala/scala3/pull/21799) +- (Re-)Drop inaccessible subclasses from refineUsingParent [#21930](https://github.com/scala/scala3/pull/21930) +- Fix use of class terms in match analysis [#21848](https://github.com/scala/scala3/pull/21848) +- Don't project nested wildcard patterns to nullable [#21934](https://github.com/scala/scala3/pull/21934) +- Fix provablyDisjoint handling enum constants with mixins [#21876](https://github.com/scala/scala3/pull/21876) +- Do not consider uninhabited constructors when performing exhaustive match checking [#21750](https://github.com/scala/scala3/pull/21750) + +## Presentation Compiler + +- Update mtags to 1.4.1 and backport remaining changes [#21859](https://github.com/scala/scala3/pull/21859) +- Backport changes for the presentation compiler from Metals [#21756](https://github.com/scala/scala3/pull/21756) + +## Pickling + +- Avoid orphan param from default arg [#21824](https://github.com/scala/scala3/pull/21824) +- Make sure definition tree has the defined symbol [#21851](https://github.com/scala/scala3/pull/21851) + +## REPL + +- Allow top-level opaque type definitions in REPL [#21753](https://github.com/scala/scala3/pull/21753) +- JLine: follow recommendation to use JNI, not JNA; also JLine 3.27.1 (was 3.27.0) [#22271](http://github.com/scala/scala3/pull/22271) + +## Scaladoc + +- Fix scaladoc TastyInspector regressions [#21716](https://github.com/scala/scala3/pull/21716) +- Bring back the fix for scaladoc TastyInspector regressions [#21929](https://github.com/scala/scala3/pull/21929) +- Fix scaladoc graph highlight background color in dark mode [#21814](https://github.com/scala/scala3/pull/21814) + +## Standard Library + +- Combine cases of `Tuple.Zip` disjoint from `(h1 *: t1, h2 *: t2)` [#21287](https://github.com/scala/scala3/pull/21287) + +## Quotes + +- Fix #20471: owners of top-level symbols in cached quoted code being incorrect [#21945](https://github.com/scala/scala3/pull/21945) + +## Reporting + +- Do not warn about expected missing positions in quotes.reflect.Symbol [#21677](https://github.com/scala/scala3/pull/21677) +- Add missing error messages to asserts in QuotesImpl [#21852](https://github.com/scala/scala3/pull/21852) +- Don't point to the compiler backlog when a compiler plugin phase crashes [#21887](https://github.com/scala/scala3/pull/21887) +- Better error message for polytypes wrapping capturing types [#21843](https://github.com/scala/scala3/pull/21843) +- Pretty-print lambdas [#21846](https://github.com/scala/scala3/pull/21846) +- Nowarn extension matching nonpublic member [#22276](http://github.com/scala/scala3/pull/22276) +- Refactor: Improve Given search preference warning [#22273](http://github.com/scala/scala3/pull/22273) + +## Runner + +- Fix: update `scala-cli.jar` path [#22274](http://github.com/scala/scala3/pull/22274) + +## Releases + +- Fix layout of released SDK archives, restore intermiediete top-level directory [#22272](http://github.com/scala/scala3/pull/22272) + +## Scala.js + +- Shade scalajs.ir under dotty.tools [#21765](https://github.com/scala/scala3/pull/21765) + +## SemanticDB + +- Extract semanticDB for lifted definitions [#21856](https://github.com/scala/scala3/pull/21856) + +## Transform + +- Fix enclosingClass from returning refinement classes [#21411](https://github.com/scala/scala3/pull/21411) +- Attempt to beta reduce only if parameters and arguments have same shape [#21970](https://github.com/scala/scala3/pull/21970) +- Drop copied parent refinements before generating bytecode [#21733](https://github.com/scala/scala3/pull/21733) + +## Tooling + +- Ensure to escape characters before constructing JSON profile trace [#21872](https://github.com/scala/scala3/pull/21872) + +## Tuples + +- Fix tupleTypeFromSeq for XXL tuples [#21782](https://github.com/scala/scala3/pull/21782) + +## Typer + +- Do not crash when typing a closure with unknown type, since it can occur for erroneous input [#21178](https://github.com/scala/scala3/pull/21178) +- Revert SAM condition to what it was before [#21684](https://github.com/scala/scala3/pull/21684) +- Fix ctx implicits under case unapplySeq [#21748](https://github.com/scala/scala3/pull/21748) +- Avoid erasure/preErasure issues around Any in transformIsInstanceOf [#21647](https://github.com/scala/scala3/pull/21647) + +# Contributors + +Thank you to all the contributors who made this release possible 🎉 + +According to `git shortlog -sn --no-merges 3.6.2..3.6.3` these are: + +``` + 33 Wojciech Mazur + 30 Dale Wijnand + 30 Kacper Korban + 14 noti0na1 + 11 Hamza Remmal + 10 Eugene Flesselle + 10 HarrisL2 + 9 Martin Odersky + 8 Matt Bovel + 7 Jan Chyb + 6 Tomasz Godzik + 5 Seth Tisue + 4 Jamie Thompson + 2 Friendseeker + 2 João Ferreira + 2 Pascal Weisenburger + 2 Som Snytt + 2 Sébastien Doeraene + 1 Adrien Piquerez + 1 Alden Torres + 1 Alexander + 1 Eugene Yokota + 1 Fengyun Liu + 1 Georgi Krastev + 1 Jentsch + 1 Lunfu Zhong + 1 Michał Pałka + 1 Natsu Kagami + 1 Rui Chen + 1 dependabot[bot] + 1 friendseeker + 1 tgodzik +``` From c33db50d79d3bc64386b2327cb2cae3bcc6b621d Mon Sep 17 00:00:00 2001 From: Wojciech Mazur Date: Wed, 15 Jan 2025 13:57:21 +0100 Subject: [PATCH 360/371] Release 3.6.3 --- project/Build.scala | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/project/Build.scala b/project/Build.scala index 9f7a4839c3dc..7f98e87fcaaa 100644 --- a/project/Build.scala +++ b/project/Build.scala @@ -115,7 +115,7 @@ object Build { * During release candidate cycle incremented by the release officer before publishing a subsequent RC version; * During final, stable release is set exactly to `developedVersion`. */ - val baseVersion = s"$developedVersion-RC2" + val baseVersion = developedVersion /** Final version of Scala compiler, controlled by environment variables. */ val dottyVersion = { From b455651049118064b4b7cbc41a0e2a621bdd1830 Mon Sep 17 00:00:00 2001 From: Wojciech Mazur Date: Thu, 16 Jan 2025 13:05:32 +0100 Subject: [PATCH 361/371] Add changelog for 3.6.4-RC1 --- changelogs/3.6.4-RC1.md | 157 ++++++++++++++++++++++++++++++++++++++++ 1 file changed, 157 insertions(+) create mode 100644 changelogs/3.6.4-RC1.md diff --git a/changelogs/3.6.4-RC1.md b/changelogs/3.6.4-RC1.md new file mode 100644 index 000000000000..d2889b89a0ab --- /dev/null +++ b/changelogs/3.6.4-RC1.md @@ -0,0 +1,157 @@ +# Highlights of the release + +- Add REPL init script setting [#22206](https://github.com/scala/scala3/pull/22206) +- Support for JDK 24 [#22250](https://github.com/scala/scala3/pull/22250) +- Merge -Xno-decode-stacktraces with -Xno-enrich-error-messages [#22208](https://github.com/scala/scala3/pull/22208) +- Do not lift annotation arguments [#22035](https://github.com/scala/scala3/pull/22035) + +# Other changes and fixes + +## Annotations + +- Make sure symbols in annotation trees are fresh before pickling [#22002](https://github.com/scala/scala3/pull/22002) +- Consider all arguments in Annotations.refersToParamOf [#22001](https://github.com/scala/scala3/pull/22001) +- Do not lift annotation arguments (bis) [#22046](https://github.com/scala/scala3/pull/22046) + +## Desugaring + +- Fix #22051: only trust the type application part for case class unapplies [#22099](https://github.com/scala/scala3/pull/22099) + +## Documentation + +- Update example code linked to obsolete content in macros-spec.md [#22256](https://github.com/scala/scala3/pull/22256) + +## Experimental: Capture Checking + +- Fix #21868, #21869, and #21870: handle CapsOf in more places [#21875](https://github.com/scala/scala3/pull/21875) +- Refine rules for capture parameters and members [#22000](https://github.com/scala/scala3/pull/22000) +- Add a hint for using CC with REPL [#22220](https://github.com/scala/scala3/pull/22220) +- Consolidate CC [#21863](https://github.com/scala/scala3/pull/21863) + +## Experimental: Global Initialization + +- Fix crash when initializing val in ByName closure [#22354](https://github.com/scala/scala3/pull/22354) + +## Experimental: Named Tuples + +- Handle TypeProxy of Named Tuples in unapply [#22325](https://github.com/scala/scala3/pull/22325) +- Fail more eagerly when trying to adapt named unapply patterns [#22315](https://github.com/scala/scala3/pull/22315) +- Widen singleton types when computing fields from .Fields [#22149](https://github.com/scala/scala3/pull/22149) +- Fix .toTuple insertion [#22028](https://github.com/scala/scala3/pull/22028) + +## Extension Methods + +- Tweak ExtensionNullifiedByMember [#22268](https://github.com/scala/scala3/pull/22268) +- Nowarn extension matching nonpublic member [#21825](https://github.com/scala/scala3/pull/21825) + +## Implicits + +- Rollback constraints in compareAppliedTypeParamRef [#22339](https://github.com/scala/scala3/pull/22339) +- Try implicit searching after finding dynamic select [#22318](https://github.com/scala/scala3/pull/22318) + +## Inline + +- Drop phase.isTyper use in isLegalPrefix/asf [#21954](https://github.com/scala/scala3/pull/21954) + +## Linting + +- Allow discarding "Discarded non-Unit" warnings with `: Unit` [#21927](https://github.com/scala/scala3/pull/21927) + +## Match Types + +- Fix #21841: Check more that an `unapplySeq` on a `NonEmptyTuple` is valid. [#22366](https://github.com/scala/scala3/pull/22366) +- Type avoidance in MT bound inference [#22142](https://github.com/scala/scala3/pull/22142) + +## Metaprogramming + +- Rethrow SuspendExceptions caught in CodeGen phase [#22009](https://github.com/scala/scala3/pull/22009) + +## Metaprogramming: Compile-time + +- Extend compiletime.testing.typechecks with certain transform phases [#21185](https://github.com/scala/scala3/pull/21185) + +## Nullability + +- Fix #21619: Refactor NotNullInfo to record every reference which is retracted once. [#21624](https://github.com/scala/scala3/pull/21624) + +## Presentation Compiler + +- Use new infer expected type for singleton complations [#21421](https://github.com/scala/scala3/pull/21421) +- Fix match error in keyword completions [#22138](https://github.com/scala/scala3/pull/22138) + +## Reflection + +- Do not return java outline dummy constructor in `primaryConstructor` [#22104](https://github.com/scala/scala3/pull/22104) + +## Reporting + +- Normalise the types for Type Mismatch Error (E007) [#22337](https://github.com/scala/scala3/pull/22337) +- Improve given search preference warning [#22189](https://github.com/scala/scala3/pull/22189) +- Better error messages when an enum derives from AnyVal [#22236](https://github.com/scala/scala3/pull/22236) +- Correctly print litteral types in the refined printer [#22351](https://github.com/scala/scala3/pull/22351) + +## Rewrites + +- Undo patch of double-block apply [#21982](https://github.com/scala/scala3/pull/21982) + +## Scaladoc + +- Scaladoc: Add support for named tuples [#22263](https://github.com/scala/scala3/pull/22263) + +## Settings + +- Limit exposure to ConcurrentModificationException when sys props are replaced or mutated [#22180](https://github.com/scala/scala3/pull/22180) + +## Specification + +- Align the spec to allow the marker [#22323](https://github.com/scala/scala3/pull/22323) +- Integrate the specification for match types. [#22164](https://github.com/scala/scala3/pull/22164) + +## Transform + +- Fix #22226: Use `classOf[BoxedUnit]` for Unit array in `ArrayConstructors`. [#22238](https://github.com/scala/scala3/pull/22238) + +## Typer + +- Fixes for isLegalPrefix change [#22241](https://github.com/scala/scala3/pull/22241) +- Resolve name when named imp is behind wild imps [#21888](https://github.com/scala/scala3/pull/21888) + +# Contributors + +Thank you to all the contributors who made this release possible 🎉 + +According to `git shortlog -sn --no-merges 3.6.3..3.6.4-RC1` these are: + +``` + 46 Martin Odersky + 33 noti0na1 + 17 Wojciech Mazur + 14 Dale Wijnand + 13 Matt Bovel + 11 Hamza Remmal + 7 Jan Chyb + 6 aherlihy + 5 Kacper Korban + 5 Seth Tisue + 5 Som Snytt + 4 Oliver Bračevac + 4 Yichen Xu + 3 Sébastien Doeraene + 3 dependabot[bot] + 3 kasiaMarek + 2 João Ferreira + 1 David Hua + 1 Eugene Flesselle + 1 Eugene Yokota + 1 Florian3k + 1 Jędrzej Rochala + 1 Kenji Yoshida + 1 Mathias + 1 Natsu Kagami + 1 Oleg Zenzin + 1 Piotr Chabelski + 1 Rui Chen + 1 philippus + 1 rochala + 1 xiaoshihou +``` From bc3e415a8182c8529f138a781bb60e2ab76bc539 Mon Sep 17 00:00:00 2001 From: Wojciech Mazur Date: Thu, 16 Jan 2025 13:09:44 +0100 Subject: [PATCH 362/371] Release 3.6.4-RC1 --- project/Build.scala | 4 ++-- tasty/src/dotty/tools/tasty/TastyFormat.scala | 4 ++-- 2 files changed, 4 insertions(+), 4 deletions(-) diff --git a/project/Build.scala b/project/Build.scala index b67974f4405d..39893a95633f 100644 --- a/project/Build.scala +++ b/project/Build.scala @@ -98,7 +98,7 @@ object Build { * * Warning: Change of this variable needs to be consulted with `expectedTastyVersion` */ - val referenceVersion = "3.6.3-RC2" + val referenceVersion = "3.6.3" /** Version of the Scala compiler targeted in the current release cycle * Contains a version without RC/SNAPSHOT/NIGHTLY specific suffixes @@ -136,7 +136,7 @@ object Build { * - in release candidate branch is experimental if {patch == 0} * - in stable release is always non-experimetnal */ - val expectedTastyVersion = "28.7-experimental-1" + val expectedTastyVersion = "28.6" checkReleasedTastyVersion() /** Final version of Scala compiler, controlled by environment variables. */ diff --git a/tasty/src/dotty/tools/tasty/TastyFormat.scala b/tasty/src/dotty/tools/tasty/TastyFormat.scala index 8ff590fefec5..8da8879185f5 100644 --- a/tasty/src/dotty/tools/tasty/TastyFormat.scala +++ b/tasty/src/dotty/tools/tasty/TastyFormat.scala @@ -324,7 +324,7 @@ object TastyFormat { * compatibility, but remains backwards compatible, with all * preceding `MinorVersion`. */ - final val MinorVersion: Int = 7 + final val MinorVersion: Int = 6 /** Natural Number. The `ExperimentalVersion` allows for * experimentation with changes to TASTy without committing @@ -340,7 +340,7 @@ object TastyFormat { * is able to read final TASTy documents if the file's * `MinorVersion` is strictly less than the current value. */ - final val ExperimentalVersion: Int = 1 + final val ExperimentalVersion: Int = 0 /**This method implements a binary relation (`<:<`) between two TASTy versions. * From 838add93b64c58e560b4b58dc51382a116165a27 Mon Sep 17 00:00:00 2001 From: Wojciech Mazur Date: Fri, 17 Jan 2025 15:00:14 +0100 Subject: [PATCH 363/371] Generate reference expected links --- project/resources/referenceReplacements/sidebar.yml | 3 +++ project/scripts/expected-links/reference-expected-links.txt | 3 +++ 2 files changed, 6 insertions(+) diff --git a/project/resources/referenceReplacements/sidebar.yml b/project/resources/referenceReplacements/sidebar.yml index 240085b681f2..2e84b0b5e433 100644 --- a/project/resources/referenceReplacements/sidebar.yml +++ b/project/resources/referenceReplacements/sidebar.yml @@ -28,6 +28,9 @@ subsection: directory: contextual subsection: - page: reference/contextual/givens.md + - page: reference/contextual/deferred-givens.md + - page: reference/contextual/more-givens.md + - page: reference/contextual/previous-givens.md - page: reference/contextual/using-clauses.md - page: reference/contextual/context-bounds.md - page: reference/contextual/given-imports.md diff --git a/project/scripts/expected-links/reference-expected-links.txt b/project/scripts/expected-links/reference-expected-links.txt index 59add1da0153..8be7dba8d4d0 100644 --- a/project/scripts/expected-links/reference-expected-links.txt +++ b/project/scripts/expected-links/reference-expected-links.txt @@ -27,13 +27,16 @@ ./contextual/context-functions-spec.html ./contextual/context-functions.html ./contextual/conversions.html +./contextual/deferred-givens.html ./contextual/derivation-macro.html ./contextual/derivation.html ./contextual/extension-methods.html ./contextual/given-imports.html ./contextual/givens.html ./contextual/index.html +./contextual/more-givens.html ./contextual/multiversal-equality.html +./contextual/previous-givens.html ./contextual/relationship-implicits.html ./contextual/right-associative-extension-methods.html ./contextual/type-classes.html From 383d19a35ebe3156198fe756dec3b1872126d7ad Mon Sep 17 00:00:00 2001 From: Wojciech Mazur Date: Fri, 17 Jan 2025 18:52:32 +0100 Subject: [PATCH 364/371] Remove and adjust tests based on release-3.6.3 branch --- .../communitybuild/CommunityBuildTest.scala | 2 +- tests/neg/i15474.check | 29 ----------- tests/neg/i15474.scala | 16 ------- tests/neg/looping-givens.check | 48 ------------------- tests/neg/looping-givens.scala | 11 ----- tests/run-macros/type-show/Test_2.scala | 2 +- 6 files changed, 2 insertions(+), 106 deletions(-) delete mode 100644 tests/neg/i15474.check delete mode 100644 tests/neg/i15474.scala delete mode 100644 tests/neg/looping-givens.check delete mode 100644 tests/neg/looping-givens.scala diff --git a/community-build/test/scala/dotty/communitybuild/CommunityBuildTest.scala b/community-build/test/scala/dotty/communitybuild/CommunityBuildTest.scala index ecf80a4bad19..6181d4c3ddec 100644 --- a/community-build/test/scala/dotty/communitybuild/CommunityBuildTest.scala +++ b/community-build/test/scala/dotty/communitybuild/CommunityBuildTest.scala @@ -95,7 +95,7 @@ class CommunityBuildTestC: @Test def sourcecode = projects.sourcecode.run() @Test def specs2 = projects.specs2.run() - @Test def stdLib213 = projects.stdLib213.run() + // @Test def stdLib213 = projects.stdLib213.run() @Test def ujson = projects.ujson.run() @Test def upickle = projects.upickle.run() @Test def utest = projects.utest.run() diff --git a/tests/neg/i15474.check b/tests/neg/i15474.check deleted file mode 100644 index 9fa8fa6c722a..000000000000 --- a/tests/neg/i15474.check +++ /dev/null @@ -1,29 +0,0 @@ --- Error: tests/neg/i15474.scala:6:39 ---------------------------------------------------------------------------------- -6 | given c: Conversion[ String, Int ] = _.toInt // error - | ^ - | Result of implicit search for ?{ toInt: ? } will change. - | Current result Test2.c will be no longer eligible - | because it is not defined before the search position. - | Result with new rules: augmentString. - | To opt into the new rules, compile with `-source future` or use - | the `scala.language.future` language import. - | - | To fix the problem without the language import, you could try one of the following: - | - use a `given ... with` clause as the enclosing given, - | - rearrange definitions so that Test2.c comes earlier, - | - use an explicit conversion, - | - use an import to get extension method into scope. --- Error: tests/neg/i15474.scala:12:56 --------------------------------------------------------------------------------- -12 | given Ordering[Price] = summon[Ordering[BigDecimal]] // error - | ^ - | Result of implicit search for Ordering[BigDecimal] will change. - | Current result Prices.Price.given_Ordering_Price will be no longer eligible - | because it is not defined before the search position. - | Result with new rules: scala.math.Ordering.BigDecimal. - | To opt into the new rules, compile with `-source future` or use - | the `scala.language.future` language import. - | - | To fix the problem without the language import, you could try one of the following: - | - use a `given ... with` clause as the enclosing given, - | - rearrange definitions so that Prices.Price.given_Ordering_Price comes earlier, - | - use an explicit argument. diff --git a/tests/neg/i15474.scala b/tests/neg/i15474.scala deleted file mode 100644 index b196d1b400ef..000000000000 --- a/tests/neg/i15474.scala +++ /dev/null @@ -1,16 +0,0 @@ -//> using options -Xfatal-warnings - -import scala.language.implicitConversions - -object Test2: - given c: Conversion[ String, Int ] = _.toInt // error - -object Prices { - opaque type Price = BigDecimal - - object Price{ - given Ordering[Price] = summon[Ordering[BigDecimal]] // error - } -} - - diff --git a/tests/neg/looping-givens.check b/tests/neg/looping-givens.check deleted file mode 100644 index 1e7ee08d79df..000000000000 --- a/tests/neg/looping-givens.check +++ /dev/null @@ -1,48 +0,0 @@ --- Error: tests/neg/looping-givens.scala:9:22 -------------------------------------------------------------------------- -9 | given aa: A = summon // error - | ^ - | Result of implicit search for T will change. - | Current result ab will be no longer eligible - | because it is not defined before the search position. - | Result with new rules: a. - | To opt into the new rules, compile with `-source future` or use - | the `scala.language.future` language import. - | - | To fix the problem without the language import, you could try one of the following: - | - use a `given ... with` clause as the enclosing given, - | - rearrange definitions so that ab comes earlier, - | - use an explicit argument. - | - | where: T is a type variable with constraint <: A --- Error: tests/neg/looping-givens.scala:10:22 ------------------------------------------------------------------------- -10 | given bb: B = summon // error - | ^ - | Result of implicit search for T will change. - | Current result ab will be no longer eligible - | because it is not defined before the search position. - | Result with new rules: b. - | To opt into the new rules, compile with `-source future` or use - | the `scala.language.future` language import. - | - | To fix the problem without the language import, you could try one of the following: - | - use a `given ... with` clause as the enclosing given, - | - rearrange definitions so that ab comes earlier, - | - use an explicit argument. - | - | where: T is a type variable with constraint <: B --- Error: tests/neg/looping-givens.scala:11:28 ------------------------------------------------------------------------- -11 | given ab: (A & B) = summon // error - | ^ - | Result of implicit search for T will change. - | Current result ab will be no longer eligible - | because it is not defined before the search position. - | Result with new rules: Search Failure: joint(ab, ab). - | To opt into the new rules, compile with `-source future` or use - | the `scala.language.future` language import. - | - | To fix the problem without the language import, you could try one of the following: - | - use a `given ... with` clause as the enclosing given, - | - rearrange definitions so that ab comes earlier, - | - use an explicit argument. - | - | where: T is a type variable with constraint <: A & B diff --git a/tests/neg/looping-givens.scala b/tests/neg/looping-givens.scala deleted file mode 100644 index 57dc95f99aab..000000000000 --- a/tests/neg/looping-givens.scala +++ /dev/null @@ -1,11 +0,0 @@ -//> options -source 3.4 - -class A -class B - -given joint(using a: A, b: B): (A & B) = ??? - -def foo(using a: A, b: B) = - given aa: A = summon // error - given bb: B = summon // error - given ab: (A & B) = summon // error diff --git a/tests/run-macros/type-show/Test_2.scala b/tests/run-macros/type-show/Test_2.scala index 3bc9da043885..de845f3e84dd 100644 --- a/tests/run-macros/type-show/Test_2.scala +++ b/tests/run-macros/type-show/Test_2.scala @@ -23,7 +23,7 @@ object Test { """TypeRef(ThisType(TypeRef(NoPrefix(), "scala")), "Nothing"), """+ """TypeRef(ThisType(TypeRef(NoPrefix(), "scala")), "Any"))), """+ """MatchType("""+ - """TypeRef(ThisType(TypeRef(NoPrefix(), "scala")), "Any"), """+ // match type bound + """TypeRef(TermRef(ThisType(TypeRef(NoPrefix(), "")), "scala"), "Int"), """+ // match type bound """ParamRef(binder, 0), """+ """List("""+ """MatchCase("""+ From c4a18969d54d1fd3256efe0407ff4f74727e0446 Mon Sep 17 00:00:00 2001 From: Wojciech Mazur Date: Mon, 24 Feb 2025 14:47:16 +0100 Subject: [PATCH 365/371] Revert "Drop phase.isTyper use in isLegalPrefix/asf" This reverts commit 26ecda540b93fbe1fc7be030559a78dd2db364f2. --- .../dotty/tools/dotc/core/TypeComparer.scala | 5 +- .../src/dotty/tools/dotc/core/TypeOps.scala | 2 +- .../src/dotty/tools/dotc/core/Types.scala | 29 ++++---- .../dotty/tools/dotc/transform/Recheck.scala | 4 +- .../test/dotc/pos-test-pickling.blacklist | 1 - tests/neg/6314-6.check | 4 +- tests/neg/i6225.scala | 2 +- tests/pos/i17222.2.scala | 30 -------- tests/pos/i17222.3.scala | 39 ---------- tests/pos/i17222.4.scala | 71 ------------------- tests/pos/i17222.5.scala | 26 ------- tests/pos/i17222.8.scala | 18 ----- tests/pos/i17222.scala | 33 --------- 13 files changed, 24 insertions(+), 240 deletions(-) delete mode 100644 tests/pos/i17222.2.scala delete mode 100644 tests/pos/i17222.3.scala delete mode 100644 tests/pos/i17222.4.scala delete mode 100644 tests/pos/i17222.5.scala delete mode 100644 tests/pos/i17222.8.scala delete mode 100644 tests/pos/i17222.scala diff --git a/compiler/src/dotty/tools/dotc/core/TypeComparer.scala b/compiler/src/dotty/tools/dotc/core/TypeComparer.scala index bbe157d4a29b..ca3df65625a8 100644 --- a/compiler/src/dotty/tools/dotc/core/TypeComparer.scala +++ b/compiler/src/dotty/tools/dotc/core/TypeComparer.scala @@ -369,8 +369,7 @@ class TypeComparer(@constructorOnly initctx: Context) extends ConstraintHandling } compareWild case tp2: LazyRef => - isBottom(tp1) - || !tp2.evaluating && recur(tp1, tp2.ref) + isBottom(tp1) || !tp2.evaluating && recur(tp1, tp2.ref) case CapturingType(_, _) => secondTry case tp2: AnnotatedType if !tp2.isRefining => @@ -490,7 +489,7 @@ class TypeComparer(@constructorOnly initctx: Context) extends ConstraintHandling // If `tp1` is in train of being evaluated, don't force it // because that would cause an assertionError. Return false instead. // See i859.scala for an example where we hit this case. - tp2.isAny + tp2.isRef(AnyClass, skipRefined = false) || !tp1.evaluating && recur(tp1.ref, tp2) case AndType(tp11, tp12) => if tp11.stripTypeVar eq tp12.stripTypeVar then recur(tp11, tp2) diff --git a/compiler/src/dotty/tools/dotc/core/TypeOps.scala b/compiler/src/dotty/tools/dotc/core/TypeOps.scala index a7f41a71d7ce..e3168ca5a27d 100644 --- a/compiler/src/dotty/tools/dotc/core/TypeOps.scala +++ b/compiler/src/dotty/tools/dotc/core/TypeOps.scala @@ -124,7 +124,7 @@ object TypeOps: } def isLegalPrefix(pre: Type)(using Context): Boolean = - pre.isStable + pre.isStable || !ctx.phase.isTyper /** Implementation of Types#simplified */ def simplify(tp: Type, theMap: SimplifyMap | Null)(using Context): Type = { diff --git a/compiler/src/dotty/tools/dotc/core/Types.scala b/compiler/src/dotty/tools/dotc/core/Types.scala index c5937074f4bc..f77f268d6ee6 100644 --- a/compiler/src/dotty/tools/dotc/core/Types.scala +++ b/compiler/src/dotty/tools/dotc/core/Types.scala @@ -98,8 +98,12 @@ object Types extends TypeUtils { // ----- Tests ----------------------------------------------------- // // debug only: a unique identifier for a type -// val uniqId = { nextId = nextId + 1; nextId } -// if uniqId == 19555 then trace.dumpStack() +// val uniqId = { +// nextId = nextId + 1 +// if (nextId == 19555) +// println("foo") +// nextId +// } /** A cache indicating whether the type was still provisional, last time we checked */ @sharable private var mightBeProvisional = true @@ -5578,25 +5582,24 @@ object Types extends TypeUtils { } def & (that: TypeBounds)(using Context): TypeBounds = - val lo1 = this.lo.stripLazyRef - val lo2 = that.lo.stripLazyRef - val hi1 = this.hi.stripLazyRef - val hi2 = that.hi.stripLazyRef - // This will try to preserve the FromJavaObjects type in upper bounds. // For example, (? <: FromJavaObjects | Null) & (? <: Any), // we want to get (? <: FromJavaObjects | Null) intead of (? <: Any), // because we may check the result <:< (? <: Object | Null) later. - if hi1.containsFromJavaObject && (hi1 frozen_<:< hi2) && (lo2 frozen_<:< lo1) then + if this.hi.containsFromJavaObject + && (this.hi frozen_<:< that.hi) + && (that.lo frozen_<:< this.lo) then // FromJavaObject in tp1.hi guarantees tp2.hi <:< tp1.hi // prefer tp1 if FromJavaObject is in its hi this - else if hi2.containsFromJavaObject && (hi2 frozen_<:< hi1) && (lo1 frozen_<:< lo2) then + else if that.hi.containsFromJavaObject + && (that.hi frozen_<:< this.hi) + && (this.lo frozen_<:< that.lo) then // Similarly, prefer tp2 if FromJavaObject is in its hi that - else if (lo1 frozen_<:< lo2) && (hi2 frozen_<:< hi1) then that - else if (lo2 frozen_<:< lo1) && (hi1 frozen_<:< hi2) then this - else TypeBounds(lo1 | lo2, hi1 & hi2) + else if (this.lo frozen_<:< that.lo) && (that.hi frozen_<:< this.hi) then that + else if (that.lo frozen_<:< this.lo) && (this.hi frozen_<:< that.hi) then this + else TypeBounds(this.lo | that.lo, this.hi & that.hi) def | (that: TypeBounds)(using Context): TypeBounds = if ((this.lo frozen_<:< that.lo) && (that.hi frozen_<:< this.hi)) this @@ -5605,7 +5608,7 @@ object Types extends TypeUtils { override def & (that: Type)(using Context): Type = that match { case that: TypeBounds => this & that - case _ => super.&(that) + case _ => super.& (that) } override def | (that: Type)(using Context): Type = that match { diff --git a/compiler/src/dotty/tools/dotc/transform/Recheck.scala b/compiler/src/dotty/tools/dotc/transform/Recheck.scala index 172ae337d6e6..9631136a1c4e 100644 --- a/compiler/src/dotty/tools/dotc/transform/Recheck.scala +++ b/compiler/src/dotty/tools/dotc/transform/Recheck.scala @@ -219,10 +219,10 @@ abstract class Recheck extends Phase, SymTransformer: sharpen: Denotation => Denotation)(using Context): Type = if name.is(OuterSelectName) then tree.tpe else - val pre = ta.maybeSkolemizePrefix(qualType, name) + //val pre = ta.maybeSkolemizePrefix(qualType, name) val mbr = sharpen( - qualType.findMember(name, pre, + qualType.findMember(name, qualType, excluded = if tree.symbol.is(Private) then EmptyFlags else Private )).suchThat(tree.symbol == _) val newType = tree.tpe match diff --git a/compiler/test/dotc/pos-test-pickling.blacklist b/compiler/test/dotc/pos-test-pickling.blacklist index 23c79affada0..07c157793f5d 100644 --- a/compiler/test/dotc/pos-test-pickling.blacklist +++ b/compiler/test/dotc/pos-test-pickling.blacklist @@ -24,7 +24,6 @@ t5031_2.scala i16997.scala i7414.scala i17588.scala -i8300.scala i9804.scala i13433.scala i16649-irrefutable.scala diff --git a/tests/neg/6314-6.check b/tests/neg/6314-6.check index df988f1db9dd..7d6bd182173d 100644 --- a/tests/neg/6314-6.check +++ b/tests/neg/6314-6.check @@ -4,7 +4,7 @@ |object creation impossible, since def apply(fa: String): Int in trait XX in object Test3 is not defined |(Note that | parameter String in def apply(fa: String): Int in trait XX in object Test3 does not match - | parameter Test3.Bar[X & (X & Y)] in def apply(fa: Test3.Bar[X & YY.this.Foo]): Test3.Bar[Y & YY.this.Foo] in trait YY in object Test3 + | parameter Test3.Bar[X & Object with Test3.YY {...}#Foo] in def apply(fa: Test3.Bar[X & YY.this.Foo]): Test3.Bar[Y & YY.this.Foo] in trait YY in object Test3 | ) -- Error: tests/neg/6314-6.scala:52:3 ---------------------------------------------------------------------------------- 52 | (new YY {}).boom // error: object creation impossible @@ -12,5 +12,5 @@ |object creation impossible, since def apply(fa: String): Int in trait XX in object Test4 is not defined |(Note that | parameter String in def apply(fa: String): Int in trait XX in object Test4 does not match - | parameter Test4.Bar[X & (X & Y)] in def apply(fa: Test4.Bar[X & YY.this.FooAlias]): Test4.Bar[Y & YY.this.FooAlias] in trait YY in object Test4 + | parameter Test4.Bar[X & Object with Test4.YY {...}#FooAlias] in def apply(fa: Test4.Bar[X & YY.this.FooAlias]): Test4.Bar[Y & YY.this.FooAlias] in trait YY in object Test4 | ) diff --git a/tests/neg/i6225.scala b/tests/neg/i6225.scala index bb936c9a79b1..148a484fd0f1 100644 --- a/tests/neg/i6225.scala +++ b/tests/neg/i6225.scala @@ -1,4 +1,4 @@ -object O1 { // error: cannot be instantiated +object O1 { type A[X] = X opaque type T = A // error: opaque type alias must be fully applied } diff --git a/tests/pos/i17222.2.scala b/tests/pos/i17222.2.scala deleted file mode 100644 index 34db494750c4..000000000000 --- a/tests/pos/i17222.2.scala +++ /dev/null @@ -1,30 +0,0 @@ -import scala.compiletime.* - -trait Reader[-In, Out] - -trait A: - type T - type F[X] - type Q = F[T] - -object Reader: - - given [X]: Reader[A { type Q = X }, X] with {} - -object Test: - - trait B[X] extends A: - type T = X - - trait C extends A: - type F[X] = X - - trait D[X] extends B[X] with C - - val d = new D[Int] {} - val bc = new B[Int] with C - - summonAll[(Reader[d.type, Int], Reader[d.type, Int])] // works - summonAll[(Reader[bc.type, Int], Reader[bc.type, Int])] // error - summonInline[Reader[d.type, Int]] // works - summonInline[Reader[bc.type, Int]] // works?? diff --git a/tests/pos/i17222.3.scala b/tests/pos/i17222.3.scala deleted file mode 100644 index 7ca85f65278f..000000000000 --- a/tests/pos/i17222.3.scala +++ /dev/null @@ -1,39 +0,0 @@ -import scala.compiletime.* - -trait Reader[-In, Out] - -trait A: - type T - type F[X] - type Q = F[T] - -object Reader: - - given [X]: Reader[A { type Q = X }, X] with {} - -object Test: - - trait B[X] extends A: - type T = X - - trait C extends A: - type F[X] = X - - trait D[X] extends B[X] with C - - val d = new D[Int] {} - val bc = new B[Int] with C - - case class Box[T](value: T) - - /** compiletime.summonAll, but with one case */ - inline def summonOne[T <: Box[?]]: T = - val res = - inline erasedValue[T] match - case _: Box[t] => summonInline[t] - end match - Box(res).asInstanceOf[T] - end summonOne - - summonOne[Box[Reader[d.type, Int]]] // works - summonOne[Box[Reader[bc.type, Int]]] // errors diff --git a/tests/pos/i17222.4.scala b/tests/pos/i17222.4.scala deleted file mode 100644 index 209425d47915..000000000000 --- a/tests/pos/i17222.4.scala +++ /dev/null @@ -1,71 +0,0 @@ -import scala.compiletime.* - -trait Reader[-In, Out] - -trait A: - type T - type F[X] - type Q = F[T] - -given [X]: Reader[A { type Q = X }, X] with {} - -case class Box[T](x: T) - -/** compiletime.summonAll, but with one case */ -inline def summonOne[T]: T = - val res = - inline erasedValue[T] match - case _: Box[t] => summonInline[t] - end match - Box(res).asInstanceOf[T] -end summonOne - - -@main def main = - - - trait B[X] extends A: - type T = X - - trait C extends A: - type F[X] = X - - - val bc = new B[Int] with C - - summonOne[Box[Reader[bc.type, Int]]] // errors - - - val bc2: A { type Q = Int } = new B[Int] with C - - summonOne[Box[Reader[bc2.type, Int]]] // works - - - object BC extends B[Int] with C - - summonOne[Box[Reader[BC.type, Int]]] // works - - - val a = new A: - type T = Int - type F[X] = X - - summonOne[Box[Reader[a.type, Int]]] // works - - - val b = new B[Int]: - type F[X] = X - - summonOne[Box[Reader[b.type, Int]]] // works - - - val ac = new A with C: - type T = Int - - summonOne[Box[Reader[ac.type, Int]]] // works - - - trait D[X] extends B[X] with C - val d = new D[Int] {} - - summonOne[Box[Reader[d.type, Int]]] // works diff --git a/tests/pos/i17222.5.scala b/tests/pos/i17222.5.scala deleted file mode 100644 index dc608e94235c..000000000000 --- a/tests/pos/i17222.5.scala +++ /dev/null @@ -1,26 +0,0 @@ -import scala.compiletime.* - -trait Reader[-In, Out] - -trait A: - type T - type F[X] - type Q = F[T] - -given [X]: Reader[A { type Q = X }, X] with {} - -case class Box[T](x: T) - -inline def summonOne[T]: T = - summonInline[T] -end summonOne - -@main def main = - trait B[X] extends A: - type T = X - trait C extends A: - type F[X] = X - - val bc = new B[Int] with C - summonInline[Reader[bc.type, Int]] // (I) Works - summonOne[Reader[bc.type, Int]] // (II) Errors diff --git a/tests/pos/i17222.8.scala b/tests/pos/i17222.8.scala deleted file mode 100644 index a415a78e0703..000000000000 --- a/tests/pos/i17222.8.scala +++ /dev/null @@ -1,18 +0,0 @@ -import scala.compiletime.* - -trait A: - type F - type Q = F - -trait Reader[-In, Out] -object Reader: - given [X]: Reader[A { type Q = X }, X] with {} - -class Test: - //type BC = A { type F = Int } & A // ok - type BC = A & A { type F = Int } // fail, also ok when manually de-aliased - - inline def summonOne: Unit = summonInline[Reader[BC, Int]] - - def t1(): Unit = summonInline[Reader[BC, Int]] // ok - def t2(): Unit = summonOne // error diff --git a/tests/pos/i17222.scala b/tests/pos/i17222.scala deleted file mode 100644 index 2af9fc2861a8..000000000000 --- a/tests/pos/i17222.scala +++ /dev/null @@ -1,33 +0,0 @@ -import scala.deriving.Mirror -import scala.compiletime.* - -trait Reader[-In, Out] - -trait A: - type T - type F[X] - type Q = F[T] - -object Reader: - - given [X]: Reader[A { type Q = X }, X] with {} - - type Map2[Tup1 <: Tuple, Tup2 <: Tuple, F[_, _]] <: Tuple = (Tup1, Tup2) match - case (h1 *: t1, h2 *: t2) => F[h1, h2] *: Map2[t1, t2, F] - case (EmptyTuple, EmptyTuple) => EmptyTuple - - inline given productReader[In <: Product, Out <: Product](using mi: Mirror.ProductOf[In])(using mo: Mirror.ProductOf[Out]): Reader[In, Out] = - summonAll[Map2[mi.MirroredElemTypes, mo.MirroredElemTypes, Reader]] - ??? - -object Test: - - trait B[X] extends A: - type T = X - - trait C extends A: - type F[X] = X - - val bc = new B[Int] with C - - summon[Reader[(bc.type, bc.type), (Int, Int)]] // fails From e25316c24ad01bc5629973be13ed314deb4878dd Mon Sep 17 00:00:00 2001 From: Wojciech Mazur Date: Mon, 24 Feb 2025 14:55:38 +0100 Subject: [PATCH 366/371] Adjust captures/lazylist test --- tests/neg-custom-args/captures/lazylist.check | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/tests/neg-custom-args/captures/lazylist.check b/tests/neg-custom-args/captures/lazylist.check index 65fed0c4ec7e..bc95a445f3f4 100644 --- a/tests/neg-custom-args/captures/lazylist.check +++ b/tests/neg-custom-args/captures/lazylist.check @@ -29,7 +29,7 @@ -- [E007] Type Mismatch Error: tests/neg-custom-args/captures/lazylist.scala:41:42 ------------------------------------- 41 | val ref4c: LazyList[Int]^{cap1, ref3} = ref4 // error | ^^^^ - | Found: (ref4 : lazylists.LazyList[Int]^{cap3, ref1, ref2}) + | Found: (ref4 : lazylists.LazyList[Int]^{cap3, cap2, ref1, cap1}) | Required: lazylists.LazyList[Int]^{cap1, ref3} | | longer explanation available when compiling with `-explain` From 27a73e915acd080eba4ccb6d2feac4f605bda70e Mon Sep 17 00:00:00 2001 From: Wojciech Mazur Date: Mon, 20 Jan 2025 16:17:04 +0100 Subject: [PATCH 367/371] Fix chocolatey-test when used in stable releases --- .github/workflows/ci.yaml | 4 ++-- .github/workflows/test-chocolatey.yml | 5 ++++- 2 files changed, 6 insertions(+), 3 deletions(-) diff --git a/.github/workflows/ci.yaml b/.github/workflows/ci.yaml index b6bd9c319b1c..c70eaaaa0e3e 100644 --- a/.github/workflows/ci.yaml +++ b/.github/workflows/ci.yaml @@ -911,14 +911,14 @@ jobs: uses: ./.github/workflows/build-chocolatey.yml needs: [ build-sdk-package ] with: - version: 3.6.0-local # TODO: FIX THIS + version: 3.6.0-SNAPSHOT # Fake version, used only for choco tests url : https://api.github.com/repos/scala/scala3/actions/artifacts/${{ needs.build-sdk-package.outputs.win-x86_64-id }}/zip digest : ${{ needs.build-sdk-package.outputs.win-x86_64-digest }} test-chocolatey-package: uses: ./.github/workflows/test-chocolatey.yml with: - version : 3.6.0-local # TODO: FIX THIS + version : 3.6.0-SNAPSHOT # Fake version, used only for choco tests java-version: 8 if: github.event_name == 'pull_request' && contains(github.event.pull_request.body, '[test_chocolatey]') needs: [ build-chocolatey-package ] diff --git a/.github/workflows/test-chocolatey.yml b/.github/workflows/test-chocolatey.yml index b6ca9bf74b12..e302968b9129 100644 --- a/.github/workflows/test-chocolatey.yml +++ b/.github/workflows/test-chocolatey.yml @@ -21,7 +21,10 @@ on: env: CHOCOLATEY-REPOSITORY: chocolatey-pkgs - DOTTY_CI_INSTALLATION: ${{ secrets.GITHUB_TOKEN }} + # Controls behaviour of chocolatey{Install,Uninstall}.ps1 scripts + # During snapshot releases it uses a different layout and requires access token to GH Actions artifacts + # During stable releases it uses publically available archives + DOTTY_CI_INSTALLATION: ${{ endsWith(inputs.version, '-SNAPSHOT') && secrets.GITHUB_TOKEN || '' }} jobs: test: From e3b2838214fe528b87341b7f4ca90875e07d4dec Mon Sep 17 00:00:00 2001 From: Wojciech Mazur Date: Wed, 26 Feb 2025 10:41:12 +0100 Subject: [PATCH 368/371] Add changelog for 3.6.4-RC2 --- changelogs/3.6.4-RC2.md | 13 +++++++++++++ 1 file changed, 13 insertions(+) create mode 100644 changelogs/3.6.4-RC2.md diff --git a/changelogs/3.6.4-RC2.md b/changelogs/3.6.4-RC2.md new file mode 100644 index 000000000000..1edfad6321ee --- /dev/null +++ b/changelogs/3.6.4-RC2.md @@ -0,0 +1,13 @@ +# Reverted changes + +- Revert "Drop phase.isTyper use in isLegalPrefix/asf" from Scala 3.6.4 [#22653](https://github.com/lampepfl/dotty/pull/22653) + +# Contributors + +Thank you to all the contributors who made this release possible 🎉 + +According to `git shortlog -sn --no-merges 3.6.4-RC1..3.6.4-RC2` these are: + +``` + 5 Wojciech Mazur +``` From 9d7f439403feb3c87b98bf26892600a5038c92ac Mon Sep 17 00:00:00 2001 From: Wojciech Mazur Date: Wed, 26 Feb 2025 10:42:08 +0100 Subject: [PATCH 369/371] Release 3.6.4-RC2 --- project/Build.scala | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/project/Build.scala b/project/Build.scala index 39893a95633f..d20bcff08ce6 100644 --- a/project/Build.scala +++ b/project/Build.scala @@ -118,7 +118,7 @@ object Build { * During release candidate cycle incremented by the release officer before publishing a subsequent RC version; * During final, stable release is set exactly to `developedVersion`. */ - val baseVersion = s"$developedVersion-RC1" + val baseVersion = s"$developedVersion-RC2" /** The version of TASTY that should be emitted, checked in runtime test * For defails on how TASTY version should be set see related discussions: From 4377e99063c4bf7cbe81cac91dd4dff1a476f63c Mon Sep 17 00:00:00 2001 From: Wojciech Mazur Date: Wed, 5 Mar 2025 23:03:29 +0100 Subject: [PATCH 370/371] Add changelog for 3.6.4 --- changelogs/3.6.4.md | 168 ++++++++++++++++++++++++++++++++++++++++++++ 1 file changed, 168 insertions(+) create mode 100644 changelogs/3.6.4.md diff --git a/changelogs/3.6.4.md b/changelogs/3.6.4.md new file mode 100644 index 000000000000..43d2502f7218 --- /dev/null +++ b/changelogs/3.6.4.md @@ -0,0 +1,168 @@ + +# Highlights of the release + +- Support for JDK 24 [#22250](https://github.com/scala/scala3/pull/22250) +- REPL `:silent` command to toggle automatic printing of outputs [#22248](https://github.com/scala/scala3/pull/22248) +- REPL `--repl-init-script:` setting to run a code on startup [#22206](https://github.com/scala/scala3/pull/22206) +- Deprecated setting `-Xno-decode-stacktraces` is now an alias to `-Xno-enrich-error-messages` [#22208](https://github.com/scala/scala3/pull/22208) +- Annotation arguments are no longer lifted [#22035](https://github.com/scala/scala3/pull/22035) +- Experimental Capture Checking: Implement tracked members [#21761](https://github.com/scala/scala3/pull/21761) + +## Breaking changes + +- Align `@implicitNotFound` and `@implicitAmbigous` with the language specification [#22371](https://github.com/scala/scala3/pull/22371) + + This change may impact users who previously used these annotations using variables or string interpolation. + + Previously, a bug in the Scala 3 compiler allowed non-literal strings to be passed as arguments to the `@implicitNotFound` and `@implicitAmbiguous` annotations. + This could have affected how failed implicit search results were reported by the compiler. + + Starting from Scala 3.6.4, the arguments for these annotations must be string literals. + If a message is too long, it can be concatenated using the `+` operator, allowing for constant folding. + +# Other changes and fixes + +## Annotations + +- Make sure symbols in annotation trees are fresh before pickling [#22002](https://github.com/scala/scala3/pull/22002) +- Consider all arguments in Annotations.refersToParamOf [#22001](https://github.com/scala/scala3/pull/22001) +- Do not lift annotation arguments (bis) [#22046](https://github.com/scala/scala3/pull/22046) + +## Desugaring + +- Fix #22051: only trust the type application part for case class unapplies [#22099](https://github.com/scala/scala3/pull/22099) + +## Documentation + +- Update example code linked to obsolete content in macros-spec.md [#22256](https://github.com/scala/scala3/pull/22256) + +## Experimental: Capture Checking + +- Fix #21868, #21869, and #21870: handle CapsOf in more places [#21875](https://github.com/scala/scala3/pull/21875) +- Refine rules for capture parameters and members [#22000](https://github.com/scala/scala3/pull/22000) +- Add a hint for using CC with REPL [#22220](https://github.com/scala/scala3/pull/22220) +- Consolidate CC [#21863](https://github.com/scala/scala3/pull/21863) + +## Experimental: Global Initialization + +- Fix crash when initializing val in ByName closure [#22354](https://github.com/scala/scala3/pull/22354) + +## Experimental: Named Tuples + +- Handle TypeProxy of Named Tuples in unapply [#22325](https://github.com/scala/scala3/pull/22325) +- Fail more eagerly when trying to adapt named unapply patterns [#22315](https://github.com/scala/scala3/pull/22315) +- Widen singleton types when computing fields from .Fields [#22149](https://github.com/scala/scala3/pull/22149) +- Fix .toTuple insertion [#22028](https://github.com/scala/scala3/pull/22028) + +## Extension Methods + +- Tweak ExtensionNullifiedByMember [#22268](https://github.com/scala/scala3/pull/22268) +- Nowarn extension matching nonpublic member [#21825](https://github.com/scala/scala3/pull/21825) + +## Implicits + +- Rollback constraints in compareAppliedTypeParamRef [#22339](https://github.com/scala/scala3/pull/22339) +- Try implicit searching after finding dynamic select [#22318](https://github.com/scala/scala3/pull/22318) + +## Linting + +- Allow discarding "Discarded non-Unit" warnings with `: Unit` [#21927](https://github.com/scala/scala3/pull/21927) + +## Match Types + +- Fix #21841: Check more that an `unapplySeq` on a `NonEmptyTuple` is valid. [#22366](https://github.com/scala/scala3/pull/22366) +- Type avoidance in MT bound inference [#22142](https://github.com/scala/scala3/pull/22142) + +## Metaprogramming + +- Rethrow SuspendExceptions caught in CodeGen phase [#22009](https://github.com/scala/scala3/pull/22009) + +## Metaprogramming: Compile-time + +- Extend compiletime.testing.typechecks with certain transform phases [#21185](https://github.com/scala/scala3/pull/21185) + +## Nullability + +- Fix #21619: Refactor NotNullInfo to record every reference which is retracted once. [#21624](https://github.com/scala/scala3/pull/21624) + +## Presentation Compiler + +- Use new infer expected type for singleton complations [#21421](https://github.com/scala/scala3/pull/21421) +- Fix match error in keyword completions [#22138](https://github.com/scala/scala3/pull/22138) + +## Reflection + +- Do not return java outline dummy constructor in `primaryConstructor` [#22104](https://github.com/scala/scala3/pull/22104) + +## Reporting + +- Normalise the types for Type Mismatch Error (E007) [#22337](https://github.com/scala/scala3/pull/22337) +- Improve given search preference warning [#22189](https://github.com/scala/scala3/pull/22189) +- Better error messages when an enum derives from AnyVal [#22236](https://github.com/scala/scala3/pull/22236) +- Correctly print litteral types in the refined printer [#22351](https://github.com/scala/scala3/pull/22351) + +## Rewrites + +- Undo patch of double-block apply [#21982](https://github.com/scala/scala3/pull/21982) + +## Scaladoc + +- Scaladoc: Add support for named tuples [#22263](https://github.com/scala/scala3/pull/22263) + +## Settings + +- Limit exposure to ConcurrentModificationException when sys props are replaced or mutated [#22180](https://github.com/scala/scala3/pull/22180) + +## Specification + +- Align the spec to allow the marker [#22323](https://github.com/scala/scala3/pull/22323) +- Integrate the specification for match types. [#22164](https://github.com/scala/scala3/pull/22164) + +## Transform + +- Fix #22226: Use `classOf[BoxedUnit]` for Unit array in `ArrayConstructors`. [#22238](https://github.com/scala/scala3/pull/22238) + +## Typer + +- Fixes for isLegalPrefix change [#22241](https://github.com/scala/scala3/pull/22241) +- Resolve name when named imp is behind wild imps [#21888](https://github.com/scala/scala3/pull/21888) + +# Contributors + +Thank you to all the contributors who made this release possible 🎉 + +According to `git shortlog -sn --no-merges 3.6.3..3.6.4` these are: + +``` + 46 Martin Odersky + 33 noti0na1 + 24 Wojciech Mazur + 14 Dale Wijnand + 13 Matt Bovel + 11 Hamza Remmal + 7 Jan Chyb + 6 aherlihy + 5 Kacper Korban + 5 Seth Tisue + 5 Som Snytt + 4 Oliver Bračevac + 4 Yichen Xu + 3 Sébastien Doeraene + 3 dependabot[bot] + 3 kasiaMarek + 2 João Ferreira + 1 David Hua + 1 Eugene Flesselle + 1 Eugene Yokota + 1 Florian3k + 1 Jędrzej Rochala + 1 Kenji Yoshida + 1 Mathias + 1 Natsu Kagami + 1 Oleg Zenzin + 1 Piotr Chabelski + 1 Rui Chen + 1 philippus + 1 rochala + 1 xiaoshihou +``` From 6b2b8819191891f2057a1e11bda45035a4ca03ad Mon Sep 17 00:00:00 2001 From: Wojciech Mazur Date: Wed, 5 Mar 2025 23:05:41 +0100 Subject: [PATCH 371/371] Release 3.6.4 --- project/Build.scala | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/project/Build.scala b/project/Build.scala index d20bcff08ce6..60f8e4d87be1 100644 --- a/project/Build.scala +++ b/project/Build.scala @@ -118,7 +118,7 @@ object Build { * During release candidate cycle incremented by the release officer before publishing a subsequent RC version; * During final, stable release is set exactly to `developedVersion`. */ - val baseVersion = s"$developedVersion-RC2" + val baseVersion = developedVersion /** The version of TASTY that should be emitted, checked in runtime test * For defails on how TASTY version should be set see related discussions: