I have some code where I have a collection of functions. Each function might have 1 or 2 arguments. When iterating through these functions I check whether the element is of type Function1 or Function2. The problem is that I start getting type erasure warnings because I have to define the parameter datatypes in order for it to compile (ex. Function2[String,Int] -- see example warning below+).
I understand that Java strips the parameter type information away when it gets compiled so it can't pattern match the Function2 that I am defining. The thing is, I only want to test if the function is of type Function1 or Function2, and from there I know what data types I need to pass to the function. I tried defining them as ex. Function2[Any,Any] and Function2[Object,Object], but neither of these suppressed the warnings. Is there any way to pattern match against functions of different parameter counts without asking it to also check the parameter types?
[warn] /home/ubuntu/aa-2-0/src/main/scala/hw.scala:818: non-variable type argument Any in type Any => Any is unchecked since it is eliminated by erasure
[warn] else if(col_data.isInstanceOf[Function1[Any,Any]]) mapped_data+= col -> col_data.asInstanceOf[Function1[Any,String]].apply(page)
[warn] ^
[warn] /home/ubuntu/aa-2-0/src/main/scala/hw.scala:819: non-variable type argument Any in type (Any, Any) => Any is unchecked since it is eliminated by erasure
[warn] else if(col_data.isInstanceOf[Function2[Any,Any,Any]]) mapped_data+= col -> col_data.asInstanceOf[Function2[Any,Any,String]].apply(result_row,page)
I know there is reflection and tagtypes and all that, but those seem like ugly and overly elaborate hacks to solve my rather trivial problem. Instead, I was planning to resort to encapsulating these functions in FunctionOne,FunctionTwo case classes then just test for those, but wanted to first see if there was a more elegant solution to be learned.