Hi,
I was given following question as an assessment but unfortunately I was unable to answer. Can someone help ?
The question is :
The function given below is supposed to retrieve the substring between and including the ith and jth characters of the input string s. When i and j are out of range, ie. i,j< 0 ori,j >= s.length(), the function returns the string "<error>". The following table shows the expected and actual return values of the function for various inputs.
Test case
s
i
j
Expected return value
Actual return value
1
"quick"
0
2
"qui"
"brown"
"ro"
""
3
"fox"
9
"<error>"
4
"jumped"
-1
5
"lazy"
6
"dog"
"o"
Unfortunately, the function returns the wrong value for the 2<sup>nd</sup> and 5<sup>th</sup>test cases.
C# Version:
string SubString(string s, int i, int j)
{
int k;
string result;
try
k = s.Length;
if (i < 0 || j < 0 || i >= k || j >= k)
throw new Exception();
result = s.Substring(i, j + 1 - i);
}
catch (Exception e)
return "<error>";
return result;
a) Under what general conditions does the function return the wrong value?
b) Add some code to fix the function. Can you improve it? If so explain why your version is better.