Company: Databrick
Difficulty: medium
Codewriting Problem Description You are given an array of strings `startUpTimes` representing server startup times in 24-hour format and an array of strings `shutDownTimes` of the same length, representing server shutdown time in 24-hour format. For the `i`th server, startup time is `startUpTimes[i]` and shutdown time is `shutDownTimes[i]`, where `i` is a 0-based index. You are also given a string `currentTime` representing the current time in 24-hour format. The `shutDownTimes[i]` may be equal to `"None"`, meaning the `i`th server was not shut down at the `currentTime`. The cost of running each server is $1 per minute. Your task is to calculate the total amount you need to pay (in dollars) to run all servers from the start of the first-started server to the `currentTime`. Note: You are not expected to provide the most optimal solution, but a solution with time complexity not worse than `O(startUpTimes.length × MINUTES_IN_DAY)` will fit within the execution time limit. Example Example