A robot moves in a plane starting from the original point (0,0). The robot can move toward UP, DOWN, LEFT and RIGHT with a given steps. The trace of robot movement is shown as the following:
UP 5 DOWN 3 LEFT 3 RIGHT 2
The numbers after the direction are steps. Please write a program to compute the distance from current position after a sequence of movement and original point. If the distance is a float, then just print the nearest integer.
Example:
If the following tuples are given as input to the program:
UP 5 DOWN 3 LEFT 3 RIGHT 2
Then, the output of the program should be: 2
Hint:
In case of input data being supplied to the question, it should be assumed to be a console input.
from math import sqrt def robot_moves(moves): pos = [0,0] for move in moves: dir_and_distance = move.split() if dir_and_distance[0].upper() == "UP": pos[0] += int(dir_and_distance[1]) elif dir_and_distance[0].upper() == "DOWN": pos[0] -= int(dir_and_distance[1]) elif dir_and_distance[0].upper() == "RIGHT": pos[1] += int(dir_and_distance[1]) elif dir_and_distance[0].upper() == "LEFT": pos[1] -= int(dir_and_distance[1]) return round(sqrt(pos[0]**2+pos[1]**2)) def main(): moves = ("UP 5", "DOWN 3", "LEFT 3", "RIGHT 2") dist = robot_moves(moves) print(dist) if __name__ == '__main__': main()