Hey!

So, I've been practicing a lot of Graph Theory and related algorithms recently and am trying different interpretations and implementations. I wanted to brush up on my memory of time complexities of different data structures for important operations. If anyone could correct me if anything I write below is wrong, please do so. If I have a doubt, I've added a (?) at the end of my comment.

-> Sorting using std::sort

Guaranteed to be O(n.log(n)) in worst case (?)

-> std::vector, std::deque, std::stack, std::queue

--- Access - O(1)
--- push_back - average O(1). Usually, reallocation and copying of vector happens when a certain load factor (which is the ratio of the number of elements in the vector and the actual size of the vector which is usually greater than what's returned by vector.size() (?) If a lot of elements are being added, another factor determines the probable excess reallocation space that might be needed which thus increases efficiency (?))
--- pop_back - O(1)
--- Why is push_front O(1) average for std::deque? Probably the actual [0...X] indices are not used to store data and extra space is available on both sides.

-> std::set, std::multiset and std::map

--- Operations guaranteed to be O(log(n))
--- Implemented as balanced binary trees (?) in the STL

-> std::unordered_set, std::unordered_multiset and std::unordered_map

--- Operations guaranteed to be O(1) average (average because of probable collisions)
--- Implemented using hashing

-> std::priority_queue

--- Insertion - O(log(N))
--- Removal - O(log(N))
--- Retrieval - O(1)
--- implemented using heaps, which is better than balanced binary trees in time complexity (?) and easier to implement.

-> __gnu_pbds::tree
--- find_by_order - O(log(N))
--- order_of_key - O(log(N))

-> std::reverse, std::copy, std::fill, std::is_sorted, etc...
O(N)

-> std::binary_search, std::lower/upper_bound, etc...
O(log(N))