Welcome to Software Development on Codidact!
Will you help us build our independent community of developers helping developers? We're small and trying to grow. We welcome questions about all aspects of software development, from design to code to QA and more. Got questions? Got answers? Got code you'd like someone to review? Please join us.
Why object-oriented instead of class-oriented?
I understand that in object-oriented programming, an object is an instance of a class.
If it's an instance, I misunderstand why does it need the term object at all (we could just say "instance").
Wouldn't it be more practical to say that this programming paradigm is class-oriented and that classes can be instantiated or not?
4 answers
As with anything computer science-related that dates back to the 1960s and 70s, things just happened at a whim. Everything was new and highly experimental back then. Nobody knew how to write or design programs the best way, how to organize them or what coding styles that were most readable. The whole field of computer science had just been invented.
According to wikipedia the name of the term "class" probably originates from the Simula language, which is a very old experimental one from 1960s.
Notably "class", "object" and "instance" are in themselves very broad and fuzzy words that in a general English context could mean anything. As are "variable", "structure", "subroutine", "function", "member", "property", "encapsulation" and so on. The pattern is clear: these are all originally incredibly vague and nondescript terms.
So basically someone back in the 1960s or so just picks a name for some term and then it sticks. It is not based on some deeper far-sighted rationale regarding how computer programs would be designed in the distant future, 60 years later. Actual object-orientation as we know it didn't really emerge until the 1980s-1990s, and when it did, it used already existing terms.
0 comment threads
Why object-oriented instead of class-oriented?
tl;dr
Because you can "do OOP" without classes.
Long answer
A class is one possible way to implement object oriented programming. But it's not the only one.
OOP is a programming paradigm: a particular "style"/way of designing programs. First of all, let's remind that there's no single, unique, upon-agreed, free-of-controversies, canonical definition for OOP. But most people "usually (tend to) kinda agree" on three basic pillars:
- encapsulation: don't expose an object's internal details to other objects.
- inheritance: one object can acquire properties and methods of another object.
- polymorphism: the ability to occurr in several different forms.
As I said, there's not a consensus about the correct definition. Some add a fourth pillar ("abstraction"), and we could debate the definition for hours, but that's beside the point. In case you're interested, there's already a post here with a more detailed discussion.
Anyway, you can achieve all those things (encapsulation, polymorphism, etc) without using classes.
For example, JavaScript uses prototypes to implement inheritance.
JavaScript has a way to define classes using the class
keyword, but that's just a syntax sugar, because in the end they're all "special" functions:
class Rectangle {
constructor(height, width) {
this.height = height;
this.width = width;
}
}
console.log(typeof Rectangle); // function
You can also achieve an object-oriented design with languages that have no classes, such as C - see an example here.
an object is an instance of a class.
In a general way, an object is something (a structure, a container, etc) that contains data (fields/attributes/properties) and "behaviour" (usually in the form of methods, that manipulate the data).
In a more general way, an object can be a variable, data structure, function/method, etc. Basically, any region of memory that has a value and can be accessed by some identifier.
For languages that have classes, though, an object is usually defined as an instance of a class.
But different languages can have different definitions. For example, MDN defines a JavaScript object as "a collection of properties". And you can create an object without using classes (considering that the language doesn't actually have classes, I mean "you can create an object without using the class
keyword"):
var obj = {
somefield: 1,
somemethod: function () {
console.log(this.somefield);
}
};
obj.somemethod(); // 1
Python, on the other hand, defines an object as "abstraction for data", which means that all data is represented by objects and relations between those (even a single number as 1
is also an object).
C11, section 3.15, defines object as "region of data storage in the execution environment, the contents of which can represent values".
Anyway, some languages define an object as an instance of a class, but the actual definition is way broader and it can vary according to the language. It's not something restricted to classes only.
A class is just a mechanism that some languages provide to achieve OOP goals (encapsulation, polymorphism, inheritance, etc). In the languages that have classes, I recognize that it makes things easier, IMO.
But it doesn't mean they're the only way to do it, you can do OOP without classes. That's why the paradigm isn't called "class-oriented programming".
Object and Class aren't necessarily the same thing. Back in the 1980's when object oriented programming started to be talked about by practicing software engineers writing real production code, the word "class" didn't come up much. That got popularized by C++ and Java, where class actually has a specific meaning.
In its basic form, object oriented programming is a means of tying a group of subroutines to the template of a data structure. You could have different instances of these structures, and the routines appeared to belong to these structures. Instead of calling a routine to add 5 to some number, the object oriented way is to tell the number to "go add 5 to yourself". The compiler would be aware of this. It would generate code to call the add-5 routine and pass the specific data to act upon in a standard way.
Note that the above doesn't require anything called a "class". Java and C++ made this basic concept popular with built-in support, added additional constructs around it, and used names like "class" and "method". Young-uns today that never had to trudge to school barefoot in the snow uphill both ways forget that these concepts weren't engraved on stone tablets handed to us via some mystic ritual.
Various names in computing were used in different ways, especially early on. Often what we have today is the names used by one popular implementation. That doesn't make it the only right way in the general world of computer science.
0 comment threads
I understand that in object oriented programming, an object is an instance of a class.
Not necessarily. JavaScript famously supported object-oriented programming but did not support classes until 2015.
0 comment threads