Ranked soft sets

This article defines ranked soft sets and establishes their fundamental theory. This new model of uncertain knowledge is a non‐numerical yet powerful improvement of soft sets. The model relies on a qualitative improvement of the basic parameterized description posed by soft sets. We define relations between ranked soft sets and some existing models (N‐soft sets, fuzzy soft sets, probabilistic soft sets) that enhance the soft set spirit with the help of additional quantities. Primary contributions to their development include set‐theoretic operations and representation theorems, at a theoretical level; and scores and aggregation operators, at a practical level. Finally we design a multi‐person decision‐making strategy for data in the form of ranked soft sets that takes advantage of these elements.

to define bipolar fuzzy sets, whose spirit led to bipolar soft sets (Mahmood, 2020). It also prompted Kamacı and Petchimuthu (2020) to design bipolar N-soft sets. And Ramot et al. (2002) introduced complex fuzzy sets, which subsequently motivated the appearance of mixed models like complex Fermatean fuzzy N-soft sets  and complex picture fuzzy N-soft sets . Also bipolar complex fuzzy soft sets (Mahmood et al., 2022) and complex bipolar fuzzy N-soft sets (Farooq et al., 2022) were soon defined.
In continuation of these research efforts, ranked soft sets will be defined in this paper. They are useful as an intermediate model that is less demanding than N-soft sets or fuzzy soft sets. This means that ranked soft sets can operate under more general informational bases. At the same time, they are a natural improvement of soft sets, whereby we are allowed to state that we are confident that certain alternatives satisfy (or do not satisfy) a property more than others. Hence, they are a qualitative, rather than quantitative, enhancement of the soft set spirit. In other words, contrary to the case of N-soft sets or fuzzy soft sets, no numbers are added to the basic description given by a soft set. However, we can at the same time improve this description with qualitative assessments. Within the spirit of soft sets, one can handle multiple levels of detail without a possibly numerical evaluation with the help of a ranked soft set.
Studies from several fields motivate the new model. A first motivation is multi-scale information systems (Bittner & Stell, 2003;Wu & Leung, 2011). They help produce stratified rough sets attending to less/more detailed versions of data. In the context of hesitancy, Alcantud (2022a) has shown how the idea underlying our proposal produces ranked hesitant fuzzy sets. From the perspective of social choice theory, voting systems combining approval (i.e., yes/no opinions about an issue) and preferences (i.e., rankings) have been the subject of many interesting studies (Brams, 2008;Brams & Sanver, 2009;Sanver, 2010).
Ranked soft sets will be the subject of both theoretical and practical analyses in this article.
At a theoretical level, we investigate their relationships with other popular models owing to numerically enhanced generalizations of soft sets.
Fundamental elements (null, full ranked soft sets) and operations (union and intersection) are the germ of a set-theoretic analysis. Besides, we define when an N-soft set represents a ranked soft set. Then two representation theorems directly link ranked soft sets with N-soft sets in competing lines of thought.
At a practical level, one should reckon that some operators are needed to use ranked soft sets in applications. In relation with this goal, we discuss scores and aggregation operators. The later tool is inspired by Alcantud et al. (2022), who have recently posed the problem of aggregation in the setting of N-soft sets. In order to provide flexible and reliable solutions for that problem, their main tool was an adapted form of the popular OWA (for 'ordered weighted averaging') aggregation operator (Yager, 1988). Here we shall show that in the context of ranked soft sets, OWAlike operators can also be defined with the help of this tool and the corresponding representation theorems. A sensitivity analysis clarifies the role of the weights in this procedure. Representation theorems, scores, and aggregation operators will be the building blocks of an adaptable strategy for group decision-making in the framework of ranked soft sets. The individual case will be the subject of a separate inspection.
The paper is divided into the following sections. Some preliminary concepts are recalled in Section 2. Section 3 introduces ranked soft sets. It also establishes relationships with the literature and gives examples. Section 4 provides theoretical background for the new model (inclusive of set-theoretic operations and representation theorems). Section 5 puts forward further elements for practical analyses of ranked soft sets, like scores and aggregation operators with a sensitivity analysis. They will take part in the group decision-making strategies produced in Section 6.
Our conclusions are given in Section 7.
In addition, we include a list with symbols at the end of the paper.

| PRELIMINARIES
Unless otherwise stated, henceforth U ¼ o 1 ,o 2 , …, o p f gdenotes a fixed set of p ≥ 1 alternatives. Also, E ¼ t 1 , t 2 , …,t q f gdenotes a fixed set of q ≥ 1 parameters or attributes.
Definition 1. (Fatimah et al. (2018)) Let N be an integer number greater than 1. An N-soft set over U is F,E, N ð Þ , a triple where G ¼ 0, 1,…, N À 1 f g and F is a mapping from E to 2 UÂG . It is further requested that F satisfies that for each t E and o U, exactly one pair o, g t ð Þ U Â G exists for which g t G and o,g t ð Þ F t ð Þ.
When N ¼ 2, we obtain a soft set (Molodtsov, 1999). This concept is usually defined as a pair F 0 , E ð Þwhere F 0 is a mapping F 0 : E ! P U ð Þ. It is concisely presented as the set of pairs t, To visualize the embedding of soft sets into N-soft sets, it is convenient to reformulate the latter model in the following alternative manner: gconsist of all evaluation functions associating admissible grades to the objects in U. Then an N- For the next concept we shall need D U ð Þ, that is the set which consists of all probability distributions on U. It is a subset of ℱ U ð Þ, which is the set which consists of all fuzzy sets on U.
Definition 3. (Fatimah et al., 2019); Zhu and Wen (2010)) A probabilistic soft set over U is a pair D, E ð Þ where D is a mapping from E to the set of all probability distributions on U, that is, D : E ! D U ð Þ.
Because D U ð Þ&ℱ U ð Þ, probabilistic soft sets are a special type of fuzzy soft sets with a different semantical interpretation. Let us recall the structure of the later model: Definition 4. (Maji et al. (2001)). A fuzzy soft set over U is a pair μ,E ð Þ where μ is a mapping from E to the set of all fuzzy sets on U, The agent can use a total of four ordered degrees, hence N ¼ 4.
T A B L E 2 Tabular representation of the 4-soft set F 1 , E,4 ð Þin Example 1 Thus, for example, in the analysis of the fifth attribute t 5 , D 1 t 5 ð Þ is a probability distribution on the elements of U defined by the following rules: o 1 has probability 0:4, o 3 has probability 0:1, and o 6 has probability 0:5. The probability associated with the other three objects is 0.
Clearly, this mathematical object can be identified with a fuzzy soft set over U.
To define the OWA operator on r-dimensional vectors (Yager, 1988), we need w ¼ w 1 , …,w r ð Þ , a non-negative weighting vector with the stan- where d i is the ith largest member of the list of numbers v 1 ,…, v r f g .

| RANKED SOFT SETS
The new model that we shall introduce will be defined with the help of some auxiliary elements, namely, the ranked partitions of a set. This concept is a type of corrupted ranked partitions, which are defined as follows: Definition 5. The set of corrupted ranked partitions of U is Each element V 0 , V 1 , …,V k ð Þ of ℛ Ã U ð Þ is a corrupted ranked partition of U.
As hinted above, a more structured way of partitioning a set is based on ranked partitions: Notice that we can also define By construction, ℛ U ð Þ&ℛ Ã U ð Þ. A corrupted ranked partition of U where all elements are nonempty, with the possible exception of V 0 which may be empty, becomes a ranked partition of U. If all elements are nonempty and we remove the order, that is, we keep V 0 , V 1 , …, V k f g instead of the ordered list V 0 , V 1 , …, V k ð Þ , then we have a standard partition of U. However, in corrupted ranked partitions, the empty set is allowed to appear at any position.
T A B L E 3 The tabular representation of the probabilistic soft set in Example 2 We are now ready to define the new model and study its relationship with existing models. We defer all remaining examples until Section 3.4 in order to avoid cluttering up the basic definitions and results with extraneous material.

| Introducing ranked soft sets
Definition 7. A ranked soft set over U is a pair R,E ð Þ where R is a mapping from E to the set of all ranked partitions of U, that The semantical interpretation of a ranked soft set R, E ð Þ over U is as follows. Like N-soft sets, fuzzy soft sets, and other models, it gives a parameterized description of the objects in U according to the attributes in E. Unlike these models however, these descriptions are not numerical.
Let us concentrate on the description of U by a parameter t E. It consists of a ranked partition V 0 , V 1 , …, V k ð Þ of U. We interpret that the elements in V 0 , if there are any, do not satisfy the property described by t. All the elements in V j (j > 0) satisfy this property with the same degree or strength, or alternatively, our confidence that they satisfy t is the same and it is not null. When o j V j , o i V i , and j > i, we interpret that o j satisfies property t to a larger degree or with more strength than o i , or alternatively, we are more confident that o j satisfies t than we are about o i .
The next section shows how we can represent the new model efficiently.

| Abbreviated notation and tabular representation
A ranked soft set R, E ð Þ over U will be represented by the following shorthand notation that captures the novel parameterized description of the universe of objects: In this representation it is implicit that V t 0 , V t 1 , …, V t k t ð Þ ℛ U ð Þ for each, by Definition 7.
The level of R, E ð Þrepresented by Equation (4) is max t E k t ð Þ.
We shall also use a visual representation of R, E ð Þby a table. The columns (by convention) are associated with the attributes in T. The column representing the evaluations under t gives the various V t i subsets for i ¼ 1, …, k t ð Þ, ordered from top to bottom as V t k t ð Þ , …, V t 1 , if they exist (recall that we admit k t ð Þ ¼ 0). Under a bottom horizontal line we display V t 0 , which may be the empty set if there are no elements that do not satisfy t. Table 4 summarizes the tabular representation of a general ranked soft set. Section 3.4 below illustrates the abbreviated notation and representation by tables with some examples. Particularly, Example 4 gives a complete description of the elements that we use to describe ranked soft sets efficiently. On a technical front, Section 3.3.4 explains that any ranked soft set can be characterized by a family of mappings with a precise structure.

| Relationship with the literature
In this section we show how N-soft sets and fuzzy soft sets induce ranked soft sets by similar natural procedures. Because probabilistic soft sets are embedded into fuzzy soft sets, they also induce ranked soft sets through the application of the procedure for fuzzy soft sets. Both procedures ensure a direct relationship between the original and derived models.
We shall also investigate relationships with labelled partitions, a concept arising in the analysis of certain granular structures (namely, stratified rough sets).

| Relationships with soft sets
Ranked soft sets are a sophisticated, non-numerical enhancement of soft sets. Notice the following two natural relationships: Þwhich is defined as follows: Except in the case when F 0 , E ð Þis the null soft set, the level of the ranked soft set R F 0 ð Þ, E ð Þassociated with F 0 , E ð Þis 1. Informally, there is at most one distinctive quality among the objects in U when we consider a fixed attribute t: we consider that all elements that do not fail to satisfy t must satisfy t to the same extent. This is the natural adaptation of the spirit of soft sets to our ranked improvement.
2. Conversely, we can produce soft sets from any ranked soft set. Consider a ranked soft set R, E ð Þdescribed as in Equation (4). Then the soft set is formed by all the elements for which it is not true that they fail to satisfy t. Notice that in this approximate description by t we are gathering elements for which evidence may exist that their degree of satisfaction is different, as captured by the ranked partition R t In conclusion, this transformation comes at the cost of a loss of information, unless R, E ð Þ has level 1. In this case we can retrieve R, E ð Þ from Þwithout any informational loss.
T A B L E 4 The tabular representation of a ranked soft set R, E ð Þrepresented by Equation (4) R tq 0 at the bottom line may be empty. All other subsets above that line are non-empty. However, at any given column t i , there may be no subset above the bottom line (case k t i ð Þ¼0).
We proceed to extend this discussion to the case of N-soft sets. Preliminarily, Section 3.3.2 presents a procedure that defines the ranked soft set induced by an N-soft set. We shall return to the converse problem in Section 4.2, in the framework of a comprehensive theoretical analysis of ranked soft sets.

| The ranked soft set induced by an N-soft set
Without loss of generality, in this section we assume that N-soft sets are expressed in the more convenient form of Definition 2.
The next result shows how any N-soft set produces a (uniquely defined) ranked soft set that is closely related to it. Of course, this construction does not produce ranked soft sets with level 1, as was the case of soft sets.
then for each t E one obtains: Proof. The proof is constructive. For each t E, we proceed recursively according to the following algorithm.
We continue in this fashion, and eventually the process will end. This will happen We say that R F , U ð Þconstructed in Proposition 1 is the ranked soft set induced by the N-soft set F, E, N ð Þ .
In simple terms, the argument of proof of Proposition 1 goes as follows. First V t,F 0 selects all the objects whose evaluation under attribute t is zero (if there are any). Then V t,F 1 selects the objects for which t is satisfied with the least non-zero evaluation (if there are any left). If we have not exhausted U, then V t,F 2 selects the objects for which t is satisfied with the second least non-zero evaluation (if there are any left). We continue in this way until we have picked out all the objects in U.
Observe that Proposition 1 can be applied to soft sets, too (case N ¼ 2). In this case there are at most 2 steps for each attribute t E. First V t,F 0 selects the objects that do not satisfy t (if there are any). Then V t,F 1 selects the objects that satisfy t (if there are any), because all the elements whose evaluations are not zero must be evaluated with a 1, hence they are all minimizers of the function F t ð Þ Γ U ð Þ.
Remark 2. Another telling aspect of Proposition 1 is that it helps to link ranked soft sets with other models extending N-soft sets.
Indeed, it is known that fuzzy N-soft sets (Akram et al., 2018), hesitant N-soft sets (Akram et al., 2019a), hesitant fuzzy N-soft sets (Akram et al., 2019b), and more sophisticated models extending N-soft sets can induce N-soft sets. Thus if these processes are followed by the application of Proposition 1, then we can induce ranked soft sets from these models too.

| The ranked soft set induced by a fuzzy soft set
Our next result shows how any fuzzy (or probabilistic) soft set generates a uniquely defined ranked soft set that is closely related to it.
Proposition 2. Let μ, E ð Þbe a fuzzy soft set over U. Then there exists R μ , U À Á , a ranked soft set, such that if we write down then for each t E one has: The proof is constructive. It is a simple replication of the proof of Proposition 1.
We say that R μ , U À Á constructed in Proposition 2 is the ranked soft set induced by the fuzzy soft set μ, E ð Þ.
In simple terms, the construction of R μ ,U À Á from μ, E ð Þ described by the proof of Proposition 2 goes as follows: first V t,μ 0 selects all the objects whose membership degree under attribute t is zero (if there are any). Then V t,μ 1 selects the objects for which t is satisfied with the least non-zero membership degree (if there are any left). If we have not exhausted U, then V t,μ 2 selects the objects for which t is satisfied with the second least membership degree (if there are any left). We proceed in the same fashion recursively. When we have selected all the objects in U, we have finished.
As said above, Proposition 2 can also be applied to any probabilistic soft set D,E ð Þ because it can be identified with a fuzzy set due to the In this way we define R D , U ð Þ, the ranked soft set induced by the probabilistic soft set D,E ð Þ.
Remark 3. Similarly to the case of Remark 2, which concerned extensions of the N-soft set model, Proposition 2 applies to models extending fuzzy soft sets too. For example, hesitant fuzzy soft sets induce fuzzy soft sets (Wang et al., 2014, Section 4). Thus if this process is followed by the application of Proposition 2, then we can induce ranked soft sets from hesitant fuzzy soft sets easily. Bittner and Stell (2003) used labelled partitions to redefine equivalence relations. With them, they defined granular partitions which may be treated as an extended form of equivalence relations allowing for more than two levels of details. In turn, Wu and Leung (2011) employed labelled partitions to define block-labelled rough sets. With these remarkable antecedents in mind, we shall recall the basic concepts and produce relationships with the model defined here.

| Relationships with labelled partitions
Suppose that X and K are sets. A surjective mapping f : X ! K is called a K-labelled partition. The elements of K are called cells or labels. It is well known that an equivalence relation can be characterized by a partition, consisting of blocks or equivalence classes. What K-labelled partitions add to this notion is the labelling of the blocks by cells. Intuitively, f captures what elements of X are housed in each of the cells. And surjectivity is a reflection of the idea that blocks must be nonempty, that is, cells are meaningful only when they are occupied by at least one element.
A partial mapping from X to K is an assignment such that for each x X, there exists at most one element in K that is associated with X. Let us write π x ð Þ ¼ k when k is the element associated with X, then we summarize the expression of the partial mapping by π : X 7 ! K. Now the idea of a surjective partial mapping π : X 7 ! K modifies the idea of partition by doing two things: it associates some elements to specific cells (in such way that all cells contain elements), but some elements are not put into any cell.
It is easy to observe that every ranked soft set over U is characterized by a family of (not necessarily surjective) mappings from U to Clearly, the family of mappings π t jt E f gpermits to retrieve R, E ð Þby a similar token.
Conversely, suppose that we have a family of partial mappings The proof of Proposition 1 explains how we can produce a ranked soft set from this information.

| Examples
Our first example in this section illustrates the utilization of abbreviated notation and tabular representations for a natural visualization of ranked soft sets. We also compute levels of ranked soft sets. The derivation of the respective soft sets encapsulating a part of the information given by the ranked soft sets defined here is studied separately in Example 5.
Consider the ranked soft sets and Let us describe them in the shorthand form given by Equation (4). Then we interpret that for The abbreviated notation for R 2 , E uses In view of these descriptions, we observe that the levels of R 1 , E À Á and R 2 , E are 2 and 3, respectively.
By reference to Tables 4 and 5 shows their respective tabular representations.
Example 5. In continuation of Example 4, let us compute the soft sets naturally induced by R 1 , E À Á and R 2 , E using the process defined in Section 3.3.1. Some direct computations yield The next example illustrates the concept and relationship that Proposition 1 has presented.
Example 6. Consider the 4-soft set F 1 , E,4 ð Þin Example 1. Then the tabular representation of the ranked soft set R 1 , E ð Þinduced by Þthrough the process described in Proposition 1 is shown in Table 6. We observe that R 1 ,E ð Þhas level 3.
The next example illustrates the concept and relationship that Proposition 2 has presented.
Example 7. Consider the probabilistic soft set D 1 ,E ð Þin Example 2. Then the tabular representation of the ranked soft set R 2 , E ð Þ induced by D 1 , E ð Þthrough the process described in Proposition 2 is shown in Table 7. We observe that R 2 , E ð Þhas level 4.

| THEORETICAL ANALYSIS
A proper theoretical analysis of our new model requires the specification of the fundamental set-theoretic operations. We do that in Section 4.1.
Then Section 4.2 defines the concept of representation of a ranked soft set by an N-soft set. Also, it gives representation theorems whereby this relationship is guaranteed by two different processes.
T A B L E 5 The tabular representation of the ranked soft sets R 1 ,E À Á and R 2 , E in Example 4 x,y f g x,z f g ;

| Set-theoretic operations
Remark 1 has explained that every corrupted ranked partition produces a ranked partition in a simple manner. We just need to remove the empty sets appearing at positions that are not the first one. We shall take advantage of this trivial construction to correctly define set-theoretic operations on ranked soft sets.
A complete set-theoretic analysis requires to define some basic concepts. Two ranked soft sets over the same set of objects U are equal when for each t E, k 1 t ð Þ ¼ k 2 t ð Þ and V t,1 j ¼ V t,2 j for all We shall also need to establish the concepts of a 'null' and a 'full' ranked soft set. We say that g is the full ranked soft set. The null ranked soft set declares that no object satisfies any of the attributes. The full soft set however, declares that with respect to any fixed attribute, all objects are equally satisfactory. We can never tell apart objects by their respective degrees of satisfaction of an attribute, no matter how we measure this quality and what attribute we consider.
A more thorny issue is the correct construction of the intersection and union of ranked soft sets. The concepts that we shall define use Remark 1 to circumvent the problem that the natural intersection and union of ranked partitions produce corrupted ranked partitions.
To this purpose, let us consider We define two auxiliary mappings R \ and R [ from E to the set of all corrupted ranked partitions of U as follows.
Þ be the mappings defined by the application of Remark 1 to R \ and R [ , respectively. We are ready to establish the rules producing intersections and unions of arbitrary ranked soft sets:

Definition 8. The intersection and union of
Þdefined above, respectively.
The next two examples illustrate the application of Definition 8. In the first one, already R \ and R [ can be used to define ranked soft sets.
The second shows why we need to resort to R \ and R [ , that is, to Remark 1, in order to produce well-defined concepts in general.
Example 8. In continuation of Example 4, let us compute the intersection and union of R 1 , E À Á and R 2 , E using the process defined in Definition 8. Some direct computations yield T A B L E 6 The tabular representation of the ranked soft set R 1 , E ð Þinduced by F 1 ,E,4 ð Þin Example 1 (cf., Example 6) The tabular representation of the ranked soft set R 2 , E induced by D 1 , E ð Þin Example 2 (cf., Example 7) and We emphasize the fact that in this case, both R \ ¼ R \ and R [ ¼ R [ hold true. In other words, the intersection/union of the ranked partitions x, z f g, y, u,v f g ð Þ and x f g, z, u f g, y, v f g ð Þ , and of z, v f g, x, y f g, u f g ð Þ and ;, x, y f g, z f g, u, v f g ð Þ , are in fact ranked partitions.
Example 9. Suppose U ¼ x, y, z f g and E ¼ t, t 0 f g. Let us compute the intersection and union of the ranked soft sets Þ, that are the mappings defined as follows: Using Remark 1 to remove the empty sets that appear at positions other than the first one, we get R \ : E ! ℛ U ð Þ and R [ : E ! ℛ U ð Þ, that are the mappings defined as follows: With these mappings we obtain In this case, neither R \ ¼ R \ nor R [ ¼ R [ hold true. We can neither use R \ to define intersection, nor R [ to define union.

| Representations by N-soft sets, and representation theorems
Proposition 1 has explained how a ranked soft set can be generated from each N-soft set, in such way that an intimate relationship exists between both elements. The target of this section is to explore the converse process which is established as follows: Definition 9. We say that the N-soft set F,E, N ð Þrepresents the ranked soft set R, E ð Þ when R F , U ð Þ constructed in Proposition 1 coincides with R, E ð Þ.
As we shall show, representations by N-soft sets always exist. In point of fact we shall produce two representation theorems. Both give explicit constructions of N-soft sets representing a given ranked soft set.
Theorem 1. (First representation theorem). Let R, E ð Þbe a ranked soft set whose level is l. Then the mapping defines an lþ 1 ð Þ-soft set representing R, E ð Þ.
Proof. The claim can be checked directly. ▪ Theorem 2. (Second representation theorem). Let R, E ð Þ be a ranked soft set. Define e p ¼ max t E j U À V t 0 j¼ p À min t E j V t 0 j. Then the next e p þ 1 ð Þ-soft set represents R, E ð Þ: Proof. The claim follows from tedious but routine computations. ▪ Let us illustrate the application of these two alternative representations with an example: Example 10. Consider the case of the ranked soft sets R 1 , E À Á and R 2 , E in Example 4 (see Table 5 for their tabular representations).
We first compute the representations of R 1 , E À Á that Theorems 1 and 2 guarantee. Since the level of R 1 , E À Á is l ¼ 2, the first representation theorem constructs a 3-soft set representing it. A routine appeal to Equation (19) produces a representation by F 1 , E,3 À Á defined as follows: Now because j V t 0 0 j > j V t 0 j¼ 1, the second representation theorem constructs a 5-soft set representing R 1 , E À Á . Notice that in this case, e p ¼ p À 1 ¼j U j À1 ¼ 4. A routine appeal to Equation (20) produces an alternative representation by F 1 , E,5 ð Þdefined as follows: Let us now focus on R 2 , E . Since its level is l ¼ 3, the first representation theorem constructs a 4-soft set representing R 2 , E . By a routine application of Equation (19) we come up with a representation by F 2 , E,4 defined as follows: Now because V t 0 0 ¼ ;, the second representation theorem constructs a 6-soft set representing R 2 ,E . Indeed, in this case one has e p ¼ p ¼j U j¼ 5.
A routine application of Equation (20) produces an alternative representation by F 2 , E,6 ð Þdefined as follows: Later on we shall give tabular descriptions of these representations in Tables 11 and 13, in the context of the problem posed in Problem 1 below.
In relation with the competing representation theorems, the next two particular situations are worth mentioning: 1. When R, E ð Þis a ranked soft set whose level is 1, the first representation theorem produces a soft set (or more precisely, a 2-soft set which can be assimilated with a soft set). The output corresponds to the analysis performed in Section 3.3.1, hence the first representation theorem extends that observation. However, this is not the case of the second representation theorem.
2. When R,E ð Þis a ranked soft set for which j V t j j¼ 1 for all possible j and t, then both representation theorems produce the same output.
Example 11. Consider the case of a ranked soft set over U ¼ x, y, z f g. Its level is 1 so it belongs to the first particular situation described above.
Also, e p ¼ max t E j U À V t 0 j¼ p ¼ 3.
According to the first representation theorem, the following 2-soft set F R , E,2 on U ¼ x, y, z f grepresents R, E À Á : If we use the second representation theorem, we conclude that the following 4-soft set F R , E,4 À Á on U ¼ x, y, z f grepresents R, E À Á too: Example 12. Consider the case of the ranked soft sets R 1 , E and R 2 , E in Example 9. Both are in the second particular situation described above.

| PRACTICAL ANALYSIS
This section contains several elements that can take ranked soft sets to a practical level. Scores and aggregation operators are of particular note in applications. We proceed to produce both tools in the next sections. Their practical implementation will become apparent in Section 6.

| Scores
Informally, scores are numerical assessments of the information acquired about an object or alternative. This tool appears to stand in contrast to ranked soft sets, whose main characteristic is a complete absence of numbers. In such uncharted waters, we shall circumvent the problem with a strategy that combines representation theorems (cf., Theorems 1 and 2) with scores of N-soft sets.
Central to the use of scores in N-soft set theory is the concept of extended weighted choice value (EWCV) of an alternative in an N-soft set.
It was defined in the founding Fatimah et al. (2018) and it has been utilized in a multi-agent context too . To define it, we assume that a vector of weights w 1 , …, w q ð Þweighs the values of the parameters in T. We hence assume w i ≥ 0 for each i plus w 1 þ … þ w q ¼ 1.
Then the EWCV of o j in an N-soft set F, T, N ð Þ , represented in the notation of Table 1, is We shall produce scores of ranked soft sets by the sequential application of a representation theorem, which produces an N-soft set that induces the original ranked soft set (Definition 9), and the EWCV. In practical terms, we resort to Equation (22) applied to the output of either Theorem 1 or Theorem 2. A general definition follows: Definition 10. Let R, E ð Þ be a ranked soft set over U. We say that a mapping s : We emphasize that this general strategy is adaptable at two levels: one has a sample of N-soft sets representing the ranked soft set to choose from, and also the weights of the attributes can be adjusted. At any rate, these numerical evaluations of the alternatives allow us to rank them by a complete preorder in a natural manner.
For the time being, we fix a vector of weights w ¼ w 1 , …, w q ð Þassociated with the parameters in T.

| T1-weighted-scores
In this section we explore how we can implement Definition 10 through Theorem 1. We shall obtain a tool that permits to rank the objects characterized by a ranked soft set from highest to lowest scores. Of course, ties might occur.
Definition 11. Let R, E ð Þ be a ranked soft set over U whose level is l. Suppose that F R , E, l þ 1 À Á is the (l + 1)-soft set defined by Equation (19).
The T1-w-score of R, E ð Þis the mapping s w 1 : Let us illustrate Definition 11 with an example.
Example 13. Consider the case of the ranked soft set R 2 , E in Examples 4 and 10. Let us suppose that the attribute t 0 is twice as important as t. Then we must use a vector of weights w ¼ 1 3 , 2 3 À Á .
Using the first representation theorem, Example 10 produced F 2 , E,4 that represents R 2 , E . Then Table 8 shows both its tabular representation and the EWCVs associated with w by Equation (22). These figures are, by definition, the T1-w-scores of the objects in U. The alternatives are ranked accordingly by a complete preorder # 1 , hence we conclude u$ 1 v ≻ 1 z ≻ 1 y ≻ 1 x.

| T2-weighted-scores
This section explores the case where Theorem 2 enforces Definition 10. We shall replicate the analysis in Section 5.1.1.
Definition 12. Let R,E ð Þ be a ranked soft set over U. Suppose that F R , E,e p þ 1 ð Þis the e p þ 1 ð Þ-soft set defined by Equation (20), T A B L E 8 Tabular representation of the 4-soft set F 2 , E,4 in Examples 10 and 13, and the EWCVs associated with the alternatives when w ¼ 1 3 , 2 3 À Á The T2-w-score of R, E ð Þis the mapping s w 2 : U ! 0,þ∞ ½ Þsuch that for each j ¼ 1, …p, s w 2 o j À Á is the EWCV of o j in F R , E,e p þ 1 ð Þ .
Let us revisit Example 13 with this new perspective.
Example 14. In the situation of Example 13, we decide to use T2-w-scores instead of T1-w-scores.
Using the second representation theorem, Example 10 produced F 2 , E,6 ð Þthat represents R 2 , E . Then Table 9 shows both its tabular representation and the EWCVs associated with w by Equation (22). These figures are, by definition, the T2-w-scores of the objects in U. The alternatives are ranked accordingly by a complete preorder # 2 , hence we conclude u$ 2 v ≻ 2 y ≻ 2 z ≻ 2 x.
Now we can compare the outputs of Examples 13 and 14, that is, u$ 1 v ≻ 1 z ≻ 1 y ≻ 1 x versus u$ 2 v ≻ 2 y ≻ 2 z ≻ 2 x. The same input, that is, the ranked soft set R 2 , E in Examples 4 and 10, has produced two very similar rankings. Both the top and bottom positions are occupied by the same objects. This is an indication of consistency in the general approach, which is flexible at two levels: the choice of the N-soft set representing the input, and the weights of the attributes.

| Aggregation operators
The setting in this section is a list of ranked soft sets R 1 , E ð Þ, …, R r , E ð Þover a common set of alternatives U. We wonder which ranked soft set could act as a reasonable aggregate representative of these ranked soft sets.
Our strategy is to combine representations of the ranked soft sets with aggregation operators for N-soft sets, which Alcantud et al. (2022) have pioneered. A crucial contribution in their work is a suitably modified form of the OWA aggregation operator (Yager, 1988) defined in (1).
Here we shall not need such a sophisticated adjustment since we will only retain the order derived from the outputs, not the evaluations themselves. Hence our aggregation tool will be a standard OWA operator. An OWA permits to weigh the evaluations in relation with their relative ordering. This is exactly the spirit of ranked soft sets, which explains our choice of this tool.
T A B L E 9 Tabular representation of the 6-soft set F 2 , E,6 ð Þin Examples 10 and 14, and the EWCVs associated with the alternatives when w ¼ 1 3 , 2 3 À Á  Our strategy can be presented in the following manner. First, we represent the ranked information by F 1 , E, N ð Þ ,…, F r , E, N ð Þ , a list of N-soft sets producing the inputs. We do this by either Theorem 1 or Theorem 2. Suppose that the resulting N-soft sets are defined as in Table 10. Then we apply Equation (1) cell by cell with a suitable weighting vector. This vector captures our attitude toward aggregation, which ranges from fully optimistic to totally pessimistic. Finally, for every fixed attribute, we use the figures that we obtain for all the alternatives under this attribute in order to rank them by decreasing values. Only the alternatives with zero aggregate value should go to V tj 0 . Obviously, when r ¼ 1, that is, when we aggregate one ranked soft set only, our procedure returns the ranked soft set.
In the next two sections we set our sights on the formulations with the representations obtained by the two representation theorems proven in Section 4.2. They will be respectively called T1-OWA and T2-OWA aggregation operators, two portmanteaux coined from a combination of the terms 'first theorem' and 'OWA', or 'second theorem' and 'OWA'. Then we shall briefly discuss the scope of our approach to aggregation in Section 5.2.3. Here we shall explain how alternative approaches to aggregation can be designed.
A particular instantiation of the problem may help us to set our goals. Thus, we shall illustrate the application of our methodologies with the corresponding solutions to the following problem: Problem 1. Consider the ranked soft sets R 1 , E À Á and R 2 , E in Examples 4 and 10. Table 5 gives their tabular representation.
We assume in our stylized problem, that two experts have submitted their assessments by means of R 1 , E À Á and R 2 , E . In addition, we shall use the ranked soft set R 3 , E submitted by a third agent. It is also defined on U ¼ x, y, z, u, v f gby the expression A routine appeal to Equation (19) produces a representation of R 3 , E by F 3 , E,4 defined as follows: Now a routine appeal to Equation (20) produces an alternative representation of R 3 , E by F 3 , E,5 ð Þdefined as follows: Our problem consists of aggregating R 1 , E À Á , R 2 , E and R 3 , E .
To this purpose, we fix a vector of weights w ¼ 1 2 , 1 4 , 1 4 À Á . These figures are used to weigh the evaluations in relation with their relative ordering as follows: in each case, the top evaluation weighs twice as much as the other two evaluations, which are equally important. Hence w captures a moderately optimistic approach to aggregation.
Let us proceed to describe the two variations of the general methodology that we have suggested above. In both cases, we shall apply the mechanism to the problem posed in Problem 1.

| T1-OWA aggregation operators
In this case, the first step consists of the transformation of the ranked information to multinary data by the first representation theorem. Hence, we need F 1 , E,3 À Á , F 2 , E,4 and F 3 , E,4 , which are the respective representations of R 1 , E À Á , R 2 , E and R 3 , E , as seen in Problem 1. Their tabular representations are displayed in Table 11. Now a second step computes the cell by cell aggregation by the OWA operator associated with the vector of weights, namely, w ¼ 1 2 , 1 4 , 1 4 À Á .
The resulting figures are shown in Table 12 (left). For example, to calculate the evaluation of y for the attribute t, we aggregate the vector of values 2,1,3 ð Þ obtained from the corresponding cells in Table 11. So we order the vector in decreasing order 3,2,1 ð Þ, and we compute 1 2 Á 3 þ 1 4 Á 2 þ 1 4 Á 1 ¼ 9 4 . Finally, for each attribute, we rank the elements by their descending aggregate value. Since x has a zero value under t, it is put in V t 0 . However V t 0 0 ¼ ; because no element receives a zero value under t 0 . The resulting ranked soft setR, E is shown in In the second step we compute the cell by cell aggregation by the OWA operator associated with the vector of weights, namely, w ¼ 1 2 , 1 4 , 1 4 À Á .
The resulting figures are shown in Table 14 (left). For example, to calculate the evaluation of u for the attribute t 0 , we aggregate the vector of T A B L E 1 1 Tabular representation of F 1 , E,3 À Á , F 2 , E,4 , F 3 , E,4 on U ¼ x, y, z, u, v f gwith a common set of attributes E ¼ t, t 0 f g Note: They are respective representations of R 1 ,E À Á , R 2 ,E and R 3 ,E , the input of Problem 1, by the first representation theorem.
T A B L E 1 2 Left: the results of the aggregation of the values in values 3,5,4 ð Þ obtained from the corresponding cells in Table 13. So we order the vector in decreasing order 5,4,3 ð Þ, and we compute 1 2 Á 5 þ 1 4 Á 4 þ 1 4 Á 3 ¼ 17 4 . Finally, for each attribute, we rank the elements by their descending aggregate value. Since x has a zero value under t, it is put in V t 0 . However V t 0 0 ¼ ; because no element receives a zero value under t 0 . The resulting ranked soft set R, E is shown in Table 14 (right). This is the ranked soft set obtained when we aggregate R 1 ,E À Á , R 2 , E and R 3 , E with the T2-OWA methodology.
We emphasize the fact that R,E ¼R, E 5.2.3 | Brief discussion and some alternative approaches to aggregation The approach to aggregation that we have discussed so far is adaptable for two reasons. First, we can select any N-soft sets that represent the original ranked soft sets. Not only can we choose between Theorem 1 or Theorem 2, but we can also avail ourselves of representations in any fashion. Secondly, the weights permit us to use attitudes reflective of the needs of the decision-maker.
In order to provide benchmark aggregation operators, we have defined two precise methodologies in Sections 5.2.1 and 5.2.2. We have observed that with a common vector of weights, both the T1-OWA and T2-OWA strategies have produced the same solution to Problem 1. This speaks to the robustness of the general strategy, although more experiments could be done to test for sensitivity to the choice of the weights. It is an easy guess that the weights will determine the solution to a certain extent, which we will confirm in the next Section 5.2.4.
That being established, we emphasize that even if we conform with the use of N-soft sets to represent the ranked soft sets submitted by various experts, there is no need to use OWA operators to aggregate them. We can use, for example, weighted arithmetic or geometric means or any other aggregator, since the figures are only important as a proxy of relative fulfilment of attributes. In these cases, the weights have a different role. They are used to deal with experts having different levels of importance. So, the choice is between implementing attitudes toward risk (optimistic/pessimistic evaluations), or importance of the specialists.
Of course, one could also resort to any other aggregation operator for N-soft sets that may be produced in the future. But we insist that at this stage, the aggregation must not necessarily produce another N-soft set in order to achieve a ranked list of alternatives.

| Sensitivity analysis
The purpose of this section is to corroborate that the selection of weights influences the final result of the aggregation process. This is in fact quite valuable, since it is a confirmation that the general methodology that we have devised is able to react to the attitude of the decision-maker.
Consider first the strategy of Section 5.2.1, that is, the T1-OWA aggregation operators associated with w 1 , w 2 and w 3 . Table 15 shows the respective results of the aggregation of the representations of the inputs by the first representation theorem, when the OWA operator uses each vector of weights. Then Table 16 shows the respective solutions derived from this information. We can now compare these three outputs, also with the ranked soft setR, E shown in Table 12 (right). Now we consider the strategy of Section 5.2.2, that is, the T2-OWA aggregation operators associated with the same vectors of weights w 1 , w 2 and w 3 . Table 17 shows the respective results of the aggregation of the representations of the inputs by the second representation theorem, when we apply the OWA operator with each vector. Then Table 18 shows the respective solutions derived from this information. We can now compare these three outputs, also with the ranked soft set R, E shown in Table 14 (right).

| MAKING DECISIONS WITH A GROUP OF RANKED SOFT SETS
In this section the setting is the same as in Section 5.2, that is, a list of ranked soft sets R 1 , E ð Þ, …, R r , E ð Þ over U. Now we wonder which alternative(s) from U should be chosen in view of this group information.
It is important to emphasize that the case r = 1 (i.e., the individual case) has been solved in Section 5.1 with the help of weighted scores.
Examples 13 and 14 illustrate the application of the two strategies for individual decision-making with ranked soft sets presented in Section 5.1.
Both strategies are adaptable because they use weights for the attributes.
So now we shall focus on the case of more than one ranked soft set (i.e., r > 1). We have shown in Section 4.2 that we can produce F 1 , E,N ð Þ , …, F r , E, N ð Þ , respective N-soft sets over U representing the ranked soft sets R 1 , E ð Þ, …, R r ,E ð Þ. Both Theorem 1 and Theorem 2 serve this purpose. In this section these N-soft sets are represented by the notation in Table 10.
The WAOWA score of o i U at F 1 , T, N ð Þ , …, F r , T, N ð Þ f g is defined by where Equation (1) recalls the definition of F w , the OWA operator associated with w.
Armed with these tools, we are ready to define a flexible strategy for the selection of best alternative(s) from our list of ranked soft sets. It is expressed by Algorithm 1. This algorithm is a natural extension of our individual decision-making strategy. It resorts to one of the two representation theorems (just like the individual case uses T1-or T2-weighted-scores). But here we are enabled to use weights for both the experts and attributes separately.
As said above, in the case of individual decision-making (or r ¼ 1) we obtain a weighted score in Step 4. So Algorithm 1 is a legitimate extension of the individual decision-making strategy defined above in this section.
The following illustrative example shows the application of the steps described in Algorithm 1: Problem 2. Consider the ranked soft sets R 1 , E À Á , R 2 , E and R 3 , E in Problem 1. Now our problem consists of selecting optimal 7 | CONCLUSIONS AND LINES FOR FUTURE RESEARCH